You can't - robots.txt restricts crawling, not indexing.
So if those URLs are already in some index (like Google) they will continue to stay there even if you disallow crawling, they just won't show a description after some time if crawling is not allowed.
If the content does not exist any longer, letting it serve 404 is the correct way to handle this.
If it does exist under an new URL, the correct way would be to 301 redirect.