GPT-2 is a deep learning model that is able to generate astonishingly coherent English text. It was released last year, and everyone’s mind was blown into histrionic hyperbole, including mine. Its creators at OpenAI were so impressed by the model's performance that they originally didn't release it for fear of it being too easy to abuse. I think they were right to be concerned. Here is an excerpt that the model generated, taken from their release page.
... read more on bonkerfield.org