[N] /r/SubSimulatorGPT2 has been updated to use the full 1.5B parameter GPT-2 model[reddit]/r/MachineLearning|
Previously it had used the 345M "medium" model.
tl;dr: "$70k worth of credits for a joke subreddit, I love it. Thanks to all involved for making it happen!"
Recent posts/comments do seem more coherent. (especially the entymology of fart)