Worth noting that the new 345M model is still far from the full 1.5B model they were avoiding to post. The headline makes it seem like they finally decided to give the full model, but it's just a slightly larger demo model.
It strikes me as weird that they are not publishing it, by the way. According to their rhetoric when they started, it was the whole purpose of OpenAI: acknwoledging that we are at point when anybody with enough resources can produce something, let's say, interesting with ML, and striving to give everyone more or less equal possibilities by serving as a more effective academic organization for the world, before Facebook or Google takes over the world completely.
Plus, it's not as if this thing is more "potentially harmful" than, well... basically anything of use, like electricity, internet, fire, less perfect language models. In fact, it isn't even anything new, it's just (possibly) less broken language model than what we already have.
Admittedly, it would be quite problematic to use the full model with today's mainstream GPUs, so I'm not that much saddened by them hoarding on it. It just seems curious to me.