A hangover is due

First reported in ‘The Information’ but you need a subscription. TechCrunch has coverage though. But it seems OpenAI has created a new foundation team as its newer models are not improving as expected:

“In other words, the rate of improvement seems to be slowing down. In fact, Orion might not be reliably better than previous models in some areas, such as coding.
In response, OpenAI has created a foundations team to figure out how the company can continue to improve its models in the face of a dwindling supply of new training data. These new strategies reportedly include training Orion on synthetic data produced by AI models, as well as doing more improve models during the post-training process.”

Should be interesting as most of the really knowledgeable people quit.

I have posted quite often that improvements are probably an ‘S-curve’. Also the dangers of artificial data ingestion. And that the underlying foundational technology will not create AGI. In simple terms - buying lots of GPUs and building data centres won’t fix the problem.

Gary Marcus covers it well here

CONFIRMED: LLMs have indeed reached a point of diminishing returns
Science, sociology, and the likely financial collapse of the Generative AI bubble

Silly numbers of billions have been invested in this. More power = better. It’s not.

There is a hell of a hangover looming.

Subscribe to Gary P Shewan

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe