OpenAI’s next-gen Orion model is hitting a serious bottleneck, according to a new report – here’s why

OpenAI is running into difficulties with Orion, the next-gen model powering its AI. The company is struggling in certain areas when it comes to the performance gains realized with the successor to GPT-4.

This comes from a report by The Information, citing OpenAI employees, who claim that the increase in quality seen with Orion is ‘far smaller’ than that witnessed when moving from GPT-3 to GPT-4.

We’re also told that some OpenAI researchers are saying that Orion “isn’t reliably better than its predecessor [GPT-4] in handling certain tasks.” What tasks would they be? Apparently, coding is a weaker point, with Orion possibly not outdoing GPT-4 in this arena – although it is also noted that Orion’s language skills are stronger.

So, for general-use queries – and for jobs such as summarizing or rewriting text – it sounds like things are going (relatively) well. However, these rumors don’t sound quite as hopeful for those looking to use AI as a coding helper.

So, what’s the problem here?

By all accounts, OpenAI is running into something of a wall when it comes to the data available to train its AI. As the report makes clear, there’s a “dwindling supply of high-quality text and other data” that LLMs (Large Language Models) can work with in pre-release training to hone their powers in solving knottier problems like resolving coding bugs.

These LLMs have chomped through a lot of the low-hanging fruit, and now finding this good-quality training data is becoming a considerably more difficult process – slowing down advancement in some respects.

On top of that, this training will become more intensive in terms of computing resources, meaning that developing (and running) Orion – and further AI models down the line – will become much more expensive. Of course, the user of the AI will end up footing that bill, one way or another, and there’s even talk of more advanced models becoming effectively “financially unfeasible” to develop.

Not to mention the impact on the environment in terms of bigger data centers whirring away and sucking more power from our grids, all at a time of increasing concern around climate change.

While we need to take this report with an appropriate amount of caution, there are worrying rumblings here, foreshadowing a serious reality check for the development of AI going forward.

The Information further notes that a different approach may be taken in terms of improving AI models on an ongoing basis after their initial training – indeed, this may become a necessity from the sound of things. We shall see.

Orion is expected to debut early in 2025 (and not imminently, as some rumors have hinted), and it may not be called ChatGPT-5, with OpenAI possibly set to change the naming scheme of its AI completely with this next-gen model.

You might also like...

How It works

Search Crack for

Latest IT News

Nov 14
The Steam Frame could be the most exciting VR headset in years, unless you're Meta, Apple, and Samsung.
Nov 13
November update fixes nasty Task Manager bug and a glitch that caused battery drain with handhelds in sleep mode.
Nov 13
Adobe Black Friday deals are now live, with big savings on Creative Cloud and Adobe Firefly subscriptions.
Nov 13
Windows 11 23H2 has gone the way of Windows 10, and now it's no longer supported, Microsoft is forcing upgrades to 25H2.
Nov 13
Windows 11 is going 'agentic' and 'AI-native' and a whole heap of other buzzwords that most consumers really don't care for.
Nov 12
Valve's new Steam Frame headset is here to take on the Meta Quest 3, but which one should you buy? Here's how the two VR titans compare.
Nov 12
Worried because you can't get extended updates to keep your Windows 10 PC safe? Microsoft has fixed a bug that prevented signups.

Latest cracks