OpenAI edges closer to making its first AI chip in bid to power your favorite new apps

OpenAI is a step closer to developing its first AI chip, according to a new report – as the number of developers making apps on its platform soars alongside cloud computing costs.

The ChatGPT maker was first reported to be in discussions with several chip designers, including Broadcom, back in July. Now Reuters is claiming that a new hardware strategy has seen OpenAI settle on Broadcom as its custom silicon partner, with the chip potentially landing in 2026.

Before then, it seems OpenAI will be adding AMD chips to its Microsoft Azure system, alongside the existing ones from Nvidia. The AI giant's plans to make a 'foundry' – a network of chip factories – have been scaled back, according to Reuters.

The reason for these reported moves is to help reduce the ballooning costs of AI-powered applications. OpenAI's new chip apparently won't be used to train generative AI models (which is the domain of Nvidia chips), but will instead run the AI software and respond to user requests.

During its DevDay London event today (which followed the San Francisco version on October 1), OpenAI announced some improved tools that it's using to woo developers. The biggest one, Real-time API, is effectively an Advanced Voice Mode for app developers, and this API now has five new voices that have improved range and expressiveness.

Right now, three million developers from around the world are using OpenAI's API (application programming interface), but the problem is that many of its features are still too expensive to run at scale.

OpenAI says it's reduced the price of API tokens (in other words, how much it costs developers to use its models) by 99% since the launch of GPT-3 in June 2020, but there's still a long way to go – and this custom AI chip could be an important step towards making AI-powered apps cost-effective and truly mainstream.

OpenAI-powered apps are coming

The sky-high costs of cloud AI processing are still a handbrake on apps building OpenAI's tools into their offerings, but some startups have already taken the plunge.

The popular online video editor Veed plugs into several OpenAI models to offer features like automated transcripts and the ability to pick out the best soundbites from long-form videos. An AI-powered notepad called Granola also leverages GPT-4 and GPT-4o to transcribe meetings and send you follow-up tasks, without needing a meeting bot to join your call.

Away from consumer apps, a startup called Tortus is using GPT-4o and OpenAI's voice models to help doctors. Its tools can listen to doctor-patient chats and automate a lot of the admin like updating health records, while apparently also improving diagnosis accuracy.

Leaving aside the potential privacy and hallucination concerns of AI models, developers are clearly keen to tap into the power of OpenAI's tools – and there's no doubt that its low-latency, conversational voice mode has massive potential for customer service.

Still, while you can expect to be talking to one of OpenAI's voice models when calling a store or customer service line soon, those AI running costs could slow down the rate of adoption – which is why OpenAI is seemingly keen to develop its own AI chip sooner rather than later.

You might also like

How It works

Search Crack for

Latest IT News

Nov 6
Meta is going to use AI to check if its users are lying about their ages and restrict teens to teen accounts.
Nov 5
Perplexity debuts election information hub.
Nov 5
Early Black Friday deals for the Meta Quest 3 are live today with a free game when your purchase a headset.
Nov 5
Fed up with Windows 11 having a rash of bugs lately? You wouldn’t be alone, but Microsoft is underway with fixing a bunch of these problems.
Nov 5
iOS 18.2 beta 2 adds new ChatGPT information to track daily limits for Siri.
Nov 5
Microsoft Copilot Vision soon to arrive.
Nov 5
Google Research showcases AI that can read handwritten text.

Latest cracks