Did you know AI Costs Drop as OpenAI Faces GPU Constraints, While Synthetic Data Eases Training Concerns
When you have a leading tech executive from OpenAI arrive at one of the
most valuable global startups for a conference, you’re bound to be
intrigued about what was said.
The Morgan Stanley tech
conference event was private, which meant no media or public access. But
when you see Sam Altman, the CEO of OpenAI speak, you want to know more
details of what went down. And that’s what we’re disclosing below
thanks to analysts' discussions.
Morgan Stanley has been busy
working on AI for years to help revolutionize the business world and
society. This might be one reason why Altman chose to go and speak in
the first place.
The cost to get access to AI and make the most of AI tools is reducing. This has a lot to do with newer and more advanced techniques that make AI model production a simpler process. There is a lot of competition and variety in the industry.
"Inference and reasoning were growing faster than pre-learning and learning workloads. All acknowledged that the price of inference is falling, Sam Altman mentioned by 16X of the prior year - this is a good thing.", explained Mark P. McDonald, Distinguished Vice President and Research Fellow at Gartner.
So many models mean the laws of supply and demand work well here. They’re becoming a huge commodity and lower in terms of pricing. This is wonderful news for people needing greater access to such models as they’re paying a lot than what they did in the past.
The company shared how there are serious capacity constraints at OpenAI that GPUs remain saturated. It has never gone through a phase where it cannot sell access to GPUs at a reasonable price tag.
At high levels, the comments are consistent with thematic frameworks where small companies focus more on training the large language models that require major computing. This comes at a time when experts discussed how many debates there are about the strength of GPU demand and how that could affect our future.
Now
OpenAI is not concerned about matters like training data or data supply
for training models. The fact that the AI giant can utilize its GPUs
and other AI models to produce more data puts it at ease.
This
process is dubbed synthetic data and is slowly becoming more useful for
certain stages of this entire mode creation endeavor. Data shouldn’t be
deemed a constraint like the way computing has transformed into a
restraint as per analysts at Morgan Stanley.