DeepSeek Just Confirmed My Suspicions About OpenAI
The ChatGPT maker has been playing a losing game

There are three typical reasons why OpenAI makes the news:
- There’s a fiasco among board members
- They’re launching a new product
- Or they’re whining about how they’re still not making profit
They recently made the news for reason number 3.
Early January, Sam Altman dropped this irksome tweet.

For context, on Dec 5, 2024, OpenAI launched the $200 ChatGPT Pro subscription which, in their own words, “enables scaled access to the best of OpenAI’s models and tools.”
A $200/month subscription is quite pricy, but subscribers will be getting unlimited access to OpenAI’s smartest models. We’re talking some of the coolest tools in the AI market presently: OpenAI o1, o1-mini, GPT-4o, Advanced Voice, plus more.
That OpenAI is still losing money despite this subscription model is not surprising to me at all. For one thing, it confirms a long-held position of mine: OpenAI’s approach of launching too many models is not a wise play.
I lost count of how many other shinny tools OpenAI has launched since ChatGPT debuted in 2022. Launching new and advanced products seem to be their primary strategy for profitability.
There is always a new product from the AI maker even before the market could get used to the one they launched last.
But this wasn’t how things were for the biggest companies of today. These other tech giants had fewer products when they turned profitable:
- Google turned profitable when it had only two products in the market: Google Search and Google AdWord.
- Facebook turned profitable with just the Facebook App and Ads as its primary offerings.
- Apple was pulling in profits two years after it was founded, with the Apple I and II as its main products.
- Microsoft started making profit immediately after launch, by the being the sole supplier of the BASIC interpreter for the Altair 8800 kit.
OpenAI, with more than eight product releases (I know I said I lost count, but I had to catch up so I could say this), is yet to turn profitable.
And I get it, the primary reasons for this unprofitability is the insane cost of training and running AI models, plus the heavy investments that go into research and development.
It costs OpenAI up to $700,000/day to run its models, and they spend billions annually in research and development. The broader implication being that despite the company making more than $3.7 billion in sales last year, it ran a net loss of over $5 billion.
Everything about OpenAI is big; users, valuation, revenue, number of products, and losses. Everything, except profit!
I figured that launching too many products as a means of pursuing profitability is not a wise play, because this would dramatically keep shooting up training and running costs.
It is simple math.
If it costs OpenAI a hypothetical $0.1/query when someone uses one of its products. The cost is bound to rise the more users OpenAI incur —be it for a single product or 10 more. More products and users mean you burn more cash.
Every startup would give anything to have a million users, and would go any length to have more than one product that people run after. With more than 300 million users for ChatGPT alone, and up to 10 products in the market, OpenAI ought to be living the dream. But with the cost realities involved, the dream of every startup is, in this case, OpenAI’s nightmare.
I don’t know why OpenAI had kept on threading this path of more products. It takes only the application of First Principle thinking and it’d be clear that rather than more products, OpenAI ought to instead focus on innovations geared towards reducing training and running costs.
Going back to our hypothetical scenario, if cost per query can be reduced to $0.01 for every user, that would amount to a 10x reduction is running cost. And who said they’d stop there?
SpaceX followed the cost reduction model with rocket building and launching.
Am I saying it is something that is easy to do? Definitely not.
I was in doubt on whether this approach of cost reduction can be applied to AI training and running, especially when you consider how nascent the industry is, and the reality that OpenAI risks losing to competitors if it chooses to focus on cost reduction rather on launching new and better products.
These had been my doubts, until DeepSeek launched.
Lessons from SpaceX and DeepSeek
Space exploration used to be the biggest deal. But everything about it is dang complex.
From the theories that underpin it to the practical aspect of building the rockets that explores everything beyond Earth, space exploration is a thing only a few dare to go into. It goes over the heads of others.
Worse still, everything about space exploration is ridiculously expensive. Sending a rocket to space would typically costs NASA close to $2 billion. The implications of this expense is two-fold:
- Fewer rockets are sent to space
- Space exploration happens at a slower pace as a result
Then, Elon Musk decided to do something different.
Through SpaceX, Musk was able to make rocket-building significantly inexpensive. SpaceX flights could cost between $62m - $90m — that is more than 90% price reduction. And this has the two-fold implication of letting us send more rockets to space, and therefore speeding up space exploration.
The company was able to achieve this through a combination of factors:
- In-house manufacturing (as opposed to outsourcing)
- Simplification of design
- And reusability
The result is that, as of April 2024, SpaceX was launching a mission every 2.7 days. A number whose significance shines when you consider that from the mid 1980s through the 2010s, a mission was launched every 2.8 days, not just by NASA, but worldwide!
Enters DeepSeek
DeepSeek is freaking the tech world out after it launched its latest R-1 model on January 20th.
You don’t need a knowledge of technicalities to understand the brilliance and significance of this launch.
Only three points are worth mentioning:
- Its performance matches or beats OpenAI’s o1 model on certain AI benchmarks.
- Usage is free, with API pricing costs one-thirtieth of OpenAI’s o1.
- Training one of its model cost $5.6 million, compared to the $100,000 to $1 billion cited as the cost of building a model.

Instead of doing what established players in the AI industry had been doing, DeepSeek chose to do things differently. This helped them reduce cost, while achieving almost-equal results as other players in the game. They did at least four things differently:
- Used smart optimization to make their existing resources work smarter.
- Trained only the important parts
- Went for smaller memory, leading to faster results and lower costs
- Leveraged reinforced learning
The details are more complex than simplistic, but the result is glaring for everyone to see.
Hopefully, OpenAI will see the need to innovate around cost-reduction rather than keep flooding the market with numerous products and charging users exorbitant subscription fees. It can be done.
Some netizens commented:
Lloyd Osler, P. Eng.: DeepSeek has certainly disrupted the AI landscape, redefining the cost-to-performance paradigm. U.S. tech companies will have no choice but to rethink their strategies, including pricing models and operational approaches.
DeepSeek R1 is already being integrated into third-party applications that rely on AI, simply because it costs one-tenth as much to run and has one-tenth the environmental impact. Its existence has significant ramifications for the global deployment of AI data centers, particularly outside the U.S., ensuring that AI technology cannot be leveraged as a bargaining chip or a coercive tool by the new Trump administration.
Andy Spence: Not convinced by Altman at all. The things he says sound like he is stuck in Fake It mode.
Bobby Dennie: So all is not good, lot of shiny objects syndrome and short term memory in this world. suddenly the revolution starter is not cool anymore, still open ai got best ecosystem all other need to use something else to complete all task to make a real product or meaningful action. The standard and the brand chatgpt still prevail. let see the rest nothing really changes just open more doors
欢迎关注我公众号:AI悦创,有更多更好玩的等你发现!
公众号:AI悦创【二维码】

AI悦创·编程一对一
AI悦创·推出辅导班啦,包括「Python 语言辅导班、C++ 辅导班、java 辅导班、算法/数据结构辅导班、少儿编程、pygame 游戏开发、Linux、Web 全栈」,全部都是一对一教学:一对一辅导 + 一对一答疑 + 布置作业 + 项目实践等。当然,还有线下线上摄影课程、Photoshop、Premiere 一对一教学、QQ、微信在线,随时响应!微信:Jiabcdefh
C++ 信息奥赛题解,长期更新!长期招收一对一中小学信息奥赛集训,莆田、厦门地区有机会线下上门,其他地区线上。微信:Jiabcdefh
方法一:QQ
方法二:微信:Jiabcdefh
