Don't Get Used To Cheap AI (axios.com)
- Reference: 0180973826
- News link: https://news.slashdot.org/story/26/03/13/1926255/dont-get-used-to-cheap-ai
- Source link: https://www.axios.com/2026/03/12/ai-models-costs-ipo-pricing
> Flashback: Silicon Valley has seen this movie before. The so-called " [4]millennial lifestyle subsidy " meant VC money helped underwrite cheap Uber rides and DoorDash deliveries. Before that, Amazon built its base with low prices, free shipping and, for years, no sales tax in most states. Eventually, all of these companies had to charge enough to cover costs -- and make a profit.
>
> Follow the money: The current iteration of AI subsidies won't last forever. Both OpenAI and Anthropic are widely expected to go public. Public investors will demand earnings growth and expanding margins. Even as chips get more efficient, total spending keeps rising. Labs need more capacity, more upgrades and more supply to meet demand.
>
> The bottom line: The costs of AI will keep going down. But total spend from customers will need to keep going up if AI companies are going to become profitable and investors are ever going to get returns on their massive investments.
[1] https://www.axios.com/2026/03/12/ai-models-costs-ipo-pricing
[2] https://tech.slashdot.org/story/26/02/01/2255206/is-metas-huge-spending-on-ai-actually-paying-off
[3] https://slashdot.org/story/26/02/12/1931255/anthropic-raises-30-billion-at-380-billion-valuation-eyes-ipo-this-year
[4] https://www.theatlantic.com/newsletters/archive/2022/06/uber-ride-share-prices-high-inflation/661250/
The steps (Score:3)
1. Subsidize product running off "big iron" 2. Get people hooked 3. Jack up the price as high as it'll go 4. Wonder why everyone decided to run the software on their own hardware instead. Rinse and repeat decade after decade.
Re: (Score:3)
The cloud was also more expensive than rolling your own, but you couldn't do that because the megacorps hired all the programmers. Then they realized hardware was cheaper than manpower, so those programmers were let go and the megacorps just bought up all the hardware to make sure nobody can run their own models.
Re: (Score:2)
Why wouldn't you be able to have your own private, open-source AI, without the need for an internet connection, on a simple desktop? Not possible today perhaps, but maybe in the not-too-distant future.
Re: (Score:2)
It's absolutely possible today. I've been using local models for code type work (no, of course not vibe coding, but for identifying features or locating them inside of a new-to-me codebase) entirely on the garbage deall windows laptop I was handed for work. Ollama and OpenwebUI are free. Plenty of models are free.
Re: (Score:2)
#4 is where we should get to, so that only the people who really want/need AI get it, and that too, at their expense . Not at public expense thanks to these AI ops happening in the cloud, where nobody really has to cover the actual expense
Distorted reality. (Score:5, Insightful)
If AI execs think that most people would pay for AI, then they are truly delusional. We only do it because it's free and/or forced upon us.
Besides, our occasional use of your tools doesn't justify paying a subscription for it.
If you start charging, or charging more, we'll go somewhere else, even if that means reddit or StackOverflow. I'm sure they'd welcome us back with open arms.
Re: (Score:1)
I assume this guy got downvoted or the slur against the disabled ... but what he's saying about employers paying for AI is dead on.
If you make say $200k/year, even $500/month ($6k a year) is a relative drop in the bucket (3%). Claude makes me far, far more than 3% more productive.
Re: (Score:2)
Problem is that $200 Claude Max, if you calculate what it would cost to do the same thing via the PAYGO API, the difference is staggering. Like over 10x difference. It would cost you thousands a month without the plan. Anthropic HAS to be losing money on those. Even users who are not maxing their usage out each month, unless they are using only 15-20%, are going to be using more than what their subscription costs Anthropic to provide. Eventually they will need to either find a way to drastically lower the
Re: um (Score:2)
But does your Claude subscription make you significantly more productive than my 30B qwen model running for pennies on a local 16gb gpu?
I doubt it.
And when 80gb gpus inevitably become affordable in 3-5 years time, even the large model slight edge is likely to be lost.
I just don't see how these hyperscalers will recoup their massive capital investments before consumer grade hardware catches up and their business model collapses.
Ridiculous Millenial Bashing (Score:1)
I see. So these stupid AI companies, with crap only right 60% of the time bots, and no path to profitability is a reason to drag Millennials for opportunistically slurping up subsidies off the world's stupidest investors: Silicon Valley VCs. Instead of a reason to point out how idiotic LLMs and VCs are.
Sure.
Re: (Score:2)
I understand the logic, though. The big tech companies want to get people hooked on cheap and easy AI, in the hopes that they get hooked on it and will pay for subscriptions once the free tier plans get neutered.
What they seem to be missing is that running an AI model locally is also becoming cheap and easy. Installing OpenClaw is basically as easy as copying and pasting an install command into a command line window and entering an API key. I'm sure that we'll soon have GUI installers that make that even ea
Just a reminder AI is not for you (Score:1)
It is a tool for the Epstein class to eliminate you from their supply chains and dependency chains.
They have had enough of paying filthy peons and being dependent on them.
Seriously try to imagine what somebody at the Pinnacle of human power and wealth feels when they stop and think about how all of that power and all of that wealth is entirely dependent on the rest of the planet agreeing that they get it. It eats them alive.
Kings of traditionally used feudalism to create a top-down hierarchical
No problem. (Score:4, Insightful)
I've never used it and I never plan to
Yet, thanks to skyrocketing RAM prices, I've still had to pay for it.
I'll just run it locally (Score:2)
There are plenty of models you can run on your own hardware. Sure the capabilities will be significantly less, but that's better than being trapped in an ever increasing subscription price.
Re: (Score:2)
Yeah... memory and storage prices are going through the roof right now.
The higher prices do not seem to be impacting the large PC manufacturers as much, though. Amusingly, you can get an entire Macbook Neo with the education discount ($499) than the price of 32 GB high speed DDR5 memory kit ($550-$600).
Cheap AI is here to stay (Score:3)
I disagree completely with the premise. Prices don't necessarily have to increase. We have yet to reach maturity of chips optimized for inference, in addition to the regular old factor of computing work per dollar going up over time. Moore's law might be "slowing down" but we aren't at the end of the road yet. Keeping today's features running will cost less to deliver in both infrastructure and electricity in the future.
What will more likely happen is that features and functionality will keep expanding to use more processing power. But where is the limit? I say that comes when we can render at near-realtime 8k/240hz a video (or video game if you prefer) with procedurally generated world, characters, and storyline based on the users input via whatever real-world data, UI or sensors you want to use. This might even be possible now if you are a billionaire with access to million dollar server farms. Probably my imagination isn't broad enough in estimating the limit of "personal computing" but additional computing power beyond that seems pointless for any one individual.
In any case the price of an ai product will depend on the features offered and how much hardware is needed. Probably you can run a 500 billion parameter LLM on a smart watch in 2055 but if you want that power today it seems to cost about $20 a month. I doubt anyone will try to ever charge more for today's $20 featureset. The price of this stuff will absolutely decrease, the only unknown is how companies will roll out new functionality and how specific future features fit into the pricing tiers over time.
Re: (Score:2)
A lot of the cost is involved with the training. AI companies got caught in the "stealing public work" cookie jar, and that's unlikely to be allowed to happen again at any meaningful scale. That means new AI cos are unlikely to form. They are more or less out of places to steal from without actually paying someone, and that will increase costs as well.
Facebook's LLaMMA is OSS only because of how far they know they are behind, and even to get there they got sued for stealing from pirate sites to train the
Re: (Score:2)
The price of this stuff will absolutely decrease, the only unknown is how companies will roll out new functionality and how specific future features fit into the pricing tiers over time.
I agree. The opportunity for AIs to act as Agents will be worth lots to providers. As always, we are the product.
Re: (Score:2)
Good models also get smaller. Did you try Ministral 8B or Qwen3.5 9B? They beat many much larger models from 6 months ago and run fast on cheap desktop hardware.
I question their strategy (Score:2)
AI companies released models early and for free
They also loudly and publicly made bold claims that all jobs will be replaced by AI
I can only guess that they did this to create excitement and attract investment
The general public reacted by using AI to create slop and scams
Non technical people saw only the slop, scams and predictions of job loss, and turned strongly against AI
Investors reacted irrationally
It might have been a better strategy to keep it in the lab and charge for early access, restricted to tec
Enterprise will bear the cost - you won't (Score:2)
Enterprise will pay anything if it's part of a contract bundle with the big guys, but individuals won't have to worry about cost increases. Chinese AI providers can easily undercut these services by offering 90% of the capability at a fraction of the cost (same goes for an increasing number of domestic hosts). Lots of software developers are getting by with $10/month plans from these guys. If they don't plateau, the best-case scenario for Anthropic and OpenAI is that the competition will always be a few mon
Technology will get cheaper (Score:2)
Small models are much faster and cheaper and are at most 2 years behind the frontier models. So in two years we can get something as good as any current AI and it won't cost those suppliers (or your own computer) any more than you are paying now and there will be plenty of competition. It may be that we will have to pay a premium for AI significantly smarter than current AI. It is possible that at some point there will be a bit of a jump as market conditions change.
Sam Altman: AI will sold like a utility (Score:2)
And then there this ... [1]Sam Altman says AI will eventually be sold like electricity and water — by companies like OpenAI [businessinsider.com] (Mar 13, 2026):
> "Fundamentally our business and I think the business of every other model provider is going to look like selling tokens," Altman said, referring to the units AI systems use to process and price input and output data.
> "We see a future where intelligence is a utility like electricity or water and people buy it from us on a meter and use it for whatever they want to use it for," he added.
[1] https://www.businessinsider.com/sam-altman-ai-utility-electricity-water-openai-2026-3
Don't depend on OpenAI or Anthropic (Score:2)
AI providers that provide freely usable models are already profitable. Only the companies doing own training are at a loss and may need to rise prices (and also have the highest prices currently).
And in the end, aim for self-hosting. Not because there won't be cheap APIs, but because you don't want others to see your data and because you don't want to get a problem when the API provider decides that the five year old model is now too old and you need to switch to a newer one that reacts differently and need
No shit (Score:4, Interesting)
The only question is will it stay cheap enough to still replace people or will it be increased until it's no longer economical.
Re: No shit (Score:4, Interesting)
First one, then the other.
AI companies hoping by then there will be enough platform lock-in and hassle to transition to human-powered production they don't try and go back.