News: 0181222380

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Anthropic Announces Claude Subscribers Must Now Pay Extra to Use OpenClaw (venturebeat.com)

(Saturday April 04, 2026 @05:34PM (EditorDavid) from the talk-to-my-agent dept.)


Anthropic's making a big and sudden change — and connecting its Claude AI to third-party agentic tools " [1]is about to get a lot more expensive ," writes the Verge:

> Beginning April 4th at 3PM ET, users will "no longer be able to use your Claude subscription limits for third-party harnesses including OpenClaw," according to an email sent to users on Friday evening. Instead, if users want to use OpenClaw with Claude, they'll have to use a "pay-as-you-go option" that will be billed separate from their Claude subscription.

Anthropic's announcement added these extra usage bundles are "now available at a discount." Users can also try Anthropic's API, [2]notes VentureBeat , "which charges for every token of usage rather than allowing for open-ended usage up to certain limits, as the Pro and Max plans have allowed so far. "

> The technical reality, according to Anthropic, is that its first-party tools like Claude Code, its AI vibe coding harness, and Claude Cowork, its business app interfacing and control tool, are built to maximize "prompt cache hit rates" — reusing previously processed text to save on compute. Third-party harnesses like OpenClaw often bypass these efficiencies... [Claude Code creator Boris Cherny [3]explained on X that "I did put up a few PRs to improve prompt cache hit rate for OpenClaw in particular, which should help for folks using it with Claude via API/overages."] Growth marketer [4]Aakash Gupta observed on X that the "all-you-can-eat buffet just closed," noting that a single OpenClaw agent running for one day could burn $1,000 to $5,000 in API costs. "Anthropic was eating that difference on every user who routed through a third-party harness," Gupta wrote. "That's the pace of a company watching its margin evaporate in real time."

>

> However, Peter Steinberger, the creator of OpenClaw who was [5]recently hired by OpenAI , took a more skeptical view of the "capacity" argument."Funny how timings match up," [6]Steinberger posted on X . "First they copy some popular features into their closed harness, then they lock out open source." Indeed, Anthropic recently added some of the same capabilities that helped OpenClaw catch-on — such as the ability to [7]message agents through external services like Discord and Telegram — to Claude Code...

>

> User @ashen_one, founder of Telaga Charity, [8]voiced a concern likely shared by other small-scale builders: "If I switch both [OpenClaw instances] to an API key or the extra usage you're recommending here, it's going to be far too expensive to make it worth using. I'll probably have to switch over to a different model at this point."

>

> "I know it sucks," Cherny replied. "Fundamentally engineering is about tradeoffs, and one of the things we do to serve a lot of customers is optimize the way subscriptions work to serve as many people as possible with the best mode..." OpenAI appears to be positioning itself as a more "harness-friendly" alternative, potentially using this moment as a customer acquisition channel for disgruntled Claude power users.

>

> By restricting subscription limits to their own "closed harness," Anthropic is asserting control over the UI/UX layer. This allows them to collect telemetry and manage rate limits more granularly, but it risks alienating the power-user community that built the "agentic" ecosystem in the first place. Anthropic's decision is a cold calculation of margins versus growth. As Cherny noted, "Capacity is a resource we manage thoughtfully." In the 2026 AI landscape, the era of subsidized, unlimited compute for third-party automation is over. For the average user on Claude.ai, the experience remains unchanged; for the power users running autonomous offices, the bell has tolled.



[1] https://www.theverge.com/ai-artificial-intelligence/907074/anthropic-openclaw-claude-subscription-ban

[2] https://venturebeat.com/technology/anthropic-cuts-off-the-ability-to-use-claude-subscriptions-with-openclaw-and

[3] https://x.com/bcherny/status/2040212589951774808

[4] https://x.com/aakashgupta/status/2040248998486061381

[5] https://venturebeat.com/technology/openais-acquisition-of-openclaw-signals-the-beginning-of-the-end-of-the

[6] https://x.com/steipete/status/2040209434019082522

[7] https://venturebeat.com/orchestration/anthropic-just-shipped-an-openclaw-killer-called-claude-code-channels

[8] https://x.com/ashen_one/status/2040208727949664767?s=20



I already cancelled my subscription (Score:3, Insightful)

by Hadlock ( 143607 )

I cancelled my subscription overnight, and I'm using the free credits they gave me to wrap up some things and transition away. I am not going to be locked into someone's walled garden again.

Re:I already cancelled my subscription (Score:4, Insightful)

by Fly Swatter ( 30498 )

Considering all the expensive data center build-outs, acess is still cheap right now. I hope soon all AI services bill at the actual cost of 'running the AI.'

Right now AI is in the equivalent of a drug dealer's 'hook em for cheap, then run up the cost once they are addicted.' Or, the bubble bursts and any remaining survivors will have survived by billing at a rate to sustain the business, instead of debt on debt on debt while claiming a 'free tier.'

Re: (Score:2)

by Hadlock ( 143607 )

I ordered 64gb of ram about an hour ago and i'm planning on running either qwen 35B-A3B 8 bit or 122B-A10B 3 bit in fully offline mode.

>the actual cost of 'running the AI.'

is a fixed $200 cost (ram upgrade) + electricity

Re: (Score:2)

by EvilSS ( 557649 )

I hope you looked into benchmarks for running models in system RAM vs VRAM.

Re: I already cancelled my subscription (Score:2)

by Hadlock ( 143607 )

It's about 5 tokens/second which is totally fine for an async assistant. 20 tokens/second is about the lower limit for usable in realtime. You can also set it up to use a smaller model for quick questions (what are the next 6 items on my calendar/to-do list?) and drop through to the bigger slower model for harder questions (can you add this feature to my internal ticketing system and redeploy?)

Re: (Score:2)

by Fly Swatter ( 30498 )

I assumed OP was adding that ram to a system with unified ram, but I don't think Apple hardware allows user expansion anymore. What else is there?

Re: (Score:2)

by Hadlock ( 143607 )

I'm using a $200 used ~5 year old (from the ebay listing) HP EliteDesk 805 G6 DM Desktop Ryzen 5 PRO 4650GE 3.3GHz 32GB RAM 512GB SSD WiFi in cpu mode... you don't need a gpu to run single user local LLM... just a bunch of ram. This isn't 2022 anymore

Who did not see this coming (Score:3)

by stabiesoft ( 733417 )

All the AI co's need to monetize. Those investors are not spending a trillion out of the goodness of their hearts. Expect many more free to paid and pay to pay more announcements.

Re: (Score:3)

by scalptalc ( 6477834 )

> All the AI co's need to monetize.

This, or it's actually, as the party winds down and you see what your apartment looks like, the AI market seeing just how many *real* customers there might be. Re-think build-out, modify budgets, lock down you IP as if had real value. Potempkin market. It's just a herd of cattle raging along and at some point a single crow will dip in and whisper 'You know this trail goes to the packing plant, don't you?"

Let the enshitification begin (Score:4, Insightful)

by ukoda ( 537183 )

Looks like they think they have passed the point where they need more customers and are now going to start the process of milking them for what they can get. Let the enshitification begin.

Re: (Score:2)

by PPH ( 736903 )

So, how much for [1]heated seats [slashdot.org]?

[1] https://tech.slashdot.org/story/26/02/05/0036222/bmw-commits-to-subscriptions-even-after-heated-seat-debacle

Re: (Score:2)

by az-saguaro ( 1231754 )

While I generally share the cynicism and doubt about AI as seen here in many Slashdot comments, your comment made me wonder if this is enshitification versus shitting their pants.

When you talk about milking their customers for what they can get, to me that usually connotes that the company is doing well and has little competition, so they can get greedy and get away with it.

However, as I read the article, this sounds more like reality is setting in. After burning money and trying to establish some buzz and

Makes sense (Score:2)

by GeekWithAKnife ( 2717871 )

OoenClaw is free so punish free things.

I'm using Claude (and other tools) to help train my specialised local agents...so by the time Claude Code becomes a lot more expensive I'll only need it a fraction of the amount (or transition to another service).

Re: (Score:1)

by retchdog ( 1319261 )

If by "punish", you mean "stop subsidizing" then yes; I suppose so.

Re: (Score:2)

by Fly Swatter ( 30498 )

Nothing in life is free. What is your soul worth?

Why is this surprising? (Score:2)

by Kiliani ( 816330 )

It should have been clear for a while (it was to me) that "AI" as SaaS at current level and in its current form is drastically far away from sustainable levels. When allegedly $1 spent on Claude costs Anthropic $13 ... no degree in rocket science is required to imagine what will happen. I have told people for some time "enjoy it while it lasts, since it *will* *not* *last*!"

What can you do?

First: Learn how to use tokens efficiently. That is an absolute must. Spend the money on commercial "AI" when you must.

Re: (Score:2)

by Fly Swatter ( 30498 )

What the AI conglomerates did is give it away as a taste for free while driving a huge hardware shortage fueled price apocalypse that even once they jack up their prices towards sustainability no one will be able to afford switching to run on local hardware that now costs a small fortune, if you can even find it in stock anywhere.

It is devious and evil, but shows a lot of forward thinking.

Anyone that comes out with a user expandable unified memory architecture using mostly common components could be a

Cloud computing anybody? (Score:2)

by n0w0rries ( 832057 )

Just like cloud computing.

Hey it's cheaper than hosting it yourself!

???????

Profit!

Inevitable (Score:2)

by sjames ( 1099 )

AI has been running at a big loss to get the users hooked. It was inevitable that prices would start climbing. That process is nowhere near done, running AI is expensive as hell.

Once the market starts reflecting the actual costs, you can bet the cost/benefit will not be nearly as rosy as it looks now. But some customers will already have gotten themselves between a rock and a hard place and will be sucked dry, then discarded. Those "expensive" people that are getting dumped will start looking like a bargain

Another one that cannot get profitable (Score:2)

by gweihir ( 88907 )

I guess we are seeing the start of the end of the hype. "Investors", dumb and clueless as they may be, are not freely pouring money into the bottomless pit that LLMs are anymore.

Even a man who is pure at heart,
And says his prayers at night
Can become a wolf when the wolfbane blooms,
And the moon is full and bright.
-- The Wolf Man, 1941