Anthropic: No, absolutely not, you may not use third-party harnesses with Claude subs
(2026/02/20)
- Reference: 1771624480
- News link: https://www.theregister.co.uk/2026/02/20/anthropic_clarifies_ban_third_party_claude_access/
- Source link:
Anthropic this week revised its legal terms to clarify its policy forbidding the use of third-party harnesses with Claude subscriptions, as the AI biz attempts to shore up its revenue model.
Anthropic sells subscriptions to its Claude platform, which provides access to a family of machine learning models (e.g. Opus 4.6), and associated tools like Claude Code, a web-based interface at Claude.ai, and the Claude Desktop application, among others.
[1]Claude Code is a harness or wrapper – it integrates with the user's terminal and routes prompts to the available Claude model in conjunction with other tools and a control loop that, together, make it what Anthropic calls an agentic coding tool.
[2]
Many other tools serve as harnesses for models, such as OpenAI Codex, Google Antigravity, Manus (recently acquired by Meta), OpenCode, Cursor, and Pi (the harness behind OpenClaw).
[3]
[4]
Harnesses exist because interacting with a machine learning model itself is not a great user experience – you feed it a prompt and it returns a result. That's a single-turn interaction. Input and output. To create a product that people care about, model makers have added support for multi-turn interaction, memory of prior interactions, access to tools, orchestration to handle data flowing between those tools, and so on. Some of this support has been baked into model platforms, but some of it has been added through harness tooling.
This can pose a business problem for frontier model makers – they've invested billions to train sophisticated models, but they risk being disintermediated by gatekeeping intermediaries that build harnesses around their models and offer a better user experience.
[5]
One of the ways that Anthropic has chosen to build brand loyalty is by selling tokens to subscription customers at a monthly price, with [6]usage limits , that ends up being less costly than pay-as-you-go token purchases through the Claude API. Essentially, the economics are similar to an all-you-can-eat buffet that's priced with certain usage expectations.
That practice, effectively a subsidy for subscribers, led to token arbitrage. Customers accessed Claude models via subscriptions linked to third-party harnesses because it cost less than doing the same work via API key.
The AI biz's Consumer Terms of Service have forbidden the use of third-party harnesses, except with specific authorization [7]since at least February 2024 . The contractual language in Section 3.7, which remains unchanged from that time, says as much – any automated access tool not officially endorsed is forbidden.
You may not access or use, or help another person to access or use, our Services in the following ways:
Except when you are accessing our Services via an Anthropic API Key or where we otherwise explicitly permit it, to access the Services through automated or non-human means, whether through a bot, script, or otherwise.
Despite the presence of that passage for more than two years, a variety of third-party tools have flouted that rule and have allowed users to supply a Claude subscription account key.
The added rule explicitly states that OAuth authentication, the access method used for Claude Free, Pro, and Max tier subscribers, is only intended for Claude Code and Claude.ai (the web interface for Claude models).
[8]
" Using OAuth tokens obtained through Claude Free, Pro, or Max accounts in any other product, tool, or service — including the [9]Agent SDK — is not permitted and constitutes a violation of the [10]Consumer Terms of Service ," the updated [11]legal compliance page says.
According to Anthropic, the update represents an attempt to clarify existing policy language to make it consistent throughout company documentation.
[12]AI coding assistant Cline compromised to create more OpenClaw chaos
[13]Ex-Google engineers accused of helping themselves to chip security secrets
[14]Accenture tells staffers: If you want a promotion, use AI at work
[15]EFF policy says bots can code but humans must write the docs
Anthropic appears to have decided to police its rules at the start of the year. In a January social media [16]thread , Anthropic engineer Thariq Shihipar said the company had taken steps to prevent third-party tools from "spoofing the Claude Code harness."
"Third-party harnesses using Claude subscriptions create problems for users and are prohibited by our Terms of Service," he wrote. "They generate unusual traffic patterns without any of the usual telemetry that the Claude Code harness provides, making it really hard for us to help debug when they have questions about rate limit usage or account bans and they don't have any other avenue for this support."
The prohibition proved unpopular enough to elicit a response from the competition. OpenAI's Thibault Sottiaux pointedly [17]endorsed the use of Codex subscriptions in third-party harnesses.
After banning accounts for attempting to game its pricing structure, Anthropic has now clarified its legalese, as Shihipar [18]indicated would happen , and makers of third-party harnesses are taking note.
On Thursday, OpenCode [19]pushed code to remove support for Claude Pro and Max account keys and Claude API keys. The commit cites "anthropic legal requests." ®
Get our [20]Tech Resources
[1] https://code.claude.com/docs/en/overview
[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[6] https://support.claude.com/en/articles/9797557-usage-limit-best-practices
[7] https://www.anthropic.com/legal/archive/71085c3c-857c-464d-8075-ae918f0e5555
[8] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[9] https://platform.claude.com/docs/en/agent-sdk/overview
[10] https://www.anthropic.com/legal/consumer-terms
[11] https://code.claude.com/docs/en/legal-and-compliance
[12] https://www.theregister.com/2026/02/20/openclaw_snuck_into_cline_package/
[13] https://www.theregister.com/2026/02/20/google_ip_theft_charges/
[14] https://www.theregister.com/2026/02/20/accenture_tells_staffers_want_promotion/
[15] https://www.theregister.com/2026/02/20/eff_demands_human_documentation_as/
[16] https://x.com/trq212/status/2009689809875591565?s=20
[17] https://x.com/thsottiaux/status/2009742187484065881?s=20
[18] https://x.com/trq212/status/2009689814917083363?s=20
[19] https://github.com/anomalyco/opencode/commit/973715f3da1839ef2eba62d4140fe7441d539411
[20] https://whitepapers.theregister.com/
Anthropic sells subscriptions to its Claude platform, which provides access to a family of machine learning models (e.g. Opus 4.6), and associated tools like Claude Code, a web-based interface at Claude.ai, and the Claude Desktop application, among others.
[1]Claude Code is a harness or wrapper – it integrates with the user's terminal and routes prompts to the available Claude model in conjunction with other tools and a control loop that, together, make it what Anthropic calls an agentic coding tool.
[2]
Many other tools serve as harnesses for models, such as OpenAI Codex, Google Antigravity, Manus (recently acquired by Meta), OpenCode, Cursor, and Pi (the harness behind OpenClaw).
[3]
[4]
Harnesses exist because interacting with a machine learning model itself is not a great user experience – you feed it a prompt and it returns a result. That's a single-turn interaction. Input and output. To create a product that people care about, model makers have added support for multi-turn interaction, memory of prior interactions, access to tools, orchestration to handle data flowing between those tools, and so on. Some of this support has been baked into model platforms, but some of it has been added through harness tooling.
This can pose a business problem for frontier model makers – they've invested billions to train sophisticated models, but they risk being disintermediated by gatekeeping intermediaries that build harnesses around their models and offer a better user experience.
[5]
One of the ways that Anthropic has chosen to build brand loyalty is by selling tokens to subscription customers at a monthly price, with [6]usage limits , that ends up being less costly than pay-as-you-go token purchases through the Claude API. Essentially, the economics are similar to an all-you-can-eat buffet that's priced with certain usage expectations.
That practice, effectively a subsidy for subscribers, led to token arbitrage. Customers accessed Claude models via subscriptions linked to third-party harnesses because it cost less than doing the same work via API key.
The AI biz's Consumer Terms of Service have forbidden the use of third-party harnesses, except with specific authorization [7]since at least February 2024 . The contractual language in Section 3.7, which remains unchanged from that time, says as much – any automated access tool not officially endorsed is forbidden.
You may not access or use, or help another person to access or use, our Services in the following ways:
Except when you are accessing our Services via an Anthropic API Key or where we otherwise explicitly permit it, to access the Services through automated or non-human means, whether through a bot, script, or otherwise.
Despite the presence of that passage for more than two years, a variety of third-party tools have flouted that rule and have allowed users to supply a Claude subscription account key.
The added rule explicitly states that OAuth authentication, the access method used for Claude Free, Pro, and Max tier subscribers, is only intended for Claude Code and Claude.ai (the web interface for Claude models).
[8]
" Using OAuth tokens obtained through Claude Free, Pro, or Max accounts in any other product, tool, or service — including the [9]Agent SDK — is not permitted and constitutes a violation of the [10]Consumer Terms of Service ," the updated [11]legal compliance page says.
According to Anthropic, the update represents an attempt to clarify existing policy language to make it consistent throughout company documentation.
[12]AI coding assistant Cline compromised to create more OpenClaw chaos
[13]Ex-Google engineers accused of helping themselves to chip security secrets
[14]Accenture tells staffers: If you want a promotion, use AI at work
[15]EFF policy says bots can code but humans must write the docs
Anthropic appears to have decided to police its rules at the start of the year. In a January social media [16]thread , Anthropic engineer Thariq Shihipar said the company had taken steps to prevent third-party tools from "spoofing the Claude Code harness."
"Third-party harnesses using Claude subscriptions create problems for users and are prohibited by our Terms of Service," he wrote. "They generate unusual traffic patterns without any of the usual telemetry that the Claude Code harness provides, making it really hard for us to help debug when they have questions about rate limit usage or account bans and they don't have any other avenue for this support."
The prohibition proved unpopular enough to elicit a response from the competition. OpenAI's Thibault Sottiaux pointedly [17]endorsed the use of Codex subscriptions in third-party harnesses.
After banning accounts for attempting to game its pricing structure, Anthropic has now clarified its legalese, as Shihipar [18]indicated would happen , and makers of third-party harnesses are taking note.
On Thursday, OpenCode [19]pushed code to remove support for Claude Pro and Max account keys and Claude API keys. The commit cites "anthropic legal requests." ®
Get our [20]Tech Resources
[1] https://code.claude.com/docs/en/overview
[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[6] https://support.claude.com/en/articles/9797557-usage-limit-best-practices
[7] https://www.anthropic.com/legal/archive/71085c3c-857c-464d-8075-ae918f0e5555
[8] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aZjnjxGB8DOhkrG6Qf_RuQAAAQs&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[9] https://platform.claude.com/docs/en/agent-sdk/overview
[10] https://www.anthropic.com/legal/consumer-terms
[11] https://code.claude.com/docs/en/legal-and-compliance
[12] https://www.theregister.com/2026/02/20/openclaw_snuck_into_cline_package/
[13] https://www.theregister.com/2026/02/20/google_ip_theft_charges/
[14] https://www.theregister.com/2026/02/20/accenture_tells_staffers_want_promotion/
[15] https://www.theregister.com/2026/02/20/eff_demands_human_documentation_as/
[16] https://x.com/trq212/status/2009689809875591565?s=20
[17] https://x.com/thsottiaux/status/2009742187484065881?s=20
[18] https://x.com/trq212/status/2009689814917083363?s=20
[19] https://github.com/anomalyco/opencode/commit/973715f3da1839ef2eba62d4140fe7441d539411
[20] https://whitepapers.theregister.com/
> You may not ... access the Services through automated or non-human means, whether through a bot, script, or otherwise
Only humans may use the robot? Is not the robot's purpose to mimic and replace humans?
Sauce for the goose?