News: 0181720862

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Cal.com Is Going Closed Source Because of AI

(Wednesday April 15, 2026 @05:00PM (BeauHD) from the blueprint-to-the-bank-vault dept.)


Cal is [1]moving its flagship scheduling software from open source to a proprietary license , arguing that AI coding tools now make it much easier for attackers to scan public codebases for vulnerabilities. "Open source security always relied on people to find and fix any problems," [2]said Peer Richelsen, co-founder of Cal. "Now AI attackers are flaunting that transparency." CEO Bailey Pumfleet added: "Open-source code is basically like handing out the blueprint to a bank vault. And now there are 100x more hackers studying the blueprint." The company says it still supports open source and is releasing a separate [3]Cal.diy version for hobbyists, but doesn't want to risk customer booking data in its commercial product. ZDNet reports:

> When Cal was founded in 2022, Bailey Pumfleet, the CEO and co-founder, wrote, "Cal.com would be an open-source project [because] limitations of existing scheduling products could only be solved by open source." Since Cal was successful and now claims to be the largest Next.js project, he was on to something. Today, however, Pumfleet tells me that AI programs such as "Claude Opus can scour the code to find vulnerabilities," so the company is moving the project from the GNU Affero General Public License (AGPL) to a proprietary license to defend the program's security.

>

> [...] Cal also quoted Huzaifa Ahmad, CEO of Hex Security, "Open-source applications are 5-10x easier to exploit than closed-source ones. The result, where Cal sits, is a fundamental shift in the software economy. Companies with open code will be forced to risk customer data or close public access to their code." "We are committed to protecting sensitive data," Pumfleet said. "We want to be a scheduling company, not a cybersecurity company." He added, "Cal.com handles sensitive booking data for our users. We won't risk that for our love of open source."

>

> While its commercial program is no longer open source, Cal has released Cal.diy. This is a fully open-source version of its platform for hobbyists. The open project will enable experimentation outside the closed application that handles high-stakes data. Pumfleet concluded, "This decision is entirely around the vulnerability that open source introduces. We still firmly love open source, and if the situation were to change, we'd open source again. It's just that right now, we can't risk the customer data."



[1] https://www.zdnet.com/article/ai-security-worries-force-company-to-abandon-open-source/

[2] https://cal.com/blog/cal-com-goes-closed-source-why

[3] https://www.cal.diy/



AI can also FIX t (Score:4, Insightful)

by Archangel Michael ( 180766 )

Instead of fearing AI, use it to secure software and make it better.

We have nothing to fear but fear itself.

Re: (Score:3)

by rsilvergun ( 571051 )

Dude they just wanted an excuse to close source their software without getting the blow back from the community that's all.

Every time you want to do a shitty thing in the world now you just say AI made me do it.

Re: AI can also FIX t (Score:2)

by Anamon ( 10465047 )

Yeah, so many holes in this justification that it's completely transparent.

If attackers can now so easily scan for vulnerabilities... so could they. They have access to the same tools. Not to mention that these new approaches don't even really need access to the source.

He says they don't want to be a cybersecurity company, just quietly focus on handling the sensitive data of their customers. But you can't do one without the other.

If you don't want to build up the security know-how and processes in-house, th

Re: AI can also FIX t (Score:2)

by AvitarX ( 172628 )

I'm pretty sure they found more bugs than they wanted to fix when the checked the code.

Re: (Score:2)

by Junta ( 36770 )

GenAI is a bit nicer for offense than defense.

If you are an attacker, the time and consequences of a GenAI mistakes can be more easily ignored. Whoops, an attack that didn't work but you weren't going to succeed anyway. If it screws up the target in a way that you didn't actually want, you may have an opportunity cost because you wanted that data or to ransom the data, but you didn't care *that* much about the data. It's actually a pretty unambiguous 'win' for malicious users since the usual downsides do

Cool Story Bro (Score:2)

by DMJC ( 682799 )

They do realise that the next level of AI is binary analysis/conversion LLMs. Everything will be open source going forward.

Re: (Score:2)

by Tailhook ( 98486 )

Never put off till tomorrow the poor decisions you can make today.

Also, I've never heard of Cal.com. I suspect nothing of value is being lost here.

Re: (Score:2)

by Junta ( 36770 )

That's assuming that you even publish the code at all. Looks like they would keep their codebase internal.

Not "flaunting" (Score:3)

by 50000BTU_barbecue ( 588132 )

But flauting, surely?

Re: Not "flaunting" (Score:3)

by 50000BTU_barbecue ( 588132 )

Flouting god damn it

Open-source code is basically like handing out... (Score:2)

by MpVpRb ( 1423381 )

... the blueprint to a bank vault.

Hmmm... arguing for security by obscurity.

Security researchers answered that question long ago.

Bullcrap. we already lived this before (Score:3)

by williamyf ( 227051 )

In the very late '90s and most of the '00s, Automated Fuzzing tools were ivented. That led to a massive increase of vulnerability discovery and reports, increasing significantly the workload of maintainers. Also, bad actors started to use said tools to discover vulns before the maintainers could discover and patch them.

If you search tech websites of the era (including slashdot) you will see the same set nad tone of articles. Maintainers complaining of the increased workload. The sky is falling. Security-pocalypse...

In the end, the big corpos steped up giving tooling and compute capacity for free to run the new tools against the existing codebases, both for project important to their infrastructure, as well as projects that would earn them good PR points.

Also, the maintainers were able to adapt their procedures, tooling and community to the "new normal" increased workload, and the software world kept turning without the sky falling off.

This shall also pass.

Yes, not all projects will survive, and of those which survive, not all wil get through unskaved, but stresses like this help separate the grain from the chaf

Re: (Score:2)

by dfghjk ( 711126 )

"...and the software world kept turning without the sky falling off."

The teensy piece of the "software world" impacted anyway. Embedded dominates the "software world", you know all that notorious software vulnerable to "Automated Fuzzing tools". 98% of processors are used in embedded products, but sure, those "maintainers" are all that matters.

And what sky "falls off"?

If the tools are so good (Score:4, Insightful)

by Local ID10T ( 790134 )

If the tools are so good that you are afraid they will be used to expose your security flaws... maybe you should use the tools to find the security flaws yourself, and then fix them rather than declaring security thru obscurity.

This is a fig leaf over the desire to back out of the open source community now that the product has reached profitability.

Hopefully someone cares enough to fork the latest open source version and run them out of business with a better product that remains open.

Re: If the tools are so good (Score:2)

by AvitarX ( 172628 )

They probably did run the tools.

Then saw more work than they wanted to do.

AI can analyze machine code (Score:4, Insightful)

by Zero__Kelvin ( 151819 )

Unless they don't plan on making the executable available, this won't help. Do they really think that AI can't understand machine code and find vulns that way?

Re: (Score:2)

by dfghjk ( 711126 )

"Do they really think that AI can't understand machine code and find vulns that way?"

They likely haven't thought of it, but why would you just assume that it can? At very minimum, machine instructions would overwhelm systems sized for source-level analysis making systems need revision and inferencing far more expensive. More likely, machine code might be preprocessed first, you know, like has been done for decades. Not an AI thing.

Would be fun watching AI digest an executable and emit full source code fo

Re: AI can analyze machine code (Score:2)

by Zero__Kelvin ( 151819 )

I'm not "just assuming" it can. It is a known fact that it can and does.

AGPL + CLA (Score:2)

by Himmy32 ( 650060 )

The restrictiveness of the AGPL when combined with license assignment or a CLA achieves the opposite result as intended. Where it's meant to kept contributions from the community from being taken by corporations who don't give back, but instead it's used to take community contributions until the point when they are ready to rug pull. Knowing that an open community fork won't survive with such a restrictive license.

At least here, they have the courtesy to at least pretend it's not the same old rug pull, with

We need our security through obscurity! (Score:2)

by Jeremi ( 14640 )

n/t

Local Man Abandons Honest Living (Score:1)

by Iamthecheese ( 1264298 )

A local warehouse clerk announced Tuesday that he intends to start mugging strangers because of recent developments in artificial intelligence security. Derek "Deke" Hargrove, 34, told reporters that powerful AI systems can now analyze any publicly available information about a person and turn it into a perfect plan for exploitation. "Any data you put out there -- your routines, your photos, your habits -- it is all open-source to these AI models now," Hargrove said outside a Loop coffee shop, with a ski ma

My father was a God-fearing man, but he never missed a copy of the
New York Times, either.
-- E. B. White