Cursor CEO Warns Vibe Coding Builds 'Shaky Foundations' That Eventually Crumble (fortune.com)
- Reference: 0180458117
- News link: https://developers.slashdot.org/story/25/12/26/0623233/cursor-ceo-warns-vibe-coding-builds-shaky-foundations-that-eventually-crumble
- Source link: https://fortune.com/2025/12/25/cursor-ceo-michael-truell-vibe-coding-warning-generative-ai-assistant/
"If you close your eyes and you don't look at the code and you have AIs build things with shaky foundations as you add another floor, and another floor, and another floor, and another floor, things start to kind of crumble," Truell said. Truell and three fellow MIT graduates created Cursor in 2022. The tool embeds AI directly into the integrated development environment and uses the context of existing code to predict the next line, generate functions, and debug errors. The difference, as Truell frames it, is that programmers stay engaged with what's happening under the hood rather than flying blind.
[1] https://fortune.com/2025/12/25/cursor-ceo-michael-truell-vibe-coding-warning-generative-ai-assistant/
Re: (Score:1)
I was looking for a "vibe" as a gift to my girlfriend for when she's travelling, but when she saw me googling "AI assisted" she said "No way I'm putting that thing in me"
I refuse to use AI coding tools... (Score:2)
I don't need AI coding tools to write code for me, I am perfectly capable of doing that myself. Plus there isn't an AI coding tool on the planet that can do the stuff I do with reverse engineering proprietary file formats, interacting with obscure dead game engines and working with proprietary secret code that no AI would have ever seen before.
Re: (Score:2)
> I don't need AI coding tools to write code for me, I am perfectly capable of doing that myself.
Yes, but you can't do it as fast as you can using AI coding tools, according to your boss. He doesn't have any actual numbers to back up this assertion, and he wouldn't be looking at the impact on time in the QA/testing phase either. But he's sure he's right after talking to that guy he played golf with last month, and is willing to stake your job on it.
Re: (Score:2)
They save a lot of time on boilerplate / web design. That said, Gemini 3 (the current leaderboard champ) tried to talk me into doing something the other day that a junior dev might not have been skeptical enough of to catch, and would have ended with catastrophic downtime due to common exploits.
Re: (Score:2)
I am not surprised. The current hype-AI is incapable of writing reliably secure code.
Re: (Score:2)
> They save a lot of time on boilerplate / web design.
How much new boilerplate do you need in a project which you can't pull from the git repo of your older projects?
I don't see any non-trivial savings that can justify even the $20/mo for the cheapest "copilot" account.
Re: (Score:2)
You do know that most experienced coders keep their code base around. Stuff they tend to continually write for any program/website/etc.
So, its not like AI is a "shortcut", instead what it is, is an exploit generation system, loose code system, and a cpu time waster.
Re: (Score:2)
Definitely don't use LLM-generated code without scrutiny, but it's not bad as a starting point or as a source for potential approaches. It's also not bad for code review: I asked gpt-oss to adapt some (not yet tested) code in a certain way, and it noticed a cut-and-paste error in addition to adapting the code. I ended up not using that adaptation, but the big report was helpful.
One can -- and I think should -- be skeptical of lots of things about LLMs, from business models and environmental impacts to qua
Re: I refuse to use AI coding tools... (Score:3)
Dont use any output from any machine learning model without checking: It is statistical models, which can do predictions better than random, but often are completely wrong. Always verify.
Did you see that Microsoft? (Score:2)
Sure your coders will be really engaged with 1 million lines of Rust code per month per dev.
How will AI replace developers? (Score:2)
Until AI is able to understand said underlying foundations, framework and code it will always have this issue..
Did all those companies firing staff due to AI knowledge this? We're they told by any shiny new AI company about these any many other limitations?
Sounds like an "oopsie" moment for one of those companies that laid off 20%+ of their developers because AI was going to replace them all.
Re: (Score:2)
That means until we have AGI. For which we do not even know whether it is possible and, if possible, we will not get anytime soon. May be 100 years, 1000 years or "never".
I really do not understand why so many people are willing to believe AI is all-powerful. There must be a widespread mental defect somewhere that can explain this disconnected belief.
cursor ai (Score:1)
it cost 20 or so amonth and allows me to gereate a framework and do simple crud(kinda). it does save me a lot of time.
Now i have yet to see it produce any code that actaully worked with out fixes.
Duh (Score:3)
That is really all that is to say here. Well, I would like to see some research into why so many people massively overestimate what LLM-type AI can do. I mean, I took one look and was not impressed. Why do people seem to think that AI needs to be regarded as all-knowing, all-powerful until the converse is proven? And sometimes not even with that proof? I really do not get it.
Well that's unexpected... (Score:2)
It feels like AI companies are trying to rein in expectations lately and I wonder what the strategy is.
Are they insulating themselves from liability?
Re: (Score:2)
Either that, his company just happens to sell a code management & analysis tool that allegedly reduces the chance of AI code-slop gumming up an app. The fire insurance sales-person likes to remind customers of how hot and dry the weather feels.
Re: (Score:3)
> Are they insulating themselves from liability?
That's an angle I don't see mentioned very often when it comes to AI in the work world. The AI companies don't want to assume liability for anything their creation does (like giving away all the snacks in a vending machine for free, or deleting a bunch of corporate data in a fever dream), but then they try and sell AI as a replacement for humans doing many jobs.
Would a company hire a human who preemptively refuses to be held responsibility for their actions on the job?
Re: (Score:2)
> Are they insulating themselves from liability?
That sounds quite plausible to me. Maybe they finally got some competent legal advice. They sure did not have that before.