News: 0180671078

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

'Clawdbot' Has AI Techies Buying Mac Minis

(Wednesday January 28, 2026 @11:45AM (BeauHD) from the digital-life-management dept.)


An open-source AI agent originally called [1]Clawdbot (now renamed Moltbot) is [2]gaining cult popularity among developers for running locally, 24/7, and wiring itself into calendars, messages, and other personal workflows. The hype has gone so far that some users are buying Mac Minis just to host the agent full-time, even as its creator warns that's unnecessary. Business Insider reports:

> Founded by [creator Peter Steinberger], it's an AI agent that manages "digital life," from emails to home automation. Steinberger previously founded [3]PSPDFKit . In a key distinction from ChatGPT and many other popular AI products, the agent is open source and runs locally on your computer. Users then connect the agent to a messaging app like WhatsApp or Telegram, where they can give it instructions via text.

>

> The AI agent was initially named after the "little monster" that appears when you restart Claude Code, Steinberger said on the " [4]Insecure Agents" podcast . He formed the tool around the question: "Why don't I have an agent that can look over my agents?" [...] It runs locally on your computer 24/7. That's led some people to brush off their old laptops. "Installed it experimentally on my old dusty Intel MacBook Pro," one product designer wrote. "That machine finally has a purpose again."

>

> Others are buying up Mac Minis, Apple's 5"-by-5" computer, to run the AI. Logan Kilpatrick, a product manager for Google DeepMind, [5]posted : "Mac mini ordered." It could give a sales boost to Apple, some X users have [6]pointed [7]out -- and online searches for "Mac Mini" jumped in the last 4 days in the US, per Google Trends. But Steinberger said buying a new computer just to run the AI isn't necessary. "Please don't buy a Mac Mini," he [8]wrote . "You can deploy this on Amazon's Free Tier."



[1] https://clawd.bot/

[2] https://www.businessinsider.com/clawdbot-ai-mac-mini-2026-1

[3] https://github.com/pspdfkit

[4] https://open.spotify.com/episode/4WiWdSlovcYdMZhEYg0EvL?si=PpRZey7mQDeP2_muTGmPzg

[5] https://x.com/OfficialLoganK/status/2015279441962836170?s=20

[6] https://x.com/youngbloodcyb/status/2015528401462014398?s=20

[7] https://x.com/DaveThackeray/status/2015478224680050914?s=20

[8] https://x.com/steipete/status/2015252888528859603?s=20



This feels like the future (Score:4, Insightful)

by memory_register ( 6248354 )

I think many people will jump at having an agent they control and own. I want the convenience of an agent without giving Sam Altman access to my data.

Re:This feels like the future (Score:4, Informative)

by fuzzyfuzzyfungus ( 1223518 )

It's possible that people would; but this system is not that. It stores a variety of data locally to automatically generate the prompts that give a fresh session the illusion of continuity in the face of restrictive context windows, and ; but farms out the bot part the usual suspects(though the project creation sees to prefer Anthropic; so it's not Sam specifically who is getting the data).

It's 'local' in the sense that a real mail client sucks less than webmail.

Mac mini vs AWS free tier (Score:2)

by Ritz_Just_Ritz ( 883997 )

Of course, if you depend on your bot for anything of consequence for home automation, perhaps the concept of being dependent on internet access for it to function wouldn't be ideal. A number of folks have a local version of home assistant for home automation that functions just fine without the internet. It would be a shame to hamstring its "minder" if you lose internet connectivity or AWS has a bad hair day.

Best,

Knowing your (local) audience. (Score:4, Interesting)

by geekmux ( 1040042 )

> Steinberger said buying a new computer just to run the AI isn't necessary. "Please don't buy a Mac Mini," he wrote. "You can deploy this on Amazon's Free Tier."

Uh..

> ..gaining cult popularity among developers for running locally, 24/7..

When THE primary reason your product is making headlines today is the localized capability, it tends to speak to bit of a disconnect in the founders recommendation here.

Now where did I put that Beowulf Pi Cluster For Dummies book..

Re: (Score:3)

by unrtst ( 777550 )

> I haven't looked at what this thing is but why can't it be run on ordinary PC hardware? Either CPU or GPU, nvidia, etc? Why a a Mac?

FYI, cause I looked... It has a gateway component (IE: Server part) and companion apps (IE: the part that brings it to your device). It's all written in Typescript and runs on Node (node.js). The gateway runs on Windows, Mac, or Linux (maybe others?). They have a MacOS bar app companion, and mobile apps for iOS and Android. Windows and Linux companion apps are planned but don't exist yet - thus the push for a Mac (or at least that seems like the obvious reason).

Re: (Score:3)

by SouthSeb ( 8814349 )

My guesses:

1) Unix-based, non-Windows OS

Yeah, of course you could also set up a Linux machine, but just buying a Mac is much easier.

2) Small form factor and nice product design

It occupies very little space on your desk and doesn't look ugly.

3) Established ecosystem

Many of these users already use Apple devices, so the Mini nicely integrates with them.

4) Relatively low price

It's not cheap, but not prohibitive.

5) Trendy

They're following the trend.

Re: (Score:3)

by Ol Olsoc ( 1175323 )

> My guesses:

> 1) Unix-based, non-Windows OS Yeah, of course you could also set up a Linux machine, but just buying a Mac is much easier.

> 2) Small form factor and nice product design It occupies very little space on your desk and doesn't look ugly.

> 3) Established ecosystem Many of these users already use Apple devices, so the Mini nicely integrates with them.

> 4) Relatively low price It's not cheap, but not prohibitive.

> 5) Trendy They're following the trend.

Having a M4 Mac Mini, I have to note that is a damn nice little computer. A hella deal for the price. fast to boot and run - the Adobe Creative Suite flies on it compared to the Intel Mac I traded in. And Apple's trade-in program - I think it was 600 bucks I laid out in the end. Yes, Unix base is important to me, since I also use Linux, the similarities are nice, and I spend a bit of time in Terminal.

And it is a cute little thing.

If there is one thing I don't care for, it is the placement of the power

Re: (Score:2)

by abulafia ( 7826 )

Because the author wrote it for the Mac.

You can run the gateway on anything.

Boy I feel old (Score:4)

by 50000BTU_barbecue ( 588132 )

What is the purpose of this? What problem am I solving?

PIaaS (Score:2)

by doragasu ( 2717547 )

Prompt Injection as a Service

Re: (Score:2)

by MtHuurne ( 602934 )

It does seem that way, judging from [1]this video on the "Low Level" channel [youtube.com].

[1] https://www.youtube.com/watch?v=kSno1-xOjwI

Not for me (Score:2)

by Buchenskjoll ( 762354 )

Checks me into flights? In 1991 I checked in by fax, not getting the information that the flight had been moved an hour. If I hadn't gone to the airport early to take a bath, I would have missed it. Instead I sat all the way from Singapore to Copenhagen smelling like a (not recently) dead animal. I will check myself into flights, thank you, spare me your fancy modern faxes and AI.

Re: (Score:2, Insightful)

by Anonymous Coward

> If I hadn't gone to the airport early to take a bath

WOT?

Local? (Score:2)

by SlashbotAgent ( 6477336 )

When the article says local AI, I'm thinking that the I runs on the Mac Mini(or other local hardware).

But, you interact with it through WhatsApp? It uses AI backends like ChatGPT and Claude? It does stuff on website APIs?

How is this local? It sounds like it's just a friont-end. What is it's use case that other AI solutions don't have? It seems like Ollama is more loacl than this.

Re: (Score:3)

by FictionPimp ( 712802 )

It can run local models or (and most commonly) uses Anthropic/OpenAI api calls to their models.

If you have a high end mac mini you could probably do fairly well with local models.

Re: (Score:3)

by MtHuurne ( 602934 )

Mac Mini is a popular device for local inference, as it has a built-in GPU which shares the system RAM, so you can run relatively large models for its price.

I assume Clawdbot/Moltbot can work with any inference backend with an OpenAI-compatible API, so it's up to the user to choose between local inference or using a subscription to someone else's LLM service.

What the value is of running an agent locally when all your data is in cloud services is a good question, but I guess it could also use self-hosted dat

RNG computing (Score:2)

by Z80a ( 971949 )

This program is demanding absolutely precise and error free operation from something that do include an RNG generator.

It's the epitome of not understanding the tech.

Source: Business Insider. (Score:2)

by greytree ( 7124971 )

Source: Business Insider.

That means the author got a mate to plant a story on the drivel-driven site Business Insider.

Which appears to be enough to get on Slashdot in 2026.

That's all, nothing to see here.

Not local inference (Score:2)

by Dan East ( 318230 )

Just to be clear here, Moltbot does not run AI inference locally. You connect it to your standard AI services (ChatGPT, Gemini, etc), which do the actual AI processing. What Moltbot does is connect those things to other things, like to Whatsapp.

In fact, even if you do have your own local inference engine running, like a llama model, Moltbot can't work with it currently. It ONLY works with the big AI services.

It really is just glue to connects things together, and is so lightweight it even runs on a Raspberr

Re: (Score:3)

by drinkypoo ( 153816 )

> Just to be clear here, Moltbot does not run AI inference locally. You connect it to your standard AI services (ChatGPT, Gemini, etc), which do the actual AI processing.

From the project page, the first point of pride is:

Runs on Your Machine

Mac, Windows, or Linux. Anthropic, OpenAI, or local models. Private by default--your data stays yours.

So no, that's only an option.

Re: (Score:2)

by Fly Swatter ( 30498 )

that 'or local models' part should be in fine print it's so small you can't see it. Few people have the hardware capable of running the big models locally; and when they see the price to buy said hardware, for most - that ain't happening.

Re: (Score:2)

by Moridineas ( 213502 )

Mac Minis are solid at running models locally.

Re: (Score:2)

by karmawarrior ( 311177 )

> you aren't understanding what Moltbot is doing

This is true, but that's because the summary says the exact opposite of what you're saying, and I suspect it's you, not the summary, that's right given that a "useful" spicy-autocomplete system generally requires more set up than "Just install this easy to download package on a discarded Mac mini".

Urgh! Slashdot!

They are subject to a supply chain issue (Score:1)

by blueapples ( 614410 )

They auto closed a report that they are subject to a supply chain issue. Installing based on the readme on their official repo results in installing a package that is very much NOT moltbot: [1]https://www.npmjs.com/package/... [npmjs.com]

[1] https://www.npmjs.com/package/moltbot?activeTab=code

Time and tide wait for no man.