News: 1746689287

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

'I see you're running a local LLM. Would you like some help with that?'

(2025/05/08)


Clippy is back - and this time, its arrival on your desktop as a front-end for locally run LLMs has nothing to do with Microsoft.

In what appears to be a first for the 90s icon, [1]Clippy has finally been made useful, ish, in the form of a small application that allows users to chat with a variety of AI models running locally, with Gemma 3, Qwen3, Phi-4 Mini and Llama 3.2 serving as built-in ready-to-download versions. Clippy can also be configured to run any other local LLM from a GGUF file.

Developed by San Francisco-based dev Felix Rieseberg, who we've [2]mentioned [3]several [4]times on The Register before for his work maintaining cross-platform development framework Electron and passion for early Windows nostalgia, the app was written as a "love letter" to Clippy, he [5]wrote on the unofficial app's GitHub page.

[6]

Nu-Clippy only has praise for our 'critical voice in the tech and business world' ... Thanks, buddy. Though the business model part is half wrong: We don't do subscriptions. We definitely do take revenue directly from advertisers and their ad agencies; that's how we get paid.

"Consider it software art," Rieseberg said. "If you don't like it, consider it software satire."

He added on the app's About page that he doesn't mean high art. "I mean 'art' in the sense that I've made it like other people do watercolors or pottery - I made it because building it was fun for me."

[7]

Rieseberg is one of the maintainers of Electron, which uses a Chromium engine and Node.js to allow web apps in things like HTML, CSS, and JavaScript to operate like desktop applications regardless of the underlying platform - and that's what the latest variant of Clippy is all about demonstrating.

[8]

[9]

This Nu-Clippy is meant to be a reference implementation of Electron's [10]LLM module , Rieseberg wrote in the GitHub documentation, noting he's "hoping to help other developers of Electron apps make use of local language models." And what better way than with a bit of '90s tech nostalgia?

Who needs GitHub Copilot when you can roll your own AI code assistant at home [11]READ MORE

This unofficial AI iteration of Clippy (Clippy 2.0? [12]3.0 ?) may be more capable than its predecessor(s), but that's not to say it's loaded with features. Compared to a platform like LM Studio, which allows users to chat with local LLMs and has nigh countless options for tweaking and modifying models, Clippy is just a chat interface that lets a user talk to a local LLM like it would to one that lives in a datacenter.

In that sense, it's definitely a privacy improvement when considered alongside ChatGPT, Gemini, or its relatives, which are invariably trained on user data. Clippy doesn't go online for practically anything, Rieseberg said in its documentation.

"The only network request Clippy makes is to check for updates (which you can disable)," Rieseberg noted.

[13]Clippy designer was too embarrassed to include him in his portfolio

[14]You know something's wrong when Clippy fills you with nostalgia for simpler times

[15]How to run an LLM on your PC, not in the cloud, in less than 10 minutes

[16]Don't want Copilot app on your Windows 11 machine? Install this official update

AI Clippy is dead-simple to run, too. In this vulture's test on his MacBook Pro, it was a snap to download the package file for an Apple Silicon chip, unzip it, let it download its default model (Gemma 3 with 1 billion parameters), and start asking questions. When Clippy's Windows 95-themed chat window is closed, it remains on the desktop, and a click opens the window back up for a new round of queries.

As to what AI Clippy could do if Rieseberg had the time, he told The Register that node-llama-cpp, the Node.js binding file used by Llama and other LLMs, could allow Clippy to access all the typical Llama.cpp inference features that one could use with other locally-run AIs. Aside from temperature, top k, and system prompting, Rieseberg said they're not exposed.

[17]

"That's just a matter of me being lazy, though. The code to expose all those options is there," Rieseberg added. He's unlikely to get a chance to do so anytime soon, as he's scheduled to join Anthropic to work on Claude next week, meaning he'll be busy with more serious AI projects.

Rieseberg also said that he's not particularly worried about Microsoft coming at him for coopting their contentious desktop companion, but said that if they came after him he wouldn't fight.

"The moment they tell me to stop, shut it down, and hand over all my code, I will," Rieseberg told us. But he doesn't think it would make sense for Microsoft to do anything with Clippy and AI itself.

[18]

"Building a fun stupid toy like I have is an entirely different ballgame from building something really solid for the market," he said. "With Cortana and Copilot they have probably much better characters available."

The new Clippy is available for Windows, macOS, and Linux, which makes perfect sense given the developer's cross-platform background.®

Get our [19]Tech Resources



[1] https://felixrieseberg.github.io/clippy/

[2] https://www.theregister.com/2020/01/27/electron_devs_bond_at_covalence_conference/

[3] https://www.theregister.com/2019/02/11/windows_95/

[4] https://www.theregister.com/2018/08/28/microsoft_round_up/

[5] https://github.com/felixrieseberg/clippy?tab=readme-ov-file

[6] https://regmedia.co.uk/2025/05/07/screenshot_clippy_ai_register.png

[7] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/bootnotes&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aByAv1s9Y8CBTdjUR5gUNAAAAUA&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[8] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/bootnotes&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aByAv1s9Y8CBTdjUR5gUNAAAAUA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[9] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/bootnotes&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aByAv1s9Y8CBTdjUR5gUNAAAAUA&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[10] https://github.com/electron/llm

[11] https://www.theregister.com/2024/08/18/self_hosted_github_copilot/

[12] https://www.theregister.com/2025/03/17/copilot_windows_update/

[13] https://www.theregister.com/2023/06/12/clippy_designer_embarrassed/

[14] https://www.theregister.com/2025/02/05/microsoft_clippy_post/

[15] https://www.theregister.com/2024/03/17/ai_pc_local_llm/

[16] https://www.theregister.com/2025/03/17/copilot_windows_update/

[17] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/bootnotes&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aByAv1s9Y8CBTdjUR5gUNAAAAUA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[18] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_offbeat/bootnotes&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aByAv1s9Y8CBTdjUR5gUNAAAAUA&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[19] https://whitepapers.theregister.com/



subscription model

Valeyard

anyone can bullshit their way through something, but it takes AI to do it with such authority and confidence, I aspire to that level of barefaced confident delivery when blagging

b0llchit

Maybe the best art and satire would be for it to ask you: Would you like me to uninstall myself? and then do so regardless the answer.

A reader reports that when the patient died, the attending doctor
recorded the following on the patient's chart: "Patient failed to fulfill
his wellness potential."
Another doctor reports that in a recent issue of the *American Journal
of Family Practice* fleas were called "hematophagous arthropod vectors."
A reader reports that the Army calls them "vertically deployed anti-
personnel devices." You probably call them bombs.
At McClellan Air Force base in Sacramento, California, civilian
mechanics were placed on "non-duty, non-pay status." That is, they were fired.
After taking the trip of a lifetime, our reader sent his twelve rolls
of film to Kodak for developing (or "processing," as Kodak likes to call it)
only to receive the following notice: "We must report that during the handling
of your twelve 35mm Kodachrome slide orders, the films were involved in an
unusual laboratory experience." The use of the passive is a particularly nice
touch, don't you think? Nobody did anything to the films; they just had a bad
experience. Of course our reader can always go back to Tibet and take his
pictures all over again, using the twelve replacement rolls Kodak so generously
sent him.
-- Quarterly Review of Doublespeak (NCTE)