News: 0179799228

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Apple's New MacBook Pro Delivers 24-Hour Battery Life and Faster AI Processing (apple.com)

(Wednesday October 15, 2025 @11:20AM (msmash) from the 24-hours-battery dept.)


Apple unveiled [1]a new 14-inch MacBook Pro on Wednesday that features the company's M5 chip and represents what Apple describes as the next major advancement in AI performance for its Mac lineup. The laptop delivers up to 3.5 times faster AI performance than the M4 chip and up to six times faster performance than the M1 chip through a redesigned 10-core GPU architecture that incorporates a Neural Accelerator in each core.

The improvements extend beyond AI processing to include graphics performance that runs up to 1.6 times faster than the previous generation and battery life that reaches up to 24 hours on a single charge. Apple also integrated faster storage technology that performs up to twice as fast as the prior generation and allows configurations up to 4TB. The 10-core CPU delivers up to 20% faster multithreaded performance compared to the M4.

The laptop runs macOS Tahoe and includes a Liquid Retina XDR display available in a nano-texture option, a 12MP Center Stage camera, and a six-speaker sound system. The 14-inch MacBook Pro is available for pre-order starting Wednesday in space black and silver finishes and begins shipping October 22. The base model costs $1,599.



[1] https://www.apple.com/newsroom/2025/10/apple-unveils-new-14-inch-macbook-pro-powered-by-the-m5-chip/



Is people really using notebooks for AI? (Score:2)

by Parker Lewis ( 999165 )

I know that the main Apple motivation is to massage the investors, but is people really using their Macs to run local LLMs? (Genuine question, I have no idea). Another question: are software that have real AI capabilities using those NPUs or just exchanging data with company servers?

Re:Is people really using notebooks for AI? (Score:4, Informative)

by NoMoreACs ( 6161580 )

> I know that the main Apple motivation is to massage the investors, but is people really using their Macs to run local LLMs? (Genuine question, I have no idea). Another question: are software that have real AI capabilities using those NPUs or just exchanging data with company servers?

Yes, limited local AI. The have built a Privacy-Focused, three-tiered AI architecture, unique (AFAICT) in the industry.

Tom's guide explains it fairly well

[1]https://www.tomsguide.com/ai/a... [tomsguide.com]

[1] https://www.tomsguide.com/ai/apple-intelligence-everything-to-know-about-apples-ai

Re: (Score:2)

by beelsebob ( 529313 )

Apple ones - yes. The unified memory architecture makes them the fastest things you can buy for running AI outside of $5000 nVidia AI GPUs. Every desktop GPU is going to run out of memory on any serious model, and nothing else is as fast as these.

I mean, sure, you're likely wanting the M5 Ultra or M5 Max for doing that, rather than the base M5, but given that they're the same architecture, just with bits chopped off or doubled up, yeh... they're gonna need that.

Re: (Score:3)

by FictionPimp ( 712802 )

I do for my personal dev projects. It's getting to the point where I can see it being the new norm at my day job in the next 18 months.

Re: (Score:2)

by Moridineas ( 213502 )

Yeah, they absolutely are. There are some Local LLM reddit groups where people are doing some neat stuff.

The M* hardware is very impressive.

Re: (Score:2)

by Parker Lewis ( 999165 )

Very interesting! Do you mind to share one of those sub links?

Re: (Score:2)

by Moridineas ( 213502 )

I'm a neophyte, but there's some interesting information here: [1]https://www.reddit.com/r/Local... [reddit.com]

[1] https://www.reddit.com/r/LocalLLaMA/

Re: (Score:2)

by EvilSS ( 557649 )

I am. M4 Max with 128GB RAM. It's one of the cheapest options to run large models (basically anything that won't fit on a 5090's 32GB).

Re: (Score:2)

by DamnOregonian ( 963763 )

128GB of VRAM- ya. I'm using mine to do it. Love it.

The only thing that really uses the NPU is the OS.

You can build stable diffusion checkpoints to use it, but the only advantage is power efficiency- the GPU is faster... at many times the power draw.

"base" model (Score:2)

by null etc. ( 524767 )

> The base model costs $1,599.

Journalism outlets should start calling these things what they truly are: desperation models. Apple ratchets up the price so much if you want to upgrade past the desperation model that it's practically comical.

Re: (Score:2)

by cmseagle ( 1195671 )

There was some validity to this sort of argument when their base models came with 8 GB of memory and 128/256 GB of storage. That was always pretty borderline and you needed to factor in ~$400 on top of the base price to get it to a reasonable spot.

These have 16 GB memory and 512 GB storage. That's plenty for a large portion of the market.

The base model costs $1,599 (Score:3)

by stealth_finger ( 1809752 )

> The base model costs $1,599

Fuck I wish we could post gifs here because that one from spiderman where he says "oh you're serious" and then laughs even harder.

Re: (Score:3)

by cmseagle ( 1195671 )

What comparable laptop can you get for cheaper?

Battery life sounds great, but AI? (Score:2)

by marcle ( 1575627 )

For one thing, it's got questionable usability. For another, if I'm interested in AI, Apple isn't the name that immediately comes to mind...

Re: (Score:3)

by blackomegax ( 807080 )

Apple is shockingly a strong name in local LLM use. Just not at the base model level..... The max and pro chips have extremely fat memory bus speeds and up to 192gb unified ram, but even the 96gb or 64gb options can run hefty LLMs

Re: (Score:3)

by beelsebob ( 529313 )

Wait... Apple doesn't come to mind for AI? You clearly no absolutely nothing about buying hardware for running AI.

In order of speed, your options are:

A server with $5000 nVidia GPUs with lots of VRAM on board.

A Mac, preferably with their highest end chip in it

A desktop PC with a high end desktop graphics card

Thanks to their unified memory architecture, Apple's neural engines are the fastest thing that doesn't cost $5000 for the GPU alone that doesn't instantly run out of memory running any vaguely serious

Re: (Score:2)

by Moridineas ( 213502 )

> For one thing, it's got questionable usability. For another, if I'm interested in AI, Apple isn't the name that immediately comes to mind..

Yeah, Siri sucks and Apple's models are behind, but Apple is doing some interesting research and the M chip architecture is very, very good for running local models.

Even if you ignore Apple's own AI software, it's popping up in 3rd party software all over the place, including graphics and video editing.

Re: Battery life sounds great, but AI? (Score:2)

by topham ( 32406 )

I have an M1 MacBook Pro Max I've run gpt-oss-20b on it. It wouldn't support a dozen users, but it worked fine for me for experimental and dev testing.

Considering I didn't buy this machine with AI as an intended use, that's pretty amazing.

There's definitely some room for Apple in this space, while everybody thinks the only player is Nvidia.

Sure (Score:1)

by devslash0 ( 4203435 )

If you keep the lid closed and don't do anything with it.

Re: (Score:3)

by Moridineas ( 213502 )

> If you keep the lid closed and don't do anything with it.

It's easy to tell you haven't used Apple hardware in a long, long time.

Up to (Score:2)

by 0xG ( 712423 )

> delivers up to 3.5 times faster AI performance

> up to 1.6 times faster than the previous generation

> up to 24 hours on a single charge.

> up to 20% faster multithreaded performance c

Marketing weasel words.

Suuuure (Score:1)

by Reygle ( 5392954 )

battery life that reaches up to 24 hours on a single charge. If you believe that I have a bridge for sale.

The perversity of nature is nowhere better demonstrated by the fact that, when
exposed to the same atmosphere, bread becomes hard while crackers become soft.