News: 0153858863

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Apple Introduces M1 Pro and M1 Max (apple.com)

(Monday October 18, 2021 @05:25PM (msmash) from the moving-forward dept.)


Apple today [1]announced M1 Pro and M1 Max , its new chips for the Mac. Apple:

> M1 Pro and M1 Max introduce a system-on-a-chip (SoC) architecture to pro systems for the first time. The chips feature fast unified memory, industry-leading performance per watt, and incredible power efficiency, along with increased memory bandwidth and capacity. M1 Pro offers up to 200GB/s of memory bandwidth with support for up to 32GB of unified memory. M1 Max delivers up to 400GB/s of memory bandwidth -- 2x that of M1 Pro and nearly 6x that of M1 -- and support for up to 64GB of unified memory. And while the latest PC laptops top out at 16GB of graphics memory, having this huge amount of memory enables graphics-intensive workflows previously unimaginable on a notebook. The efficient architecture of M1 Pro and M1 Max means they deliver the same level of performance whether MacBook Pro is plugged in or using the battery. M1 Pro and M1 Max also feature enhanced media engines with dedicated ProRes accelerators specifically for pro video processing. M1 Pro and M1 Max are by far the most powerful chips Apple has ever built.

>

> Utilizing the industry-leading 5-nanometer process technology, M1 Pro packs in 33.7 billion transistors, more than 2x the amount in M1. A new 10-core CPU, including eight high-performance cores and two high-efficiency cores, is up to 70 percent faster than M1, resulting in unbelievable pro CPU performance. Compared with the latest 8-core PC laptop chip, M1 Pro delivers up to 1.7x more CPU performance at the same power level and achieves the PC chip's peak performance using up to 70 percent less power. Even the most demanding tasks, like high-resolution photo editing, are handled with ease by M1 Pro. M1 Pro has an up-to-16-core GPU that is up to 2x faster than M1 and up to 7x faster than the integrated graphics on the latest 8-core PC laptop chip.1 Compared to a powerful discrete GPU for PC notebooks, M1 Pro delivers more performance while using up to 70 percent less power. And M1 Pro can be configured with up to 32GB of fast unified memory, with up to 200GB/s of memory bandwidth, enabling creatives like 3D artists and game developers to do more on the go than ever before.

>

> M1 Max features the same powerful 10-core CPU as M1 Pro and adds a massive 32-core GPU for up to 4x faster graphics performance than M1. With 57 billion transistors -- 70 percent more than M1 Pro and 3.5x more than M1 -- M1 Max is the largest chip Apple has ever built. In addition, the GPU delivers performance comparable to a high-end GPU in a compact pro PC laptop while consuming up to 40 percent less power, and performance similar to that of the highest-end GPU in the largest PC laptops while using up to 100 watts less power.2 This means less heat is generated, fans run quietly and less often, and battery life is amazing in the new MacBook Pro. M1 Max transforms graphics-intensive workflows, including up to 13x faster complex timeline rendering in Final Cut Pro compared to the previous-generation 13-inch MacBook Pro. M1 Max also offers a higher-bandwidth on-chip fabric, and doubles the memory interface compared with M1 Pro for up to 400GB/s, or nearly 6x the memory bandwidth of M1. This allows M1 Max to be configured with up to 64GB of fast unified memory. With its unparalleled performance, M1 Max is the most powerful chip ever built for a pro notebook.



[1] https://www.apple.com/newsroom/2021/10/introducing-m1-pro-and-m1-max-the-most-powerful-chips-apple-has-ever-built/



x86 is dead (Score:1, Funny)

by Anonymous Coward

Bow down to your new god.

Re:x86 is dead (Score:5, Interesting)

by dnaumov ( 453672 )

> Said like a true believer, if not just a fool.

Only a true believer would post something like you did. Others would see that neither Intel nor AMD _STILL_ have an answer even to the M1 when looking at is as a total price/performance/noise level/power draw/battery life package. If the claims of 1.7x M1 are true, things are getting outright embarrassing for x86.

Re: (Score:2)

by znrt ( 2424692 )

that true 1.7x only exists in apple's walled garden. embarrassing for x86 indeed (methinks they're getting used to that :D), but still very far from world domination. implying that this can anyhow dominate the broader market in the current context is pure fanboyism, and the person you are calling "true believer" is just pointing out that glaring evidence (granted, without much explanation).,

Re: (Score:3)

by MightyMartian ( 840721 )

Strange statement. I've been using open source software on my M1 Mac for a couple of months now, as well as writing code. Java and C/C++ all seem to work just fine, and the battery life is just bloody amazing. Best of all, the command prompt is actual BSD Unix, so I have a powerful processor and the CLI interface I enjoy the most and am the most productive in. I have my old Dell laptop if I need to do anything Windows-specific, but honestly, other than to use it for video conferencing, I rarely even turn it

Re: (Score:2)

by znrt ( 2424692 )

that's indeed progress.

however, that's also ... openBSD :D

which is fine and dandy, by all means enjoy that luxury. i wouldn't expect any significant portion of the desktop userbase to follow, though. the fight for world domination will be long ... and lonely!

Re: (Score:1)

by Distortions ( 321282 )

If the performance is competitive, great.

But this is Apple under Tim whatshisface.

My assumption will be it is complete lies until thoroughly proven otherwise.

Re: (Score:1)

by BeepBoopBeep ( 7930446 )

Dont need to compete with AMD and Nvidia, no one plays games on Mac anyways. It has all the video encode/decode accelerators plus the AI blocks the target audience will use, just scrap the crazy 3D stuff out of the silicon. I have a M1 Air, would love to see the new Air next year. Fanless or nothing.

There are signifiant numbers, more coming (Score:1)

by SuperKendall ( 25149 )

No-one played games on Mac before because the GPU performance was not that great.

But lots and lots of people play games on iPads and iPhones, and that widespread use has been bleeding over to more and more Mac ports of high end games. And with Apple offering now what are laptops that have the specs for premium level gaming, I would not be surprised to see more game makers jumping on board with ports... after all many games are based on Unitty or Unreal so it's not like it's a massive step to support the Ma

Re: (Score:1)

by dfghjk ( 711126 )

"...and that widespread use has been bleeding over to more and more Mac ports of high end games."

Leading with a lie, as is usual for you, SuperKendall. Some tepid observation, followed by a fanboy lie.

"And with Apple offering now what are laptops that have the specs for premium level gaming, I would not be surprised to see more game makers jumping on board with ports..."

And we are not be surprised by your pro-Apple propaganda. It's not what a company sells, it's what people have. Gamers do not have Macs.

Re: (Score:2)

by dfghjk ( 711126 )

Because gamers don't use Apple. When that changes, game makers may support the platform.

Not for laptops (Score:3)

by SuperKendall ( 25149 )

Graphics while a huge boost compared to the crappy M1 are still a long long way behind offerings available from Nvidia and AMD.

If you watched the presentation they compared it against "the most expensive gaming laptop" and the performance of the M1 Max was about equal in terms of GPU...

Except that the performance of the Apple laptop doesn't drop on battery only.

And it used I think 100w less power?

Yes there are higher end GPU's more powerful. Apple will get around to trouncing those next year with the M1 Ul

Maybe, not sure (Score:1)

by SuperKendall ( 25149 )

Why didn't they compare it against the most powerful gaming laptop?

They might have, I don't remember the exact wording used. I await further spec tests to get real numbers compared against real systems.

Re:Not for laptops (Score:4, Insightful)

by _xeno_ ( 155264 )

I believe they did actually say it was the most "powerful" laptop they could find.

They didn't say what it was and their graph's vertical scale is meaningless (it's "relative performance" and it looks like the M1 Max caps out at "375" while the "most powerful gaming laptop" caps at over "400") so who freaking knows what that means.

Plus, how are they even comparing things? Did they use a benchmark? What benchmark? Relative to what?

Who knows. They didn't give exact figures and they didn't say what they were comparing against.

It'll be interesting to see these things benchmarked when they ship. I expect that they really will have some impressive 3D performance, but none of that really matters because it's not like anyone uses Macs for anything anyway. (Even creative types have mostly moved over to Windows tools thanks to Apple Silicon breaking almost every piece of software professionals use. Sure, the pro-sumer stuff has been ported, but not the pro stuff.)

Yes raytracing an interesting point (Score:1)

by SuperKendall ( 25149 )

There's no hardware raytracing support

That is a interesting point about the raytracing, I wonder if not having specific hardware devoted for raytracing would be made up at all by Metal optimizations for the GPU that it contains which may have something like hardware support for raytracing calculations we just don't know about yet...

Apple does have a [1]guide [apple.com] on how to use Metal to accelerate ray tracing.

I have no reason to expect Apple has yet caught up to that level of GPU support though. Maybe in the deskto

[1] https://developer.apple.com/documentation/metalperformanceshaders/metal_for_accelerating_ray_tracing

Re: (Score:2)

by dfghjk ( 711126 )

You get your benchmarks from Apple presentations? Of course you do.

Since when do people who care about graphics performance operate on battery only? Oh yeah, since Apple wants to promote it.

"Yes there are higher end GPU's more powerful. Apple will get around to trouncing those next year with the M1 Ultimate Pro Max chip or whatever they choose to call the Mac Pro desktop chip."

Sounds objective. At least you aren't claiming the generation in the next 3 months, like happened here last time with the M1.

I gu

Re: x86 is dead (Score:2)

by leonbev ( 111395 )

People are guessing from the Apple charts that the GPU performance will be comparable to a mobile GeForce 3060. It will be interesting to see if the actual benchmarks will meet those goals.

If it's true, it's basically the first APU in history where the integrated graphics do not suck!

Re: (Score:2)

by Ostracus ( 1354233 )

RISC-V says hi.

Re: (Score:2)

by raynet ( 51803 )

And goes back to sulk in the corner and is allowed only to return once it gets 2 GHz clock speed and is barely usable for running any kind of desktop app.

Re: x86 is dead (Score:2)

by Noah Draper ( 5166365 )

Is all a relative to the current garbage that is standard coding these days. I've got computers from 1982 that still will perform Are relevant tasks quicker Then modern day computers because of competent programmers who do not abstract the hell out of everything. Is modern Os black hole the majority of hardware resources.

Re: (Score:2)

by znrt ( 2424692 )

not really. this thing looks indeed very nice but if it can't run anything but mac-os then it's for sure not killing any competition whatsoever anytime soon. actually, it might even stimulate the competition which is still good news.;-)

Re: (Score:2)

by PCM2 ( 4486 )

> not really. this thing looks indeed very nice but if it can't run anything but mac-os then it's for sure not killing any competition whatsoever anytime soon. actually, it might even stimulate the competition which is still good news.;-)

It can run pretty much anything. The new macOS (the one that these things boot to, out of the box) has an emulation/translation layer that pretty much works. But major application suites have already been compiled and optimized for Apple silicon. Microsoft Office and Microsoft Edge already are, so I'm sure the Adobe stuff is too, and Apple's media stuff... etc. It just doesn't seem like it's anywhere near hard to cross-compile software to these chips.

Re: (Score:2)

by znrt ( 2424692 )

thanks for the update. then it's just a matter of apple providing a sensible usage license and other software vendors working on compilers and platforms. sounds good.

Re: (Score:2)

by dfghjk ( 711126 )

the dumber you are, the more you are willing to believe. See SuperKendall...and PCM2. It runs what it runs, and Apple fanboys will claim that anything it can't is not "anything".

Re: (Score:2)

by ElitistWhiner ( 79961 )

CASM child porn surveillance is Apple’s new mission.

Apple stepped away from computing to be the Man in the middle

Awesome (Score:1)

by fod_dzug ( 6598790 )

I welcome this new contender. Can't wait to see independent reviews! /drool

ffs (Score:2)

by richy freeway ( 623503 )

Hate to say it, but as an Android fanboy who utterly dislikes Apples products I'm a bit jealous.

Google has money, why aren't they doing this?

Re: (Score:1)

by BeepBoopBeep ( 7930446 )

Google is an OS supplier, they barely sell any Pixel phones of their own. No ROI, easy math. Even MSFT has to think carefully to go down this road with their own Surface laptops.

Re: (Score:2)

by AmiMoJo ( 196126 )

Google's CPU is launching with the Pixel 6... I think this month, certainly soon.

If you want a high performance laptop though, Ryzen is still king.

nice euphemism (Score:2)

by algaeman ( 600564 )

Unified certainly sounds much more robust than shared memory...

Re: (Score:3)

by Proudrooster ( 580120 )

Technically it is shared memory, but there is a 'switched fabric' memory controller between the CPU and GPU.

What is the significance?

It means the GPU doesn't have to bother the CPU to fetch memory for it, thus slowing down the CPU and tying up the memory bus.

Right now in most shared memory setups, the CPU is the default memory controller, like Intel, the only path to memory is through the DDR memory bus which is connected to the CPU.

[1]https://www.intel.com/content/... [intel.com]

In the M1 architecture both the CPU and GP

[1] https://www.intel.com/content/dam/www/public/us/en/documents/white-papers/ia-introduction-basics-paper.pdf

Re: (Score:1)

by BeepBoopBeep ( 7930446 )

Its not just GPU access to memory, but also other functional blocks (AI, video accelerators, etc) and the SSD. Dont need to go through CPU to waste cycles. Its ideal, but not upradeable.

Re: (Score:1)

by f00zbll ( 526151 )

I've done comparison between my M1 macbook air to windows amd 3700x and linux quad core i7. I can tell you without any BS, for many daily programming activities, the M1 is faster. For example starting a ReactJS project on m1 MB air is usually 4x faster and more responsive. That unified memory makes a big difference. My AMD 3700x system has 64G of memory and 8 full power cores. Yet the M1 with 4 perf cores out performs it.

I love being able to replace or upgrade components on my windows workstation, but unifi

Re: nice euphemism (Score:1)

by Osgeld ( 1900440 )

So an 8 core 64 gig monster is not fast enough to run your shitty code

No wonder things are so bloated nowadays

Re: nice euphemism (Score:2)

by fred6666 ( 4718031 )

How do you know the unified memory has anything to do with it? Perhaps the m1 CPU itself is just fast.

Re: (Score:2)

by dfghjk ( 711126 )

yes, that used to be a slam, but now Apple makes it so it is revolutionary

Re: (Score:2)

by edwdig ( 47888 )

Assuming they're using "Unified" in the same way you see in game consoles, then yeah, it's a big difference.

Shared memory on your typical Intel integrated GPU means your GPU's memory bus is wired directly to the CPU. All GPU memory requests get sent to the CPU, then they get processed like any memory access from the CPU would, including going thru the CPU's cache.

On game consoles with unified memory, there's often multiple memory busses. You've got one bus that goes RAM -> CPU -> GPU, and another that

(Slashdot Commenter Voice) (Score:1)

by Drew84WHEEE ( 1447189 )

More space than a Nomad and has WiFi now, but: No compatibility with TabWorks or WordPerfect 2.3, so my definitely-generalizable shop is out. It’s a real shame, because they seem neat.

Several steps backwards (Score:2)

by t0qer ( 230538 )

I don't mind SOC for certain things. My phone for instance, that's great. A hobbyist computer like the Pi, it's great there too. I also get why it's advantageous to have RAM on chip, and maybe even share it with the GPU.

OTOH I like the modularity a desktop provides, and some of the stuff from this article (OTHER LAPTOPS ARE LIMITED TO 16 GB OF GPU RAM" is kind of chicken shit. We're talking about shared VS dedicated GPU ram. There are plenty of Intel GPU laptops that use shared memory.

I grew up in an a

Kinda makes sense, but it is terrible (Score:2)

by stikves ( 127823 )

I am writing this on one of the weirdest collaborations ever designed. An Intel NUC with an embedded AMD GPU on SOC. It has i7-8809G, with an i7 core and n Radeon RX Vega combined on the same chip. It actually performs quite well, and can play many games at 1080p.

So I can see the appeal.

That being said, the Apple chip has many downsides for the public. First of all, lack of OS choice. Even though Linux is being hacked to run on their platform, many things do not work. Than lack of proper PCIe extensibility,

Re: (Score:2)

by Dynedain ( 141758 )

I dunno, if you go back to the pre "PC-compatible" days, the variety of hardware and unique takes on what an OS should be was astoundingly diverse compared to what we saw during the Wintel-dominated eras of "flexible standards". Sure, the Wintel days meant cheap interchangeable hardware, but that also meant a very generic definition of physical form factors and lowest-common-denominator interoperability between software and hardware.

When? How Much? (Score:1)

by dudeus ( 664731 )

Guess I missed it in the press release, when are these available and how big a mortgage do I need to take out to own one? Or are these like phones now and I really only rent them?

The posts here so far are pretty predictable (Score:2)

by 93 Escort Wagon ( 326346 )

Let's take it up a notch , shall we?

Anandtech has great article on chips (Score:2)

by SuperKendall ( 25149 )

Even though they don't have access to the chips yet, Anandtech has a [1]great article [anandtech.com] up going through what details they can surmise, and giving more details on the Intel chips Apple was comparing against in the presentation.

One astounding point - the M1 Max has 57 BILLION transistors, built on a 5nm process... also from the article "AMD advertises 26.8bn transistors for the Navi 21 GPU design at 520mm on TSMC's 7nm process". So wow.

[1] https://www.anandtech.com/show/17019/apple-announced-m1-pro-m1-max-giant-new-socs-with-allout-performance

Re: 32GB? (Score:1)

by fod_dzug ( 6598790 )

I'm pretty sure my order was for 64GB.

Well, I'm INVISIBLE AGAIN ... I might as well pay a visit to the LADIES
ROOM ...