Apple’s new M1 computers max out at 16GB RAM and you can’t use an external GPU
Nov 12, 2020
Share:
The desktop on which I’m writing this article is a PC I built myself in 2013. I built it with 16GB RAM. Back then it was considered a “decent” amount of memory. It was plenty to satisfy most needs, but not overkill. Nor was it a particularly expensive quantity of RAM. Here we are seven years later, in 2020, and Apple thinks that’s the amount of ram you need in a computer today.
Apple’s new M1 ARM-based processor computers announced this week are limited to a maximum of only 16GB of RAM. Which is severely disappointing when creatives have been getting very excited about the impending apple announcement over the past few months. What makes it even worse, though, is that it won’t work with an external GPU, either.
The Verge reports that Apple is recommending that customers interested in buying a Mac with more than 16GB should just purchase one of the older Intel-based models, which doesn’t do a whole lot for confidence in their newly announced systems. A number of years ago when people were complaining about memory limitations in the MacBook Pro, Apple said that the limitations were to help conserve battery life. This time around, though, it looks like it’s just a limitation of the chips themselves.
And whatever capacity of system you do get, you’re stuck with it forever. Want to save a little money and get the 8GB system for now with plans to upgrade it to 16GB in the future? Nope, sorry. You can’t do that, either, because the RAM is built right into the chip. So it’s not replaceable or upgradeable.
Even if you think you can just about manage for your media creation with 16GB RAM, you’re still going to be somewhat limited as the new chip also won’t work with external GPUs. While PC users might not be all that familiar with external GPUs (because we can just put Nvidia ones right inside our machines), they’re quite common in Macs and MacBooks to provide more graphical processing power for demanding applications that have GPU acceleration – like most photo and video editors these days.
Although Apple had apparently planned to replace their entire lineup with the new ARM-based chips, there’s no way the current chip would even be able to keep up with the demands of content creators five years ago, let alone their needs today. When 4K, 6K and even 8K RAW footage is becoming common with cameras hitting 60+ megapixels for stills, it’s just not enough.
The “transition” to shift the Mac lineup over to the new chip begins now, Apple says, and will take two years to complete. But they’d better have a few tricks up their sleeve and some new processors that can utilise more RAM and more GPUs if they want to keep hold of creatives – which they may or may not car about, especially given some of their actions (or inactions) over the last few years.
[via The Verge]
John Aldred
John Aldred is a photographer with over 25 years of experience in the portrait and commercial worlds. He is based in Scotland and has been an early adopter – and occasional beta tester – of almost every digital imaging technology in that time. As well as his creative visual work, John uses 3D printing, electronics and programming to create his own photography and filmmaking tools and consults for a number of brands across the industry.





































Join the Discussion
DIYP Comment Policy
Be nice, be on-topic, no personal information or flames.
30 responses to “Apple’s new M1 computers max out at 16GB RAM and you can’t use an external GPU”
Oh look Apple shoots itself in the foot! ???
If you run the iOS versions, 16gb is a big step up and cheaper. If you use desktop versions, I think it will still be better for most ( not large video or 3D files ) uses due to the way the m1 works with all its modules. Assuming you are not using the intel version.
wait for the update that allows you to spend more money to allow you to do what it should have done in the first place………….
Apple keeps trying to get people who make $20k on their work spend $13K on their computer.
For those who complains about the 16 GB RAM cap, ARM processors use less RAM than x86 architecture for the same operations.
But what about the actual image and video data? That’s still the same. That’s irrelevant of platform. :)
John Aldred I believe it uses system storage as RAM. Don’t know exactly, but he’s not wrong.
I’m pushing 4K 30 and HD 100 video (not ProRes raw) around very fluidly in LumaFusion on a 2018 iPad Pro and that thing has no more than 4-6GB of ram.
But without knowing what kind of project it is, that doesn’t really mean much. Are we talking hour-long projects with hundreds of clips and several video tracks and Fusion effects all running simultaneously?
I mean, you can edit basic single track 4K30 just adding music with pretty minimal hardware. But until I see some solid evidence, there’s no way anybody will be able to convince me an M1 with 16GB of RAM is going to be capable of the same kind of work as an Intel or AMD based desktop (PC or Mac) with 128GB RAM and a high end NVidia or AMD GPU. :)
John Aldred you know I’m a video newbie by comparison to you, I’ve never done an hours long production – but I’m saying try it before you write it off.
I don’t expect it to perform like a high end workstation, but I’m suggesting you can’t simply judge it by past hardware metrics. Given from what I’ve seen my iPad handle, that my i7 intel 32GB MacBook fights, I think it will easily punch above its weight class. Obviously these machines aren’t intended for the likes of you, but if the advancements in ARM integration really does enhance performance with minimal power requirements, this is going to change a lot. MS has its toes in the pond too with the Surface Pro X. It’s still needs some time to mature, but I think it will change things for the better as intel (and x86 in general) seems to have stagnated.
BUILD YOUR OWN COMPUTER!
Why write an article like this without anyone actually using the product yet? It’s a completely different setup to what we all are accustomed to… plus if the benchmark comparisons floating around are correct would more than 16gigs RAM and an eGPU even boost real world experience??
Christian Nixon coz u don’t have to be rocket scientist to know how much ram CPU and graphic CPU is used by graphic program. I have my Mac with top top stuff in and some times is sweat.
Christian Nixon and one more m1 CPU plus only 16ram is just the beginning to force us to spend more and more money for better gutters in Mac system.
Marcin Pietrzak have you used any of these new systems or even the developer kit for photo or video work?
Marcin Pietrzak dude no-one has used these setups yet. I’m saving my opinions as we all should for when they’re in ppls hands getting put through it’s paces anything before that is just arguing from ignorance
If you can explain to me how you’ll fit 30-50GBs worth of video (which always gets bigger when it’s decrypted and decompressed into actual image data) into 16GB of RAM while editing (plus the operating system and all the other stuff it has to do in the background), then you have my attention. :)
If ARM is that much more efficient, why doesn’t the Raspberry Pi 4 outperform my 7-year old desktop? :)
Also tell me why Apple’s still telling people to buy the Intel ones if photography, video and content creation is their goal? :)
“anything before that is just arguing from ignorance”
Actually, having no opinion at all in this case would be ignorance. Decades of experience and understanding how computers work is what others are basing their opinions on.
I’m no fan of the Apple’s hardware walled garden, but there are so many things wrong with this statement @John Aldred that it would be tedious to go through them all. Now would be a good time to quietly close the book on this post and move on.
John Aldred
My iPad Pro can comfortably handle 4k footage with less RAM and an overall lower powered chip than the M1. Will I ever edit a Hollywood blockbuster on it? Of course not not would I expect it to.
If ARM isn’t more effecient then why does the iPad outperform 4 year old laptops?
The systems announced so far are clearly base models to replace the intel base models. If your needs has always been on the higher end why would you suddenly want or expect that on a base model or want Apple to recommend them to that audience. ?
Anything before that IS arguing from ignorance
While I don’t have decades experience I do have years enough (with formal education and work experience) to know that a new platform that has yet to be put into the hands of consumers should be tested fully before being judged on in such a matter of fact way. You have no experience with THIS new chip so your opinion is on the same footing as no opinion. Good read though
“If ARM isn’t more effecient then why does the iPad outperform 4 year old laptops?”
Because it doesn’t in all cases? Because you’re comparing it to a crap laptop? :)
Your ignorance is showing…
The graphics power built into these machines is better than some of the top end GPUs on the market right now…
That ram is used simultaneously by the CPU and GPU which means apps can be optimized depending on whether it needs more CPU or GPU power…
Those benchmarks that you often see being used online…
you often see the GPU being heavily taxed while the CPU throttles down, or vice versa… But seldom do you ever see any application maxing out the GPU and system ram simultaneously.
There’s lots of photographers out there that are perfectly happy using an iPad Pro… The M1 chip should be a noticeable improvement over that
Don Barnard thanks for writing that finally someone who understands computers
Although the all in one is much more efficient I will wait for a year to see how graphic intensive apps would compete for memory with basic stuff like Chrome.
The graphics power built into these machines is better than some of the top end GPUs on the market right now.
Erm thats completely false, the M1 is comparable to a 1050ti or at best a 1650 which are both previous gen entry level Nvidia cards from 2014 & 2019. Forget Geekbench, in a non-synthetic test under continuous load this M1 chip would be destroyed by both of those cards.
Perhaps a bit less snark and a few more facts might help. Do your research before relying on the Verge and other PC-centris sites for your opinions. Look to Don Barnard and also realize that larger file sizes are really storage issues, especially withe the way the M1 chips allocate RAM usage.
Apple gave up on power users a while ago. Now it’s just another luxury lifestyle brand
Apple are complete control freaks when it comes to modifying any of their products. Their hardware is no better than a high end Dell. Everything is proprietary. You’re paying ridiculous prices for a mediocre product. My first and last experience with Apple was an iphone 5. It was just as I expected and worse. However, if you want a product that thinks for you, Apple is for you.
Tha Train if your “first and last experience with Apple is an iPhone 5” …you have no clue what you’re talking about when it comes to their hardware and software integration.
It’s like writing a review article without actually experiencing the product.
There’s very little chance that Apple doesn’t release an M1 Pro chip that’s larger form factor with 32+ GB of ram. They control the socket.
Guys, this is the first version. M1. There will be new versions with more capability. It is normal and natural. Remember that these are entry level machines. Not the full pro line. (apart from the 13″ pro)
So you are writing an article condemning a product you have never used!
Borrow someones machine and run the projects you are saying the M1 chip cannot handle then you will at least appear well informed and able to make an honest judgment.