2 thoughts on “Why Apple Cares About ARM – TMO Daily Observations 2020-05-06”
Really? Lack of Intel x86 inside Macs is a deal breaker for our University. Because we need true full compatibility with the rest (95%) of the world (Windows). And I do not mean boot camp alone, but also VMware Fusion, for instance. And last but not least, being critically essential in our workflow, true full compatibility with Microsoft Office applications (Word, PowerPoint and Excel) to share documents and scientific meetings with colleagues, track changes while co-editing manuscripts for scientific publications, etc. If Apple does not release Intel x86 Macs, we will be forced to switch to Windows for sure at our Universisty. A shame for all. Did Apple remember and learn the lesson from the PowerPC (RISC as ARM) fiasco yeas ago?
I don’t think it was mentioned, but Apple has sold many RISC processor Macs – the G series desktops and laptops. PowerPC chips hit a power/performance/heat roadblock, Apple switched to Intel Core processors, and PowerPC lived on in other platforms – servers, Playstation/X-Box, etc. But I digress…
To simplify a little – Fat binaries existed because the compiler could turn High level code into instructions for 2 processors – if the developer was able to write the app using high level code, using Apple’s APIs (application programming interface), then it -could- be compiled for both. I said simplifying… for sticklers out there.
If a transition were planned, Apple would presumably be shaping APIs to work with both, or at least the new chips.
Where companies like Adobe come unstuck is having so much Low level code, which talks directly to the chip (in Machine Language/binary), bypassing high level APIs. This is done for speed, is highly processor specific and by its nature, not re-compilable for another chip. Changing chip technology for applications with a lot of Low level code is a massive investment, possibly even requiring different developers with different skillsets, and potentially no knowledge of Adobe code/practices. Very sticky situation. No that the likes of Adobe wouldn’t have had adequate warning, but newer players like Affinity won’t be dragging decades of legacy with them -if- a change happens.
And you know how influential the high-powered applications for viability of Apple’s platform. Apple basically lost the professional audio and video market when it dropped the ball on Mac Pro (a music application that I use, Digital Performer, for over 30 years Mac-only, had to create a Windows version or lose its customers). A change of processor at the Pro level could be another nail in the Mac Pro coffin, before the newimprovedcheesegrater gains momentum. Making its own Mac processors seems inevitable for Apple, but watching how it’s done is going to be VERY INTERESTING.
Kelly – Westworld lost the plot (would have been -much- easier to take over Delos without a massacre in the park), Devs is the new shiny.
Really? Lack of Intel x86 inside Macs is a deal breaker for our University. Because we need true full compatibility with the rest (95%) of the world (Windows). And I do not mean boot camp alone, but also VMware Fusion, for instance. And last but not least, being critically essential in our workflow, true full compatibility with Microsoft Office applications (Word, PowerPoint and Excel) to share documents and scientific meetings with colleagues, track changes while co-editing manuscripts for scientific publications, etc. If Apple does not release Intel x86 Macs, we will be forced to switch to Windows for sure at our Universisty. A shame for all. Did Apple remember and learn the lesson from the PowerPC (RISC as ARM) fiasco yeas ago?
I don’t think it was mentioned, but Apple has sold many RISC processor Macs – the G series desktops and laptops. PowerPC chips hit a power/performance/heat roadblock, Apple switched to Intel Core processors, and PowerPC lived on in other platforms – servers, Playstation/X-Box, etc. But I digress…
To simplify a little – Fat binaries existed because the compiler could turn High level code into instructions for 2 processors – if the developer was able to write the app using high level code, using Apple’s APIs (application programming interface), then it -could- be compiled for both. I said simplifying… for sticklers out there.
If a transition were planned, Apple would presumably be shaping APIs to work with both, or at least the new chips.
Where companies like Adobe come unstuck is having so much Low level code, which talks directly to the chip (in Machine Language/binary), bypassing high level APIs. This is done for speed, is highly processor specific and by its nature, not re-compilable for another chip. Changing chip technology for applications with a lot of Low level code is a massive investment, possibly even requiring different developers with different skillsets, and potentially no knowledge of Adobe code/practices. Very sticky situation. No that the likes of Adobe wouldn’t have had adequate warning, but newer players like Affinity won’t be dragging decades of legacy with them -if- a change happens.
And you know how influential the high-powered applications for viability of Apple’s platform. Apple basically lost the professional audio and video market when it dropped the ball on Mac Pro (a music application that I use, Digital Performer, for over 30 years Mac-only, had to create a Windows version or lose its customers). A change of processor at the Pro level could be another nail in the Mac Pro coffin, before the newimprovedcheesegrater gains momentum. Making its own Mac processors seems inevitable for Apple, but watching how it’s done is going to be VERY INTERESTING.
Kelly – Westworld lost the plot (would have been -much- easier to take over Delos without a massacre in the park), Devs is the new shiny.