Many of you have heard of the term “fat binaries,” or “universal apps,” in which the code will work on more than a single platform or product. It may even offer a different look and feel depending on the needs of that product.
So when Apple went to Intel CPUs beginning in 2006, they embedded a built-in emulator, dubbed Rosetta. for Power PC apps for several years. That way, you didn’t have to wait for developers to build compatible software. Developers could also serve both users by building apps that combined both Power PC and Intel code, so-called universal apps. It meant for larger downloads, but at least you were assured of a version that would work even on your older Mac.
Eventually that all went by the wayside; Apple made Rosetta an optional install for OS X Snow Leopard. Support was removed beginning with OS X 10.7 Lion. After that it was Intel or nothing.
Microsoft has supported fat Windows 10 binaries that run on regular PCs, tablets and, while they lasted, mobile gear. But with the constrained resources of a smartphone, it would seem wasteful, unless the download process strips the unused code from the binary.
Now there are published repots that Apple is planning on taking a similar approach with its current gear. So developers may be able someday to create one universal app that works on an iPhone, iPad and, yes, a Mac. The story comes from Bloomberg, which has a less-than-stellar reputation for accurate reporting about Apple, but it posits some intriguing possibilities.
As it stands, many iOS apps are universal in that they are optimized for both iPhone and iPad, with their very different display sizes and feature optimizations. That makes perfect sense, as does automatically stripping an installer of unneeded code to keep the download size as small as possible. It also simplifies the development process.
The theory from the Bloomberg blogger goes that, if developers can build one version of an app for the three Apple platforms, more software will be available in the Mac App Store. I suppose that means the selection could be larger.
But Apple’s own sandboxing restrictions already limit the kind of software available for iOS and macOS. So, for example, Rogue Amoeba’s Audio Hijack, used to capture and mix audio from a number of sources, wouldn’t be approved for Apple’s online software repositories. I should think ways can be found to ensure security in pushing audio from one app to another, but I don’t claim to be a developer.
In any case, the article cites unnamed sources in claiming the existence of what is being called Marzipan. But that doesn’t imply that it’s an official source. It may even be that some developers would like for something of this sort to happen. Then again, such a move would have no meaning for most people outside of the developer community.
Besides, it’s not the same thing as running an iOS app on a Mac. That’s already done in emulation in Xcode so mobile apps can be developed, but as a practical matter, many iOS apps are limited-purpose or otherwise restricted compared to their macOS counterparts. Very much of this is due to constrained resources and the requirements of Apple’s mobile hardware.
If true, this move has the potential of making it far easier for developers who have produced millions of iOS apps to embrace Apple’s traditional computing platform.
It would also demonstrate is Apple’s ongoing commitment to the Mac, something that was a little questionable in 2016, when only one product, the MacBook, received an update until fall, when the controversial MacBook Pro arrived. And some people weren’t even satisfied with that.
Yet another suggestion is that Marzipan is only the first step towards running macOS on Apple’s custom A-series silicon. Apple is already offering such chips for specialized tasks, such as the MacBook Pro’s Touch Bar and Touch ID implementations, and low-level functions on the iMac Pro.
But they aren’t intended, so far at least, to replace Intel’s CPUs. But by offloading certain functions to Apple designed CPUs, it may make way for better performance, and to build Macs with features that no PC maker can easily duplicate.
Is that the first step towards a wholesale chip migration? I suppose you can romanticize the idea, that today’s A-series CPUs can essentially match or at least approach Intel silicon in many performance parameters. Remember, too, that even today’s A11 Bionic chip, used on the iPhone 8 and the iPhone X, are probably not run at full bore because of the resource constraints of mobile hardware.
I don’t think it’ll happen in the foreseeable future, unless Intel falls down bigly in developing new Core chips. It’s not a matter of performance. It’s not a matter of being able to ease migration with an Intel emulator that offers decent performance. But it would also handicap the ability to run Windows natively on a Mac with Boot Camp, or with really good performance in a virtual machine. Apple would have to develop CPUs with much faster performance than Intel offers to be able to overcome the losses entailed in emulation.
But building a universal or fat binary shouldn’t represent a huge problem, if such a move makes any sense for developers, and, of course, to Apple.