To be sure, Mac users have had to undergo lots of change in the past twelve years. First there was that transition from 680×0 to PowerPC. It meant that applications had to be recoded to support the new processor technology, and the older stuff, even most of the operating system, ran in emulation. At first, the Power Macs were dog slow, until the software caught up. While some of the updates were free, you felt double-dipped by publishers who updated their applications to make them PowerPC native, but somehow scrimped on new features. Wasn’t greater performance sufficient to make you pay for the privilege?
In 1998, Apple did it again to you with the arrival of the original iMac, where Apple tossed traditional serial ports and SCSI out the window in favor of a new beast, known as USB. You had to look for adapters, or buy new peripherals, and external drives were pathetically slow, at least until FireWire came onboard for later iMacs. Oh yes, you had to buy new drives to support the feature, or some sort of SCSI to FireWire adapter, which sometimes created more problems than it solved.
All right, peripheral connection ports were one thing. But when Apple decided that the aging Mac OS was not long for this world, and went with Unix and Mac OS X, that was something else altogether. Again, software developers were forced to rebuild their software, but there was a Classic environment that let you run all the old stuff with decent performance. Consider now that you had what was, in effect, double emulation. First, to allow you to run software that wasn’t compiled for PowerPC, and then for software that wasn’t updated for Mac OS X. That it worked at all with reasonable success must have been some sort of miracle.
It didn’t take long for the Mac OS X transition to be complete, but those heady days were downright painful. Performance was dreadful, and lots of stuff just didn’t work. It took a while for Apple to boost performance, both with the operating system and the hardware, and today things are pretty good. But that world-class architecture wasn’t quite as bullet proof as promised, although it was, in large part, way better than the Classic Mac OS, which was collapsing under its own weight for many years.
Just when you thought you could get on your life, Apple did the unthinkable last year, and that was to impose another processor transition on you. This time, it was the move to half of the infamous Wintel alliance, Intel. After all these years of telling us that the PowerPC was way better, now we were being told that Intel’s roadmap was far superior. Goodbye Freescale. Goodbye IBM. Hello Intel, and to send the point home in style, Steve Jobs shared the stage with Intel CEO Paul Otellini.
Of course, developers, having endured the transition to PowerPC and then to Mac OS X, were asked to get back to work yet again and make all their applications Universal. Not only that, they had to migrate all their code, which sometimes added up to millions of lines, to Apple’s own development environment. This made their task doubly difficult, which, in part, explains why such companies as Adobe and Microsoft are months and months away from getting the job done.
Once again, Apple delivered an emulation environment, this time for PowerPC software? Classic and 680×0 were buried for good. Imagine what might have happened if you had to emulate an emulator that ran atop another emulator. If that sounds confusing, bear in mind that third parties have been struggling to deliver a Classic replacement, but maybe it’s time to move on.
To be sure, the new MacIntels are, by and large, faster than the models they replace if you use native or Universal software, and the speed hit for the Rosetta emulation environment, while substantial, wasn’t enough to make you feel unproductive. Sure, there were a few teething pains with the new hardware. Some early adopters of the original MacBook Pro complained of excessive noise, excessive heat, or a combination of the two. If things got real bad, Apple would take the unit back and replace the defective parts, but this appears to be a combination of early production glitches of one sort or another. Later models seem all right, and the 17-inch MacBook Pro I set up last week is evidently free of such ills, at least so far.
Now you can look at all these developments in a positive light, that Apple made huge changes in its hardware and operating system and managed, somehow, to keep the pain at a minimum. Many of you survived the ordeal and remained a Mac user. Yet I have read some complaints from time to time that Apple was pouring abuse on its customers by making these changes. While I can understand why some of you might feel that way, each of these moves was done for valid commercial reasons.
Lest we forget, Apple is in the business to make money, and to do so it must deliver better value to its customers. When the 680×0 processor topped out, the PowerPC emerged as a superior alternative, with more room to grow. Steve Jobs no doubt looked at USB as a better way to handle printers and other peripheral devices, and it made sense to use a technology that had already been introduced in the PC market.
There can be little argument that Apple’s efforts to build an industrial strength operating system had failed, after years of trying. Buying NeXT made sense because it delivered what Apple needed to enter the 21st century of personal computing, even though it took a few years to meld the best of NeXT with the Mac OS. Moreover, if you wonder why Apple went with Intel, try to find the promised 3GHz Power Mac G5, or a PowerBook G5. Need I say any more?
In an ideal world, we all wish Apple did not find it necessary to inflict such changes upon us, but each and every one was absolutely essential to ensure the future of the company. Just think of what might have happened to the Mac if Apple decided not to abandon old technologies and embrace new ones? What kind of computer do you think you’d be using now?
Print This Article