Up till now, Mac users have been spoiled. Each and every release of Mac OS X has been shown to be demonstrably faster than its predecessor, whereas with Microsoft Windows it’s usually the reverse.
However, Leopard includes a huge amount of under-the-hood changes that tax the graphics processors fiercely, particularly on slower Macs. On the other hand, with less work for the CPU to handle, shouldn’t that signal speedier performance for most of you?
Well, maybe. But recent benchmarks from Bill Fox’s Macs Only site show a troubling trend, one that I hope doesn’t portend an unfortunate change in Apple’s direction. Before I give you my two cents about his results, though, let me tell you that Bill is extremely careful about his work, and he doesn’t take the task of measuring the speed of software and hardware casually. Everything is checked multiple times, with restarts between each set of tasks, to ensure the most accurate results. He also uses different sets of hardware, so you can get a fair basis on which to make a decision.
So what do we see from his tests?
Well, for one thing, it may take almost twice as long for a Mac to restart with Leopard installed as with Tiger. But it’s not an across-the-board situation. So on his 15-inch 2.4GHz MacBook Pro, boot times increased from 31 seconds to 56 seconds. That’s one huge slowdown, and one you’d observe without the need to resort to a stopwatch. On the other hand, the startup times of his 24-inch 2.8GBz iMac were unchanged; 25 seconds in both cases.
This is damned peculiar, because the MacBook Pro and iMac share lots of hardware, except for the slower hard drive on an note-book. The slightly faster build-to-order processor configuration on the iMac shouldn’t account for an improvement of more than a few seconds. So I’m at a loss to guess why there’s a performance disparity here, unless it’s solely drive related.
When it comes to so-called user interface graphics, all that eye candy takes a huge toll on both models.Â Quartz graphics figures are improved somewhat with Leopard, but OpenGL, which is heavily used to generate all those special effects, has worsened.
No to be fair, it’s quite possible there are shortcomings in the canned benchmark applications, such asÂ Xbench 1.3, which contribute to the results, and that they don’t reflect use in the real world. I know that, in my own subjective experience, only the slower startup times seem to be noticeable. Leopard’s interface seems perceptibly speedier in my personal equipment lineup, which includes a first-generation 17-inch MacBook Pro and a Power Mac G5 Quad.
So where does the fault lie? Is it Xbench 1.3, which hasn’t been updated since the middle of 2006? Or is it due to the fact that the graphic drivers in Leopard aren’t fully optimized for the new operating system yet, and future maintenance updates will reflect a noticeable improvement?
To be sure, that’s happened in the past, as Mac OS X releases have grown faster between the major upgrades due to Apple’s constant stream of maintenance updates.
I don’t pretend to have the answers. Besides, none of those benchmarks should really impact on your productivity, which you can’t say about Windows Vista, where molasses is the name of the game even on a speedy PC.
What does concern me more, however, is the continued variance in Leopard user experiences. As carefully as Bill Fox is in maintaining his Macs, he’s seen a higher number of system crashes since he updated to Leopard, although the 10.5.1 update has improved things noticeably. And, by the way, if you want to hear more details about his Mac OS X benchmarking, you’ll want to listen to his guest shot this week on The Tech Night Owl LIVE.
What’s more, other benchmarks, using different criteria, show decidedly different results. Take the recent comparisons conducted over at Rob-Art Morgan’s BeatFeats site, where Leopard was, in many respects, faster than Tiger with a suite of high-power applications, where the performance improvements would be most noticeable. Indeed, in situatons where Leopard came out second-best, the results were so close that I doubt most of you would notice the difference without resorting to some instrumentation. In other words, they aren’t significant.
Now when it comes to which benchmarks rate as the most important in your day-to-day use of Leopard, I would suggest that Rob’s tests are far more apt to reflect your productivity. It really doesn’t matter if some interface widget displays a fraction or a second faster or slower, unless the difference is quite significant, and it’s not.
More to the point, as more and more Mac applications receive Leopard-related updates, it’s quite possible developers will harness the updated tools in Apple’s Xcode to glean enhancements in application multithreading and other features that hold the promise of greater efficiency.
For me, it still seems that everything I do over the course of my workday is, subjectively at least, snappier in Leopard. Perhaps Apple concentrated on certain parameters that visually impact performance, even if a stopwatch shows a contrary result. I don’t pretend to have the answers, but I will watch developments carefully as Leopard gets maintenance updates and third-party applications become more compatible.
And, of course, I welcome your comments and your own experiences. Feel free to run your personal benchmarks and let us know the particulars here.
Print This Article