The other day, I read an article in Daniel Knight’s usually excellent LowEndMacÂ site, in which he interviews musician and artist Scott Hansen, a devoted Mac user who is willing to overlook alleged performance limitations with Mac OS X because he still thinks it’s far, far better.
So what performance limitations are we talking about? Well, according to Knight, Hansen has benchmarked Photoshop on a fully decked out system running Windows XP and Mac OS X, and found the former is measurably faster. However, he still reports that Mac OS X runs far more reliably.
According to Knight, Hansen’s “conclusions are sure to disappoint some Mac partisans.”
I’m disappointed, all right, but not in the subject’s conclusions, such as they are. I’m more upset by the lack of journalism Knight exhibits in writing this article. Whenever you publish something about benchmarks, it’s always a good idea to be as specific about the methodology as possible, particularly when the test results seem at odds with other published reports.
Take, for example, this one, in which Photoshop CS3 is evaluated in four ways in a new MacBook. Here, Boot Camp and virtual machines are used to run Windows XP. Even in Boot Camp, where the Mac runs Windows as fast as any dedicated PC of comparable specs, Mac OS X runs Photoshop CS3 faster, and a fair amount faster as a matter of fact.
Now of course different benchmarking techniques may yield differing results, so it’s fair to say there may be some trials that give Windows XP the advantage, and I’ll grant that.
Unfortunately, Knight doesn’t tell us anything about Hansen’s tests that so impress him. And before you ask, yes, I sent two emails over the weekend asking for an explanation and further details. One even included a link to the tests that revealed a decidedly different result. There were no responses.
One extremely troubling aspect of this story is the fact that Hansen apparently ran these trials on his own custom-configured overclocked 4.2GHz Intel Core 2 Duo Quad-Core Extreme system. In other words, the test computer’s innards weren’t all built by Apple, but were heavily-altered in ways that may certainly yield unpredictable results.
In describing the process, Hansen concedes, “I have no idea what Iâ€™m talking about,” and I would agree wholeheartedly. Worse, severe overclocking can heavily compromise the longevity of the system, resulting in overheating and shortened component life. That to me is not an optimized Photoshop computer, but a time bomb waiting to explode.
Besides, Apple already has a computer that’s optimized for high-end content creation. It’s called the Mac Pro, and it doesn’t require tricked up processors or other silliness to deliver excellent performance.
Even forgetting the questionable choice of a production system, when you run benchmarks of this sort, you carefully select the test methodology to provide heavy-duty rendering functions, say, with a large photograph. That way, you wouldn’t be dealing with possible differences of tenths of a second that might challenge your stopwatch or whatever you use to time the runs.
Additionally, it’s a good idea to restart the computers after each trial, to make sure that everything is as pristine as possible. Running other applications at the same time is a no-no. What’s more, timed runs need to be repeated at least two or three times to remove the possibility of anomalous results. You then average the figures to get as accurate a measurement as possible.
Well at least he seems to have been running Photoshop CS3, to his credit, though it’s not at all clear which version of Mac OS X was used. I’ll grant, however, that Tiger and Leopard would probably yield fairly similar results regardless.
Of course, what I’m writing is just plain common sense. Even then, there are reasons why one might criticize the resulting benchmarks, perhaps suggesting that different combinations of Photoshop filters might favor one platform or the other. I accept that as well.
In fact, all things being equal, I would actually have expected Mac OS X and Windows XP to be fairly close in terms of ultimate performance potential. Windows Vista is another story, and almost every test I’ve seen shows it runs applications slower. No wonder Microsoft is having one hell of a time selling upgrades, and tens of thousands of Windows users are begging them to keep XP alive longer, and not just for a handful of cheap note-books.
However, I can see where this poorly researched article is going to appeal to Windows diehards who are absolutely begging for evidence their chosen platform is superior to the Mac. They have certainly lost the argument about Macs costing more, when identically equipped, although that fiction is still being repeated to the level of total exhaustion. And, no, I’m not going to argue the point here. I’ve done that so often that I’m tired of the discussion, at least for now.
Now perhaps Knight was looking for a fast headline to generate some hits, though I always considered him to be a pretty careful reporter regardless. I also hope that he and Hansen will explain themselves a little further so we can see just what’s going on here.
Meantime, dear reader, I welcome your benchmarks too, with as much detail as you’re able to provide.