So who did ignite the personal computer revolution? Well, most of you know that Apple was a major force in the industry dating back to the 1970s, yet some want to convey the impression that Microsoft was the real innovator, and that the original IBM PC came first, even though it didn’t appear until the early 1980s?
Well on this week’s episode of The Tech Night Owl LIVE, we returned to “The David Biedny Zone” to set the record straight, as our Special Correspondent presented an insightful look at the real history of personal computers, not the fake version foisted upon us by Microsoft and some unsuspecting tech writers.
Macworld Senior Editor Rob Griffiths was in full rant mode in this episode, as he explained why he doesn’t like the glossy screens on the iMac and the latest Mac note-books. He also presented his top ten pet peeves about Mac OS X Leopard.
Now, I bet if you have used Leopard for a year, as he has, you have your own expanding list of rants and raves. But I think you’ll agree that Rob was right on track with many of his complaints.
Our second journey through history featured John Larish, author of “Silver to Silicon,” a fascinating voyage to discover the progression of the photographic industry from roll film to all-digital.
You may not recall that Apple was one of the early pioneers in affordable digital photography, with its QuickTake cameras. However, those products were actually built by Kodak. That, and a whole lot more, was discussed by John during this fascinating segment.
On The Paracast this week, the “Culture of Ufology” is explored by veteran UFO author and publisher Tim “Mr. UFO” Beckley, experiencer Jeremy Vaeni, and your outspoken Paracast crew. Do UFO conferences help advance research or are they just entertainment and little more?
Coming November 9: Bill Birnes, from The History Channel’s “UFO Hunters” TV series and “UFO Magazine,” delivers new information about Philip Corso, insights into UFOs from time and space, and revelations about ongoing government conspiracy theories.
Some of the folks who bash Apple, unfortunately, do it based on erroneous assumptions. Take the claim that Macs are overpriced, or that there’s an Apple Tax of some sort that you pay for the privilege of buying a computer that just works. In fact, if you actually configure a Mac and a PC as closely as possible — from hardware to the bundled software, including the equivalent Vista version, which is Ultimate — the prices are really quite close.
But it’s also true that Apple does make its share of mistakes. Sometimes it learns from them, sometimes it doesn’t. One example might perhaps be the transition to glossy screens, which has made some of you absolutely scream! Now maybe they’re cheaper, maybe they are more environmentally friendly. But what about just using some sort of anti-reflective coating? Wouldn’t that answer at least some of the complaints?
I don’t pretend to know the manufacturing obstacles involved. I’m just asking
But I also think there have been ongoing decisions that, while they may have certain side benefits to Apple in terms of a smooth form factor, do not make sense from the point of practicality. I’m sure there are logical design considerations as to why these things were done, and that some of you will explain them to me. In the meantime, let me tell you what personally upsets me.
Take the Eject button on a Mac’s optical drive. No, not the dedicated one on your Apple keyboard, or the alternative on a third-party product. No, my friends, I’m talking about the one you will no longer find on your Mac. All right, maybe that little “bump” won’t look so pretty on an Apple note-book, an iMac or a Mac mini. But on the Mac Pro? Does it really matter?
Besides, I’m sure Jonathan Ive and the rest of Apple’s positively brilliant designers can conceive of a recessed button that will get the job done and not seriously detract from your Mac’s looks.
Under normal circumstances, of course, the keyboard alternative and pressing the mouse button at startup will generally get the job done and activate the function. But sometimes the action fails, because your input device is broken, or, during startup, when the third-party input drivers (usually with wireless devices) load too late. I see situations where it would be more convenient to have an Eject button on the drive itself. Apple doesn’t.
You know that the new MacBook Pro is fitted with a single FireWire 800 port. That’s the fastest available FireWire standard available now, though a faster one is under construction. You can use your FireWire 400 devices, such as a camcorder with an adapter. But that adapter comes at an extra cost, roughly $15 or thereabouts, and having a slower device on the same port also slows down a faster one.
Now paying another $15 for an adapter plug is no big deal after you’ve spent upwards of $2,000 on a new note-book computer, but why should it be necessary? Does Apple truly save that much money by omitting a second port for the slower FireWire protocol?
That takes us, of course, to the loss of FireWire on the regular MacBook. Steve Jobs feels that it shouldn’t matter, since today’s camcorders mostly support USB 2.0. But if you have an older camcorder, what do you do then, other than buy a MacBook Pro, or the white MacBook? Is this really a marketing decision to differentiate the regular model from the Pro alternative? The MacBook Pro does offer a bigger screen, ExpressCard 34 slot and twin graphic chips. Isn’t that sufficient?
To be fair, Apple did listen to demands to make it easier to replace hard drives on the MacBook Pro. That’s certainly a positive development. At the same time, what about the iMac? Will the expected refreshed version make the process easier? Certainly Apple has a larger container for its parts; hence, more working area to make this parts swap a simple process.
That, of course, takes us to the poor, neglected Mac mini. Ever since I made my public plea for Apple to upgrade the mini and address some of its design shortcomings, a number of you wrote that this diminutive Mac remains popular in education, as entry-level Web servers and in other market segments that, one hopes, give Apple sufficient reason to keep it production.
For the next version, if there is one of course, I would hope Apple will seriously consider making it simple to snap off the bottom cover, so you can easily get to the RAM and hard drive.
Remember that when Apple makes it hard for you to do routine parts replacements, it may help the dealer in terms of giving them more service-related income. But the more complicated the installation process, the easier it is to make a mistake, and service people, even though they may be highly skilled and carefully trained, aren’t perfect. Do you think they enjoy picking apart a Mac mini for an upgrade or repair?
That takes us to the last entrant in my little rant list, and that’s the iPhone. Apple’s various firmware updates have pretty much eliminated most of the connection issues I’ve encountered. Part of that, of course, might be the result of AT&T improving the quality of its 3G network in my area, and I hope it’s better in your city too.
But for a smartphone that aspires to become an appliance for the enterprise, Apple has to rethink the battery replacement scheme. Forget about a dead battery. Consider a spent one, where you only have minutes left and you’re not near a power outlet (home, office or motor vehicle). What do you do then? You can’t very well remove the battery and install a spare, as you can even with the free wireless handsets your mobile carrier offers. Certainly there are expansion batteries you can attach to the iPhone, and there are good ones, although they make for a thicker or longer object for you to lug around.
For the next iPhone, Apple ought to seriously consider compromising for once on such matters. I’m sure there’s a way to craft a removable battery cover without seriously denigrating the iPhone’s looks. Apple has the best and the brightest, and I just know they can come up with a credible solution.
Beyond this tiny list, I’m sure you readers can double and triple the size without breaking a sweat. Whether Apple will listen is another thing entirely.
The conventional wisdom, such as it is, has it that the prolonged battle for supremacy for high definition DVD formats took a far greater toll than the consumer electronics makers and movie studios expected. Even though Blu-ray emerged the victor early his year in the battle to the death against HD-DVD, it came too late to save that format for the long haul.
What that means, if you believe what they tell you, is that online downloads of HD content will quickly replace DVDs even before the format gains genuine traction. Already you are hearing dire predictions that holiday sales of Blu-ray players will be pitiful, even though pricing will begin at around $150.
Now I suppose any consumer electronic gadget will have difficulty, considering the shaky state of the world economy. It’s really hard to find any optimistic expectations from any company, though it does seem that Apple is trying real hard to maintain a stiff upper lip in light of diminished expectations for what would normally be a blow-out quarter.
At $150, though, why would Blu-ray fail? Well, the problem is that you can get a perfectly good up-converting DVD player for $100 less, and the argument for paying more money is one that’s very difficult to make. Sure, Blu-ray has genuine 1080p picture resolution and a growing number of even lower-priced flat panel TVs offer the capability of reproducing all of it on their large screens.
However, when you watch the pseudo high definition picture from an upconverting player at a normal viewing distance, it comes really close to Blu-ray. Sure, you may not get some of the fancy extras, but I think most people don’t really care. They just want a clear picture with great sound, and both are capabile of delivering the goods.
Did I say “really close?” Well, an upconverting DVD player uses digital tricks to make a standard definition picture look sharper. If you look carefully and are situated a couple of feet from your TV, you can probably see more digital artifacts on the screen, particularly when you compare it to the pristine image from a typical Blu-ray deck.
But most of you don’t watch TVs that close. I know I don’t. Just the other day, in fact, I saw the aging Harrison Ford demonstrate he hasn’t lost his action movie chops in “Indiana Jones and the Kingdom of the Crystal Skull.” I bought the two-DVD special in Blu-ray, and it’s clear they did a wonderful transfer.
That evening, the Steinberg family viewed an episode from a DVD collection of the second season of Fox TV’s “House,” one of our network favorites. On our Panasonic DMP-DB30 Blu-ray player, the upconverted picture looked simply marvelous when viewed from our standard vantage point of about ten feet or so.
I would be hard-pressed to tell the difference between the faux high definition and the genuine article, although I suppose I might be able to succeed if I bothered to compare the two DVDs in rapid succession.
But does that difference make a difference? Sure, when DVD replaced VHS, you could see a vast improvement, and that’s a key reason why the former succeeded so quickly. However, unless the consumer electronics industry can knock the price of Blu-ray down to below $100 really soon — and make the prices of DVD content competitive with standard definition rather than significantly higher — I can see where Blu-ray may be one of those better ideas that ultimately failed in the marketplace.
And that would be a tragedy, even though there’s a tremendous chance it’s going to happen unless the industry acts fast.
THE FINAL WORD
The Tech Night Owl Newsletter is a weekly information service of Making The Impossible, Inc.
Publisher/Editor: Gene Steinberg
Managing Editor: Grayson Steinberg
Marketing and Public Relations: Barbara Kaplan
Worldwide Licensing and Marketing: Sharon Jarvis
Print This Issue