Can you believe it? iTunes is 15 years old, and how time has flown away. But have those years been pleasant experiences to the tens of millions of iTunes users, or a growing source of trouble?
At first, iTunes was a jukebox app that debuted months after Apple bought the original, SoundJam, from Casady & Greene in 2000. So you’d be able to organize your music library after ripping and burning content from CDs, or files downloaded from other sources. Well, some of those sources weren’t exactly legal, but Apple had a solution for that too.
That takes us to the iPod, which, at first, allowed you to store 1,000 songs in your pocket. Seeing the music industry’s dilemma coping with illegal downloads, Apple managed to convince them to agree to a workable solution for legal downloads, and that resulted in the iTunes Store. You could buy individual tracks for 99 cents, or full albums for prices that often averaged $9.99. You can argue the wisdom of offering individual songs, particularly from concept albums that depended on you listening from beginning to end.
Regardless, iTunes began to expand in a big way, particularly after the Windows version, with a mostly identical look and feel, was released in 2003. Over time, the iTunes feature set grew. Today iTunes is a one-stop shopping center for books, music, movies and other digital content. Indeed, it seems to be organized more towards marketing and selling stuff (or renting movies) than to organize your library. Users complain of a cumbersome, non-intuitive interface, mishandling of meta data and occasional loss or garbling of playlists.
The arrival of Apple Music last summer only made things worse.
One suggestion is to break it all up into smaller apps, in the tradition of App Store on the Mac. That’s already done in part on an iOS device, where monolithic apps are the exception rather than the rule. But if one app’s interface can be confusing, what about having to use several apps to buy and play different types of digital content? What about having to update several apps rather than one when bug fixes or new versions arrive?
There are no doubt things Apple could do to make iTunes more intuitive to use, more user-centric. As the app enters its 16th year, maybe something good is on the drawing board. Or will marketing intervene again and make iTunes more confusing than ever?
In any case, on this weekend’s episode of The Tech Night Owl LIVE, we observed the 15th anniversary of Apple’s iTunes with commentator Kirk McElhearn, also known as Macworld’s “iTunes Guy.” He focused first on the history of Apple’s original jukebox app, and how it’s changed and expanded over the years. Kirk and Gene also talked about the need for Apple to overhaul the way movie and TV content is handled. As it is, movie rentals may suddenly disappear, while the option to buy a movie remains. Does Apple solve this by launching a TV subscription service — and will the entertainment companies accept a simpler buying/renting scheme?
You also heard from columnist Nancy Gravley, of The Mac Observer. The discussion was centered on her ongoing efforts to help senior citizens become competent with Macs, iPhones and iPads. So if you have someone in your family who is uncomfortable with technology, you’ll want to hear Nancy give you some quick pointers about what’s confusing about even the personal computer “for the rest of us,” the personal computer that is supposed to “just work.”
On this week’s episode of our other radio show, The Paracast: When it comes to UFO researchers, Chris Rutkowski is a class act. He returns to The Paracast for a 2015 UFO sighting update, his reaction to Hilary Clinton’s promise to look into UFOs, the “new” Ufology and other hot topics in the field. We also focus on pop culture and sic-fi. Says his bio: “Chris Rutkowski, B.Sc., Med, is a Canadian science writer and educator, with a background in astronomy but with a passion for teaching science concepts to children and adults. Since the mid-1970s, he also has been studying reports of UFOs and writing about his investigations and research.”
This weekend, while doing the family grocery shopping at a nearby Walmart Supercenter, I happened to notice racks and racks of Ultra HD/4K TV sets. A large number were on sale for less than $1,000, with a few sets sporting displays of 48-50 inches or more that were selling for less than $500.
You might think that some of these sets represented unsold inventory from the holidays that Walmart wanted to move quickly. That’s no doubt partly true. But I saw some of the very same sets, with similar prices, in December. So clearly the extra cost of providing TV sets with four times as many pixels isn’t terribly high. Or perhaps TV makers are not against selling them cheap to boost sales and worry about profits later. That’s reminiscent of the PC and smartphone industries.
TV sales have been stagnant or declining in the last couple of years, and manufacturers continue to hope that the move to 4K will be the “next big thing” and result in a huge upgrade cycle that the advent of 3D failed to deliver.
Do you remember 3D TV?
After James Cameron’s “Avatar” became a huge hit in 2009, partly as the result of offering a superior 3D experience at the local movieplex, TV makers devised relatively affordable ways to bring that technology into your home. At first, 3D sets for expensive, but prices soon came down to affordable levels, with some models costing hardly more than regular sets.
While 3D makes for a wonderful shared experience in a large movie theater, it’s not so wonderful at home. For one thing, colors are muted, and the viewing area is narrower than the already narrow range of the typical LCD-TV. You then have to put on those specs, assuming you have enough on hand to accommodate everyone. While passive 3D allowed you to use the same 3D glasses served up at the movie theater, and TV makers often gave you several, active could be costly. If you had a large family, you might find people fighting amongst themselves — or doing without — if they didn’t have enough to allow everyone to share the experience.
And you have to pay a little extra for Blue-ray 3D discs. Even assuming there were a few titles offering 3D, was it worth it? Evidently not. Sales remained in the doldrums. Some TV makers are ditching the feature, and VIZIO’s newest lineup doesn’t include 3D at all. But VIZIO offers 4K on all models save the entry-level E-Series.
Unfortunately 4K is no magic bullet, at least not yet.
While more and more sets at affordable prices came with 4K, that’s only half of the picture, if you’ll forgive the pun. First and foremost, you need to be close to the set or have a very large screen to see the resolution difference. Figure no farther than eight feet with a 55-inch set. Otherwise it looks no different than regular 1080p HD.
That’s one factor. And the other is the scarceness of the software. Very few movies and TV shows come in 4K. Sports would really benefit, at least if you have a set with a screen that’s large enough to show a visible difference. The offerings at such streaming services as Amazon Instant Video and Netflix only include a relative handful of Ultra HD titles. Try finding any at your cable or satellite provider, although support is growing, and the Ultra HD Blu-ray format is arriving later this year.
So is it all worth it? Maybe not now, and not if you will never need a set with a sufficient sized display to ever see a resolution advantage. But there’s one more feature of the Ultra HD format that promises to provide a superior viewing experience for most viewers — well, if they have a TV — and content — that supports HDR.
Enter Ultra HD Premium.
This is huge buzzword, for a subset of the 4K standard the includes sets with enhanced color and dynamic range. Colors will pop, and the wider dynamic range will mean brighter pictures. Add it all together and you’ll have a more immersive viewing experience even if the resolution advantage is not otherwise visible.
Right now, Ultra HD Premium will be confined to more expensive models, but it’s inevitable that HDR support will appear in cheaper sets come this fall or winter. That’s one important part of the equation. The other, of course, is having content with HDR support, so you can actually see the improved picture. It’s the chicken and egg syndrome that never quite worked out with 3D.
Certainly, the TV makers are anxious to boost sales, and they hope you will be sufficiently impressed with Ultra HD, the Premium alternative, and the promise of a much better picture, and therefore buy new sets. That may not be so easy. Existing sets can last five or ten years before they need major repairs. And maybe not even then.
So the 4K difference is not as great as the move from standard definition to HD, which also included the launch of widescreen. This time you need 4K, HDR, and lots of content to exploit the new features. That will begin to happen this year, but it may take a couple of more years before it filters down to the very cheapest sets. Meantime, if you need a new TV today, and you want to future proof your purchase, there’s probably nothing wrong with choosing 4K, though I’d suggest you look at models that bear the Ultra HD Premium label.
Assuming you can afford the extra price, of course.
Otherwise, you shouldn’t feel in any rush to embrace new technology. Even if you’re several years away from putting that old set out to pasture, or passing it off to another family member or friend, there are apt to be even more compelling technologies available. If you need a set now, great 1080p sets are cheaper than ever, and the 4K advantage may not mean a lot to you.
What about the potential for cheap OLED sets, with amazing blacks and infinite viewing angles? And I haven’t begun to consider 8K, which is already being tested and exhibited in prototype form.
THE FINAL WORD
The Tech Night Owl Newsletter is a weekly information service of Making The Impossible, Inc.
Publisher/Editor: Gene Steinberg
Managing Editor: Grayson Steinberg
Marketing and Public Relations: Barbara Kaplan
Sales and Marketing: Andy Schopick
Worldwide Licensing: Sharon Jarvis
Print This Issue