One thing is sure: When a rival company brings out something new, Apple is made to look second best by the critics unless it duplicates the feature. It doesn’t matter how often responsible tech pundits correct the fake news, it is repeated.
A typical example is the claim that Apple lost because it did not originate something. Whether it was support for LTE on the iPhone to digital music players and smartphones, Apple took its sweet time, allegedly. But when its solution arrived, it usually worked without the downsides.
So before the iPod appeared in 2001, music players were little more than clumsy successors to the original Sony Walkman. It was about grabbing digital files from your computer, but it didn’t matter how quickly the files were retrieved, and the user interface was an afterthought. In those days, I reviewed such gear for ZDNet, sister site to CNET. Every single product I received was virtually unusable, so after the review was done, I sent it back and never thought about it again.
Not that the iPod was perfect. Those teeny tiny hard drives were fragile, and I bet there are millions of people out there with dead iPods, or iPods in which the drive was replaced. Apple waited until flash memory capacity was sufficient and inexpensive enough to switch, although an iPod Classic, with a regular drive, stayed in production for several years.
The iPhone success story? Well, before Apple introduced its game-changer in 2007, many smartphones were derived more or less, from the BlackBerry, with clumsy physical keyboards. I realize some of you became quite flexible on them, and it’s also true that there are still BlackBerrys, with the traditional keyboards, on sale. Only they run Android these days. Even though Android handsets are more plentiful than iPhones, no single model beats Apple.
LTE? Well, it’s simple. The first chips supporting the faster Internet speeds weren’t power efficient. For Apple to add them would severely reduce battery life. Apple waited for new generations of cellular modems that wouldn’t make the batteries die more quickly, and then joined the LTE revolution. But for most people, the higher broadband speeds only offer a modest advantage.
Now on this week’s episode of The Tech Night Owl LIVE, we featured commentator/podcaster Jeff Gamet, Managing Editor for The Mac Observer. Gene and Jeff did a pop culture segment, anticipating the movie version of “Wonder Woman,” the introduction of General Zod on the “Supergirl” TV show, and other movie and TV-related topics. The tech segment covered expectations for Mac notebook upgrades at the 2017 WWDC in June, whether actor Jeff Goldblum might have become the voice of Siri, the Microsoft Surface Laptop, and whether you can trust the cloud.
You also heard from ethical hacker Dr. Timothy Summers, President of Summers & Company, a cyber strategy and organizational design consulting firm. Tim delivered a comprehensive look at the recent WannaCry ransomware attack that targeted hundreds of thousands of institutions and businesses around the world using Windows XP. This attack targeted a Windows flaw that has been patched by Microsoft. You also learned more about the ongoing prospects of bitcoin, the controversial digital currency that is still regarded as a viable alternative payment method by some. The ransomware attack required bitcoin payments to unlock compromised PCs.
On this week’s episode of our other radio show, The Paracast: Gene and guest co-host Randall Murphy present a return visit from Walter Bosley. He’s an author, blogger, former AFOSI agent and a former FBI counterintelligence specialist. He has researched mass shootings, breakaway civilizations, lost civilizations and more. On this episode, Walter will discuss one of his books, “Shimmering Light: Lost In An MKULTRA House of Anu.” You’ll learn about his father’s bizarre story involving Roswell and a 1958 UFO retrieval operation in Arizona — and the curious role Operation Paperclip and the subsequent CIA MKULTRA mind control program may have played behind the scenes. Walter will also cover some of the mysteries of Antarctica.
You may not recall this, but HDTV was actually demonstrated in the U.S. in the late 1980s. After the standard became official, it took a while for broadcast stations to begin to adopt the technology. The first was WRAL-TV, a CBS affiliate in Raleigh, North Carolina, which began transmitting digital HD on July 23, 1996. But it took until November, 1998 for HDTV sets to go on sale.
It must have seemed strange for a TV station to be offering a technology that benefited nobody, except manufacturers and professionals, for 28 months. Over the next decade, TV sets offering 720p and, later, 1080i and 1080p resolution, blanketed the country. They got cheaper and cheaper until you could buy a decent set with a huge flat screen for only a few hundred dollars. But the original HD sets were CRT and they were very expensive.
Once HDTV was ever-present in people’s homes, and many people had more than one set with high-definition capability, manufacturers had to find ways to persuade you to buy new sets. But a well-designed TV can easily survive for eight or 10 years before requiring major repairs, meaning a long replacement cycle. A standard definition CRT set that I bought around 1994 lasted 20 years before it was put out to pasture.
In addition to racing to the bottom with cheaper and cheaper sets, emulating what happened in the world of Windows PCs, manufacturers would add extra features, such as full-array backlighting and digital enhancements, to improve picture quality. By and large, the differences were really minor in the scheme of things, unless you looked carefully or real close.
That takes us to 4K, which offers a resolution of 3840 × 2160. 4K — also known as Ultra HD or UHD — was invented in 2003. The first 4K camera was introduced that year, but it took a number of years before the technology filtered down to commercially available TV sets.
As with flat screens and HDTV, early 4K gear was expensive, with prices above $5,000 commonplace, but you can bet that the TV manufacturers worked hard to make the technology affordable. These days, 4K sets aren’t much more expensive than decent quality HDTV gear of just a few years ago. Indeed, when you buy a new set, you’re very likely to choose 4K unless you want something real cheap.
But 4K has some problems.
So with HDTV, the difference between high-definition and standard-definition is crystal clear. Of course, that assumes you’re watching HD fare. If you’re watching a regular DVD, or a cable channel that isn’t HD, the content will be scaled up, but it’ll still look inferior to the real thing.
However, you may have spent a bundle for 4K, but when you bring your spanking new TV home, you might want to spank it. Or perhaps you’ll feel cheated, because you may not see a difference. The reason is similar to how Apple’s Retina displays work. Resolution is increased to the point where you won’t see the pixels at a normal viewing distance. This is why some Android smartphones, despite offering more pixels, don’t really look so different from iPhones. You can’t see the extra pixels, but the handsets need more powerful graphics to push all those useless extra pixels.
So if you watch a relatively small screen 4K set, below 55 inches say, at a normal viewing distance, it won’t look any better than HDTV. To see the advantage, you have to sit closer or get a larger screen. This shortcoming may help the industry move bigger displays, and thus earn higher profits.
In order to drive home the 4K advantage, more and more sets support wider color gamuts, or HDR. There are two main standards, HDR10 and Dolby Vision. To make matters more confusing, some sets offer one but not the other. Samsung has a supposedly enhanced variation, HDR10+, An Ultra HD Premium label on a set is supposed to ensure that it properly supports the various HDR standards.
As with the latest iPhone, the 9.7-inch iPad Pro, and recent iMacs and MacBook Pros, a wider color gamut means richer, more lifelike color reproduction. It means that a 4K set can offer a visually improved picture even if the higher resolution advantage isn’t readily visible in your home.
Then there’s the software. It took a while before most TV channels were offered in HD. The arrival of Blu-ray and streaming video resulted in making high-definition fare commonplace. There’s not a whole lot of standard-definition left, although TV stations and cable and satellite systems still offer lower resolution content. Curiously, you may have to pay the cable and satellite companies a little extra for HD even though it took over years ago. Yes, it’s profiteering pure and simple.
Now when it comes to 4K fare, there’s not much. TV stations are experimenting with Ultra HD broadcast transmissions, and sets will require a digital tuner that supports the forthcoming ATSC 3.0 standard. Meantime, such streaming services as Amazon Instant Video and Netflix offer some programming in 4K, but you’ll need a pretty fast broadband connection to reliably receive such content. A consistent connection speed of over 25 megabits is a given, and probably a lot more if you have an active family viewing content at the same time. Or you’ll have to shut down everybody else to reduce buffering.
The first Ultra HD Blu-ray players are on sale, but you have to check the specs carefully. Many players merely upscale HD content to 4K resolution; they don’t offer the real thing. Even when you buy a new player from Samsung and other companies, and prices start at around $200, there aren’t a whole lot of 4K Ultra HD discs to be had. When they are, they are normally priced slightly higher than regular Blu-ray and require a separate disc. The packages I’ve seen also include Blu-ray discs, so you can future proof if you like.
Does 4K have any downsides, other than not being visible on smaller sets unless you look real close? Well, it may just be upscaling. So when you watch HDTV content, it is scaled up to 4K, in other words, the extra pixels are added, and the picture looks a little bit better. But the conversion may also add picture noise on some sets.
But what about standard-definition, which also has to be upscaled to 4K? Well, according to a USA Today article from my friend and colleague Rob Pegoraro — a regular guest on The Tech Night Owl LIVE — those old DVDs may “look awful.” He offers ways configure the set to reduce the pixelation, at the cost of turning off key image processing features. Many sets provide a way to customize the picture and save it as a preset, so if it’s done with an alternate setting, you may be able to switch back and forth with a relatively fast visit to the settings menu.
The movie companies would rather you buy a Blu-ray or Ultra HD Blu-ray version of your favorite movies instead. The entertainment industry is happy to resell content as newer standards appear, just as they did when DVDs took over from videotape.
What this all means is that, assuming my aging VIZIO TV holds up, I don’t see any compelling reason to consider buying a 4K set, even if I had the extra cash for one. Besides, the industry is already working on 8K; digital cameras, used for some of your favorite movies, are already shooting blockbuster movies in 8K.
THE FINAL WORD
The Tech Night Owl Newsletter is a weekly information service of Making The Impossible, Inc.
Publisher/Editor: Gene Steinberg
Managing Editor: Grayson Steinberg
Marketing and Public Relations: Barbara Kaplan
Sales and Marketing: Andy Schopick
Worldwide Licensing: Sharon Jarvis