Most TV stores have Ultra-HD (4K) sets for sale. Are the specifications established, so we can buy one without worrying that it won’t work if the final specifications are not part of the circuitry in the 4K TVs being sold right now?

– Andy L.
Sacramento, CA


The specifications associated with 4K or Ultra HD televisions fall into two major categories, connection options and image parameters. Both of these are in varying states of flux but if you want to buy a 4K TV today, it’s unlikely to become obsolete during its lifetime.

In the consumer television world, the only connection option for HD and UHD content is our old friend, HDMI. The latest version, 2.0, was introduced in September of 2013. As of now, the newest models from all the major manufacturers have HDMI 2.0. That’s an important distinction because it’s the only way to transmit a UHD signal at greater than 30 Hz. So that’s one feature you should insist upon. Pretty much all the first-generation sets have HDMI 1.4 and cannot be upgraded with a simple firmware flash.

Of course this only matters if you plan to send native Ultra HD content to your new TV. Currently, the only way to do this is from a server like Sony’s 4K Media Player or a RedRay player.

Since Sony has their own movie studio, they’ve taken the opportunity to re-master some of their titles in 4K. When you buy the server shown above it comes pre-loaded with a few films and you can download more if you wish.

RedRay is another 4K-capable server solution from Red, the makers of 4K digital cameras that many Hollywood movies are shot on today.

The second thing to consider is the image parameters. Right now there are two resolutions commonly called “4K”, 3840×2160 and 4096×2160. Fortunately, you don’t need to worry about that because consumer TVs are all 3840×2160 and are likely to stay that way for the near future. 4096×2160 is a film-industry standard and is more likely to be found in cameras rather than displays.

You may have read about a new color standard associated with Ultra HD, Rec.2020. It’s a significantly larger color gamut than Rec.709 which is part of the HD standard. We haven’t seen displays that can meet that standard yet but it could happen. But here’s the rub – the color of all Blu-ray and DVD content would suddenly look very over-saturated and un-natural. While that might appeal to some viewers, more discriminating consumers like yourself would probably not be fans.

The last part of your question concerning circuitry bears looking at. Aside from the HDMI version, there’s no reason an Ultra HD display you buy today won’t be able to play 4K content created in the future. Even if you wind up with HDMI 1.4, it will still transmit a 4K signal up to 30 Hz. This is enough for the 24p frame-rate of movies on Blu-ray or even native 4K film content streamed from a server.

My final advice is this. If you need a new television, 4K is certainly a compelling option. I’ve spent some time with 4K TVs and projectors and in my opinion, even up-scaled content looks better. It’s not a night-and-day improvement but there is a subtle difference. The downside is you’re limited to LCD panels if you want 4K. I’m not quite ready to give up the deep blacks and greater contrast of plasma just yet.

If you are simply considering upgrading to 4K, I would wait and see what happens in the next few years with OLED sets. Not only are they on a development path towards 4K, they can match or exceed the contrast performance of a plasma. And it’s a good bet there will never be a 4K plasma. With only two companies still marketing that technology, it seems unlikely to be further developed.