The Chaos of 4K

3 Jul, 2016 | TechVfdFeatures

As a recent recipient of a 4K monitor I have taken sudden in-depth interest in the subject of 4K, or Ultra HD (UHD) to give it its other name. What I found was a confusing mix of terminology and technology.

The Standards

First off, there are two standards that are referred to as 4K.

The most common use of 4K as a term is for UHD, which is exactly twice the resolution of HD (1080p) as marketed in TVs, hence the Ultra HD name.

The Digital Cinema Initiatives established the 4K standard for movie projectors, exactly double their previous 2K standard.

By rights, UHD is not 4K (it's not 4,000 pixels wide), but is commonly referred to as such by manufacturers and retailers.


The Connections

To connect your 4K source to a screen, you're going to need one of these connectors:

Thunderbolt is primarily available on Macs, so will be of limited interest to anyone who doesn't have one. It uses either a Mini DisplayPort or USB-C (for Thunderbolt 3) connector. I suspect devices with this are going to remain pretty niche, especially screens.

USB-C is the latest version of the USB standard so will be available on a ton of devices. It supports something called DisplayPort over USB-C, which offers up to 8K resolutions at 60Hz. There will also likely be a bunch of converters, as most screens don't have a USB-C input.

DisplayPort is the second most common connector, with version 1.2 or later capable of up to 4K at 60Hz. It's most widely used on monitors, where higher refresh rates are more relevant. Not many TVs will have one of these connections.

The most common connector is HDMI. What they don't tell you is that only version 2.0 or later is capable of driving UHD above 30Hz or 4K at more than 24Hz. This is probably fine for your TV, but if you're buying a monitor you'll likely need a new graphics card too.

My Brain Hertz

Aside from the differences in resolution (the number of pixels that fit onto the screen, generally the higher the better) there's also the refresh rate. This refers to the number of times the screen is updated each second and is what creates the illusion of movement. It's usually referred to as Hz, but sometimes with a p value, such as 24p.

Movies typically run at 24 frames a second (Hz), TV around 25, computer monitors at 60 or more. So having a signal or screen that runs at 30Hz would probably be fine for a TV.

A higher refresh rate typically gives a smoother feel, so if you like action movies or sports you may see a benefit. On monitors the default is typically 60Hz, to provide smoother mouse movements and transitions, with variants designed for gaming now offering 120 or 140Hz versions.

So using a 4K TV as a monitor may be fine for static use (websites and spreadsheets, etc) but won't cut it for gameplay.

To Sum it Up

Hopefully that makes the picture clearer (pun intended). When we talk about 4K and UHD, assuming you're not a movie-maker, we're talking about a resolution double that of regular HD (1080p).

If refresh rate is important to you, you'll need to keep an eye on the connections both on the screen and the device you're using as a source (check the manufacturer's website), but even lowly graphics cards can be made to work in most cases, so don't feel you have to go out there and upgrade everything just to support it.

And let's not get started on 5K and 8K just yet.