We understand the features of HDR technology, actively introduced by manufacturers in modern TVs, and find out what is worth preferring: support for HDR or higher resolution.
Modern technologies are so firmly embedded in our lives that sometimes it is difficult to understand all their diversity. For example, when choosing a TV, you will have to pay attention to various parameters, some of which can puzzle the unprepared buyer. In this article, we’ll talk about HDR technology, which has become extremely relevant in the last year and a half.
In practice, its introduction until the beginning of 2016 was the prerogative of individual models of televisions, and the relevant content was catastrophically small. Fortunately, the situation began to change during the year, with more and more manufacturers equipping their TVs with HDR support. Game console manufacturers have also introduced technology support into updated models. Sony went even further and included HDR support in the original PlayStation 4 by updating the software.
Most importantly, there is and continues to be suitable content that can be revealed on HDR-enabled devices.
What does it look like?
Let’s try to understand and start by explaining the essence of the technology.
Any TV is characterized by indicators of contrast and accuracy of color transmission. Contrast affects how bright and dark colors can demonstrate the device, keeping them distinguishable to the viewer. The accuracy of the color transmission, in turn, means how close to the real shades will be displayed on the screen.
Curiously, most potential buyers, if offered a choice of higher-resolution TV and TV with less resolution but higher contrast will choose a second. It is the saturation and variety of colors that are the priority when choosing. Thus, the brightness of the picture is still preferable to a resolution above 4K.
What’s the point?
HDR (High Dynamic Range), or extended dynamic range, makes a choice described above even more obvious: it makes light colors even lighter and darker. HDR increases color range and maximum contrast, making the image deeper and richer. Standard colors – red, blue, and green – get additional shades and their combinations, which directly affect the quality of the image.
Hand in hand with HDR goes WCG (Wide Color Gamut). The latter further expands the available set of colors. Viewers who have never encountered these technologies before will be pleasantly surprised at how much the number of shades of the same seemingly familiar colors increases.
It is important to understand that the HDR technology introduced into modern TVs and connected devices are very different from what has been present in the cameras of our smartphones for some time.
Thus, HDR television technology increases the contrast and palette of available colors to make the screen more realistic and show it in natural colors. HDR technology in cameras, in turn, is used to combine multiple images into one to produce the best image, which combines the most successful elements of all the frames taken. Thus, the difference between the two HDR turns out to be fundamental.
How is it implemented?
HDR technology consists of two integral parts: display and content.
In fact, TV is the simplest part of the two. It requires it to illuminate certain areas of the screen more vividly than its usual HDR-enabled counterpart.
It’s much more complicated in content that needs HDR support to display an extended dynamic range on the screen. Most films, many of which have been shot in the last decade, have HDR support. It can also be added without any artificial inclusions in the original picture. The fact is that the main obstacle to HDR content to your TV is solely data transmission.
A video created using an extended dynamic range is compressed to be streamed to your TV, computer, or another device. As a result, the user sees at best the image that his display is trying to reproduce using its built-in technologies and systems to improve image quality.
Thus, only content from certain sources will be displayed with a real HDR because your TV will receive additional metadata that will tell it exactly how to display each particular scene. Of course, this implies support for technology by a playback device.
In addition, there are certain requirements for equipment. Your TV and the player or set-top box should have an HDMI version no lower than 2.0. Most of the hardware released from 2015 to the present is supported by HDMI 2.0, which can be software upgraded to HDMI 2.0a. The latest version of the standard is required to transmit the same metadata mentioned above.
At the same time, manufacturers have already decided to assign UHD Premium certificates to TVs with 4K and HDR software. Its presence is worth paying attention to when buying. It’s also worth noting that the 4K Blu-ray format also has default HDR support.
Let’s sum it up
Of course, HDR technology in TVs is not as vital as the manufacturers present, but it is now the main driving force in the industry. The race for resolution above 4K has faded into the background, giving way to the extended dynamic range.
While the best result will be achieved when you combine two advanced standards, it’s preferable at this stage to choose an HDR-enabled TV if you’re not willing to overpay for a higher than 4K resolution. The quality of the picture when using the right content will pleasantly surprise you in any case. Eyes can not be deceived: brighter and richer colors, as well as their variety, will be preferable to the presence of a matrix with ultra-high resolution.
Thus, buying a new TV at the end of 2016, it is advisable to take care at least the presence of HDR support, and the resolution above 4K is still a pleasant, but still, an addition that too strongly affects the final cost of purchase, but does not bring the same emotions.