Everything you need to know about HDR and its effect on image quality

Everything you need to know about HDR and its effect on image quality

Do you know what the HDR standard is? How can this standard improve image quality? What is the difference between different versions of the HDR standard? What are the features of HDR monitors and what features should a good HDR display have? In this article, we will answer all these questions and introduce you to HDR.

HDR or High Dynamic Range is one of the features that various display manufacturers are maneuvering to increase the sales of their products. This feature, which is used for both images and video, aims to make images more realistic and bring them closer to what we see with our own eyes in the real world.

In fact, the goal of HDR is to differentiate between the darkest and brightest parts of the image. In fact, HDR and contrast are two closely related features with the same purpose, and all monitors that perform well in high contrast are quite successful in implementing HDR capability. In this article, we are going to first explain the meaning and concept of HDR feature in modern monitors and then see how we can see HDR images with the best possible quality.

What is HDR?

What is HDR
In addition to enhancing the dynamic range, HDR makes images look more realistic; But how can HDR monitors display such images? The key to achieving such success is to cover a wider range of colors and increase their richness.

Relatively older monitors without HDR technology (such as CRT TVs and older LCD and LED TVs) only support the sRGB color spectrum (or Rec. 709). This color spectrum covers a small amount of the visible light spectrum, which is equivalent to 25 to 33% of the colors we see with our eyes. Certainly, a display that can cover only one-fourth or one-third of the colors that are recognizable to the human eye will not be able to display near-reality images.

The HDR standard has greatly reduced the limitations of the sRGB color standard and supports many more colors compared to it. Many experts in the field of display and quality video content believe that HDR content and monitors should at least support the DCI-P3 color gamut or Digital Cinema Initiatives – Protocol 3. The colors that can be covered in this color spectrum are about 25% more than sRGB colors, which creates vivid colors and more accurate color retrieval. Many movies have this color gamut.

There are currently many HDR standards and formats to support color gamut Rec. 2020, are preparing. Color Gamut Rec. 2020 is the latest color gamut to be created, covering 75% of the visible light spectrum.

Beyond color: Familiarity with the concepts of brightness, contrast and bit depth
Brightness and contrast

The good performance of the monitors in these three areas is very important for displaying real HDR content with very high quality. Let’s start by explaining this concept with an explanation of brightness and contrast. Non-HDR monitors, known as SDR monitors, do not perform well in creating high contrast and recreating light and dark areas of the image; Therefore, the image created by them looks faded and without depth.

High contrast or high amplitude rates allow both the brightness and brightness of bright spots and the details of dark areas to be preserved. In fact, in high-contrast displays, the bright spots of the image are displayed in a way that is completely different from the dark parts; Examples of such images are the glow of light when reflected on a light surface or the edges of dark clouds.


Another hallmark of HDR displays is the high bit color depth. Panels with high bit depth can create richer colors. In fact, color bit depth refers to the number of color shades that can be recreated by RGB pixels. Eight-bit SDR monitors can only produce 256 shades of red, green, and blue; While 10-bit and 12-bit displays can create 1024 and 4096 color shades of these three primary colors, respectively.

To display real HDR images, you either have to use a 10-bit display or you have to provide a display that can create 10-bit color with software techniques such as dithering. Dithering means increasing the intensity of combining colorful dots to create a new color. In panels with high color depth, the change between the shades of the corresponding colors is softer, and this is very important for displaying high-quality and acceptable HDR images. Many monitors that use the dithering technique to create 10-bit colors rely on the firmware control technique. Framework control is actually a fast and continuous switch between similar color shades, which makes one think that one is seeing a new color shade between the previous color shades. This technique allows the 8-bit display to produce 10-bit color.

8-bit panel compared to 10-bit panel

Having HDR standard does not help to increase the sharpness or clarity of the image. The resolution of a monitor’s images is determined by its resolution. The resolution of HDR monitors has increased dramatically in the last decade, enabling them to display images in incredible detail. However, when the HDR standard and high resolution are combined, an extremely high quality, high-resolution image is created; But these two characteristics are completely different from each other. You can enjoy very good HDR image on a low-resolution display and vice versa; Of course, today almost all large HDR displays have very high resolutions, mostly UHD.

Different HDR standard formats

The HDR standard comes in a variety of formats, including HDR 10, Dolby Vision, HDR 10+, and HLG. These different formats are very different in some cases. Fortunately, high-end HDR monitors usually support several different formats. In the following, we will get acquainted with these different formats briefly.


HDR10, first introduced in 2015, was developed by the Consumer Technology Association. HDR10 is a completely open standard. In other words, all display makers can create this technology in their products without any restrictions and create HDR10 content to promote their screens. In this format, the number 10 next to the word HDR refers to the 10-bit panel.

HDR 10 format also provides meta (or metadata) data for monitors. This data describes the brightness and color levels for a particular section of content for displays; Of course, in this HDR format, unlike other more advanced formats, which we will briefly discuss in the following, the meta-data is fixed; For example, for a content, the minimum and maximum brightness is defined, which applies to the entire video.

HDR 10 is the standard for both monitors and content, and almost all HDR monitors from Iranian and foreign brands support it. Other platforms that support this format include UHD Blu-ray players, video streaming services, and even the previous generation of gaming consoles. Sony and Microsoft pointed out.

Dolby Vision

Dolby Vision.jpg

Dolby Vision is Dolby’s proprietary version of HDR. This version of the HDR standard is completely superior to the HDR 10 in various areas. First, Dolby Vision also supports 12-bit color, unlike HDR 10, which only supports 10-bit color. In addition, Dolby Company has required developers who intend to provide Dolby Dolby Vision content to use more advanced and accurate equipment to build their content. In addition, the minimum contrast and brightness required by Dolby is significantly higher than the standards specified for HDR 10. In addition, in the Dolby Vision standard, the meta-data provided for HDR content, unlike HDR 10 meta-data, is not static and is dynamic; In other words, in Dolby Vision content, meta-data is created in such a way that the amount of contrast and brightness for each scene (or even each frame) for the content being displayed changes.

At the beginning of the Dolby Vision standard, content based on this standard was not much different from HDR 10; But over time, Dolby Vision content improved dramatically thanks to the use of various technologies, and the difference between Dolby Vision content and HDR 10 increased over time, as their creators gained more experience in creating such content. It should be noted that the manufacturers of displays that intend to establish the Dolby Vision standard in their products must pay the right to obtain a license for this feature to Dolby.

HDR10 +

HDR 10 +
HDR10 + is an improved version of HDR10 that has been significantly upgraded. This standard, like Dolby Vision, uses dynamic meta data, and like HDR10, is an open standard, and all display makers can use this feature in their products without any restrictions.

HDR10 + Like HDR10 only supports 10-bit color; Of course, since most monitors today have a 10-bit panel, support for 12-bit color is not necessary; But the situation may change in the future, and with the use of 12-bit panels in displays, Dolby Vision will be completely superior to HDR10 +.

It should be noted that HDR10 + is an adaptive standard. Simply put, in monitors with no Insta, HDR uses sensors to detect ambient light in the room and then changes the image settings based on ambient light; Of course, not all HDR10 + monitors have these sensors, and such sensors are only seen on high-end HDR10 + monitors.

Hybrid Log Gamma (HLG)

The HLG or Hybrid Log-Gamma standard is a free HDR standard and is mostly used for OTA or over-the-air (wireless) streaming of content, and there are limitations to OTA video transmission.

HLG Contrary to the standards introduced so far, meta-data is used to communicate with the display; This is because interference is more likely to occur during video transmission over OTA than when video is transmitted over the Internet, and if meta-data is used, the data may be lost during transmission. The HLG standard uses a combination of conventional gamma and additional logarithmic curves created in the content itself to achieve HDR content.

The first part, which is the gamma, is detected by each signal that is displayed; Because gamma is the standard used to describe the brightness of SDR content. The logarithmic content, on the other hand, describes a higher level of brightness than SDR displays, and the information provided by this HDR standard is read only by HDR-compatible displays. This means that HLG is also fully compatible with SDR displays (backwards compatible) and eliminates the need to provide two video streams, which saves bandwidth. HLG is not an obsolete standard; Therefore, it can adapt itself to displays that have variable brightness levels.

Of course, the HLG standard has its drawbacks that cannot be easily ignored. When used on older SDR displays that do not support wide color gamut, the colors may not have the required saturation and may appear faded. In SDR monitors, the brightness is less than the standard; Because the HLG white dot is lower than the standard BT.709 color brightness on most SDR displays; However, HLG images are still very good, and the quality of HLG content on the SDR display is remarkably better than the quality of other HDR content. People who are not very sensitive to picture quality while watching TV will not notice any low color saturation or low brightness of HLG content.

The BBC and NHK are among the top companies using the HLG standard to display HDR content on their television networks. In the future, HLG looks set to become the standard for displaying HDR content on television networks.

HDR Content Provider Resources: Movies, Online Content, and Games

You might expect to be able to see quality images just by buying an HDR monitor; Unfortunately, this is not the case and HDR monitors only perform very well when displaying HDR content; So in order to enjoy the true quality of HDR monitors, you must display HDR content in them. Here are some different sources and platforms from which you can get HDR content. These sources are as follows:

Online Video Streaming Sources: Netflix, Amazon Prime Video, Hylu (Disney Plus, Apple TV + (Apple TV) and, Peocock) are among the sources, many of which are just basic HDR content. But in some of them, Dolby Vision content has also been released, such as Apple TV Plus and Netflix.

Video Games: Sony and Microsoft started supporting the HDR standard a few years ago with the release of their eighth generation console, the PlayStation 4 Pro and the Xbox One X and One S. Some of the games released for Microsoft’s ninth generation consoles, the Xbox X Series and S Series, also support Dolby Vision. Although many of the games released by major companies support HDR, many games are still not HDR. Unfortunately, older graphics cards do not support this standard and only new HDR graphics. The games released for the Nintendo Switch console are also not HDR.

Disk Publishing Content: Although it is easier to access online content, many people still have it because of the lack of high-speed Internet access and the higher quality of disk-based content compared to online content. They especially prefer content published on Blu-ray Discs. UHD or Ultra HD contents of Blu-ray Discs use HDR10 as the default standard, and some of the content published on these discs also supports Dolby Vision and HDR 10 +. It should be noted that if you want to watch HDR content through Blu-ray players, the HDR standard used on the TV must be the same as the Blu-ray Disc content standard.

Different types of HDR monitors

HDR display

So far we have talked about the HDR standards themselves and the ways in which these standards can be used to significantly improve image quality. In the following, we want to introduce you to different types of HDR monitors. As we said, different HDR standards do not offer the same quality, and some HDR monitors do not support different versions of this standard, and only one or two standards are supported.

Cheap HDR monitors lack the wide color gamut needed to display true HDR content. Such monitors receive the HDR signal without any problems; But they can not display it properly and in some cases the quality of HDR content is even worse than SDR content!

Therefore, if you are sensitive about the quality of HDR content, we recommend that you be careful when buying cheap monitors, and if you can afford better and higher-end monitors, do not buy cheap monitors; Because such monitors are practically not capable of displaying HDR with real quality. Even low-cost monitors that their manufacturers claim support the HDR 10 standard do not support the standard as they should, and maneuvering the manufacturers of such monitors over their support for the HDR 10 standard is just a propaganda ploy to increase their sales. The support of such monitors for HDR 10 only means that they are compatible with 10-bit content.

However, the support of a monitor for HDR 10 means that it has the necessary features to display HDR content in real quality; But the HDR 10 standard used in low-end, low-cost displays is unrealistic. Undoubtedly, mentioning the ability of such TVs to support the HDR standard by their manufacturers causes a completely bad image of this standard in the minds of people and people consider it a standard of inefficiency. The stark difference between the quality of real and unreal HDR content leads people to buy high-end displays.

Now you may be wondering how you can choose a display that can provide you with real quality HDR content? We recommend that you first look at the list of specifications of the monitor you want and make sure that it has all the following features.

Wide color gamut support: The display must cover at least 80% of the DCI-P3 color space. The display also supports the color gamut Rec. 2020 and BT. 2020 is a positive point for it; But it is not a necessary feature.

Minimum brightness between 500 and 800 nits: It should be noted that some display manufacturers overexpress their products. The higher the brightness of a TV, the better. Some high-end displays have a brightness of more than 1000 nits or even 2000 nits.
Having an efficient background system: We recommend that you buy a display that must have a local dimming backlight system or a Mini LED display itself. Such displays have a high contrast. In VA panel displays, the local dimming backlight system has been removed; Because the structure of this type of panel is such that it can create good contrast on its own.

Support for different HDR standards: We recommend that you buy high-end HDR TVs so that you can enjoy the different HDR standards we have described. Some low-cost HDR monitors only support the HDR 10 standard; But in addition to HDR 10, high-end displays also support Dolby Vision and HLG standards. Choosing a monitor with a specific HDR standard depends largely on the source from which you want to receive HDR content.
Unfortunately, some of these specifications are not recognizable or easily understood when buying a monitor. Display consumers have been calling for universal and specific standards for HDR for years; Just like the standards created for USB and HDMI. Members of the VESA Interface Standards Group have already created a DisplayHDR certificate that you can use as a resource.

Introducing the DisplayHDR standard

Although the Displayport standard is not a universal standard; But it can help you choose the right HDR monitor. According to this standard, each product is tested and tested separately, and if the monitor can pass these tests successfully, the manufacturer can claim that its product meets this standard. DisplayHDR currently has eight performance levels. The three functional levels of this standard are for OLED and micro LED TVs.

The lowest level of this standard is the DisplayHDR 400. Naturally, displays with this standard are not considered high-end displays. Displays with the DisplayHDR 400 standard typically have an 8-bit panel and a maximum brightness of 400 nits. In addition, displays with this standard do not cover many colors of the DCI-P3 color spectrum. Unfortunately, such features are not at all suitable for displaying HDR content in real quality and are far from the required features; Of course, the DisplayHDR 400 standard was created solely for the purpose of creating better image quality than SDR content.

Displays with the DisplayHDR 500 standard and above have better performance for displaying HDR content. These displays support wider color gamut and are equipped with a local dimming system for better contrast. In addition, the brightness of these screens is significantly higher than SDR screens. Other notable features of these displays include a 10-bit panel and wider color gamut coverage.

DisplayHDR True Black is another standard for DisplayHDR. The brightness of screens with this standard is between 400 and 600 nits. These monitors perform very well in producing deep and rich black color and high contrast and effective. Olds are one of the best types of monitors and can display HDR images with stunning quality. In addition, these TVs work flawlessly in producing true and deep black color as well as infinite contrast, and are also perfectly suitable for HDR games; But the brightness of these monitors is not as great as the LCD display with Quantum Dots technology or quantum dots. Higher levels of this standard may be introduced in the future with the development of TVs with no backlight and self-illuminated pixels.

Source: androidauthority



Please enter your comment!
Please enter your name here