What Is HDR? HDR10 vs HDR10+ vs Dolby Vision Explained
A thorough guide to High Dynamic Range display technology, the standards that power it, and how to get the most out of HDR on your screen.
A thorough guide to High Dynamic Range display technology, the standards that power it, and how to get the most out of HDR on your screen.
Last updated: February 2026
High Dynamic Range, or HDR, is a display technology that expands the range of brightness and color a screen can reproduce. The word "dynamic range" refers to the gap between the darkest black and the brightest white a display can show at the same time. In a traditional Standard Dynamic Range (SDR) image, that range is relatively narrow. Highlights clip to a flat white, shadows crush to a featureless black, and the overall picture, while perfectly watchable, lacks the intensity and depth your eyes perceive in the real world.
HDR changes the equation by letting displays push peak brightness well beyond SDR limits while simultaneously deepening black levels. This means a sunset scene can render a blazing sun alongside shadowed foreground detail, rather than forcing one or the other to lose information. The result is an image that feels more three-dimensional and lifelike, with specular highlights that actually appear to glow and dark areas that retain texture.
Beyond brightness, HDR also unlocks a wider color gamut. Where SDR content is typically mastered to the sRGB or Rec. 709 color space, HDR content targets the significantly larger Rec. 2020 container (with most real-world content hitting somewhere around the DCI-P3 range). This means more saturated reds, greens, and blues without the artificial look of simply cranking up the saturation slider. The combination of extended brightness, deeper blacks, and wider color is what gives HDR its visual impact.
You can test your display's color capabilities using the DisplayPixels Color/HDR tool, which checks both gamut coverage and HDR readiness right in your browser.
Understanding HDR starts with understanding what SDR gets wrong, or rather, what it was never designed to do. SDR was standardized decades ago for CRT televisions that topped out around 100 nits of brightness. Modern LCD and OLED panels can far exceed that, but SDR content does not take advantage of the extra headroom.
SDR content uses 8 bits per color channel, which provides 256 levels per channel and roughly 16.7 million total colors. That sounds like a lot, but it creates visible banding in smooth gradients, especially in dark scenes. HDR uses 10-bit (over 1 billion colors) or even 12-bit color depth. The extra levels eliminate banding and allow for smooth, artifact-free transitions across the expanded brightness range.
SDR content is typically mastered for displays that peak between 100 and 300 nits. HDR content can be mastered for peak luminance values of 1,000 nits, 4,000 nits, or even 10,000 nits in theory. In practice, most consumer displays reach between 500 and 2,000 nits in HDR mode, with high-end models pushing past 3,000 nits. This extra brightness gives specular highlights genuine intensity. A reflection off water, a spark from a welder, or the gleam of polished metal all look dramatically more convincing.
True HDR impact requires not just bright highlights but also deep blacks. An OLED panel can turn off individual pixels for perfect blacks, giving it essentially infinite contrast. LCD panels rely on local dimming (more on that below) to deepen blacks in certain zones. The contrast ratio between the deepest black and the brightest highlight is arguably more important than peak brightness alone.
Color volume is a concept that goes beyond flat gamut coverage. It describes how saturated a display can remain at different brightness levels. A display might cover 95% of DCI-P3 at moderate brightness but lose saturation as it pushes toward its peak luminance. Displays with high color volume maintain vivid, accurate color even in the brightest HDR highlights, which is critical for scenes with bright, colorful elements like neon signs or sunlit flowers.
Not all HDR is created equal. Several competing standards exist, each with different capabilities and licensing models. Here is how they stack up.
| Feature | HDR10 | HDR10+ | Dolby Vision |
|---|---|---|---|
| Bit Depth | 10-bit | 10-bit | Up to 12-bit |
| Max Brightness (mastering) | 1,000 - 4,000 nits | 4,000 nits | 10,000 nits |
| Metadata Type | Static | Dynamic | Dynamic |
| Licensing | Royalty-free | Royalty-free | Licensed (Dolby) |
| Backward Compatible | Base layer for all | Falls back to HDR10 | Falls back to HDR10 |
HDR10 is the most widely supported HDR standard. It uses static metadata, meaning a single set of brightness and color parameters is applied to the entire movie or video. The content creator picks one set of values that works for the whole piece, and the display uses those values from start to finish. The advantage is simplicity and universal compatibility. The downside is that a dark, moody scene must use the same tone-mapping parameters as a bright outdoor scene, which can lead to compromises. Despite this limitation, HDR10 delivers a significant upgrade over SDR when the display hardware is capable.
HDR10+ was developed by Samsung and Amazon to address HDR10's static metadata limitation. It adds dynamic metadata that can change on a scene-by-scene or even frame-by-frame basis. This means the display receives specific instructions for each moment, allowing it to optimize tone-mapping continuously. A dark interior scene gets one set of instructions, while the following bright exterior shot gets another. HDR10+ is royalty-free, which encourages adoption, and it falls back gracefully to standard HDR10 on devices that do not support it. Samsung TVs, Amazon Prime Video, and a growing number of Blu-ray releases support HDR10+.
Dolby Vision is the most technically advanced HDR format. It supports up to 12-bit color depth and can be mastered for peak brightness up to 10,000 nits, far beyond what current displays can achieve, which provides future-proofing. Like HDR10+, it uses dynamic metadata, but Dolby Vision's metadata is typically authored with more rigorous quality control during the mastering process. Dolby licenses the technology to both content creators and hardware manufacturers, which adds cost but also ensures a baseline level of quality certification. Netflix, Disney+, Apple TV+, and most major streaming services offer Dolby Vision content. Most premium TVs from LG, Sony, and others support it.
Hybrid Log-Gamma (HLG) deserves a mention as the HDR standard designed for live broadcast. Developed by the BBC and NHK, HLG does not use metadata at all. Instead, it encodes the HDR information directly into the signal using a transfer function that SDR displays can interpret without looking broken. This makes it ideal for live sports, news, and broadcast content where backward compatibility with millions of existing SDR TVs is essential.
If you are shopping for an HDR-capable LCD monitor or TV, local dimming is one of the most important specifications to understand. LCD panels use a backlight behind the liquid crystal layer, and by default, that backlight illuminates the entire screen uniformly. Local dimming divides the backlight into zones that can be brightened or dimmed independently.
When a scene has a bright object against a dark background, the zones behind the bright area increase in intensity while zones behind the dark areas dim down. This dramatically improves contrast compared to a panel with no local dimming, where the backlight stays at a uniform level and blacks appear gray and washed out.
The number of dimming zones matters enormously. A budget display might have 8 to 16 zones, which produces noticeable halos around bright objects because each zone covers a large area. Mid-range displays may have 100 to 500 zones, offering much better precision. High-end mini-LED displays can have 1,000 to 2,000 or more zones, approaching OLED-like contrast in many scenes.
OLED panels sidestep this entirely because each pixel is self-emitting. There is no backlight, so each pixel can turn off completely for true black. This gives OLED displays a natural advantage for HDR content, though they typically cannot reach the same peak brightness as the brightest mini-LED panels.
Gaming in HDR can be transformative, but the experience varies wildly depending on your hardware and the game's HDR implementation. Here is what you need to know to get the best results.
First, your GPU must support HDR output. Modern NVIDIA, AMD, and Intel GPUs all support HDR over HDMI 2.0+ and DisplayPort 1.4+. You will also need a cable that can handle the bandwidth. For 4K at 120Hz with HDR (10-bit color at 4:4:4 chroma), you need HDMI 2.1 or DisplayPort 1.4 with Display Stream Compression (DSC).
On Windows, you must enable HDR in Settings under System, then Display. Once enabled, the desktop composites everything in HDR, and SDR content gets tone-mapped. Some users find the SDR tone-mapping in Windows imperfect, leading to slightly washed-out or too-bright SDR content. Windows 11 has improved this significantly with the SDR content brightness slider, which lets you fine-tune how SDR content looks when HDR mode is active.
Game-side implementation quality matters a great deal. Some games implement HDR beautifully, with proper per-scene tone-mapping and accurate peak brightness calibration. Others bolt on HDR as an afterthought, producing results that look worse than SDR. Always check the in-game HDR calibration screen, if one exists, and set the peak brightness to match your display's actual capabilities. Setting it too high causes clipping; setting it too low wastes your display's headroom.
Auto HDR, available on Xbox consoles and Windows 11, applies machine-learning-based HDR conversion to SDR games. The results range from impressive to mediocre depending on the game, but it is generally worth trying on games that lack native HDR support.
Having an HDR display is only half the equation. You need HDR content to feed it. Here are the primary sources available today.
Determining your display's HDR capabilities is straightforward on most platforms. Here are the steps for each major operating system.
Open Settings, navigate to System, then Display. Select your monitor and look for the "HDR" or "Use HDR" toggle. If present, your display has been detected as HDR-capable. Click "HDR display capabilities" for more detail, including color space coverage and peak brightness tier. You can also run the DisplayPixels Color/HDR test directly in your browser for a quick check.
Go to System Settings, then Displays. If your display supports HDR, you will see "High Dynamic Range" as an option. Apple's built-in Retina displays on MacBook Pro (2021 and later) support HDR natively with peak brightness up to 1,600 nits. External displays require Thunderbolt or HDMI connection and macOS Ventura or later for full HDR support.
Beyond operating system settings, look for these specs in your monitor's documentation: VESA DisplayHDR certification (DisplayHDR 400, 600, 1000, or 1400), supported HDR formats (HDR10, HDR10+, Dolby Vision), peak brightness in nits, local dimming zone count, and color gamut coverage (DCI-P3 percentage). A monitor advertising "HDR support" but only reaching 350 nits with no local dimming will deliver a marginal HDR experience at best.
The VESA DisplayHDR certification provides a standardized way to compare HDR capabilities across monitors. Each tier has specific requirements for peak brightness, black level, color gamut, and bit depth.
| Tier | Peak Brightness |
|---|---|
| DisplayHDR 400 | 400 nits |
| DisplayHDR 500 | 500 nits |
| DisplayHDR 600 | 600 nits |
| DisplayHDR 1000 | 1,000 nits |
| DisplayHDR 1400 | 1,400 nits |
| DisplayHDR True Black 400 | 400 nits (OLED) |
| DisplayHDR True Black 600 | 600 nits (OLED) |
The "True Black" tiers are specifically for OLED and similar self-emissive displays that can achieve true 0-nit black levels. For a genuinely impressive HDR experience, most enthusiasts recommend DisplayHDR 600 or higher for LCD panels, or True Black 400 and above for OLEDs.
HDR can be finicky. Here are the most common issues users encounter and practical solutions for each.
This is the most frequently reported HDR problem, especially on Windows. It usually happens when HDR mode is enabled in the OS but the display is not properly configured. First, ensure your display's firmware is up to date. Second, check that the input port is set to an HDR-compatible mode (some monitors have a specific "HDR Mode" setting in their OSD). Third, adjust the SDR content brightness slider in Windows HDR settings to restore proper SDR appearance. Finally, make sure your cable supports the required bandwidth. An older HDMI 1.4 cable will silently drop to SDR even if both your GPU and monitor support HDR.
Some HDR content, particularly movies mastered for bright cinema screens, can look dim on a display that does not reach sufficient peak brightness. If your display peaks at 400 nits, it may struggle with content mastered at 1,000 nits. The tone-mapping algorithm must compress a wide brightness range into a smaller one, which can make the overall image appear darker than SDR. Try adjusting your display's tone-mapping settings if available, or ensure your media player's HDR processing is set appropriately.
On LCD displays with local dimming, bright objects against dark backgrounds can produce visible halos. This is a hardware limitation related to the number and size of dimming zones. You can reduce it by lowering the local dimming aggressiveness setting on your display, though this trades halo reduction for reduced contrast. If blooming is a major concern, consider an OLED display, which eliminates the issue entirely.
When HDR mode is active on Windows, the desktop compositor converts all SDR content to HDR. Some older applications may look incorrect because they were never designed with HDR compositing in mind. Adjusting the SDR content brightness slider helps. For persistent issues with specific apps, you can use the per-app HDR toggle in Windows 11 to disable HDR for individual applications.
For a deeper understanding of color accuracy in HDR and SDR, see our guides on how to calibrate your monitor's colors and color spaces explained.
HDR technology continues to evolve rapidly. Mini-LED backlights with thousands of dimming zones are closing the gap between LCD and OLED contrast performance. Micro-LED technology, which uses individual self-emissive LEDs at the pixel level, promises OLED-level blacks with much higher peak brightness and no risk of burn-in. Meanwhile, QD-OLED (Quantum Dot OLED) panels already deliver the best of both worlds for many users, combining deep OLED blacks with Samsung's quantum dot color technology for excellent brightness and color volume.
On the content side, more streaming services are adopting Dolby Vision and HDR10+ simultaneously, and the tools for creating HDR content are becoming more accessible to independent filmmakers and content creators. As display hardware improves and content libraries grow, HDR is steadily transitioning from a premium feature to a baseline expectation for any serious display.
SDR (Standard Dynamic Range) typically uses 8-bit color and peaks at around 100-300 nits of brightness. HDR (High Dynamic Range) uses 10-bit or 12-bit color depth and can reach 1,000 nits or more, delivering brighter highlights, deeper blacks, and a wider range of visible colors for a more lifelike image.
Dolby Vision is generally considered superior because it supports 12-bit color depth and uses dynamic metadata that adjusts on a scene-by-scene or even frame-by-frame basis. HDR10+ also uses dynamic metadata but is limited to 10-bit color. However, HDR10+ is royalty-free, making it more widely adopted by some manufacturers. Many premium displays now support both.
For a genuinely impactful HDR experience, a display should reach at least 600 nits of peak brightness, though 1,000 nits or more is ideal. Displays certified as VESA DisplayHDR 400 provide a basic HDR experience, while DisplayHDR 1000 and above deliver the dramatic contrast and highlight detail that HDR was designed for.
You can play HDR content on a non-HDR monitor, but your operating system or media player will tone-map the HDR signal down to SDR. The result will look like standard video and you will not see the expanded brightness range or wider color gamut that HDR provides. Some tone-mapping solutions produce acceptable results, but the intended experience requires an HDR-capable display.
Washed-out HDR is usually caused by incorrect settings. Common fixes include enabling HDR in your operating system display settings, making sure your cable supports the required bandwidth (HDMI 2.0 or higher, or DisplayPort 1.4+), setting your display to the correct HDR input mode, and ensuring the application or game is actually outputting an HDR signal rather than SDR.