Hola cómo estáis. Que pasada brutal quemazón de logo en E6. Es que sucede así, en los mejores displays de visualización como plasma, CRT con imágenes estáticas permanentes con brillo a tope que te cagas, no darle caña para todo tipo de vídeos al TV .
The reality is that to be able to HDR imagery environmental light levels will have to be very carefully controlled. Far more so than for SDR viewing. This really does mean using a true home cinema environment. (cinema environment!!!)
Excellent your article Steve. And in that case now that the diffuse white point for HDR has been modified up to 203 nits.
Would you recommend a bias light, located behind the screen, of up to 5 nits D65 for HDR in a home theater environment dedicated to television? Or do you recommend a typical home theater environment for video projection with absolutely all the lights off?
Thank you very much for your help
The following defines the basic standards for a HDR Reference Viewing Environment, and match those for HDTV.
5 nits*: A 5 nits surround is defined due to the potentially low luma of shadow detail, based on the specified EOTF for both PQ and HLG based HDR. This is part of the issue with home viewing environments, where the surround illumination is often difficult to control, especially down to the specification levels.
- Monitor surround luminance to be 5 nits*
- Colour of monitor surround to be D65
- Surround extent to be 90° horizontal, 60° vertical
- Remaining surfaces to be dark matte to prevent stray light
- Average room Illumination ≤ 5 nits
- Viewing distance 1.5 screen height**
- Viewing angle to define < 0.001 Δuv
SaludosUnfortunately PQ based HDR never took that into account, and was only valid for a 'grading' environment.
Something we pointed out to Dolby when the first iteration of PQ was defined.
And that is why IQ now exists - to break the PQ standard for non-critical viewing environments.
But, basing it on a non-calibrated light sensor, and dynamically changing the image based on the sensor is, well, just bollox.
Unlike SDR, which uses a relative curve, HDR is absolute. The levels you see on screen about halfway up through the displays capability is how dark the film maker intended. HDR needs a light controlled room much more than SDR.
Just to reiterate. HDR needs a light controlled room vs. SDR. SDR needs to it, but not nearly as bad as HDR.
I mean no ambient light from windows or the lights on in the room. A bias light is fine and recommended.
As I mentioned, with SDR, the range is stretched up to the peak brightness of the panel. For HDR, nothing should be stretched since a given code value maps to a specific luminance value. Content creators have more control with HDR than they do with SDR to get the desired look.
HDR - Shadows Too
The reality is PQ based HDR does nothing for black levels, and that is true of shadow detail too - no matter what those less knowledgeable or marketing material may say.
A good example of inaccurate information used to promote 'benefits' of HDR can be seen in this presentation on YouTube, where improved shadow detail was stated as being an example of the benefits HDR brings over SDR... which is incorrect. The reality is the SDR image is probably just poorly graded, even potentially deliberately so, to promote HDR. HDR provides no such benefit over SDR shadow detail.
And in reality, due to the EOTF curve in use on PQ based HDR the black under normal home viewing conditions will often be 'crushed' when compared to SDR versions of the same image. This is born out by the surround illumination level that is specified as being preferred for HDR as being 5 nits, while for SDR it was originally specified as 10% of the maximum brightness of the display. That large discrepancy, and shows that HDR black/shadows will often be washed-out/clipped when viewed in any environment where the ambient light levels cannot be controlled.
In reality a 10 bit SDR image will have potentially better black/shadow detail than a PQ based HDR image.
Different viewing environments really need different display gamma, which the 'Absolute' PQ based HDR standard cannot address.
While I really do not agree with the individual in question, I do have to say that most of the SDR/HDR comparisons I see are wrong.
The whole concept of HDR is that up to a given point (diffuse white), when displayed on calibrated displays, SDR and HDR will effectively be of similar brightness.
The initial idea was the point of deviation would be around 100 nits, based on that being the diffuse white point when grading.
But, in home viewing situations that made PQ too dark, and so that was changed to be around 200 nits, based on home TVs and their viewing environments, rather than grading display setups.
This is explained here: UHDTV - HDR and WCG
So, in reality, any demo that shows an overall difference in brightness between SDR and HDR below diffuse white is invalid.
I mean no ambient light from windows or the lights on in the room (Quiero decir que no hay luz ambiental desde las ventanas ó las luces encendidas en la habitación)
“My C9 is calibrated for 100 nit, BT.1886. (same as 2.4 on the C9). My room is light controlled with a bias light. I use Cinema mode for SDR and HDR”
El colorista Alexis Van Hurkman también tiene una OLED TV LGMy C9 is calibrated for 100 nit, BT.1886. (same as 2.4 on the C9). My room is light controlled with a bias light. I use Cinema mode for SDR and HDR
Que te dije?Jajajaja pero él no ha recomendado esa marca, ha dicho a nivel genérico, basándose en las normas y estándares de las que él a nivel profesional forma parte activa, que para TV en habitación de luz controlada hay que utilizar luz de sesgo, pero no que compres un modelo concreto de sistema y que además lo hagas en una tienda especifica solo porque distribuyan su disco.
EBU TECH 3320 (Version 4.1 - September 2019) - User Requirements for Video Monitors in Television Production, defines the technical characteristics for video broadcast monitors used in a professional TV production environment for evaluation and control of the images being produced.
From Version 4.0, its been added a section dedicated to High Dynamic Range and Wide Color Gamut for UHD and 1080P HD Monitors.
That section splits Grade 1 HDR monitors into two types: Grade 1A HDR and Grade 1B HDR. A Grade 1A monitor can accurately reproduce all aspects of the standard it was designed to display.
A Grade 1B monitor may not be capable of reproducing the full range of colour or brightness defined in a standard, but will otherwise fulfil all the requirements of a Grade 1A monitor.
This novel approach was taken to bridge the gap between what a video standard may define and what monitors currently available on the market are able to reproduce.
Grade 1B HDR monitors require ≥1000 nits peak white, but they can have reduced Gamut and limited brightness specifications. Grade 1B HDR may be withdrawn at a future date.
When a Grade 1B monitor is unable to correctly display an input signal, e.g. it cannot physically display colors conveyed in an ITU-R BT.2100 signal, it shall by default apply a hard clip of the linear display signals to the available color volume whilst maintaining the ITU-R BT.2100 white point, rather than applying a soft clip (roll-off).
For Grade 1A HDR PQ or the Grade1B HDR PQ monitors, a 199.2cd/m2 (code value 592, 10-bit full range) full screen, uniform field input signal shall be displayed without power limiting.
LG 2019 OLED TV's unfortunately can't meet these specifications, and can't be recommended for color grading of HDR content.
But LG 2019 OLED TV's can be used as client view (review screening), on set, VFX, editing, or QC applications.