We have a large amount of HDR (High-Dynamic Range) TVs in the office due to our work on HDR standards. We run HDR capable content on those TVs frequently.
I was recently showing one of our non-HDR demos at a conference. The conference organizer was very nice and they provided an HDR10 TV to us. I was grateful for that because that meant we didn't have to ship a large and heavy TV over a large distance.
Usually, I am not the one who sets up our TVs. One of my co-workers with more experience is dealing with this type of work.
The demo I was going to show didn't have any special HDR treatment. It is not supposed to showcase the advantages of a higher dynamic brightness range. So all the art assets are LDR and there was no effort made in making it look beautiful on an HDR TV. It was demoed on many LDR TVs and monitors before.
I was recently showing one of our non-HDR demos at a conference. The conference organizer was very nice and they provided an HDR10 TV to us. I was grateful for that because that meant we didn't have to ship a large and heavy TV over a large distance.
Usually, I am not the one who sets up our TVs. One of my co-workers with more experience is dealing with this type of work.
The demo I was going to show didn't have any special HDR treatment. It is not supposed to showcase the advantages of a higher dynamic brightness range. So all the art assets are LDR and there was no effort made in making it look beautiful on an HDR TV. It was demoed on many LDR TVs and monitors before.
I was stunned to see how ugly this demo looked on this particular HDR10 capable TV. Up until then it didn't occur to me that we would have to make changes to the art assets or add a tone mapper and tune it (the demo didn't have one) to make a LDR demo look good on a HDR10 capable TV. My assumption was that the LDR demo should still look good on an HDR TV after all LDR is a subset of HDR right?
Also as a programmer, I consider going from LDR - > HDR a solved problem ... at least it should be documented enough and therefore didn't expect any challenge from just outputting our little demo that was already used on many LDR monitors and TVs before on an HDR10 TV.
So after my initial shock, a friendly person suggested to go to the following website to tune the TV:
http://www.rtings.com/tv/reviews/vizio/m-series-2015/settings
At that point, it occurred to me that an HDR10 capable TV has dozens of sliders in several sub-menus. I had a hard time to understand what all these sliders are doing and how they interact. There is no way an end-user will go through them and understand 5% of them.
http://www.rtings.com/tv/reviews/vizio/m-series-2015/settings
At that point, it occurred to me that an HDR10 capable TV has dozens of sliders in several sub-menus. I had a hard time to understand what all these sliders are doing and how they interact. There is no way an end-user will go through them and understand 5% of them.
How do we make sure that our games run on a wide range of HDR10 capable TVs consistently? Will every game come with a handbook explaining the best settings for a few dozen TVs? We can not possibly assume any end-user will go through the exercise of figuring this out. Previous experiences with just adjusting the gamma value indicate a low "adoption rate" of menu options.
Will we give recommendations like we do for PC Graphics Cards now, saying this game is best used with this display and expect users to upgrade their displays for our game? What happens if two games recommend different displays, will users need to have different displays in their main living room ... obviously, I am exaggerating.
For gamma, we invented our own gamma calibration test screens that worked well. Is there a way to do this for color as well? Maybe color wheels? This way we can guide users that are inclined to work their way through the menu options to a more optimal image?
Anyone has already thought this through?
For the last 20 years we've been living in a Rec.709 world where RGB "just works". You take a picture on your cellphone, move it to your PC, cast it to your TV screen and it just works. That's because all of those devices have been Rec.709 (ok, flat panels use BT1886 to correct for gamma, but that's a small detail). Welcome to the bad old days of 1990's color displays and printing, the days before the HDTV standard. Remember picking your "color intent" before printing a document? They're back!
ReplyDeleteThe problem is that the SDR signal is mapped according to the Rec.709 standard where the maximum intensity of your (1,1,1) pixel is 100 nits. But almost no TV or monitor you use today follows that convention, they on average use 300-nits for a (1,1,1) pixel and further "vividness" controls rotate the RGB colors out of the rec.709 colorspace into something more saturated. You are used to over-sugared content.
The Rec.2100 standard was designed to convert SDR signals to Rec.709 values, so you have to add back the sugar you are used to. Scale your SDR content to 300-nits and rotate your Rec.709 colors into Rec.2100 colorspace before encoding it to PQ for output - that's the basic transform that will solve the "too dim" or "too unsaturated" problems.
After that you can start to tame overbright pixels or start generating Wide Color Gamut values using WCG content, filmic curves and all the bells and whistles.