We have a large amount of HDR (High-Dynamic Range) TVs in the office due to our work on HDR standards. We run HDR capable content on those TVs frequently.
I was recently showing one of our non-HDR demos at a conference. The conference organizer was very nice and they provided an HDR10 TV to us. I was grateful for that because that meant we didn't have to ship a large and heavy TV over a large distance.
Usually, I am not the one who sets up our TVs. One of my co-workers with more experience is dealing with this type of work.
The demo I was going to show didn't have any special HDR treatment. It is not supposed to showcase the advantages of a higher dynamic brightness range. So all the art assets are LDR and there was no effort made in making it look beautiful on an HDR TV. It was demoed on many LDR TVs and monitors before.
I was recently showing one of our non-HDR demos at a conference. The conference organizer was very nice and they provided an HDR10 TV to us. I was grateful for that because that meant we didn't have to ship a large and heavy TV over a large distance.
Usually, I am not the one who sets up our TVs. One of my co-workers with more experience is dealing with this type of work.
The demo I was going to show didn't have any special HDR treatment. It is not supposed to showcase the advantages of a higher dynamic brightness range. So all the art assets are LDR and there was no effort made in making it look beautiful on an HDR TV. It was demoed on many LDR TVs and monitors before.
I was stunned to see how ugly this demo looked on this particular HDR10 capable TV. Up until then it didn't occur to me that we would have to make changes to the art assets or add a tone mapper and tune it (the demo didn't have one) to make a LDR demo look good on a HDR10 capable TV. My assumption was that the LDR demo should still look good on an HDR TV after all LDR is a subset of HDR right?
Also as a programmer, I consider going from LDR - > HDR a solved problem ... at least it should be documented enough and therefore didn't expect any challenge from just outputting our little demo that was already used on many LDR monitors and TVs before on an HDR10 TV.
So after my initial shock, a friendly person suggested to go to the following website to tune the TV:
http://www.rtings.com/tv/reviews/vizio/m-series-2015/settings
At that point, it occurred to me that an HDR10 capable TV has dozens of sliders in several sub-menus. I had a hard time to understand what all these sliders are doing and how they interact. There is no way an end-user will go through them and understand 5% of them.
http://www.rtings.com/tv/reviews/vizio/m-series-2015/settings
At that point, it occurred to me that an HDR10 capable TV has dozens of sliders in several sub-menus. I had a hard time to understand what all these sliders are doing and how they interact. There is no way an end-user will go through them and understand 5% of them.
How do we make sure that our games run on a wide range of HDR10 capable TVs consistently? Will every game come with a handbook explaining the best settings for a few dozen TVs? We can not possibly assume any end-user will go through the exercise of figuring this out. Previous experiences with just adjusting the gamma value indicate a low "adoption rate" of menu options.
Will we give recommendations like we do for PC Graphics Cards now, saying this game is best used with this display and expect users to upgrade their displays for our game? What happens if two games recommend different displays, will users need to have different displays in their main living room ... obviously, I am exaggerating.
For gamma, we invented our own gamma calibration test screens that worked well. Is there a way to do this for color as well? Maybe color wheels? This way we can guide users that are inclined to work their way through the menu options to a more optimal image?
Anyone has already thought this through?