Page 1 of 3 123 LastLast
Results 1 to 10 of 23
  1. Collapse Details
    True HDR and True 10 bit monitors. Is this possible?
    #1
    Senior Member
    Join Date
    Jul 2016
    Posts
    271
    Default
    Hello,

    Hopefully I'm posting this in the right section.

    I am looking into buying an HDR 10 bit 4k monitor but I've read many of them are not true HDR or true 10 bit and are fake knock offs. How do I verify which is really 10 bit and what is really HDR? I know an editor that has this monitor. https://www.bhphotovideo.com/c/produ...ide.html/specs Is this really true 10 bit and true HDR? Also I was told by some random person on reddit that no real colorists should ever color on anything but a reference monitor because HDMI, display ports never output true HDR or true 10 bit regardless of what monitor is sold for computers and that if they do the colors won't be right on a regular computer monitor or standard TV and that reference monitors require SDI for real true output. This slightly confuses because no computer graphics card has SDI to my knowledge. My colorist friend is telling me to ignore this person on Reddit and they don't know what they are talking about.

    Premiere Pro has SDR conversion built into the export mode as an option specifically for HDR to SDR content. If this option is here but is totally useless like some the redditor tells me then why is it there?

    Also how is YouTube supposedly supporting HDR content when it's all one file uploaded to YouTube? I tried watching a YouTube video with the HDR setting turned on in YouTube on a non HDR montitor and all the brightness levels were too dark. There was no way to fix this. So how can anyone on Youtube watch a video that's made for HDR on a non HDR display if there is no option to shut it off?

    Can someone here please help me understand this and set this straight?

    Thanks,

    Bryce


    Reply With Quote
     

  2. Collapse Details
    #2
    Senior Member
    Join Date
    Feb 2013
    Location
    Sydney, Australia
    Posts
    913
    Default
    If you buy a monitor like the one below, there are obviously other brands, then yes you can get a true 10-bit PQ HDR/HLG display but you wont be running it through an HDMI connection it will need to be driven via a Display Port from a graphics card that can deliver a full 10-bit signal. I'm currently doing that with a Nvidia GTX1080Ti. See the JPG below. The output levels can be FULL or LIMITED depending on whether you are working 0-1023 data levels or 64-940 video levels. This 10-bit color depth selection in the Nvidia control panel is NOT accessible unless the graphics card and the monitor handshake with one another and confirm that there is indeed a true 10-bit graphics path in place. Once you have that full 10-bit 422 workflow in place it's hard to go back to anything else. 10-bit HD SDI is what I was used to with reference SDI CRT monitors so the whole PC monitor 8-bit viewing workflow was something I found hard to work with because I never knew exactly how my footage was going look. So often I found that color banding in 8-bit and even 10-bit workflows were introduced by the limitations of the 8-bit graphics path in most case where I saw this problem. The same footage on the 10-bit path exhibiting none of the artifacts seen on the 8-bit path.

    One thing to note if using Resolve. Up until v16.1 of Resolve you had to have a BM video card or breakout box to be able to output a 10-bit signal. Now with 16.1 you have the option to output via a new option a separate "VIDEO CLEAN FEED," see the JPG. What I haven't confirmed yet is whether this new output feature IS actually a 10-bit output or not. Prior to Resolve 16.1 I know all the GUI outputs were only 8-bit.

    Re the YouTube thing if the upload has the embedded HDR10 or HDR10+ or HLG information then HDR sets or monitors viewing this feed will see the embedded metadata and display the vision at the correct levels. Note: Not all HDR sets can display HLG yet. If you watch these videos on non HDR sets they will look flatter and less vibrant because they don't recognize the signal and just treat it as a 709 signal. If the embedded metadata is HLG and an HLG (Hybrid Log Gamma) TV is used then yes the full HLG signal will be decoded for display. If the TV is only a 709 HD TV then it will see the 709 level content embedded in the HLG stream and it will look a lot better but its still a bit flatter than a normal 709 encode.

    If you need to get a better handle on this whole HDR Hornets nest, we had to recently on a broadcast delivery, you may find Mystery Box's walk through series pretty helpful. Helped me no end. Here is one of their overviews. BTW the Mystery Box outlined HDR workflow is what we used in Resolve and it worked perfectly and the program sailed through the network's technical HDR QC without a hitch or query.

    https://www.mysterybox.us/blog/2016/...delivering-hdr

    https://www.eizo.com/news/2018/04/06...ith-hdr-gamma/

    Nvidia 10-bit.jpg Resolve output.jpg
    Last edited by cyvideo; 11-12-2019 at 09:55 PM. Reason: added info


    Reply With Quote
     

  3. Collapse Details
    #3
    Senior Member
    Join Date
    Jul 2016
    Posts
    271
    Default
    Quote Originally Posted by cyvideo View Post
    If you buy a monitor like the one below, there are obviously other brands, then yes you can get a true 10-bit PQ HDR/HLG display but you wont be running it through an HDMI connection it will need to be driven via a Display Port from a graphics card that can deliver a full 10-bit signal. I'm currently doing that with a Nvidia GTX1080Ti. See the JPG below. The output levels can be FULL or LIMITED depending on whether you are working 0-1023 data levels or 64-940 video levels. This 10-bit color depth selection in the Nvidia control panel is NOT accessible unless the graphics card and the monitor handshake with one another and confirm that there is indeed a true 10-bit graphics path in place. Once you have that full 10-bit 422 workflow in place it's hard to go back to anything else. 10-bit HD SDI is what I was used to with reference SDI CRT monitors so the whole PC monitor 8-bit viewing workflow was something I found hard to work with because I never knew exactly how my footage was going look. So often I found that color banding in 8-bit and even 10-bit workflows were introduced by the limitations of the 8-bit graphics path in most case where I saw this problem. The same footage on the 10-bit path exhibiting none of the artifacts seen on the 8-bit path.

    One thing to note if using Resolve. Up until v16.1 of Resolve you had to have a BM video card or breakout box to be able to output a 10-bit signal. Now with 16.1 you have the option to output via a new option a separate "VIDEO CLEAN FEED," see the JPG. What I haven't confirmed yet is whether this new output feature IS actually a 10-bit output or not. Prior to Resolve 16.1 I know all the GUI outputs were only 8-bit.

    Chris Young

    https://www.eizo.com/news/2018/04/06...ith-hdr-gamma/

    Nvidia 10-bit.jpg Resolve output.jpg
    Thanks,

    How come this person on reddit is telling me Colorists and editors don’t use hdr monitors as it will look wrong? I’m confused. He even told me that he would never pay an editor or colorist who uses an hdr monitor unless they are using a reference monitor vi Sdi. Confused where his info comes from? There are fake hdr monitors correct?

    Can you explain the YouTube issue with hdr video on YouTube being too dark on non hdr monitors?


    Reply With Quote
     

  4. Collapse Details
    #4
    Senior Member
    Join Date
    Jul 2016
    Posts
    271
    Default
    Quote Originally Posted by cyvideo View Post
    If you buy a monitor like the one below, there are obviously other brands, then yes you can get a true 10-bit PQ HDR/HLG display but you wont be running it through an HDMI connection it will need to be driven via a Display Port from a graphics card that can deliver a full 10-bit signal. I'm currently doing that with a Nvidia GTX1080Ti. See the JPG below. The output levels can be FULL or LIMITED depending on whether you are working 0-1023 data levels or 64-940 video levels. This 10-bit color depth selection in the Nvidia control panel is NOT accessible unless the graphics card and the monitor handshake with one another and confirm that there is indeed a true 10-bit graphics path in place. Once you have that full 10-bit 422 workflow in place it's hard to go back to anything else. 10-bit HD SDI is what I was used to with reference SDI CRT monitors so the whole PC monitor 8-bit viewing workflow was something I found hard to work with because I never knew exactly how my footage was going look. So often I found that color banding in 8-bit and even 10-bit workflows were introduced by the limitations of the 8-bit graphics path in most case where I saw this problem. The same footage on the 10-bit path exhibiting none of the artifacts seen on the 8-bit path.

    One thing to note if using Resolve. Up until v16.1 of Resolve you had to have a BM video card or breakout box to be able to output a 10-bit signal. Now with 16.1 you have the option to output via a new option a separate "VIDEO CLEAN FEED," see the JPG. What I haven't confirmed yet is whether this new output feature IS actually a 10-bit output or not. Prior to Resolve 16.1 I know all the GUI outputs were only 8-bit.

    Chris Young

    https://www.eizo.com/news/2018/04/06...ith-hdr-gamma/

    Nvidia 10-bit.jpg Resolve output.jpg

    Here is the full Reddit discussion. Is this guy that keeps telling me info wrong? I’m trying to understand what this person means. He literally told me editors don’t use hdr 10 Monitors. https://www.reddit.com/r/editors/com...tm_name=iossmf


    Reply With Quote
     

  5. Collapse Details
    #5
    Senior Member
    Join Date
    Feb 2013
    Location
    Sydney, Australia
    Posts
    913
    Default
    Sorry to say but in Australia putting it politely we may call guys like him Muppets. Muppets aren't too bright. In other words don't worry about his view point. It's your judgement that matters once you have researched the subject properly There are a number of post houses using a range of HDR flat panel monitors from TVs to to $40K bank buster studio reference monitors.

    Editors don't use HDR monitors? You 'gotta be kidding. Ask him to explain these anomalies to his obviously highly researched blanket statement that editors don't use HDR 10-bit monitors. Ask him why these guys are using HDR panels then? Why do one of the leading lights in the world of color imaging and image delivery use them, that would be Technicolor. I wouldn't even bother to listen to his reasons because it is obvious from the get go he is somewhat under informed on the subject.

    Senior Colorist Stephen Nakamura Discusses HDR and Panasonic OLED at Deluxe’s Company 3
    https://vimeo.com/301259898

    Technicolor punches high-def video into higher dynamic range
    https://www.cnet.com/news/technicolo...ideo-into-hdr/

    I guess someone should tell Canon not to use HDR monitors for grading!
    How-To: Canon EOS C200 and C200B Video Training Series- Cinema RAW Light & Post Production
    https://www.youtube.com/watch?v=O9DZ87cusiU

    Another "non expert" viewpoint on HDR monitors (couldn't resist the sarcasm) Blake Jones a legend in the grading business.
    https://www.redsharknews.com/product...-proart-pq22uc

    https://www.displayspecifications.com/en/model/830518cd

    I guess these guys are making these monitors for fun. Both SDI and HDMI 2.0. HDMI 2.0 significantly increases bandwidth to 18Gbps and includes the following advanced features: Resolutions up to 4K@50/60 (2160p), which is 4 times the clarity of 1080p/60 video resolution.

    Up to 512 Zones Backlights
    17, 24, 31, 55" monitors
    1,000,000:1 Contrast ratio
    1000nits Display Brightness
    AtomHDR
    HDR Technology
    15 Stops Dynamic Range

    https://www.atomos.com/neon

    available from
    https://www.bhphotovideo.com/c/produ...l?sts=pi&pim=Y

    https://www.provideocoalition.com/X300_Technicolor

    A great variety of HRD monitoring from various viewpoints. But then who am I to advise you? What do I know? I'm only a film/video/cameraman/editor/TD and ex SMPTE registered engineer with fifty years experience in the business. His viewpoint is just as valid as mine. It's up to you to learn and research where the truth lies. There is enough information out there to form a valid judgement... which in the end is what you have to do. Just be aware of the 'Fake News,' be discerning in you evaluation and research, cross check information

    Chris Young
    Last edited by cyvideo; 11-13-2019 at 04:28 AM. Reason: added info


    Reply With Quote
     

  6. Collapse Details
    #6
    Senior Member
    Join Date
    Jul 2016
    Posts
    271
    Default
    Quote Originally Posted by cyvideo View Post
    Sorry to say but in Australia putting it politely we may call guys like him Muppets. Muppets aren't too bright. In other words don't worry about his view point. It's your judgement that matters once you have researched the subject properly There are a number of post houses using a range of HDR flat panel monitors from TVs to to $40K bank buster studio reference monitors.

    Editors don't use HDR monitors? You 'gotta be kidding. Ask him to explain these anomalies to his obviously highly researched blanket statement that editors don't use HDR 10-bit monitors. Ask him why these guys are using HDR panels then? Why do one of the leading lights in the world of color imaging and image delivery use them, that would be Technicolor. I wouldn't even bother to listen to his reasons because it is obvious from the get go he is somewhat under informed on the subject.

    Senior Colorist Stephen Nakamura Discusses HDR and Panasonic OLED at Deluxe’s Company 3
    https://vimeo.com/301259898

    Technicolor punches high-def video into higher dynamic range
    https://www.cnet.com/news/technicolo...ideo-into-hdr/

    I guess someone should tell Canon not to use HDR monitors for grading!
    How-To: Canon EOS C200 and C200B Video Training Series- Cinema RAW Light & Post Production
    https://www.youtube.com/watch?v=O9DZ87cusiU

    Another "non expert" viewpoint on HDR monitors (couldn't resist the sarcasm) Blake Jones a legend in the grading business.
    https://www.redsharknews.com/product...-proart-pq22uc

    https://www.displayspecifications.com/en/model/830518cd

    I guess these guys are making these monitors for fun. Both SDI and HDMI 2.0. HDMI 2.0 significantly increases bandwidth to 18Gbps and includes the following advanced features: Resolutions up to 4K@50/60 (2160p), which is 4 times the clarity of 1080p/60 video resolution.

    Up to 512 Zones Backlights
    17, 24, 31, 55" monitors
    1,000,000:1 Contrast ratio
    1000nits Display Brightness
    AtomHDR
    HDR Technology
    15 Stops Dynamic Range

    https://www.atomos.com/neon

    available from
    https://www.bhphotovideo.com/c/produ...l?sts=pi&pim=Y

    https://www.provideocoalition.com/X300_Technicolor

    A great variety of HRD monitoring from various viewpoints. But then who am I to advise you? What do I know? I'm only a film/video/cameraman/editor/TD and ex SMPTE registered engineer with fifty years experience in the business. His viewpoint is just as valid as mine. It's up to you to learn and research where the truth lies. There is enough information out there to form a valid judgement... which in the end is what you have to do. Just be aware of the 'Fake News,' be discerning in you evaluation and research, cross check information

    Chris Young
    I understand all of that but why is it that some hdr displays aren’t true hdr and some are? I’ve researched that if an hdr monitor advertised as hdr but doesn’t have a certain nits range it’s not real hdr. I also don’t understand how youtube is playing back hdr content way too dark on non hdr monitors. Also isn’t there true 10 bit vs fake 10 bit?

    Here a couple more articles I found. https://referencehometheater.com/201...nd-statistics/ https://www.pugetsystems.com/labs/ar...-Hardware-987/

    Are these legit articles?

    Also


    Reply With Quote
     

  7. Collapse Details
    #7
    Senior Member
    Join Date
    Feb 2013
    Location
    Sydney, Australia
    Posts
    913
    Default
    Some supposed "10 bit HDR" monitors are actually 8+2 FRC (Frame Rate Control) panels so not are not true 10-bit. They use a dithering algorithm. FRC is a form of temporal dithering which cycles between different color shades with each new frame to simulate an intermediate shade. Unless you know the authenticity of YouTube videos and whether they complied with the YouTube HDR metadata upload requirements you won't know if what you are looking at is true HDR and therefore it's impossible to know whether your screen is receiving HDR. If it is you should get an HDR notification on screen when the signal first hits.

    The best thing to test a screen with is to view an HDR video from a known reliable HDR source such as the Sony channel. Watch the following video, link below, on an HDR screen and then a 709 HD screen and you will see a stark difference. The necessary HDR10+ metadata will be recognized by a true HDR panel and the image will be displayed correctly. Then again most HDR OLED panels range from around 500 to 800 nit while the accepted LCD domestic panel is on average 1000 nit whereas Dolby vision is based around 4000 nit. Bear in mind a lot of people are unaware that the Dolby 4000 nit standard is designed to be viewed in almost dark conditions. Basically cinema ambient light levels. So HDR vision on one HDR technology can look quite different on a different HDR technology. To test things out what we did in the early stages was to edit in Resolve using Color Managed to the HDR10+ standard at 1000 nit and then output to Rec.2100 ST2084 for the broadcaster. In the end that all worked perfectly. For viewing on an HLG compatible TV we output the file to Rec 2020 HLG ARIB STD-B67.

    The best thing you can do is to edit something to the HDR standard making sure you embed the necessary HDR10+ metadata in the render and strictly follow YouTube's "Upload High Dynamic Range (HDR) videos" workflow then download the video and then inspect it in something like MediaInfo to confirm it is Rec 2020 compliant with the correct color primaries and secondaries. If it is then you have hit a home run.

    Re your links. The Puget Systems link is nearly three years old. It was very accurate at the time but display cards and panels have experienced a quantum advance since then so it is no longer a definitive resource, still useful though. The Home Theater link is again a couple of years old and as to their reference to "Fake HDR" I think that is a bit over the top. There is nothing 'fake' about 8+2 FRC panels. They are just a cheaper not as efficient technology. "Caveat Emptor," buyer beware. Learn and understand the technology and you make your own decision as to whether to buy FRC or full 10-bit technology.

    Chris Young

    Sony HDR video
    https://www.youtube.com/watch?v=aCr2lfMoMPY

    YouTube HDR upload specs.
    https://support.google.com/youtube/answer/7126552?hl=en


    Reply With Quote
     

  8. Collapse Details
    #8
    Senior Member
    Join Date
    Jul 2016
    Posts
    271
    Default
    Quote Originally Posted by cyvideo View Post
    Some supposed "10 bit HDR" monitors are actually 8+2 FRC (Frame Rate Control) panels so not are not true 10-bit. They use a dithering algorithm. FRC is a form of temporal dithering which cycles between different color shades with each new frame to simulate an intermediate shade. Unless you know the authenticity of YouTube videos and whether they complied with the YouTube HDR metadata upload requirements you won't know if what you are looking at is true HDR and therefore it's impossible to know whether your screen is receiving HDR. If it is you should get an HDR notification on screen when the signal first hits.

    The best thing to test a screen with is to view an HDR video from a known reliable HDR source such as the Sony channel. Watch the following video, link below, on an HDR screen and then a 709 HD screen and you will see a stark difference. The necessary HDR10+ metadata will be recognized by a true HDR panel and the image will be displayed correctly. Then again most HDR OLED panels range from around 500 to 800 nit while the accepted LCD domestic panel is on average 1000 nit whereas Dolby vision is based around 4000 nit. Bear in mind a lot of people are unaware that the Dolby 4000 nit standard is designed to be viewed in almost dark conditions. Basically cinema ambient light levels. So HDR vision on one HDR technology can look quite different on a different HDR technology. To test things out what we did in the early stages was to edit in Resolve using Color Managed to the HDR10+ standard at 1000 nit and then output to Rec.2100 ST2084 for the broadcaster. In the end that all worked perfectly. For viewing on an HLG compatible TV we output the file to Rec 2020 HLG ARIB STD-B67.

    The best thing you can do is to edit something to the HDR standard making sure you embed the necessary HDR10+ metadata in the render and strictly follow YouTube's "Upload High Dynamic Range (HDR) videos" workflow then download the video and then inspect it in something like MediaInfo to confirm it is Rec 2020 compliant with the correct color primaries and secondaries. If it is then you have hit a home run.

    Re your links. The Puget Systems link is nearly three years old. It was very accurate at the time but display cards and panels have experienced a quantum advance since then so it is no longer a definitive resource, still useful though. The Home Theater link is again a couple of years old and as to their reference to "Fake HDR" I think that is a bit over the top. There is nothing 'fake' about 8+2 FRC panels. They are just a cheaper not as efficient technology. "Caveat Emptor," buyer beware. Learn and understand the technology and you make your own decision as to whether to buy FRC or full 10-bit technology.

    Chris Young

    Sony HDR video
    https://www.youtube.com/watch?v=aCr2lfMoMPY

    YouTube HDR upload specs.
    https://support.google.com/youtube/answer/7126552?hl=en

    How would I create the proper metadata info in a render using premiere pro? I do see a sdr conversion but it creates a separate file. I was told you need one file that has both sdr and hdr metadata so YouTube can detect what display you have but when i googled this YouTube says they have no such feature and you just upload hdr to YouTube but when I go to the link you sent me everything is dark even if I select regular 1080p.


    Reply With Quote
     

  9. Collapse Details
    #9
    Senior Member
    Join Date
    Feb 2013
    Location
    Sydney, Australia
    Posts
    913
    Default
    Anything to do with Premiere I couldn't help you. Premiere 6 was the last time I crossed its path

    With YouTube If you upload an HLG file then HDR TVs that can decode HLG will display an HDR signal albeit an HLG encoded one. Embedded in an HLG render is the information for Rec 709 TVs. A pure HDR10 signal does not have the ability to display the 709 information as such. A quick simplified overview here of what the signal differences are and what their cross compatibility is:

    Chris Young

    https://www.cnet.com/news/hdr10-vs-d...rmats-compare/


    Reply With Quote
     

  10. Collapse Details
    #10
    Senior Member Tom Roper's Avatar
    Join Date
    Jun 2008
    Location
    Denver, Colorado
    Posts
    1,234
    Default
    Quote Originally Posted by offbeatbryce View Post
    Can someone here please help me understand this and set this straight?
    Misinformation about HDR has swirled since I started with it in 2015. I've converted, transformed, graded HDR in all the major formats, Dolby Vision, HDR10, HDR10+, HLG, YouTube, Amazon. Some of it has played around the globe on Sony Television sets in the Best Buy Magnolia showrooms. If you are wanting a recommendation for a HDR display, PM me and I'll tell you of the low cost one that Technicolor uses. There are other good choices as well.

    As for YouTube, under help you can keyword "HDR" and find out how they do it, or I'll give you the spoiler alert here...and this is a YouTube only workflow, you upload a HDR version of your video, YT will automatically generate a SDR copy. But any time you convert from a larger color volume (gamma+gamut) i.e. (ST-2084/2020) to a smaller one (709), there can be out of bounds values (clipping), of chroma and luma. So the YT generated SDR copy is a reasonable approximation but if you want the SDR copy to actually look as you intended, you upload a lut attached to your HDR file with metadata. Container, file, metadata, Lut. YouTube will automatically stream the appropriate version for the viewer's device, HDR, SDR, iPhone, tablet, whatever.

    When I first started, I had a display that supported HDR but I didn't have a proper way of grading the content, so I actually learned by trial and error, creating and grading in SDR, exporting in HDR and seeing how that looked. I would have two versions up at the same time on different displays, the SDR and the HDR. It was a pain but I got my first break for a Spanish film maker. That work went on to be sold to Sony and played around the globe. But it couldn't go on like that; a trial and error workflow. Then I bought BMD Decklink card and it got much easier. Resolve has gotten so good today with its tone mapping, it's possible to grade a project entirely in SDR and make an HDR conversion that is excellent, although I don't recommend it. It's possible, but backwards. From creation to delivery there are a lot of potential traps along the way, most of them related to levels mismatches, how the levels can be interpreted differently from encoding to playback, from camera log to NLE, the intermediate, HEVC, h.264, DNxH, sometimes between RGB and YUV, and playback too, Edge, Explorer, TV apps, wifi streaming, USB, HDMI. When it looks too dark, or it looks too bright, it is assuredly a levels mismatch somewhere in the chain from the creator's desk to the viewer's tv. Levels can go from Full (pc) to Video (tv), and back...multiple places in the chain. The biggest challenge is making sure the levels are correct for as many devices as possible, some of them you are not in control of like the viewer's tv. And it takes knowledge of how it's going to be input, different for a USB flash (Full) than for HDMI (Video). And if the levels are mismatched out of your control, that the content still doesn't look too bad, not overly crushed, nor overly grayed.


    Reply With Quote
     

Page 1 of 3 123 LastLast

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •