Page 2 of 2 FirstFirst 12
Results 11 to 20 of 20
  1. Collapse Details
    #11
    Senior Member
    Join Date
    Feb 2013
    Location
    Sydney, Australia
    Posts
    719
    Default
    Nice to have another opinion from someone else who has been through the minefield. As you say a lot of minefields and like you a lot of it has been trial and error to get there from my end. How else do you get there otherwise? You learn from others. There are so many permutations and there are that many opinions out there and from all of that you just have to work out a workflow that does it for you and your client/network both technically and aesthetically.

    Chris Young


    Reply With Quote
     

  2. Collapse Details
    #12
    Senior Member
    Join Date
    Jul 2016
    Posts
    255
    Default
    Quote Originally Posted by cyvideo View Post
    Nice to have another opinion from someone else who has been through the minefield. As you say a lot of minefields and like you a lot of it has been trial and error to get there from my end. How else do you get there otherwise? You learn from others. There are so many permutations and there are that many opinions out there and from all of that you just have to work out a workflow that does it for you and your client/network both technically and aesthetically.

    Chris Young
    Just sent you a message.


    Reply With Quote
     

  3. Collapse Details
    #13
    Senior Member
    Join Date
    Jun 2008
    Location
    Denver, Colorado
    Posts
    943
    Default
    Quote Originally Posted by cyvideo View Post
    Nice to have another opinion from someone else who has been through the minefield. As you say a lot of minefields and like you a lot of it has been trial and error to get there from my end. How else do you get there otherwise? You learn from others. There are so many permutations and there are that many opinions out there and from all of that you just have to work out a workflow that does it for you and your client/network both technically and aesthetically.

    Chris Young
    Exactly, and there are so many places where it can go wrong, from metadata to the display. For example, Windows 10 will play HDR10 files but you have to have it enabled in the settings. You can't just give someone the file, they'll come back with a complaint about the grade, "it looks flat!" when it had nothing to do with the grade, they just aren't configured to see it in HDR, but they don't know that. Google Chromecast Ultra, TV apps, some support HDR and some don't. A lot of people don't even know what metadata is, but without it, the tv doesn't switch to the correct mode. Some people get fake HDR because the TV itself has a switch that turns SDR into a HDR lookalike. The most reliable way for most people is probably the Netflix/Amazon app in their smart TV; YouTube is going to be hit or miss for many people with the misses outnumbering the hits. There is a lot of stuff on YouTube that's mislabeled as HDR or poorly done, some of it mine even. I had one of the very first 5 or 10, actually I think it was number 3 or 4 YouTube HDR videos from when they made it available, about 2016 as I remember. I can hardly stand looking at it now, it seems so bad. Lot's of horribly done YouTube HDR, much of it blazed neon, some would find it pretty unless it was your face getting sunburned by HDR overdone or wrongly done. Others complain it's too dark. It's piece of cake now, but considering that most people will view your video in SDR, it's truly far more important that it looks good, and since it is a subset of the HDR to SDR down conversion process by YouTube, your 3D cube lut attached to the file is important that it has been created properly.

    As you say it is truly a minefield, or was. It's gotten a lot easier now, certainly the tools are worlds better, but a comfort level with the command line apps, even making your own libx builds, muxing your own audio with video, inserting metadata, attaching files, ffmpeg, mkvmerge, x265, x264 are pretty much mandatory for anyone doing HDR seriously. And they need to read up on all the white papers from Dolby, BBC/EBU, SMPTE and let's not forget HDMI 2.0a as a minimum starting point.


    Reply With Quote
     

  4. Collapse Details
    #14
    Senior Member
    Join Date
    May 2010
    Posts
    2,430
    Default
    Back to the question of monitors


    Reply With Quote
     

  5. Collapse Details
     

  6. Collapse Details
    #16
    Senior Member
    Join Date
    Jul 2016
    Posts
    255
    Default
    Quote Originally Posted by Razz16mm View Post
    this is true HDR and Tru 10 bit I take it?


    Reply With Quote
     

  7. Collapse Details
    #17
    Senior Member
    Join Date
    Feb 2013
    Location
    Sydney, Australia
    Posts
    719
    Default
    Quote Originally Posted by Razz16mm View Post
    Back to the question of monitors
    That's why I linked to this in post #5. He is a credible source.

    Chris Young


    Reply With Quote
     

  8. Collapse Details
    #18
    Senior Member
    Join Date
    Feb 2013
    Location
    Sydney, Australia
    Posts
    719
    Default
    Yes it is bar brightness levels where it is at the lower end at peaks at 330 nit. That is why hi end outfits like Technicolor will use hi end reference panel like below plus hi end consumer panels like outlined below

    https://pro.sony/en_AR/products/broa...rs/bvm-x300-v2

    as their primary grading monitor but will also use domestic panels like the 65-inch Samsung JS9500 SUHD TVs as their final arbiter as to what the end viewer is going to see. If Technicolor see panels like the Samsung JS9500 panels good enough to judge the final result from and end viewer point of view then that is good enough for me as I can't justify $30K plus for an HDR reference monitor. I do work for a government production facility where we use two of the BVMx300 monitors and indeed they are a beautiful thing but have their annoyances like going into a sleep mode to protect the OLED when they don't detect any movement on screen. Can be very annoying when working on docos using a lot of stills al la Ken Burns style. The Technicolor link I provided earlier shows how two of these JS9500s sit front and center in the grading suite. A lot of monitor snobbery permeates this industry like with a lot of other aspects of this industry such as sensor size, number of photosites, number of Mbps etc, etc. You are doing the right thing. Continue your research and eventually you will find though many knowledgeable posts in various places such as Tom's post the path to a decision. Filter carefully everything you research and eventually you will find the answers you are after.

    Chris Young

    "The demo

    Both Samsung TVs were running the same demo reel. It consisted of a series of clips, commissioned by Technicolor, and edited and processed in house. Each version of this reel was created from the same camera files. Each version was "tuned" to look as good as possible in its specific color space and dynamic range.

    https://www.cnet.com/news/technicolo...ideo-into-hdr/

    "The left JS9500 was running a standard image, with the Rec 709 color space It was essentially HD-quality, and looked good. The right JS9500 showed a split screen image. The right half was true HDR content displayed in the P3 color space. It looks fantastic, as HDR can, with rich cyan skies, deep red bathing suits, bright highlights on metal that really pop and more."

    Tecnicolor monitors.jpgTecnicolor monitors 02.jpg


    Reply With Quote
     

  9. Collapse Details
    #19
    Senior Member Batutta's Avatar
    Join Date
    Mar 2005
    Location
    Planet 10
    Posts
    7,462
    Default
    Having just set up a new 4k HDR projector and checking various media, HDR seems wildly inconsistent from film to film and a giant pain in the ass to deal with. I have to change my brightness and contrast settings all over the place every movie or program I put in to avoid clipping whites. Also, some devices will only stream in 60hz 4:2:0 color space which causes awful magenta blocking on white surfaces. I tried watching Empire Strikes Back in 4k HDR on Disney Plus, and the snow was a pink blotchy mess. I also saw this at first on The Irishman, but I streamed it from another device which can stream it at 24hz, which then switches it to 4K 4:2:2 HDR and makes the magenta blocking disappear. The Disney Plus App only works at 60hz.
    "Money doesn't make films...You just do it and take the initiative." - Werner Herzog


    Reply With Quote
     

  10. Collapse Details
    #20
    Senior Member
    Join Date
    Apr 2013
    Location
    Vietnam
    Posts
    155
    Default
    Quote Originally Posted by offbeatbryce View Post
    Hello,

    Hopefully I'm posting this in the right section.

    I am looking into buying an HDR 10 bit 4k monitor but I've read many of them are not true HDR or true 10 bit and are fake knock offs. How do I verify which is really 10 bit and what is really HDR? I know an editor that has this monitor. https://www.bhphotovideo.com/c/produ...ide.html/specs Is this really true 10 bit and true HDR? Also I was told by some random person on reddit that no real colorists should ever color on anything but a reference monitor because HDMI, display ports never output true HDR or true 10 bit regardless of what monitor is sold for computers and that if they do the colors won't be right on a regular computer monitor or standard TV and that reference monitors require SDI for real true output. This slightly confuses because no computer graphics card has SDI to my knowledge. My colorist friend is telling me to ignore this person on Reddit and they don't know what they are talking about.

    Premiere Pro has SDR conversion built into the export mode as an option specifically for HDR to SDR content. If this option is here but is totally useless like some the redditor tells me then why is it there?

    Also how is YouTube supposedly supporting HDR content when it's all one file uploaded to YouTube? I tried watching a YouTube video with the HDR setting turned on in YouTube on a non HDR montitor and all the brightness levels were too dark. There was no way to fix this. So how can anyone on Youtube watch a video that's made for HDR on a non HDR display if there is no option to shut it off?

    Can someone here please help me understand this and set this straight?

    Thanks,

    Bryce
    The monitor you listed only has 350 nits brightness, far below the required minimum for HDR. As far as I can tell, it also lacks FALD, so it is absolutely not HDR. LCD monitors require full array local dimming to reproduce rich blacks. If you are serious about HDR grading, you will want to work in Resolve - I don’t think Adobe will cut it. If you have a Mac, you’ll also need to set aside an extra grand for an i/o device to connect the monitor. Practically all mirrorless cameras today now shoot HLG, which can be uploaded directly to YouTube for a gorgeous HDR image, assuming white balance and exposure are correct to begin with.

    The Asus PA32UC has 1,000 nits brightness, 384 local dimming zones and can handle HDR10. It runs $1,200. AFAIK, it does not do HLG or Dolby Vision. It is also FRC. For around $4,000, the UCX has greater brightness, 1,152 local dimming zones, true 10-bit and can handle all flavors of HDR except HDR10+.

    The PQ22UC has a peak brightness of only 330 nits, so technically does not even meet the minimum standard for HDR displays.
    Last edited by jonpais; 12-04-2019 at 07:33 AM.


    Reply With Quote
     

Page 2 of 2 FirstFirst 12

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •