Page 1 of 3 123 LastLast
Results 1 to 10 of 30
  1. Collapse Details
    4k vs more DR/color resolution w/ 1080p....moot point?
    #1
    Member
    Join Date
    Oct 2013
    Location
    Seattle
    Posts
    69
    Default
    There's been quite a bit of back and forth on this subject in various threads. I don't have the greatest understanding of color resolution in video/codecs, the surrounding variables and technology that delivers it etc...I have recently jumped into the video world as a longtime photographer...

    Anyways, what occured to me is to ask this question: does any given -- say APS-C sensor for sake of argument -- camera system that can deliver 4K 24p 8bit 4.2.2 (at a decent bitrate/codec to allow for grading), not also be capable of higher color resolution/bitrate 1080p 24p? Isn't it ultimately just a matter of having the overall processing capability and software needed to produce any given quality of video (framerate, pixel resolution, bitrate, color resolution) from the raw sensor data?

    It seems like 4K vs better 1080p is cast as being mutually exclusive in a lot of the comments. I can imagine a myriad of reasons any given camera manufacturer might or might not allow for both, but would appreciate a clarification if my assumptions here are correct or not.

    I've also read a few times where someone mentioned downscaling 4K video to 2K increases the color resolution...is this correct? It seems to make sense to me that sampling would increase but not bit depth....so what does that really give you? I'd have to admit that I still don't quite understand how color bit depth relates to color sampling (http://www.dvxuser.com/articles/colorspace/) in video. Are they related or entirely separate?

    Bit depth makes sense to me coming from the photography world, color sampling still is a murky one
    Last edited by hojomo; 01-13-2014 at 08:03 PM.


    Reply With Quote
     

  2. Collapse Details
    #2
    Default
    Quote Originally Posted by hojomo View Post
    does any given -- say APS-C sensor for sake of argument -- camera system that can deliver 4K 24p 8bit 4.2.2 (at a decent bitrate/codec to allow for grading), not also be capable of higher color resolution/bitrate 1080p 24p? Isn't it ultimately just a matter of having the overall processing capability and software needed to produce any given quality of video (framerate, pixel resolution, bitrate, color resolution) from the raw sensor data?
    Yes.

    It seems like 4K vs better 1080p is cast as being mutually exclusive in a lot of the comments. I can imagine a myriad of reasons any given camera manufacturer might or might not allow for both, but would appreciate a clarification if my assumptions here are correct or not.
    Pixel count and pixel size are at odds with each other, as you would guess. More pixels bring more detail. But bigger pixels bring (a) more dynamic range and (b) better light sensitivity. It's a balance. I've posted enough on DVXUser, and questions return so often, that I can now just start posting links to my answers in other threads. Here ya go! http://www.dvxuser.com/V6/showthread...=1#post2460586

    I've also read a few times where someone mentioned downscaling 4K video to 2K increases the color resolution...is this correct?
    Yes.

    It seems to make sense to me that sampling would increase but not bit depth....so what does that really give you?
    More accurate color.

    I still don't quite understand how color bit depth relates to color sampling
    Depth is how many colors you have to paint with for the whole picture. 8 bit means you have 16 million colors. 10 bit means about a billion. 12 bit means about 69 billion.

    Sampling is whether each pixel can have its own color or if pixels are grouped into blocks of 2 or 4 that have to all be shades of the same color. http://en.wikipedia.org/wiki/Chroma_subsampling


    2 out of 2 members found this post helpful.
    Reply With Quote
     

  3. Collapse Details
    #3
    Senior Member Samuel H's Avatar
    Join Date
    Jun 2010
    Location
    Madrid, Spain
    Posts
    7,799
    Default
    A 4K image with 8-bit 420 color has enough information to give you a 2K image with 10-bit luminance and 8-bit 444 color.


    Reply With Quote
     

  4. Collapse Details
    #4
    Senior Member
    Join Date
    Mar 2013
    Location
    Sweden
    Posts
    623
    Default
    I've been much more interested in an 2K 10bit image with a decent but size-efficient codec, nice motion and clean blacks than 4K. But if 4K would give you all this color-information it could be a alternative to downscale i suppose. I also fear that early 4K cameras with compression codec will deliver very smeared motion and that all this information will not really be consistently captured.

    Also, the look of 4K seems to be an aesthetic choice even if you downscale. I wish we would get something like the D5300 image in a FF-sensor with a slightly better codec before everyone goes 4K, it would be really great for my type of work.


    Reply With Quote
     

  5. Collapse Details
    #5
    Senior Member joe 1008's Avatar
    Join Date
    Dec 2005
    Location
    Barcelona
    Posts
    1,064
    Default
    I'm somewhat intrigued with the possibility to get kind of a 10bit HD image out of a 4K 8bit image. It might be as simple as converting H264 HD files into PRORES with the extra benefit of using more of the codec's possible colour space.


    Reply With Quote
     

  6. Collapse Details
    #6
    Senior Member
    Join Date
    May 2010
    Posts
    2,403
    Default
    Quote Originally Posted by joe 1008 View Post
    I'm somewhat intrigued with the possibility to get kind of a 10bit HD image out of a 4K 8bit image. It might be as simple as converting H264 HD files into PRORES with the extra benefit of using more of the codec's possible colour space.
    You have a little more room to manipulate color converting to Prores, but highly compressed codecs like 4:2:0 H.264 are very good at throwing out every bit of data not needed to represent the image as it was encoded. There is no extra data there to work with. You start pushing color gamut and you quickly get ugly artifacts like color bleed and macro blocking.
    Oversampling 4k for HD can give you a richer gamut within the limits of REC709, but you have to shoot to at least a 10 bit 4:2:2 codec at 50mbps or greater to gain any real advantage in post.
    Nothing beats uncompressed raw for gradable color gamut.


    Reply With Quote
     

  7. Collapse Details
    #7
    Senior Member Thomas Smet's Avatar
    Join Date
    Jul 2005
    Location
    Colorado
    Posts
    2,418
    Default
    The 10bit may be a bit of a stretch but you really do get much better quality 1080p from a 4k source.

    1. 4k 4:2:0 scaled down to HD gives you a close to 4:4:4 1080p. It isn't a perfect true 4:4:4 due to how pixels use surrounding pixels to calculate a value but it is pretty darn close. Especially if you smooth the chroma first like After Effects or 5DtoRGB does.
    2. Compression artifacts tend to disappear when you scale down an image to 1/4 resolution. So for example a highly compressed 4k video scaled down to 1080p would eliminate most of the artifacts and blocking. 2x2 macro blocks get canceled out, 4x4 macro blocks become 2x2 blocks and so forth. You really do end up with a 1080p that looks like it was recorded at a very high bitrate.
    3. Alias and moire is for the most part gone when shooting 4k. This then allows the users to down scale to 1080p which will typically give a much cleaner down conversion than the binning process in most DSLRs. Even if DSLRs added a higher quality 1080p recording mode they would still be plagued with aliasing and moire.


    Reply With Quote
     

  8. Collapse Details
    #8
    Senior Member joe1946's Avatar
    Join Date
    May 2009
    Location
    Millstone,NJ USA
    Posts
    1,524
    Default
    Color: Ultra-Color Spectrum and True 10-Bit Color

    VIZIO’s Ultra-Color Spectrum fully supports true uncompressed 10-bit color for the most life-like colors ever seen on a TV. Unlike most TVs, which can only reproduce a conventional color gamut of Rec 709, Ultra-Color Spectrum in the Reference Series widens the color gamut, rendering colors closer to a level the human eye can discern. Full 10-bit color enables over one billion color shades for the most life-like imagery. Collectively, these technologies push the boundaries of color and contrast to create an entirely new level of realism.

    Contrast: Active LED Zones x 384 and High Dynamic Range

    Along with the ultra-bright, 800 nits backlight, the Reference Series features Full-Array LED backlighting with 384 individual local dimming zones known as Active LED Zones, which dynamically adjust to match on-screen content for the deepest, purest black levels and highest contrast. VIZIO was one of the industry’s first to implement Full-Array LED backlighting in its TV collections, and over the years has continued to invest in the technology to advance local dimming capabilities to new levels of performance and definition. For 2014, all VIZIO TV product lines will support Active LED Zones, with the Reference Series serving as the pinnacle of local dimming performance with 384 zones. In addition, Active Pixel Tuning delivers intelligent brightness adjustments at the individual pixel level for increased picture accuracy and contrast.

    Analyzing the lightest and darkest areas on the screen and extending the dynamic range without loss of detail in bright or dark areas, High Dynamic Range creates a contrast range with true-to-life intensity, more accurately reproducing the nuances of the picture and revealing fine details found in real scenes. Working together with an Ultra-Bright LED Backlight of 800 nits, the Reference Series delivers an unprecedented range of contrast with nearly twice the luminance of standard HDTVs.
    http://store.vizio.com/news/vizio-un...ference-series

    GH5, Sony FDR-X3000,Nikon P1000 , a6300 , NX1 , GH4 , Leica DG Vario-Elmarit 8-18mm f/2.8-4 , Leica DG Nocticron 42.5mm f/1.2 ASPH Power OIS , Voigtländer Nokton 25mm f/0.95 ,Olympus M.Zuiko Digital 75 mm f/1.8 ED , 100-300mm F/4.0-5.6 OIS , Sony A7r , FE 70-200mm F4 G OSS , Sony Zeiss Sonnar T* FE 55mm f/1.8 ZA ,Sony Zeiss Sonnar T* FE 35mm f/2.8 ZA , 28-70mm FE OSS, 15mm F2.8 , 24mm F1.4, 50mm F1.4, 85mm F1.4


    Reply With Quote
     

  9. Collapse Details
    #9
    Senior Member
    Join Date
    May 2010
    Posts
    2,403
    Default
    Quote Originally Posted by Thomas Smet View Post
    The 10bit may be a bit of a stretch but you really do get much better quality 1080p from a 4k source.

    1. 4k 4:2:0 scaled down to HD gives you a close to 4:4:4 1080p. It isn't a perfect true 4:4:4 due to how pixels use surrounding pixels to calculate a value but it is pretty darn close. Especially if you smooth the chroma first like After Effects or 5DtoRGB does.
    2. Compression artifacts tend to disappear when you scale down an image to 1/4 resolution. So for example a highly compressed 4k video scaled down to 1080p would eliminate most of the artifacts and blocking. 2x2 macro blocks get canceled out, 4x4 macro blocks become 2x2 blocks and so forth. You really do end up with a 1080p that looks like it was recorded at a very high bitrate.
    3. Alias and moire is for the most part gone when shooting 4k. This then allows the users to down scale to 1080p which will typically give a much cleaner down conversion than the binning process in most DSLRs. Even if DSLRs added a higher quality 1080p recording mode they would still be plagued with aliasing and moire.
    The problem is not the 4k downscale, the problem is 4:2:0 AVC encoding throws away 90% + of your data. You don't get the benefit of oversampling in the color data. You do get better image resolution and cleaner images with fewer artifacts, but it won't help the color gamut and it doesn't give you more grading room.

    As far any form of 1080p HD video is concerned, you can't cheat the color gamut of REC709, because that is all the monitor can show, whether 8 bit or 10 bit. Dynamic contrast manipulation by the TV does not give you anything that is not already contained in the REC709 encoded video. It may or may not look better than just straight video into a properly calibrated monitor. There is no standard for active contrast manipulation, it is different for every TV manufacturer. You would not want to grade using such a setting, because you don't know what it will look like on any monitor other than your own. This is just marketing hype.
    If you look at test reports for most consumer HDTV's you will find that many will not actually reproduce 100% of the REC709 gamut, much less anything broader.

    4k REC2020 compliant UHD TV's will show a much broader color gamut, but most of the first gen 4k TV's still only cover the REC709 gamut.
    The REC709 HD standard will not change.

    I will add that I don't see many samples of video grading that even come close to using the full gamut of REC709.
    Last edited by Razz16mm; 01-14-2014 at 08:13 AM.


    Reply With Quote
     

  10. Collapse Details
    #10
    Senior Member Samuel H's Avatar
    Join Date
    Jun 2010
    Location
    Madrid, Spain
    Posts
    7,799
    Default
    * You're right, my statement that "A 4K image with 8-bit 420 color has enough information to give you a 2K image with 10-bit luminance and 8-bit 444 color" is only 100% true if you record uncompressed.
    * On my NEX-5N (1920x1080 at 24fps with 8-bit 420 color encoded into 24Mbps), the 420 part throws away half the original information, then the codec throws away 96% of what it receives. Mostly redundant stuff, really...


    Reply With Quote
     

Page 1 of 3 123 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •