Page 1 of 2 12 LastLast
Results 1 to 10 of 12
  1. Collapse Details
    8K video over IP has been achieved
    #1
    Senior Member
    Join Date
    May 2010
    Posts
    2,595
    Default
    New JPEG XS codec for 8K streaming and production. Lossless up to 6:1 compression.
    https://www.redsharknews.com/8k-over...d-with-jpeg-xs


    2 out of 2 members found this post helpful.
    Reply With Quote
     

  2. Collapse Details
    #2
    Senior Member ahalpert's Avatar
    Join Date
    Apr 2011
    Location
    NYC
    Posts
    2,766
    Default
    Interesting. I wonder how or if this relates to H266 as both are Fraunhofer inventions


    Reply With Quote
     

  3. Collapse Details
    #3
    Senior Member
    Join Date
    May 2010
    Posts
    2,595
    Default
    H.266 is VVC codec for high compression streaming. Half the bandwidth of H.265 for similar IQ.
    JPEG XS is probably more related to JPEG 2k for very low latency streaming applications, but much higher data rates. It will require 10Gb networks.
    https://www.cnx-software.com/2020/07...ents-of-h-265/


    1 out of 1 members found this post helpful.
    Reply With Quote
     

  4. Collapse Details
    #4
    Senior Member
    Join Date
    May 2011
    Posts
    407
    Default
    JPEG XS and H.266 are two different codecs for two different purposes.

    JPEG XS is a lightly compressed (visually lossless) codec with very low latency. This is a line-level compression codec so it compresses lines of video at a time rather than whole frames and has a latency of about 32 lines, which would be about 0.12 ms of latency in 8K 60. This is not really a codec intended for live streaming as the data rates are way too high. With a 6:1 compression ratio, 8K 60 video would have a bitrate of 8 Gbps, which is much too high for a delivery codec over the internet. But since it does get the bandwidth under 10 Gbps, it makes it possible to send the video over local IP networks using 10G Ethernet. Which is significant since 10G Ethernet equipment is fairly widely available and significantly less expensive than 40G or 100G Ethernet.

    Codecs like JPEG XS which are designed for use in SMPTE 2110 video over IP transport are really intended as an alternative to SDI. So where you are starting to see this type of video IP used is in multi-camera live production environments, production trucks, broadcast centers, places where you have a lot of different video signals that need to be sent to a lot of different places (from cameras, to video switchers, recorders, multi-viewers, etc.).

    There are a few reasons why interest in IP video transport is on the rise for these types of applications. One is cost. Ethernet networking hardware is much more widely used than SDI hardware that is specific to the video industry, which helps drive down the costs of higher bandwidth networking infrastructure. Another advantage is routing flexibility. So, for example, if you can send one 8K 60 (JPEG-XS) video signal over a 10G Ethernet link, it means that you could also send eight 4K 30 video signals (or 32 HD signals) over the same network link. A traditional SDI video router would require a separate SDI port and cable for each video signal, whereas with video over IP you could do the same thing with a single port of an off-the-shelf 10G Ethernet router. Another reason for pursuing video over IP is more practical, as there don't really exist standards or hardware for transmitting higher resolution than 4K 60 video over SDI (which uses 12G SDI). So most of the SDI hardware that supports 8K video signals uses four 12G SDI cables to do so, and obviously there is interest in having a single cable approach.

    Alternative codecs to JPEG XS include JPEG 2000 (used by ASPEN), NDI (https://www.ndi.tv/about-ndi), and the original RDD35 version of TICO (https://www.tico-alliance.org/technology). Technically speaking, I believe that JPEG XS is an updated version of TICO (which is referred to as TICO-XS) that allows for more aggressive compression ratios up to 15:1. The original version of TICO which was submitted to SMPTE for standardization in RDD35 only allowed for compression ratios of up to 4:1. A lot of the standards around IP video transport are still in flux, however, so there isn't really a universal standard that all of the hardware manufacturers are using yet.

    In contrast, H.266 (or VVC) is intended as a delivery codec and is the successor to H.265 (HEVC). As such, it is an interframe (or intraframe) codec that allows for very high compression ratios, with a goal of allowing 50% lower bitrates at the same visual quality than H.265. This would allow for compressing 8K video into the hundreds of Mbps range (or in many cases even lower) that is needed for Internet delivery. But the latency of such codecs is typically in the hundreds of milliseconds to seconds range (in interframe modes) and can introduce visual compression artifacts (as we are all well aware).

    At the moment, H.266 seems to be facing a somewhat uncertain future, and may not ever be as widely adopted as H.264 or H.265. There is a competing delivery codec in AV1 (https://aomedia.org) that is the successor to VP9 and also aims to achieve lower bitrates than H.265. The advantage AV1 has is that it is royalty-free (unlike H.265 or H.266), so hardware manufacturers, internet streaming platforms, etc. don't have to pay anything to use it. The AV1 codec specifications were also completed about a year and a half before H.266 (which was finalized in June of this year). Which means that it has a significant head start in terms of developing software and hardware encoders and decoders.

    As you might expect, these next generation codecs like H.266 and AV1 are even more computationally intensive than earlier codecs, so hardware decoding is pretty much a must (if you think software decoding of H.265 footage is bad, wait till you try AV1). But the good news is that AV1 hardware decoding has already started to make its way into video processors and hardware system-on-a-chip designs used in Smart TVs from Samsung and LG, set-top boxes such as the updated Roku Ultra, and computer processors and GPUs such as the 11th Gen Intel Core processors and NVidia GeForce RTX 30 series.

    So while H.266 might actually end up being a bit better in terms of compression than AV1, I'm not sure whether it will be able to overcome the licensing issues and the significant head start AV1 has in terms of hardware support.


    3 out of 3 members found this post helpful.
    Reply With Quote
     

  5. Collapse Details
    #5
    Senior Member
    Join Date
    Mar 2010
    Location
    Central NC, USA
    Posts
    1,522
    Default
    Quote Originally Posted by Razz16mm View Post
    New JPEG XS codec for 8K streaming and production. Lossless up to 6:1 compression.
    https://www.redsharknews.com/8k-over...d-with-jpeg-xs
    A solution in search of a problem. There's zero need for 8k streaming. Just like there won't be any need for 16k streaming. Or 32k.

    There's way more to an image than just simple resolution.


    Reply With Quote
     

  6. Collapse Details
    #6
    Senior Member James0b57's Avatar
    Join Date
    Dec 2008
    Posts
    6,086
    Default
    Steve Yedlin argued that we need 8K streaming.

    "...without 8K streaming, no one will ever truly know if resolution really matters all that much in movies and TV. So, we MUST get 8K streaming ASAP, so that we can then properly show my resolution demo to the masses."
    - Steve Yedlin


    Reply With Quote
     

  7. Collapse Details
    #7
    Senior Member
    Join Date
    Mar 2010
    Location
    Central NC, USA
    Posts
    1,522
    Default
    Quote Originally Posted by James0b57 View Post
    Steve Yedlin argued that we need 8K streaming.
    Sony argued in a technical white paper they released about a decade ago that we already knew the limits. They showed that we barely need 4k, and then only if you're sitting close enough to see it (that is, in the first couple of rows of stadium seating at a theater). Most of what you get in theaters is equivalent to HD (the 80% of the seats between the far front and the far rear seats). The rear most seats are often only the equivalent of 720p. IOW, for most viewers 4k is a waste of resources. This explains why movies like Skyfall look so good in theaters when shot on mostly Arri cameras using the 2.8k arriraw format. Mind, it might have looked so good because of Roger Deakins.

    That Sony research white paper used to be here:

    http://pro.sony.com/bbsccms/static/f...K_WP_Final.pdf

    but the "pro.sony.com" webpages seem to have been removed. I made a quick pass but couldn't turn it up again. Pity, it was well worth reading.

    Without the paper, we'll have to resort to math. The human eye can on average resolve detail in the range of 1/60 arc second in size. This is largely due to the spacing of rods and cones in the retina, which defines the limits of what you can resolve, and what you can't.

    Other people have already built viewing distance calculators based on the physics of the human eye. So we don't have to do all the math -- most of it has already been done for us.

    I plugged in some numbers in the above webpage just for fun. I found that a 77" TV, and a 7680x4320 (8K UHD) signal stream, has a "visual acuity distance" of 2.6'. That puts my knees about six inches away from the screen. And it's well inside the shortest recommended distance based on field-of-view (4.0'), so there's going to be a lot of left-right head turning going on. Needless to say, I'm not doing that. Not going to happen. Sorry.

    Backing off to 4k UHD (3840x2160) gets you a 4.9' visual acuity distance, which is just outside the 4.0' minimum recommended distance. So you could watch from here if you wanted to, without excessive head turning. Still, I've tried this and found it to be uncomfortably close.

    Backing off to dumb old HD (1920x1080) get you a visual acuity distance of 10'. Which weirdly enough is just where my wife insisted our couch be placed. She likes it. I like it. And it's just under the THX maximum recommendation of 10.8'. Which puts us four or five rows down from the very back of movie theater stadium seating. Which is further back than I'd want to sit in a theater, but works just fine at home. So I might have a 4k TV, but that doesn't mean that I can tell the difference between HD and UHD from where I sit. I've tried it; I can not see a difference. That's why I pay Netflix for HD streaming, but not 4k UHD. And if I won't pay an upcharge to 4k, I surely won't pay an upcharge for 8k.

    All of the above gorp is just my way of saying that I think we already know, or at least Sony and THX know, that there's no need at all for 8k delivery over either discs or streaming. Maybe Mr. Yedlin knows stuff that Sony and THX do not.


    1 out of 1 members found this post helpful.
    Reply With Quote
     

  8. Collapse Details
    #8
    Senior Member
    Join Date
    Oct 2014
    Posts
    7,128
    Default
    I want my ...
    I want my ...
    I want 8K TV.


    1 out of 1 members found this post helpful.
    Reply With Quote
     

  9. Collapse Details
    #9
    Senior Member James0b57's Avatar
    Join Date
    Dec 2008
    Posts
    6,086
    Default
    Quote Originally Posted by Bruce Watson View Post
    Sony argued in a technical white paper they released about a decade ago that we already knew the limits. They showed that we barely need 4k, and then only if you're sitting close enough to see it (that is, in the first couple of rows of stadium seating at a theater). Most of what you get in theaters is equivalent to HD (the 80% of the seats between the far front and the far rear seats). The rear most seats are often only the equivalent of 720p. IOW, for most viewers 4k is a waste of resources. This explains why movies like Skyfall look so good in theaters when shot on mostly Arri cameras using the 2.8k arriraw format. Mind, it might have looked so good because of Roger Deakins.

    That Sony research white paper used to be here:

    http://pro.sony.com/bbsccms/static/f...K_WP_Final.pdf

    but the "pro.sony.com" webpages seem to have been removed. I made a quick pass but couldn't turn it up again. Pity, it was well worth reading.

    Without the paper, we'll have to resort to math. The human eye can on average resolve detail in the range of 1/60 arc second in size. This is largely due to the spacing of rods and cones in the retina, which defines the limits of what you can resolve, and what you can't.

    Other people have already built viewing distance calculators based on the physics of the human eye. So we don't have to do all the math -- most of it has already been done for us.

    I plugged in some numbers in the above webpage just for fun. I found that a 77" TV, and a 7680x4320 (8K UHD) signal stream, has a "visual acuity distance" of 2.6'. That puts my knees about six inches away from the screen. And it's well inside the shortest recommended distance based on field-of-view (4.0'), so there's going to be a lot of left-right head turning going on. Needless to say, I'm not doing that. Not going to happen. Sorry.

    Backing off to 4k UHD (3840x2160) gets you a 4.9' visual acuity distance, which is just outside the 4.0' minimum recommended distance. So you could watch from here if you wanted to, without excessive head turning. Still, I've tried this and found it to be uncomfortably close.

    Backing off to dumb old HD (1920x1080) get you a visual acuity distance of 10'. Which weirdly enough is just where my wife insisted our couch be placed. She likes it. I like it. And it's just under the THX maximum recommendation of 10.8'. Which puts us four or five rows down from the very back of movie theater stadium seating. Which is further back than I'd want to sit in a theater, but works just fine at home. So I might have a 4k TV, but that doesn't mean that I can tell the difference between HD and UHD from where I sit. I've tried it; I can not see a difference. That's why I pay Netflix for HD streaming, but not 4k UHD. And if I won't pay an upcharge to 4k, I surely won't pay an upcharge for 8k.

    All of the above gorp is just my way of saying that I think we already know, or at least Sony and THX know, that there's no need at all for 8k delivery over either discs or streaming. Maybe Mr. Yedlin knows stuff that Sony and THX do not.
    Oh, I was joking. Steve Yedlin did a test, that not only explained, but showed how little visible difference there is in resolutions above 2K at normal viewing distances.

    Steve Yedlin did the test because cinematographers were, and still are, being forced to use the wrong camera because somebody decided they need 4K - or 8K - or 12K. Mr. Yedlin wanted to put the choice of camera back into the cinematographer's hand.

    However, a lot of people seem to miss the main point of Yedlin's test, and they use it to say that 2K is the best resolution and Arri Alexa is the best. That isn't it, it is supposed to be a creative choice, and image quality isn't just one spec. It is a lot of different things. Sometimes more resoltion will be important, but one would hope that the cinematographer gets to choose. In fact, some people use Steve Yedlin's test as an excuse to force a cinematographer to use the Alexa!


    My personal opinion, is that for fiction narrative, I'll only ever need 2K. In fact, for TV shows and movies, I am 9 times out of 10 happy with 720p, as long as it is not too compressed. but, I do not see the television and internet at home as only ever being used for Vegging out or watching movies. The TV is a wonderful information tool. Having a larger surface and more resolution allows for more information on the screen. Like having a huge chalk board or cork board, having all the articles and images splayed out when studying. OR maybe all those 1000's of Nikon D800 images that will never be printed out, will finally be seen in full resolution on a 80" 8K screen? To me that is awesome. And I think it is important.

    But yeah, for movies and TV, I think resolution is a creative choice, and typically resolution isn't important in movies. Editors take care of cutting to a close up if you need more detail. You don't have to get out of your seat to run up to the screen for a close up. The film makers guide you through the journey and show you what you need.

    Of course there are a few genres and types of films that benefit from the creative choice of more resolution. 'Samsara' is gorgeous in 4K, and I'd like to see it in 8K OLED to be honest. But Gaurdians of the Galaxy, Sophie's choice, even The Godfather, I'd be cool with 2K and HD.


    Years ago, this would be an argument, but TV's at Costco are already boasting 8K. And BM has a $10K 12K camera. Meaning it can be cheaper to produce an 8K feature than a 2K Alexa one. Not to mention play back hardware for the BM12k seems to have lower requirements than Redcode 5K. That said, the choice should still be the cinematographer's. As image quality is more than one spec.


    But yes, 2K at normal viewing distances has been proven time and time again that it meets the minimum requirement of the human ability to resolve visual detail. There are instances in 2K were aliasing shows up on text. And there is a visible difference in detail for ultra wide shots in 4K, but for the average frame, no one is going to see a difference. And that's just the people with good vision!


    One of the worst things, is that YouTube compressed the heck out of 1080, so going to 4K does help hide some of the compression, so on an HD TV set, I will sometimes choose 4K, but this is a compression issue. Because, I prefer the look of 1080 blu-ray over 4K YouTube.

    So, unfortunately for film makers, we are sharing the murky waters of the multi tasking at home TV screen. It streams, it casts, it mirrors, it does it all these days. I think it is a great tool for all the homeschooling people right now. Students have to study virtually, and if they all could just layout the research material like it were a big library table, they wouldn't need to click through 20 browser tabs, and jump between layouts to read and type. But by all means, let's hold back progress so that people can just sit and watch movies all day in super efficient 2K! woohoo! (ok, I know that even if we had 16K, people would spend more time watching friends in standard def, lol. ). Though, this progress unfortunately trickles backwards, and people buy an 8K TV, so they pay for the 8K Netflix, and then Netflix is legally obligated to have 8K content... so now film makers are getting forced to shoot 8K projects, and cinematographers are being forced to use 8K cameras that are not a good creative choice for the film...... So, mr. Yedlin makes a test that shows 2K is good enough, so that cinematographers can be free to choose their preferred camera based on total image quality, but ultimately gets quoted out of context to mean we don't need 4K or 8K TV screens and cameras.


    ....and then I start sarcastically making fake quotes with Yedlin's name.



    ... why would i do that? Because Steve Yedlin’s test was ultimately pointless, because it got people arguing that 2K is the best, and nothing about the cinematographer or director’s creative choices. Instead people are still talking about resolution, 2K isn’t the goal, creativity is.
    Last edited by James0b57; 10-17-2020 at 06:09 PM.


    1 out of 1 members found this post helpful.
    Reply With Quote
     

  10. Collapse Details
    #10
    Senior Member
    Join Date
    Jan 2006
    Location
    Atlanta, GA
    Posts
    722
    Default
    VR and 360 video headset resolution is now catching up with the camera tech and now approaching 8k so some burgeoning demand for 8k streaming. 8k would give you about 2k in your FoV.


    1 out of 1 members found this post helpful.
    Reply With Quote
     

Page 1 of 2 12 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •