Results 1 to 4 of 4
  1. Collapse Details
    Mbps vs Resolution for Grading
    #1
    Default
    Iím looking at camera for a video project I have in mind. For this, I think the degree to which I can play with the video files is more important than the actual resolution. Output resolution will be 1080p.

    It looks like there are generally two options for most cameras:
    4k at a lower Mbps (e.g. 100 Mbps on the GH4) or
    1080p at a higher Mbps (e.g. 200 Mbps on the GH4)

    Iím wondering what would give me more latitude for grading? The higher Mbps, lower res option, or the lower Mbps, higher res option (and an explanation of why would be great!). I also understand that I can downres the 4k footage to increase sharpness and malleability/how much I can play with it.

    Also, thoughts on why the lower resolution option allows for higher Mbps would be interesting as well.


    Reply With Quote
     

  2. Collapse Details
    #2
    Senior Member
    Join Date
    Feb 2009
    Location
    Long Island
    Posts
    6,163
    Default
    For consumer/prosumer cameras that has been mostly true up until a few years ago when more companies started offering ~400mbps options or ProRes with higher data rates (Blackmagic).

    There isn't a simple answer to your question because people have graded both kinds of footage above with great success. And if you get to use a lot of cameras, you'll quickly learn that 100mbps from one company may hold up better than 100mbps from another. The internal color, processing, sharpening...even motion rendering will all affect the final image. Overtime you may learn that you'll need to push one camera less or more depending on how pleased you are with its output regardless of its specs.

    With that said, the general consensus seems to be that 10-bit is preferred with a proper implementation of a LOG profile, and that you won't see much of a difference between 200mbps and 400mbps.

    As far as lower resolutions being available at higher bit rates, it's most likely because of the following:

    (1) It's easier on the camera.
    (2) It's done to protect higher-end models.

    All 4K cameras from Blackmagic, Canon's 1DX Mark II/5D Mark IV/EOS R, Fujifilm's X-T3, and Panasonic's GH5/S all offer higher data rates in 4K. Maybe some others too like the Z Cam E2. And of course more expensive cinema systems.


    Reply With Quote
     

  3. Collapse Details
    #3
    Senior Member Cary Knoop's Avatar
    Join Date
    Jan 2017
    Location
    Newark CA, USA
    Posts
    1,524
    Default
    Quote Originally Posted by Albino_BlacMan View Post
    I’m looking at camera for a video project I have in mind. For this, I think the degree to which I can play with the video files is more important than the actual resolution. Output resolution will be 1080p.
    Your main limitation is the bit-depth and the compression ratio. For good grading you need a minimum of 10-bit bit-depth.
    If you only have 8-bits I would use 4K and scale to HD after you grade.

    An additional issue is if the footage is recorded in Rec.709 this further limits your options.


    Reply With Quote
     

  4. Collapse Details
    #4
    Senior Member
    Join Date
    Dec 2017
    Location
    San Francisco, CA
    Posts
    167
    Default
    Quote Originally Posted by Cary Knoop View Post
    If you only have 8-bits I would use 4K and scale to HD after you grade.
    If you are editing with Premiere (4K/UHD on an HD timeline), be sure to use "Set to Frame Size" (vs. "Scale to Frame Size"). This will keep your pixels intact so won't be grading a downsampled image after you edit. The 4K/UHD downsampling will occur on export rather than in your timeline. Good luck!


    Reply With Quote
     

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •