Page 2 of 4 FirstFirst 1234 LastLast
Results 11 to 20 of 40
  1. Collapse Details
    #11
    Senior Member PaPa's Avatar
    Join Date
    Nov 2005
    Location
    Toronto, Ontario
    Posts
    7,643
    Default
    Are you using CS6?

    http://www.imdb.com/name/nm2393063/
    Actor
    Musician
    Filmmaker


    Reply With Quote
     

  2. Collapse Details
    #12
    Senior Member starcentral's Avatar
    Join Date
    Nov 2004
    Location
    Toronto, Canada
    Posts
    6,405
    Dennis Hingsberg


    Reply With Quote
     

  3. Collapse Details
    #13
    Senior Member PaPa's Avatar
    Join Date
    Nov 2005
    Location
    Toronto, Ontario
    Posts
    7,643
    Default
    Interesting article Star. Forgive me but I fail to see any relationship between the thoughts discussed there and the topic of this thread?

    http://www.imdb.com/name/nm2393063/
    Actor
    Musician
    Filmmaker


    Reply With Quote
     

  4. Collapse Details
    #14
    Senior Member PaPa's Avatar
    Join Date
    Nov 2005
    Location
    Toronto, Ontario
    Posts
    7,643
    Default
    Wondering if anyone could chime in on this? Curious to know how I have no issue importing a 10-bit gh5 clip into pp cs6 without issue?

    http://www.imdb.com/name/nm2393063/
    Actor
    Musician
    Filmmaker


    Reply With Quote
     

  5. Collapse Details
    #15
    Senior Member Thomas Smet's Avatar
    Join Date
    Jul 2005
    Location
    Colorado
    Posts
    1,770
    Default
    Quote Originally Posted by PaPa View Post
    Wondering if anyone could chime in on this? Curious to know how I have no issue importing a 10-bit gh5 clip into pp cs6 without issue?
    I doubt a lot of people are still using CS6.

    If I had to take a guess I would say it is because Adobe uses different decoders in different versions and perhaps the one in CS6 adapts better to the 10bit. You should also make sure it is actually showing up as 10bit. When GH5 footage first came out the 10 bit files worked fine in Premiere but the 10bit didn't look much better than 8bit which was the source of that horrible Cinema5D article. Adobe updated Premiere which then cause the 10bit files to not work. So it is possible you may be able to load 10bit files in CS6 but they may get decoded as if they are 8bit.


    Reply With Quote
     

  6. Collapse Details
    #16
    Senior Member
    Join Date
    Jan 2014
    Posts
    113
    Default
    I am using CS6 and I have loaded a V-Log clip recorded with the GH5 and Atomos Shogun in 10bit/Avid DNxHR. It is indeed recognizing the 10bit file.


    Reply With Quote
     

  7. Collapse Details
    #17
    Senior Member
    Join Date
    Jan 2011
    Posts
    176
    Default
    Quote Originally Posted by PaPa View Post
    Wondering if anyone could chime in on this? Curious to know how I have no issue importing a 10-bit gh5 clip into pp cs6 without issue?
    Part of the issue was that levels weren't mapped correctly in full range files (when 0-255 luminance levels is set in the camera's menus). Not sure if the new version actually fixed that. Anyway the issue is avoidable by selecting 16-235 or 16-255.

    And have you verified that the files are imported with 10-bit precision? That's not an easy thing to test.


    Reply With Quote
     

  8. Collapse Details
    #18
    Senior Member
    Join Date
    Mar 2010
    Posts
    2,480
    Default
    Quote Originally Posted by PaPa View Post
    I recently downloaded some GH5, 10-bit footage (as described by the poster) and quickly brought it into adobe premiere pro CS6 to find it playback with absolutely no problem.

    Am I crazy?
    No, you're not crazy, that's just one reason many of us stick with stable production tools with well-earned track records.


    Reply With Quote
     

  9. Collapse Details
    #19
    Senior Member PaPa's Avatar
    Join Date
    Nov 2005
    Location
    Toronto, Ontario
    Posts
    7,643
    Default
    Quote Originally Posted by Thomas Smet View Post
    I doubt a lot of people are still using CS6.

    If I had to take a guess I would say it is because Adobe uses different decoders in different versions and perhaps the one in CS6 adapts better to the 10bit. You should also make sure it is actually showing up as 10bit. When GH5 footage first came out the 10 bit files worked fine in Premiere but the 10bit didn't look much better than 8bit which was the source of that horrible Cinema5D article. Adobe updated Premiere which then cause the 10bit files to not work. So it is possible you may be able to load 10bit files in CS6 but they may get decoded as if they are 8bit.
    I've spent a bit of time trying to figure out how I can even tell if CS6 is treating it as 10 bit. Ii've tried looking at the file properties, but to no avail.

    http://www.imdb.com/name/nm2393063/
    Actor
    Musician
    Filmmaker


    Reply With Quote
     

  10. Collapse Details
    #20
    Senior Member Cary Knoop's Avatar
    Join Date
    Jan 2017
    Location
    Newark CA, USA
    Posts
    522
    Default
    Quote Originally Posted by PaPa View Post
    I've spent a bit of time trying to figure out how I can even tell if CS6 is treating it as 10 bit. Ii've tried looking at the file properties, but to no avail.
    Increase the contrast or make other big changes, do you start to see stripes in the waveform? If so your footage might be handled like 8 bit.


    Reply With Quote
     

Page 2 of 4 FirstFirst 1234 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •