Announcement

Collapse
No announcement yet.

How interested are you in a GH6 anymore?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Originally posted by Thomas Smet View Post
    We are throwing things into slow motion now just because we can without any real aesthetic value.
    .
    I think the campaign to overuse slow motion has been in full swing at least since the FS700. And, y'know, a year ago I would have agreed with what you wrote. But now that I have the a7siii and have been playing with 120fps, my thinking has changed

    First of all, I use slow motion with the opposite strategy. Mostly, I slow down things that were already slow, and they become more dreamlike and epic. Things that were fast I usually let remain fast. I also use it a lot on static subjects just to smooth out my camera movement

    But the larger issue here is I'm debating 120fps vs 60fps, not 120fps vs 24fps. There are already many occasions I'm required to shoot 60fps continuously. The primary reason is so the editor can extend a shot from 1 second to 2.5 seconds if the shot wasn't captured for long enough to make the edit. And I don't like the way 60fps conforms to 24fps when they run it in real-time, which is more common than slow-moing it

    If you shoot 120fps with 1/120 shutter, then you still have flawless 60fps @ 1/120 available. Your 24 is still @ 1/120 but now you don't have to interpolate frames. And as a bonus you get slightly blurry 120fps and slightly crisp 40fps to play with. A major win over shooting 60fps

    you can jump cut or speed ramp if you want to show a longer portion of a slow mo clip in a short time. Your editing rhythm doesn't necessarily need to be altered
    Last edited by ahalpert; 01-09-2022, 02:02 PM.
    www.VideoAbe.com

    "If you’re really in favor of free speech, then you’re in favor of freedom of speech for precisely the views you despise. Otherwise, you’re not in favor of free speech." - Noam Chomsky

    Comment


      Originally posted by ahalpert View Post
      If you shoot 120fps with 1/120 shutter, then you still have flawless 60fps @ 1/120 available. Your 24 is still @ 1/120 but now you don't have to interpolate frames. And as a bonus you get slightly blurry 120fps and slightly crisp 40fps to play with. A major win over shooting 60fps
      Is the camera more likely to overheat at 120 than 60 FPS or does that not make much difference?

      Comment


        Originally posted by Samuel Dilworth View Post

        Is the camera more likely to overheat at 120 than 60 FPS or does that not make much difference?
        It is more likely to overheat at 120fps. But truly I've only found it to be a problem while using wifi smartphone control, which apparently heats things up. If I get a heat warning, I turn off the wifi and it goes away. So the net effect is that sometimes I lose my smartphone controller when shooting 120fps, which doesn't happen at 60fps. (Note: any 10-min+ recordings I do are usually at 24fps)

        There are a couple other downsides on the a7siii and fx3. 120fps is only available in 1.1x crop and you lose the options to do distortion/vignetting/CA correction as well as electronic "active" IBIS. (So, when I used the 12mm on this recent music video, I dropped to 60fps because the distortion is awful without correction)

        But I'm sort of discussing theoretical benefits of the 2 framerates. It's easy to imagine another camera (especially a m43 camera) that can do 120fps without any drawbacks
        Last edited by ahalpert; 01-09-2022, 04:06 PM.
        www.VideoAbe.com

        "If you’re really in favor of free speech, then you’re in favor of freedom of speech for precisely the views you despise. Otherwise, you’re not in favor of free speech." - Noam Chomsky

        Comment


          Originally posted by ahalpert View Post
          It's easy to imagine another camera (especially a m43 camera) that can do 120fps without any drawbacks
          Yeeeaaaaahhh!

          Comment


            A tease with a 2/22 announcement (at CP+).

            FWIW, Fuji Africa teased XH-2 for May. That one is supposed to be an 8K "high end" APS-C camera for the same $2,500.

            Comment


              Originally posted by DLD View Post
              That one is supposed to be an 8K "high end" APS-C camera for the same $2,500.
              You’re obsessed with 8K. Is everyone else? If so the GH6 is doomed.

              Comment


                Originally posted by Samuel Dilworth View Post

                You’re obsessed with 8K. Is everyone else? If so the GH6 is doomed.
                What starts with large TVs eventually filters down to capture devices. As far as I can tell, CES was very much about 8K displays. Have to say that I was tempted way back by the XH-1 for the FUJI colors but for it's lack of IBIS. I'm sure it's coming to the XH-2, but no the GH6 isn't doomed. Panasonic understands the need for video features in hybrids better than anyone, especially if one is interested in anamorphic capture. 5.7K is plenty.
                Last edited by stoneinapond; 01-11-2022, 06:21 AM.

                Comment


                  X-H1 was the only Fujifilm then that actually had IBIS. lol

                  [That's actually what made it extremely popular at the time!]

                  Comment


                    The GH5 had 5.2k (they called it 6k) for years and not many seemed to care. Not sure if people or only capable of thinking of higher detail in terms of 2k,4k and 8k or if its because 8k TVs are more of a reality now.

                    Even if we are getting 8k TVs its going to be years before we get a delivery system for that. Apple Airplay from within a home doesn't even support 8k yet.

                    5k, 5.7k and 6k will all upscale very well to 8k and downscale even better to 4k.

                    I just watched the DP Redlines thesis on resolution again and he really feels there isn't much advantage over 2k. 4k being the upper limit of usable detail. Anything beyond that is purely marketing and over sampling potential.

                    I have a pretty large 65" 4k TV and I honestly cannot tell the difference between good 2k and 4k. Maybe someday I will get a 80" but even then I think 2k and 4k will be perfectly wonderful.

                    I would rather see all TVs go 1600 nits minimum for brightness and at least 90% rec2020 vs 8k. HDR has a much bigger future than impossible to see extra detail that most will not be able to afford for many years. 8k only makes sense on the massive TVs which are still priced out of most peoples budgets. Especially when the 50-75" options look so good and are so much more affordable. Heck we can barely get 4k to customers and have them actually enjoy the real benefits of 4k.

                    Gaming is still clinging hard to 1080p and gamers prefer low latency and higher frame rate to resolution. graphics cards are finally able to push some gaming to 4k but not without compromises. The graphics cards to push those levels right now cost more than what an entire gaming rig used to cost thanks to the jacked up market. At this at the computer side many of us now hav UHD monitors and 4k on YouTube can look a bit more detailed. 8k computer monitors are many years off as the norm and not very practical at what are considered normal computer monitor sizes. 27" UHD is already a 2x Retina display. A 8k display would be 4x retina and we don't really need that. Its already almost impossible to see individual pixels on a UHD display. So computer 8k viewing is almost completely worthless and an epic waste of money. Unless one loves using a 42" as a computer monitor. Ultrawides have some potential but who produces video that wide to really take advantage of it.

                    I would say a 8k bayer sensor can make some more pristine 4k vs a 4k bayer sensor but the detail is so fine at that point that I doubt it matters. It really is splitting hairs. I know we have this discussion each new generation but each time it does matter less. Thats the nature of th physics of resolution. It matters less each time. The jump from SD to HD was big. The jump from HD to 4k was decent. The jump from 4k to 8k will be a whimper.

                    Thats why I would rather see TVs stop cutting corners on brightness and color. HDR is a joke on 90% of the TVs out there. Its barely usable. I have a Vizio from a few years ago that has the highest rated rec2020 color out there. I wouldn't normally but Vizio but its color was very impressive. The brightness is only 400 nits however and that shows. HDR looks amazing on my iPad XDR but its color is a bit more limited to only P3. I would like to see TVs of all sizes and budgets no longer push 250 nits and 300 nits as HDR. We should have at least 1000 nits sustained as the standard now if one is goin to sell a TV as HDR. There should be at least 80% rec2020 as the standard which is pretty much P3 levels. Many TVs can't even do decent P3 color. There are so many other areas of improvement to make TVs stand out without using 8k to fool people.

                    Where do we go after 8k? Are we really going to pretend 16k has some kind of realistic future in a few years when TV sales are down? 32k? When does the madness end? If Apple someday did make a XDR TV at 4k that thing would be a lot more desirable than a 8k TV that nobody will be able to really tell if they are even watching 8k. If a viewer has to look up specs to see if they are watching 4k or 8k then that right there means it no longer matters.

                    Comment


                      Originally posted by NorBro View Post
                      X-H1 was the only Fujifilm then that actually had IBIS. lol

                      [That's actually what made it extremely popular at the time!]
                      I stand corrected. Memory is failing.

                      Which means I forget what put me off purchasing the camera.

                      But thanks for the correction

                      Comment


                        Maybe because of no 4K/60p...but then the X-T3 was announced a few months later which was one of the best-selling cameras of all time and the first larger sensor chip with 4K/60p for the price (and of course no IBIS per usual Japanese decision-making).

                        They were Fujifilm's GH5 and GH5S.

                        Comment


                          Originally posted by Thomas Smet View Post
                          When does the madness end?
                          When the consumer has no more money for the latest gadget.

                          Comment


                            Originally posted by NorBro View Post
                            Maybe because of no 4K/60p...
                            Now I remember. It was because there was no headphone jack unless you added the battery grip and video recording was limited to 15 minutes. Extended to 30 minutes with the add on grip.

                            Comment


                              Originally posted by Samuel Dilworth View Post

                              You’re obsessed with 8K. Is everyone else? If so the GH6 is doomed.
                              People - well, except NorBro - don't buy several cameras a year.. (OK, they don't sell several cameras a year either). 8K is a solid buy now and will remain a viable product into the future. You can call it future proofing or what not but most buyers will go for the better specs at a given price.

                              Beside, an 8K - ~ 42 MPX - suits the stills community better also.

                              Comment


                                I stopped doing that but I did average about 5-7 cameras per year during the golden years of newer cameras, 2014-2019.

                                Everything kind of sucks right now, and the few interesting cameras to me are $6K+.

                                Comment

                                Working...
                                X