Other: The Sony ILME-FX30 Owners Club

That kind of transcoding is useful but only for performance-based reasons for certain people.

It's totally unnecessary for anyone else with the proper hardware and any kind of perceived improvement in quality is pseudo-science, IMO.
 
I really can't comment on the strengths and weaknesses of IBIS vs OSS. All I know is that OSS works well for me, but the lens I have with OSS are "old" (eg F4 ZA16-35, 28-135 etc) and the "new" lens don't (eg F4 PZ16-35). The result is with no IBIS in camera, and no OSS in new glass, I've no stabilization outside Post Production (and I get weird artifacts doing this).
 
My A1 has served me well in those situations. The big advantage it has over the FX3 and FX30 is that it has a fantastic OLED viewfinder. Built-in ND would be nice though.

BTW, the FX3 and FX30 aren't cinema cameras even though that's what Sony calls them. They are just mirrorless cameras without a EVF.

The a7siii also has a nice viewfinder
 
I should imagine in the next series of cameras IBIS will be handled by oversampled sensors, gyroscope data and dare I say AI thus cheapening lenses and camera bodies and allow VNDs. The trick is to be able to steady low shutter speed footage which is where the current IBIS implementation is superior for the time being.

Camera usage trends are also pointing towards handheld being a thing of the past and gimbals take over so I can see why Sony might be moving away from OSS. The fx3/30 are priced so low that they could be someone's exclusive gimbal cam.

I think there are probably advantages to IBIS and OSS vs some sort of oversampled cropping-only digital stabilization in that you can pivot the angle of the sensor or optical element rather than strict translational movement and digital interpolation.

But I think that Sony will never put eND in a mirrorless body because they want to keep the cost/price down and maintain a distinction for the FX6 and up.

Do you have some sort of statistical data for the decline of handheld shooting? Personally, I hardly shoot handheld anymore and that's been true for several years or more. But it seems that a lot of people on this forum do. And you see a lot of it in commercials, movies, etc. News of its death is greatly exaggerated, methinks.

Fx3 as exclusive gimbal cam due to price -- I actually do a lot of b-roll shoots with 3 gimbals and cameras rigged with different prime lenses (typically starting on 35/50/135) to facilitate speed of lens changes. And the low prices of the cameras (and small size) definitely help that equation. Currently using fx3, a7siii, and a7iv for this purpose but plan to boot the a7iv out of the rotation as soon as Sony releases a full-frame camera with their new AI AF implementation, 4k120p from full readout in full-frame and aps-c crop, variable shutter setting, and dual native ISO.

the Creator was just shot on an fx3 with a budget of $86 million. People are always talking on here about what if a big budget movie were shot with a mirrorless.

Re: codecs, I only shoot either xavc-i (technically xavc s-i on the mirrorless cameras but it's the same thing, 240Mbps for 24p) or H265. Any time the choice is mine, I just shoot h265. I don't notice a visual difference. And computers with hardware acceleration are up to the challenge. A lot of my footage is delivered through the cloud, and it's obviously easier to transfer 400GB tather than 1.2TB. I only shoot xavc-i by client request.
 
I think transcoding to h265 for performance and file size saving is an eminently good idea and itâs something that I wouldnât have considered..

What an utter waste of time and effort. There's so much misinformation and bad practices going on out there among so-called "professionals".
 
H.265 is more compressed than H.264 so the file sizes are smaller, but it isn't a "better" quality codec. If you have some side-by-side examples that prove your claim H.265 is better for grading (assuming all other things are equal) I'd love to see those too. What camera are you basing your analysis on? If H.265 truly makes a "big difference" it should be easy to demonstrate, if you don't mind.

In the meantime, I ain't buying it. In fact, due to the horsepower needed to decompress H.265, I'd say it is a worse codec for post. I won't touch it except for testing purposes.

Doug, have you still not upgraded to an M1 Max or something? I really recommend not waiting if you've got the cash. Not that you'll be transcoding to H.265 but Apple Silicon is such a dramatic step up in every way. Including with codecs - if they're on the hardware-accelerated list they all feel the same, even in Premiere.
 
Doug, have you still not upgraded to an M1 Max or something? I really recommend not waiting if you've got the cash. Not that you'll be transcoding to H.265 but Apple Silicon is such a dramatic step up in every way. Including with codecs - if they're on the hardware-accelerated list they all feel the same, even in Premiere.

No, I haven't had the time to migrate to a new computer. Money is not an issue. I'd like to upgrade if I could snap my fingers and have it done, but too busy to deal with it. In general, I'm all for upgrading whenever it makes sense, but to be honest, my current MacBook Pro can handle anything I throw at it with ease -- except for H.265 and I don't own a single camera of my own that can shoot that CODEC. And even if I had a new Mac I wouldn't use that CODEC anyway because I don't care about reducing file sizes, which is the only argument I can see for it.

Maybe this summer I'll have a break. Heck, the longer I wait, the better the new machine will be when I'm actually ready for it.

I actuallly have this one sitting in my cart already at B&H.
https://www.bhphotovideo.com/c/prod...16m2_33_mbp_16_m2m_12c_38c_gpu_96_2tb_sg.html
 
Apple Silicon is such a dramatic step up in every way. Including with codecs - if they're on the hardware-accelerated list they all feel the same, even in Premiere.

I largely agree but just for the sake of accuracy, I do find that h265 footage has a slight lag when seeking, relative to intraframe h264. At least in programs like vlc. But I find that playback is otherwise smooth. It also probably struggles a bit more under heavy effects, but I haven't done a side by side comparison of the 2 codecs. If I were really pressed for editing time, I would probably shoot in xavc-i. But generally speaking, I can edit h265 on my m1 max without transcoding and without issues.
 
I don't care about reducing file sizes, which is the only argument I can see for it.

I would agree with that

, the longer I wait, the better the new machine will be when I'm actually ready for it.

Supposedly Apple is moving to a 3nm processor architecture with their next release. My thinking was to wait for the 2nd generation of that architecture (so, likely the M4 chip) before upgrading from my M1 Max
 
That's the problem with computers and cameras -- there's always a better one around the corner.

Well, if you only buy when you NEED to make an upgrade, then even if a better one comes out shortly after, it's a moot point since it wasn't available when you needed it.

Of course, by that logic you probably don't need a new computer at all. But I like to buy these things when there's a concrete generational leap. I bought a USB-C MBP, and then I could live in a USB-C world from then on. I bought the M1 Max after they went to Silicon. I'll probably wait for the 2nd generation of their 3nm architecture since 3nm should be an appreciable upgrade but they seem to make refinements on the 2nd gen of everything they do.

And generally I feel like a person would be better served by buying a new mid-tier computer every 3-4 years than buying a top of the line machine and trying to keep it for a decade. 3 or 4 years in, it will be bested by new mid-tier machines.
 
Well, if you only buy when you NEED to make an upgrade, then even if a better one comes out shortly after, it's a moot point since it wasn't available when you needed it.

Of course, by that logic you probably don't need a new computer at all. But I like to buy these things when there's a concrete generational leap. I bought a USB-C MBP, and then I could live in a USB-C world from then on. I bought the M1 Max after they went to Silicon. I'll probably wait for the 2nd generation of their 3nm architecture since 3nm should be an appreciable upgrade but they seem to make refinements on the 2nd gen of everything they do.

And generally I feel like a person would be better served by buying a new mid-tier computer every 3-4 years than buying a top of the line machine and trying to keep it for a decade. 3 or 4 years in, it will be bested by new mid-tier machines.

When Apple moves onto 3nm so too are nVidia and the 5000 series GPUs are expected to yield 2x performance and significant power savings. So the gulf in performance is likely only going to get wider.

I have been researching Mac vs PC performance in Resolve for some time as I need to update my current 5.5 year old PC to cope better with modern codecs. I'm entirely agnostic and have owned and used more Macs than PCs but the performance numbers don't lie, an Intel CPU and 4090 comprehensively beat Apples fastest Mac by a considerable margin, it's not even close.

The biggest issue Apple Silicon has is that it lacks pure GPU performance, the architecture is far too weighted towards the CPU cores which can't be efficiently multithreaded. For modern creative tasks you can get away with an 8 core CPU with a high performance GPU. If Apple's 3nm marks a real refresh of the M chips then they should seriously consider giving more of the SOC die over to GPU cores wit M3.

I still have two classic Mac Pros in the back of my studio (collecting dust) and I can't quite bring myself to take them to the local dump. They're the best computers Apple ever released IMHO, absolute classics. So I was very surprised by the Apple Silicon Mac Pro, not just the ridiculous price but no custom GPUs and no way for high end customers to add in greater levels of compute. It's just a Mac Studio with an expensive PCIe expander box. It has even got the fanboys enraged enough for Apple to come out and 'defend' their decision which they didn't do at all well. Apple has shown that a well engineered system can remove many of the old bottlenecks to performance but it has also shown there's a hard limit to absolute performance too.

Once again Apple is relying upon Steve Jobs' reality distortion field to sell what is a modest performance improvement of the M2 chips over the M1 as a 2x performance jump. It's quite sad they have to resort to these tactics. If you can get a good specced M1 Mac Studio either secondhand or refurbed it would be a much better deal than a new M2 Studio. I think many anticipate M3 being a big change, if I was a Mac buyer I'd be waiting and hoping for M3 to make a step change in performance particularly in the GPU area.
 
Back
Top