I've thrown everything at it that I have in the way of codecs, frame rates, esoteric video files (like alpha channel HD footage), and nothing bogs it down.
Multi streams do not present a problem. Grading with LUTS, titling, graphics within the normal workflow also do not affect it.
I've tested the same projects on the 16" Intel with twice the RAM against the 13" M1 and the fans really ramp up on one project on the 16" while the M1 is absolutely quiet. Rendering and exporting is up to 6x times as fast. I don't have any 5, 6 or 8K footage to test.
Results 651 to 660 of 828
-
- Join Date
- Dec 2005
- Location
- Inland Northwest
- Posts
- 13,881
12-11-2020 10:08 AM
David S.
Accept No Imitations.
www.dvxuser.com | www.reduser.net | www.scarletuser.com | www.dvxfest.com
and...
www.BMCuser.com - The Online Community for Blackmagic Camera users.
Filmmaking Communities powered by Landmine Media, Inc.
1 out of 1 members found this post helpful.
-
- Join Date
- Feb 2013
- Location
- Vancouver, BC (starting Feb 2021)
- Posts
- 548
12-11-2020 11:05 AM
One of the goals that I think Microsoft and Sony have is to outperform those budget gaming systems with their consoles, which I think they'll accomplish. The all-in-one design combined with being less expensive plus being able to do multiple duty as a living room media player is all related to that goal. Apple TV is heading down the same road, though I don't know about its gaming chops, due to the fact that I don't know what sort of hardware is in them. I just haven't looked.
In the video world we are kind of used to buying higher end machines but most people are buying rather dumbed down computers and I would argue there are likely more PC users with computers only capable of casual gaming than there are Mac users.
There are more PC users and a larger percentage using crap business level machines that can barely run Word and Zoom at the same time.
There may be more 16" MBP users than there are ROG or Razer Blade users due to how many people know and use the 16" MBP for video, music, photography, graphic design and so forth. I know a ton of Windows users but not a single one that owns a specific gaming Windows laptop. I know a ton of 16" MBP users however. I also know I can't get my hands on a M1 MBA to save my life and even the base models are back ordered until the end of the year. A custom model is likely end of January. That feels like pretty good sales to me.
I also think a lot of casual to budding gamers will appreciate being able to game on machines that have a long battery life and have little to no noise.
To be honest ever since I got a Switch I don't care about gaming on a laptop as much anymore. The Switch with its online store has a ton of games I would only be able to get on Steam and Windows.
-
- Join Date
- Feb 2013
- Location
- Vancouver, BC (starting Feb 2021)
- Posts
- 548
12-11-2020 11:08 AM
You've confused one thing -- tense. I didn't say that PC gaming IS passe, I suggested that it is heading that way. And I don't think that it's because gaming is going to stop growing, it's because most gamers are casual gamers and don't need anything more powerful than a current cell phone, and high end gamers are a relatively small niche, and the new flagship consoles are aimed squarely at that middle ground where the volume is.
-
- Join Date
- Jan 2007
- Location
- Portland, OR
- Posts
- 4,823
12-11-2020 12:26 PM
Sure, I got that - I am simply saying that the data points to the opposite being true. PC gaming has been growing year over year, and saw record growth this year. It is shrinking in terms of market share due to market expansion, but growing in terms of user base and revenue. I'm not sure how this got so unclear, but it was a side note to the real point - which is that the ARM chips are impressive and could also revolutionize the gaming industry.
Great overview. I am not sure AMD/Intel are in as much trouble as some might thing, because they aren't inherently x86 companies (even though that is the architecture in which they have focused on) as much as they are semiconductor companies. They are software limited right now, but if Apple leads the way to ARM based software, it would stand to reason that we'll see a well optimized Windows 10 with ARM support and x86 emulation, with plenty of developer support. In which case AMD/Intel would be free to develop CPU (and full SOC) designs based on new instruction sets. I don't think they're so far away from SOC based design, either. AMD produces a top tier CPU and GPU, Intel has been working on GPUs for years, both work to develop motherboard chipsets. Meanwhile we have NVIDIA who has been looking at acquiring ARM and could move from GPU to CPU/SOC designs.
The whole industry may be on the verge of a giant shake up away from x86 - but it won't be Apple exclusive, though they will have an advantage for a while. At this rate, the PC world may be years behind. Will be interesting.
And in the meantime, this is hard to beat. Extremely curious to see more over the next 2 years, many pros may not wait years to see what happens. And making a platform switch means wherever someone lands, they will stay for years. This is a huge deal for Apple.
-
- Join Date
- Jul 2015
- Location
- Bergen, Netherlands
- Posts
- 1,304
12-11-2020 01:13 PM
I like the tests of pugetsystems, they know what they're doing:
https://www.pugetsystems.com/labs/ar...ve-Cloud-1975/
This M1 seems to work well for Davinci too.
-
- Join Date
- Feb 2013
- Location
- Vancouver, BC (starting Feb 2021)
- Posts
- 548
12-11-2020 01:29 PM
Unless the next generation gaming consoles are ARM based, that's very unlikely. And since AMD has the contracts for all of them, it's even more unlikely. And since part of why AMD has those contracts is that they're custom, the odds of Apple getting involved are somewhere between slim and none.
So for Apple to have any real effect on PC gaming it would have to launch a competitive console.
It does appear that AMD and nVidia both are aware of the possibility of the gaming consoles shaking up the PC gaming industry in a big way by attracting the gamers in the middle.
Great overview. I am not sure AMD/Intel are in as much trouble as some might thing, because they aren't inherently x86 companies (even though that is the architecture in which they have focused on) as much as they are semiconductor companies.
They are software limited right now, but if Apple leads the way to ARM based software, it would stand to reason that we'll see a well optimized Windows 10 with ARM support and x86 emulation, with plenty of developer support. In which case AMD/Intel would be free to develop CPU (and full SOC) designs based on new instruction sets. I don't think they're so far away from SOC based design, either. AMD produces a top tier CPU and GPU, Intel has been working on GPUs for years, both work to develop motherboard chipsets. Meanwhile we have NVIDIA who has been looking at acquiring ARM and could move from GPU to CPU/SOC designs.
This is what is most amusing however: so many people who were completely unaware of how vast the ARM ecosystem already was suddenly think that Apple has done something new by launching an ARM processor, even though Apple has been shipping ARM processors for years -- by the millions. Business wise, it makes no difference to Apple whether or not it can keep up with x86, because as long as it's competitive in the low power consumption niches, it's going to keep selling them, and now it doesn't have to share its profits with Intel.
That's not a prediction that Apple will not compete with x86 in the high end, just making the point that for Apple might not care enough to bother with that niche.
The whole industry may be on the verge of a giant shake up away from x86 - but it won't be Apple exclusive, though they will have an advantage for a while. At this rate, the PC world may be years behind. Will be interesting.
And in the meantime, this is hard to beat. Extremely curious to see more over the next 2 years, many pros may not wait years to see what happens. And making a platform switch means wherever someone lands, they will stay for years. This is a huge deal for Apple.
I'm 100% on Windows right now, but now that there's actually a decent mac available finally, I do plan on adding one in the indeterminate (depends on job, bux, etc) future.
-
- Join Date
- Jan 2007
- Location
- Portland, OR
- Posts
- 4,823
12-11-2020 02:15 PM
Tamerlin - ARM's potential to revolutionize gaming has nothing to do with today's Sony or Microsoft consoles, it has to do with developers. The next ATV is purportedly rumored to be a major upgrade for the sake of gaming... ARM based. The Nintendo switch is as well an interesting case study (also ARM based). There is no reason to think Sony/MS will remain dominant, or any reason to think that a future Sony/MS console generation in 5-10 years may not be ARM based, especially when x86 emulation is evidently already so strong, and when AMD/Intel may move on from x86 themselves. We'll see what happens. As well, I don't think very many tech savvy people think Apple did anything new simply by using ARM processors, they are impressed that Apple has lead the charge into the desktop world on their mainstream systems, with an all ARM architecture - and hit it out of the park so strongly on day 1.
Platform switching is a PIA for most organizations for various reasons. I use a MBP and desktop windows PC, I don't find it a big deal. I am also a solo operator, and a geek. Platforms, especially Apple's, are designed from the ground up to be as sticky as possible. Any platform conversions Apple nets in the coming years are statistically meaningful in the longer term. The vast majority of users and businesses do not hop back and forth between platforms every few years, it takes a major inciting incident (say Windows Vista dismay, FCP 7.0 to X dismay, Adobe subscription model, or ARM vs x86 gap). If a small business uproots from a Windows based ecosystem to move to ARM based Macs in 2022 because the PC side doesn't yet have a compelling answer, that's a big deal and won't be quickly reversed.
Not surprising... the real test will be comparing the universal ARM optimized Adobe versions of those apps, running on the pro level M1x cpus, to these desktop workstations. That's a big gap to close, and based on Apple's keynote marketing where they showed the curves of performance to watt and where the M1 cpus are on that curve, I am curious to see how well these ARM chips will scale to higher power loads. I'm simultaneously skeptical and optimistic. It seems reasonable to me that Apple will be able to roughly match the performance of a 5950x and RTX 3080, but we'll see if they can go beyond that, and if so, how far.
Perhaps the biggest POI is how the M1 chips are decoding h.265 so well (dedicated processor?) and how that could be replicated on PC (next gen amd/nvidia chipset, or cpu instruction set?).
-
- Join Date
- Mar 2008
- Posts
- 2,480
12-11-2020 04:15 PM
That is one point I don't understand from the article that I myself shared. For many years I have heard that computers and phones included silicon dedicated to codecs like H.264 and H.265. So how is that new? When Prores first came out, one of its selling points was that it was so efficient because of its reuse of the hardware acceleration that was already in computers for JPEG (I think?). And how else can phones record 4K 60p MPEG-4, or even more than that, if it were not for dedicated silicon?
Maybe another way to phrase my question is, given the long-time presence of specialized chips (or at least, sections of chips) for video encoding and decoding, why does your computer still suck?
-
12-12-2020 05:54 AM
Excellent question. Several reasons:
- Codec complexity: We speak of "the H.264 codec" or H.264/HEVC, but they are not one thing. Internally *each* has many different encoding formats, with varying GOP lengths, varying I-frames, B-frames, bit rates, chroma sampling, bit depths, etc. The acceleration hardware isn't like a GPU. It is specific fixed-function hardware that can only handle certain encoding formats.
- What about cell phones recording and smoothly editing HEVC? The phone knows the exact flavor of HEVC it is using, thus the hardware acceleration can be designed for that.
- Intel's Quick Sync was first shipped in 2011, why all the problems? What about nVidia's NVDEC/NVENC and AMD's UVD/VCE? Each of those have been through multiple versions, each requiring specific (often buggy) software frameworks to access. Since Quick Sync was more widespread, developers (if they wrote for anything) would typically use that. However app support was often late in coming. I don't think Premiere Pro supported Quick Sync until around 2018 (at least on Mac) -- seven years after it was released.
- CPU architect bias against special hardware: At Intel especially the focus was minimize special-purpose hardware. With few exceptions such as vector instructions, the viewpoint was the transistor and power budgets should be used to improve the CPU cores, not squandered on a narrow special-purpose application. To this day Xeon still does not have Quick Sync. All Xeon-based Mac Pros before 2019 had no hardware video acceleration, and the 2019 model required Apple to create their own version on the T2 chip.
- Traditional view (now outdated) that hardware-accelerated video is lower quality.
- Traditional view (now somewhat outdated) that serious video productions don't use compressed Long GOP codecs. Initially (and for quite a while) H.264 was viewed as a comsumerish toy. I'm not sure if even today there is a widespread standard for 10-bit 4:2:2 H.264, but there is for HEVC. Today higher-end cameras like the Sony FX6 and Panasonic EVA-1 can record 10-bit 4:2:2 HEVC, so the old idea that pros never use Long GOP codecs is gradually fading away. But that early viewpoint diminished the urgency of Intel and app developers making big investments in hardware acceleration.
- Traditional view that transcoding to an intermediate codec is the way of life. Around 2010 Premiere Pro's Mercury Playback Engine became fast enough on a higher-end PC to smoothly edit 1080p H264 -- even without hardware acceleration. However in that era few cameras used flash storage. Hard to believe but only 10 years ago much acquisition was tape-based and captured to an editable format. Then a few years later Mercury Playback could not handle 4k H264 without hardware support or proxies. At long last Adobe finally added these features. FCPX had those for years before that, but it's just one NLE. With multiple splintered versions of hardware acceleration, NLE developers indifferent to making and supporting the investment to use those, and a long-established post-production procedure of transcoding, progress on hardware acceleration was slow and halting.
- Lack of collaboration between codec designers and CPU designers. It takes years and millions of $ to design a full-custom CPU or ASIC. During that lag time, camera designers can easily incorporate new codecs which computer hardware cannot support. Think about how many times you get a new still camera, Photoshop won't support the RAW format until you update Adobe Camera RAW. That is just still images which can easily be software decoded. Now imagine that was a video codec that required new full-custom hardware.
- Why can't the computer use the GPU for this? Because decode/encode of the most difficult codecs use an inherently sequential algorithm. Within a GOP, one frame must be decoded or encoded before the next frame. The only way to accelerate that is via pure hardware, running the core algorithm faster.
- Each GOP is separate. Why can't each CPU core handle each one in parallel? They do -- that is why all cores are often pegged when processing Long GOP video without hardware accelerations. However some encoding formats use dependent GOPs so they are not always fully independent.
- Each GOP is separate, why can't the heavily parallel GPU handle each one -- it had hundreds of cores. Because the GPU's lightweight threads are individually not that powerful, and within a GOP the sequential algorithm means only one GPU thread could be used.
- Given all of the above, why are Apple Silicon SoCs so much better at decode/encode of Long GOP video? They are not always better, but given more agile hardware development and matching software, the results are good in an area where traditional CPUs and NLEs have lagged. But to show there is no magic all-encompassing solution, note the below test comparing FCP and Resolve Beta 17 on various machines and codecs. In general FCP is very fast on M1 hardware for many difficult codecs, but not always. Resolve is currently much less so. However Blackmagic's developers are very responsive so this will likely improve in the near future.
"M1 Vs INTEL MAC: Video Render SHOWDOWN": https://youtu.be/thaon3b6yEs
2 out of 2 members found this post helpful.
-
- Join Date
- Nov 2020
- Location
- Paris, France
- Posts
- 89
12-12-2020 08:05 AM
Apple bought Intel’s cellular modem business last year for something like a billion dollars. Is Intel starting again from scratch? That would be a peculiar about-turn.
Apple’s goal as I understand it is to get rid of Qualcomm modems in its phones as soon as it can, likely making its own ‘M1 of 5G modems’ before very long. I wouldn’t want to be a Qualcomm shareholder when that happens.