Page 66 of 83 FirstFirst ... 165662636465666768697076 ... LastLast
Results 651 to 660 of 828
  1. Collapse Details
    Moderator
    Join Date
    Dec 2005
    Location
    Inland Northwest
    Posts
    13,881
    Default
    I've thrown everything at it that I have in the way of codecs, frame rates, esoteric video files (like alpha channel HD footage), and nothing bogs it down.

    Multi streams do not present a problem. Grading with LUTS, titling, graphics within the normal workflow also do not affect it.

    I've tested the same projects on the 16" Intel with twice the RAM against the 13" M1 and the fans really ramp up on one project on the 16" while the M1 is absolutely quiet. Rendering and exporting is up to 6x times as fast. I don't have any 5, 6 or 8K footage to test.
    David S.



    Accept No Imitations.
    www.dvxuser.com | www.reduser.net | www.scarletuser.com | www.dvxfest.com
    and...
    www.BMCuser.com - The Online Community for Blackmagic Camera users.
    Filmmaking Communities powered by Landmine Media, Inc.


    1 out of 1 members found this post helpful.
    Reply With Quote
     

  2. Collapse Details
    Senior Member
    Join Date
    Feb 2013
    Location
    Vancouver, BC (starting Feb 2021)
    Posts
    548
    Default
    Quote Originally Posted by Thomas Smet View Post
    I think consoles outsell gaming PCs as well because a good gaming PC pretty much has to be built which not everyone has the stones to do. Its getting a bit better now that Walmart sells gaming systems and laptops are getting some decent graphics options if you know where to look for them. Its easier to walk into a Walmart, Target or BestBuy to buy a console vs trying to find that just right laptop for gaming. Plus as you mentioned the insane cost for a half way decent gaming PC.
    One of the goals that I think Microsoft and Sony have is to outperform those budget gaming systems with their consoles, which I think they'll accomplish. The all-in-one design combined with being less expensive plus being able to do multiple duty as a living room media player is all related to that goal. Apple TV is heading down the same road, though I don't know about its gaming chops, due to the fact that I don't know what sort of hardware is in them. I just haven't looked.

    In the video world we are kind of used to buying higher end machines but most people are buying rather dumbed down computers and I would argue there are likely more PC users with computers only capable of casual gaming than there are Mac users.
    By a factor of probably 10-20x.

    There are more PC users and a larger percentage using crap business level machines that can barely run Word and Zoom at the same time.
    Yes, that's one of the biggest markets for volume PCs, where macs comprise a fairly small fraction of those PCs. I've worked in several mac-based development shops, and even there the macs are a minority. The IT staff have been consistent about their reasons: cost. The mac PCs cost more than the Windows PCs.

    There may be more 16" MBP users than there are ROG or Razer Blade users due to how many people know and use the 16" MBP for video, music, photography, graphic design and so forth. I know a ton of Windows users but not a single one that owns a specific gaming Windows laptop. I know a ton of 16" MBP users however. I also know I can't get my hands on a M1 MBA to save my life and even the base models are back ordered until the end of the year. A custom model is likely end of January. That feels like pretty good sales to me.
    There's the benefit of having just one brand to choose from in Apple, compared to dozens on the Windows side, even though most of those Windows systems are made by the same 4-5 manufacturers.

    I also think a lot of casual to budding gamers will appreciate being able to game on machines that have a long battery life and have little to no noise.
    I think you're completely correct, but those machines outsell laptops hand over fist: iPads, Chromebooks, phones...

    To be honest ever since I got a Switch I don't care about gaming on a laptop as much anymore. The Switch with its online store has a ton of games I would only be able to get on Steam and Windows.
    That proves my point. There's a reason that there so many casual games that are ad supported available that can continue making money


    Reply With Quote
     

  3. Collapse Details
    Senior Member
    Join Date
    Feb 2013
    Location
    Vancouver, BC (starting Feb 2021)
    Posts
    548
    Default
    Quote Originally Posted by filmguy123 View Post
    Put another way, the idea that PC gaming is passe because it represents a small percent of the market is, IMO, an improper interpretation of the data. The gaming market at large has grown exponentially, but PC gaming has not become any less popular. High end PC gaming has seen higher numbers and revenues than at any point in history in recent years - it's a bigger market than ever. And, the new consoles primary contribution will be bringing existing technology mainstream...
    You've confused one thing -- tense. I didn't say that PC gaming IS passe, I suggested that it is heading that way. And I don't think that it's because gaming is going to stop growing, it's because most gamers are casual gamers and don't need anything more powerful than a current cell phone, and high end gamers are a relatively small niche, and the new flagship consoles are aimed squarely at that middle ground where the volume is.


    Reply With Quote
     

  4. Collapse Details
    Senior Member
    Join Date
    Jan 2007
    Location
    Portland, OR
    Posts
    4,823
    Default
    Quote Originally Posted by Tamerlin View Post
    I didn't say that PC gaming IS passe, I suggested that it is heading that way.
    Sure, I got that - I am simply saying that the data points to the opposite being true. PC gaming has been growing year over year, and saw record growth this year. It is shrinking in terms of market share due to market expansion, but growing in terms of user base and revenue. I'm not sure how this got so unclear, but it was a side note to the real point - which is that the ARM chips are impressive and could also revolutionize the gaming industry.

    Quote Originally Posted by combatentropy View Post
    Great overview. I am not sure AMD/Intel are in as much trouble as some might thing, because they aren't inherently x86 companies (even though that is the architecture in which they have focused on) as much as they are semiconductor companies. They are software limited right now, but if Apple leads the way to ARM based software, it would stand to reason that we'll see a well optimized Windows 10 with ARM support and x86 emulation, with plenty of developer support. In which case AMD/Intel would be free to develop CPU (and full SOC) designs based on new instruction sets. I don't think they're so far away from SOC based design, either. AMD produces a top tier CPU and GPU, Intel has been working on GPUs for years, both work to develop motherboard chipsets. Meanwhile we have NVIDIA who has been looking at acquiring ARM and could move from GPU to CPU/SOC designs.

    The whole industry may be on the verge of a giant shake up away from x86 - but it won't be Apple exclusive, though they will have an advantage for a while. At this rate, the PC world may be years behind. Will be interesting.
    Quote Originally Posted by David Saraceno View Post
    We imported 4K 60p 4:2:2 10 bit all-I footage from our Sony A7sIII. Data rate was 512 Mbit/s.

    The M1 played it at best quality without any issues at all. The same footage on our 16" 2019 MacBook Pro i9 8-core, 32GB RAM machine brought it to a standstill.
    And in the meantime, this is hard to beat. Extremely curious to see more over the next 2 years, many pros may not wait years to see what happens. And making a platform switch means wherever someone lands, they will stay for years. This is a huge deal for Apple.


    Reply With Quote
     

  5. Collapse Details
    Senior Member
    Join Date
    Jul 2015
    Location
    Bergen, Netherlands
    Posts
    1,304
    Default
    I like the tests of pugetsystems, they know what they're doing:
    https://www.pugetsystems.com/labs/ar...ve-Cloud-1975/

    This M1 seems to work well for Davinci too.


    Reply With Quote
     

  6. Collapse Details
    Senior Member
    Join Date
    Feb 2013
    Location
    Vancouver, BC (starting Feb 2021)
    Posts
    548
    Default
    Quote Originally Posted by filmguy123 View Post
    Sure, I got that - I am simply saying that the data points to the opposite being true. PC gaming has been growing year over year, and saw record growth this year. It is shrinking in terms of market share due to market expansion, but growing in terms of user base and revenue. I'm not sure how this got so unclear, but it was a side note to the real point - which is that the ARM chips are impressive and could also revolutionize the gaming industry.
    Unless the next generation gaming consoles are ARM based, that's very unlikely. And since AMD has the contracts for all of them, it's even more unlikely. And since part of why AMD has those contracts is that they're custom, the odds of Apple getting involved are somewhere between slim and none.

    So for Apple to have any real effect on PC gaming it would have to launch a competitive console.

    It does appear that AMD and nVidia both are aware of the possibility of the gaming consoles shaking up the PC gaming industry in a big way by attracting the gamers in the middle.

    Great overview. I am not sure AMD/Intel are in as much trouble as some might thing, because they aren't inherently x86 companies (even though that is the architecture in which they have focused on) as much as they are semiconductor companies.
    Neither is actually in trouble. Most people who proclaim that are unaware of the fact that both have more than one product. Pseudo-pundits have been claiming that Intel is in trouble for the past few years because of the fact that AMD has taken so much of its market share, but the reality is that Intel anticipated that and responded accordingly. But in semiconductors these responses are multi-year strategies, not off-the-cuff launches. Intel's been branching out quite a bit, expanding its product portfolio into several new (for Intel) markets, including cellular modems and fPGAs. And its Xe GPUs are turning out to be pretty promising, to boot.

    They are software limited right now, but if Apple leads the way to ARM based software, it would stand to reason that we'll see a well optimized Windows 10 with ARM support and x86 emulation, with plenty of developer support. In which case AMD/Intel would be free to develop CPU (and full SOC) designs based on new instruction sets. I don't think they're so far away from SOC based design, either. AMD produces a top tier CPU and GPU, Intel has been working on GPUs for years, both work to develop motherboard chipsets. Meanwhile we have NVIDIA who has been looking at acquiring ARM and could move from GPU to CPU/SOC designs.
    I'm sure nVidia plans on adding higher end SOCs to its lineup, but I also expect that nVidia would like to get the AMD processors out of its products. And what better way to do that than to develop an ARM processor that looks like Fuji's monster?

    This is what is most amusing however: so many people who were completely unaware of how vast the ARM ecosystem already was suddenly think that Apple has done something new by launching an ARM processor, even though Apple has been shipping ARM processors for years -- by the millions. Business wise, it makes no difference to Apple whether or not it can keep up with x86, because as long as it's competitive in the low power consumption niches, it's going to keep selling them, and now it doesn't have to share its profits with Intel.

    That's not a prediction that Apple will not compete with x86 in the high end, just making the point that for Apple might not care enough to bother with that niche.

    The whole industry may be on the verge of a giant shake up away from x86 - but it won't be Apple exclusive, though they will have an advantage for a while. At this rate, the PC world may be years behind. Will be interesting.
    The PC world will almost certainly continue to lead because of how vast the infrastructure that relies on it is, but x86 itself has been heading toward irrelevance for a while thanks to the rapid advance on the GPU side. AMD turned that around a bit with its Zen monsters, but AMD also maintained the cap of 8 cores and 16 threads on its higher end Ryzen 5K processors. Its GPUs are performing beyond expectation because AMD was so conservative with its leaks and because the Big Navi clock speeds are just ridiculous. So why keep pushing the CPU when it can instead join nVidia and Intel in pushing the GPU to massive extremes? Those also deliver higher performance per watt than any CPU outside of the M1...

    And in the meantime, this is hard to beat. Extremely curious to see more over the next 2 years, many pros may not wait years to see what happens. And making a platform switch means wherever someone lands, they will stay for years. This is a huge deal for Apple.
    I don't really understand this platform switch silliness. It's like choosing to ONLY shoot with one brand of cinema camera, even though nearly everything releated like lenses and follow focus systems (etc) are interchangeable. Of course it's harder to own multiple brands when you're looking at Arri and DSMC2 price points, but people rent whatever fits the job there so...

    I'm 100% on Windows right now, but now that there's actually a decent mac available finally, I do plan on adding one in the indeterminate (depends on job, bux, etc) future.


    Reply With Quote
     

  7. Collapse Details
    Senior Member
    Join Date
    Jan 2007
    Location
    Portland, OR
    Posts
    4,823
    Default
    Tamerlin - ARM's potential to revolutionize gaming has nothing to do with today's Sony or Microsoft consoles, it has to do with developers. The next ATV is purportedly rumored to be a major upgrade for the sake of gaming... ARM based. The Nintendo switch is as well an interesting case study (also ARM based). There is no reason to think Sony/MS will remain dominant, or any reason to think that a future Sony/MS console generation in 5-10 years may not be ARM based, especially when x86 emulation is evidently already so strong, and when AMD/Intel may move on from x86 themselves. We'll see what happens. As well, I don't think very many tech savvy people think Apple did anything new simply by using ARM processors, they are impressed that Apple has lead the charge into the desktop world on their mainstream systems, with an all ARM architecture - and hit it out of the park so strongly on day 1.

    Platform switching is a PIA for most organizations for various reasons. I use a MBP and desktop windows PC, I don't find it a big deal. I am also a solo operator, and a geek. Platforms, especially Apple's, are designed from the ground up to be as sticky as possible. Any platform conversions Apple nets in the coming years are statistically meaningful in the longer term. The vast majority of users and businesses do not hop back and forth between platforms every few years, it takes a major inciting incident (say Windows Vista dismay, FCP 7.0 to X dismay, Adobe subscription model, or ARM vs x86 gap). If a small business uproots from a Windows based ecosystem to move to ARM based Macs in 2022 because the PC side doesn't yet have a compelling answer, that's a big deal and won't be quickly reversed.


    Quote Originally Posted by Publimix View Post
    I like the tests of pugetsystems, they know what they're doing:
    https://www.pugetsystems.com/labs/ar...ve-Cloud-1975/

    This M1 seems to work well for Davinci too.
    Not surprising... the real test will be comparing the universal ARM optimized Adobe versions of those apps, running on the pro level M1x cpus, to these desktop workstations. That's a big gap to close, and based on Apple's keynote marketing where they showed the curves of performance to watt and where the M1 cpus are on that curve, I am curious to see how well these ARM chips will scale to higher power loads. I'm simultaneously skeptical and optimistic. It seems reasonable to me that Apple will be able to roughly match the performance of a 5950x and RTX 3080, but we'll see if they can go beyond that, and if so, how far.

    Perhaps the biggest POI is how the M1 chips are decoding h.265 so well (dedicated processor?) and how that could be replicated on PC (next gen amd/nvidia chipset, or cpu instruction set?).


    Reply With Quote
     

  8. Collapse Details
    Default
    That is one point I don't understand from the article that I myself shared. For many years I have heard that computers and phones included silicon dedicated to codecs like H.264 and H.265. So how is that new? When Prores first came out, one of its selling points was that it was so efficient because of its reuse of the hardware acceleration that was already in computers for JPEG (I think?). And how else can phones record 4K 60p MPEG-4, or even more than that, if it were not for dedicated silicon?

    Maybe another way to phrase my question is, given the long-time presence of specialized chips (or at least, sections of chips) for video encoding and decoding, why does your computer still suck?


    Reply With Quote
     

  9. Collapse Details
    Senior Member joema's Avatar
    Join Date
    May 2010
    Location
    Nashville, TN
    Posts
    126
    Default
    Quote Originally Posted by combatentropy View Post
    That is one point I don't understand...For many years I have heard that computers and phones included silicon dedicated to codecs like H.264 and H.265. So how is that new? When Prores first came out, one of its selling points was that it was so efficient because of its reuse of the hardware acceleration that was already in computers for JPEG (I think?). And how else can phones record 4K 60p MPEG-4, or even more than that, if it were not for dedicated silicon?...given the long-time presence of specialized chips (or at least, sections of chips) for video encoding and decoding, why does your computer still suck?
    Excellent question. Several reasons:

    - Codec complexity: We speak of "the H.264 codec" or H.264/HEVC, but they are not one thing. Internally *each* has many different encoding formats, with varying GOP lengths, varying I-frames, B-frames, bit rates, chroma sampling, bit depths, etc. The acceleration hardware isn't like a GPU. It is specific fixed-function hardware that can only handle certain encoding formats.

    - What about cell phones recording and smoothly editing HEVC? The phone knows the exact flavor of HEVC it is using, thus the hardware acceleration can be designed for that.

    - Intel's Quick Sync was first shipped in 2011, why all the problems? What about nVidia's NVDEC/NVENC and AMD's UVD/VCE? Each of those have been through multiple versions, each requiring specific (often buggy) software frameworks to access. Since Quick Sync was more widespread, developers (if they wrote for anything) would typically use that. However app support was often late in coming. I don't think Premiere Pro supported Quick Sync until around 2018 (at least on Mac) -- seven years after it was released.

    - CPU architect bias against special hardware: At Intel especially the focus was minimize special-purpose hardware. With few exceptions such as vector instructions, the viewpoint was the transistor and power budgets should be used to improve the CPU cores, not squandered on a narrow special-purpose application. To this day Xeon still does not have Quick Sync. All Xeon-based Mac Pros before 2019 had no hardware video acceleration, and the 2019 model required Apple to create their own version on the T2 chip.

    - Traditional view (now outdated) that hardware-accelerated video is lower quality.

    - Traditional view (now somewhat outdated) that serious video productions don't use compressed Long GOP codecs. Initially (and for quite a while) H.264 was viewed as a comsumerish toy. I'm not sure if even today there is a widespread standard for 10-bit 4:2:2 H.264, but there is for HEVC. Today higher-end cameras like the Sony FX6 and Panasonic EVA-1 can record 10-bit 4:2:2 HEVC, so the old idea that pros never use Long GOP codecs is gradually fading away. But that early viewpoint diminished the urgency of Intel and app developers making big investments in hardware acceleration.

    - Traditional view that transcoding to an intermediate codec is the way of life. Around 2010 Premiere Pro's Mercury Playback Engine became fast enough on a higher-end PC to smoothly edit 1080p H264 -- even without hardware acceleration. However in that era few cameras used flash storage. Hard to believe but only 10 years ago much acquisition was tape-based and captured to an editable format. Then a few years later Mercury Playback could not handle 4k H264 without hardware support or proxies. At long last Adobe finally added these features. FCPX had those for years before that, but it's just one NLE. With multiple splintered versions of hardware acceleration, NLE developers indifferent to making and supporting the investment to use those, and a long-established post-production procedure of transcoding, progress on hardware acceleration was slow and halting.

    - Lack of collaboration between codec designers and CPU designers. It takes years and millions of $ to design a full-custom CPU or ASIC. During that lag time, camera designers can easily incorporate new codecs which computer hardware cannot support. Think about how many times you get a new still camera, Photoshop won't support the RAW format until you update Adobe Camera RAW. That is just still images which can easily be software decoded. Now imagine that was a video codec that required new full-custom hardware.

    - Why can't the computer use the GPU for this? Because decode/encode of the most difficult codecs use an inherently sequential algorithm. Within a GOP, one frame must be decoded or encoded before the next frame. The only way to accelerate that is via pure hardware, running the core algorithm faster.

    - Each GOP is separate. Why can't each CPU core handle each one in parallel? They do -- that is why all cores are often pegged when processing Long GOP video without hardware accelerations. However some encoding formats use dependent GOPs so they are not always fully independent.

    - Each GOP is separate, why can't the heavily parallel GPU handle each one -- it had hundreds of cores. Because the GPU's lightweight threads are individually not that powerful, and within a GOP the sequential algorithm means only one GPU thread could be used.

    - Given all of the above, why are Apple Silicon SoCs so much better at decode/encode of Long GOP video? They are not always better, but given more agile hardware development and matching software, the results are good in an area where traditional CPUs and NLEs have lagged. But to show there is no magic all-encompassing solution, note the below test comparing FCP and Resolve Beta 17 on various machines and codecs. In general FCP is very fast on M1 hardware for many difficult codecs, but not always. Resolve is currently much less so. However Blackmagic's developers are very responsive so this will likely improve in the near future.

    "M1 Vs INTEL MAC: Video Render SHOWDOWN": https://youtu.be/thaon3b6yEs


    2 out of 2 members found this post helpful.
    Reply With Quote
     

  10. Collapse Details
    Member
    Join Date
    Nov 2020
    Location
    Paris, France
    Posts
    89
    Default
    Quote Originally Posted by Tamerlin View Post
    Intel's been branching out quite a bit, expanding its product portfolio into several new (for Intel) markets, including cellular modems and fPGAs.
    Apple bought Intel’s cellular modem business last year for something like a billion dollars. Is Intel starting again from scratch? That would be a peculiar about-turn.

    Apple’s goal as I understand it is to get rid of Qualcomm modems in its phones as soon as it can, likely making its own ‘M1 of 5G modems’ before very long. I wouldn’t want to be a Qualcomm shareholder when that happens.


    Reply With Quote
     

Page 66 of 83 FirstFirst ... 165662636465666768697076 ... LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •