Page 63 of 91 FirstFirst ... 135359606162636465666773 ... LastLast
Results 621 to 630 of 903
  1. Collapse Details
    Senior Member
    Join Date
    Feb 2013
    Location
    Vancouver, BC (starting Feb 2021)
    Posts
    548
    Default
    Quote Originally Posted by DLD View Post
    Not quite on par - $6B for Apple vs $15B for Netflix - but close to Amazon Prime.
    I have a feeling that's where Apple will dedicate its efforts, rather than trying to break into servers... Foundation actually has me thinking about getting an Apple TV subscription. The one trailer I saw for Foundation looks fantastic.


    Reply With Quote
     

  2. Collapse Details
    Default
    Quote Originally Posted by Tamerlin View Post
    It's not a PowerPC, either.

    A - But its not Intel, my reply was to JAMedia.


    Also wrong. Servers do a lot more than that. At least the ones that I typically code for do.

    A - I am sure servers do a lot more, but their main task is moving data is it not be it for online shopping, banking or Video streaming which tasks had you in mind, housekeeping or specific dedicated applications. My question to you is what process that an Intel x86 or nVidiaA100 does that cannot be coded for an Apple M1.
    If its something very esoteric then perhaps its a custom build but what percentage of the server market is this?


    Not entirely. A datacenter built around efficient but slow processors can actually end up consuming MORE power than one built on high performance processors that burn more power. The nVidia A100 is actually a great example of how -- replacing an ENTIRE datacenter with a single rack that's 2-3x faster than the entire datacenter saves a lot of compute time. The A100 isn't a power miser, but it's so powerful that the overall hardware requirements decrease drastically.

    A - Well I had a look at the nVidia A100 spec sheet, power supply wise its anything from 250 watts to 400 watts compared to 13 watts for an M1, that's a lot more electricity to be paid for. The M1 consuming 30-20 times less power and also benefits from a greatly reduced cost for expensive HVAC management.
    And then there is the cost is the NVidia A100 ballpark $12000 compared to $1000 for a mac mini. 12 times the cost. These are very compelling figures for future project planning, i.e when the accountants run the figures.
    Is the nVidia A100 faster, I have no doubt it is, but how much faster and does it make sense financially, it depends on your needs.


    You're acting like that's news. In reality, you're just not aware of all of the low power datacenter designs that are already out there. Zen3 looks like a power monster on paper when you look only at the wattage per socket, but when you look at its power consumption per core, it ends up not being that much higher than for the M1. Part of that is due to the fact that a lot of the power consumption isn't in the cores themselves, but rather in the interconnects and external I/Os.

    A - I disagree, exactly where did I say this is news, in the context of paying very high costs in Greenland for staff, buildings, local taxes, electricity etc considerable savings can be made moving from Intel to say the M1 and moving the entire server farm to a country where costs are considerably cheaper. Do you not think this will factor into future planning for large datacentres.

    And yet Intel marketing department did the hard sell stating their chips can save money by running a few degrees hotter saving HVAC costs.


    That's why several companies have been building ARM based servers for years. Mostly, because they've been nowhere near as powerful as x86 processors, they've been going into web servers because web servers are latency constrained more than compute constrained, because they spend more time waiting for database queries to complete than they do to complete their own business logic and HTTP request/response processing.

    A - I agree many companies have moved to ARM so why not the Apple M1 chip? It offers remarkable performance and appears to really hammer the Intel offerings in overall terms of performance especially for the server market.


    Even if Apple did decide to start selling M1 CPUs to to 3rd parties, it would be the newcomer trying to break into a market that several companies are already in, with products that are more suitable for that niche, and much higher end. There are already some 100+ core ARM server processors in the works. They'd be pretty unappealing for most of our needs though, due to not having GPUs and not having as much FPU throughput -- again by design. If your primary use case doesn't need it, it's wasted silicon and not a feature.
    A - Apple did in the past sell a server, I have no idea if they would ever enter this market again, as JAMedia pointed out Dell beat them in after-sales service but why not enter the market again? Either sell their own servers or sell/licence some of the new offerings to Dell etc. It would build the Apple ecosystem enormously and they have stated they are going after the higher end PC offerings with the M2 or M1X or whatever they decide to call it.


    Reply With Quote
     

  3. Collapse Details
    Senior Member JAMedia's Avatar
    Join Date
    Jun 2019
    Location
    Birmingham UK
    Posts
    176
    Default
    Quote Originally Posted by vultch2 View Post
    A - Apple did in the past sell a server, I have no idea if they would ever enter this market again, as JAMedia pointed out Dell beat them in after-sales service but why not enter the market again? Either sell their own servers or sell/licence some of the new offerings to Dell etc. It would build the Apple ecosystem enormously and they have stated they are going after the higher end PC offerings with the M2 or M1X or whatever they decide to call it.
    I think that whilst Dell is mainstream Apple would do a server, as they did before, to round out their offering to MAC users. There are many companies using iMac's Mini's and the Pro (also iPads and iPhones) who would love Mac Server Also many small companies with Mac laptops. If they are all tending to the same CPU and OS It will be seamless.

    Re new Mac Pro. IT look like being a "mini pro" with 12 cores.

    I have stopped trusting Apple. I still have a Mac PowerBook G4 (retired) as well as several Intel Macs (Macbooks and Pro ) but Apples direction of travel over the last few years is diverging from mine. It seems that since Steve Jobs took his eye of the ball the vision has changed. primarily to the phone/pad/air user groups.


    Reply With Quote
     

  4. Collapse Details
    Default
    In the "net neutrality" kerfuffle, the big battle was between Netflix and the ISP's because the latter overwhelmed the pipes of the former while refusing the pay for it. As the (presumably) outgoing administration adopted a more of a free market approach, Netflix and the ISP's settled on the streamer paying for its own server clusters inside the ISP hubs. That wasn't exactly inexpensive, amounting to billions of dollars in investment. With the video market pretty much switching to streaming and, for better or worse, with 8K coming up, owning these server clusters is becoming a must. Avoiding the ISP throttling is going to be even more important going forward and this is where Apple can assert its advantage - the financial and the technical muscles - over the others.


    Reply With Quote
     

  5. Collapse Details
    Senior Member indiawilds's Avatar
    Join Date
    Apr 2011
    Location
    New Delhi, India
    Posts
    1,344
    Default
    Given the leaks about M2, how many of you are cancelling your Mac Minis and Macbooks and waiting out?


    Reply With Quote
     

  6. Collapse Details
    Senior Member Thomas Smet's Avatar
    Join Date
    Jul 2005
    Location
    Colorado
    Posts
    2,696
    Default
    Quote Originally Posted by indiawilds View Post
    Given the leaks about M2, how many of you are cancelling your Mac Minis and Macbooks and waiting out?
    Not me because we always knew the 16" and iMac were going to be a lot better. Not sure how anybody could think otherwise. The M1 is exactly what it is a very impressive entry level chip but at the end of the day still an entry level chip. I'm getting a MBA not as a main system for editing but a smaller and lighter long battery portable machine to do a lot of the other stuff I do. It can be used as a very affordable machine in the field to check footage an organize media and a rough cut. I don't see the point buying a 16" for that when I would still do most of my editing on my main desktop like the iMac or my current Mac mini with eGPU.

    It would be nice at times to have a bit more power on the road but it isn't a must. The M1 is fully capable of doing just about anything I can do on my desktop. Just slower rendering which I tend not to do much on the road.

    As for gaming I still think one is better off just getting a cheap PC laptop with a 2060 GPU. Not only is it much faster than the M1 GPU for gaming but Windows will still command a AAA gaming presence for at least a few years. Its going to take years for developers to get on board the M1 and many existing games likely will not be ported to MacOS. Thats why I would rather get a $999 MBA and a $1300 Windows laptop with a 2060 for the same cost as a 16" MBP and have the best of both worlds.


    Reply With Quote
     

  7. Collapse Details
    Senior Member puredrifting's Avatar
    Join Date
    Nov 2004
    Location
    Los Angeles, Ca.
    Posts
    11,457
    Default
    Quote Originally Posted by Thomas Smet View Post
    Not me because we always knew the 16" and iMac were going to be a lot better. Not sure how anybody could think otherwise. The M1 is exactly what it is a very impressive entry level chip but at the end of the day still an entry level chip. I'm getting a MBA not as a main system for editing but a smaller and lighter long battery portable machine to do a lot of the other stuff I do. It can be used as a very affordable machine in the field to check footage an organize media and a rough cut. I don't see the point buying a 16" for that when I would still do most of my editing on my main desktop like the iMac or my current Mac mini with eGPU.

    It would be nice at times to have a bit more power on the road but it isn't a must. The M1 is fully capable of doing just about anything I can do on my desktop. Just slower rendering which I tend not to do much on the road.

    As for gaming I still think one is better off just getting a cheap PC laptop with a 2060 GPU. Not only is it much faster than the M1 GPU for gaming but Windows will still command a AAA gaming presence for at least a few years. Its going to take years for developers to get on board the M1 and many existing games likely will not be ported to MacOS. Thats why I would rather get a $999 MBA and a $1300 Windows laptop with a 2060 for the same cost as a 16" MBP and have the best of both worlds.
    That would/will be so strange if in a couple of years, most gamers migrate to Macs because of the Silicon and speed.
    I have seen all of you computer nerds (I don't know nearly enough about computers to wear the badge 'computer nerd')
    posting about the battle between ARM/M1/Intel/AMD processing but it would be a weird world if
    PC's just became passť for gaming. I doubt that will happen but it would be funny if it did.
    It's a business first and a creative outlet second.
    G.A.S. destroys lives. Stop buying gear that doesn't make you money.


    Reply With Quote
     

  8. Collapse Details
    Senior Member
    Join Date
    Feb 2013
    Location
    Vancouver, BC (starting Feb 2021)
    Posts
    548
    Default
    Quote Originally Posted by vultch2 View Post
    A - Apple did in the past sell a server, I have no idea if they would ever enter this market again, as JAMedia pointed out Dell beat them in after-sales service but why not enter the market again? Either sell their own servers or sell/licence some of the new offerings to Dell etc. It would build the Apple ecosystem enormously and they have stated they are going after the higher end PC offerings with the M2 or M1X or whatever they decide to call it.
    I know about that server. It wasn't exactly a success, and that was during a time when servers were a lot more lucractive than things like cell phones.

    Apple also introduced a SAN which didn't sell well either, but Apple's reps clearly didn't understand why. It was neither the hardware nor the price.


    Reply With Quote
     

  9. Collapse Details
    Senior Member
    Join Date
    Feb 2013
    Location
    Vancouver, BC (starting Feb 2021)
    Posts
    548
    Default
    Quote Originally Posted by puredrifting View Post
    That would/will be so strange if in a couple of years, most gamers migrate to Macs because of the Silicon and speed.
    I have seen all of you computer nerds (I don't know nearly enough about computers to wear the badge 'computer nerd')
    posting about the battle between ARM/M1/Intel/AMD processing but it would be a weird world if
    PC's just became passť for gaming. I doubt that will happen but it would be funny if it did.
    Lately, mobile gaming systems have been gaining popularity over desktop gaming systems, and the two flagship consoles offer 4K gaming with hardware ray tracing.

    PC gaming IS becoming passe, but Apple, having few AAA games available for its platform, has next to nothing to do with it.


    Reply With Quote
     

  10. Collapse Details
    Senior Member
    Join Date
    Jan 2007
    Location
    Portland, OR
    Posts
    4,902
    Default
    Quote Originally Posted by Tamerlin View Post
    PC gaming IS becoming passe
    Heh - based on what? The new consoles with raytracing can barely pull decent frames with raytracing, let alone with RT + full resolution. While they are a good bump from past generations, they are also underwhelming being slightly slower than a PC's GTX 1080 TI from 3 years ago. And right now, the new RTX 3080 series push 3x the Tflops of the new generation... today... and these consoles have a lifespan of 8-10 years? With raytracing and VR coming into its own, needs all of the power it can get. If anything, PC gaming is growing - it's the only place you can get a "true" next gen experience with high resolution raytracing and constant fast frame rates - and that divide will likely grow starker in coming years.

    As of yesterday the lastest snafu is the big game of the year basically only running well on top end PCs: https://wccftech.com/cyberpunk-2077-...-high-end-pcs/ - even the new XBOX/PS5 struggle, as the new generation of consoles bring parity with a typical 2017 high end PC build.

    I believe it would be more accurate to say that gaming has gone mainstream and the market exponentially expanded. While PC gaming is stronger than ever, by far, gaming has grown so much that it no longer has as large a piece of the pie. But that's based on percentages, not raw revenue and player count.


    Quote Originally Posted by puredrifting View Post
    That would/will be so strange if in a couple of years, most gamers migrate to Macs because of the Silicon and speed... it would be a weird world if
    PC's just became passť for gaming. I doubt that will happen but it would be funny if it did.
    Important to separate Apple/Mac from their custom ARM architecture. I think it's very possible that gaming could migrate to the ARM architecture if Intel/AMD can't match what Apple is doing via x86. Apple is first to market with a true x86 desktop rival based on ARM, we'll see what the competition does. Windows already supports the ARM architecture so the questions are: Will someone come along to compete with the big x86 players in the PC space? Will Intel/AMD jump to ARM? Will x86 see some architecture redesigns that leapfrog ARM? (Intel said they are working on their own big/little config), etc.

    Actually, what is more likely and addresses the issue with both underwhelming new console hardware (they are good, I only mean compared to enthusiast PCs) and with x86 vs ARM architecture, is the trend towards streaming gaming and cloud based computing. Google Stadia is the best example of this today, and both Sony & Microsoft have seemingly setup their consoles for a future of streaming gaming where you only need hardware good enough to decompress the stream and maintain high speed, low latency network connection. In which case, gaming will move away from a PC, away from a console, away from x86 or ARM, and to literally any connected device.

    This comes with a fun side bonus of corporations keeping their customers hooked into a rent everything, own nothing model.


    Reply With Quote
     

Page 63 of 91 FirstFirst ... 135359606162636465666773 ... LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •