Intel has a faster processor than M2 Max, but at what cost?

124»

Comments

  • Reply 61 of 80
    Hreb said:
    A comparison between an ultra high end macbook pro (which will never replace a gaming laptop) and an ultra high end gaming laptop (which will never replace a mac, or really be used for anything other than gaming).  Such foolishness.  But that's EXACTLY the comparison Apple went for when they first introduced the Apple silicon macbook pro.  So here we are.
    Nonsense. Their comparisons they went for were largely professional video work, which is a large segment for them that happens to also overlap with the PC gaming market because duh, they all use the same processors and buy similar laptops. Go figure. 
  • Reply 62 of 80
    avon b7 said:

    A MBP will give you more battery time but an Intel laptop of this kind will do the job well enough if you aren't pushing it to the limit (which you would only do with it plugged in anyway). 
    As someone who works in animation and video, I can tell you firsthand that the difference between battery life on my 16" M1 Max and my previous 2018 i9 MBP is *HUGE*, and I work away from my desks all the time. Being able to work for hours in After Effects/FCP without having to look for an outlet is a game changer for me, and for a lot of people (see the M2 Pro/Max unplugged challenge video Apple showed at that event). My 140W charger rarely comes out of my bag.

    You're completely missing the point when you say "which you would only do with it plugged in anyway" — the point is, you no longer have to do that with Apple Silicon. That's game changing.
    spherictmaychasmVermelhomacike
  • Reply 63 of 80
    DuhSesame said:
    Don’t just eager to defend Apple here.  If Apple is truly comfortable at the M2 Max, they won’t be prototyping a 2-die variant for so long.

    Let’s be honest, Apple is playing catch-up in performance.
    What, the M2 Ultra? What makes you think they're "prototyping it so long" and not just waiting to finish the Mac Pro development to release it? They literally just released the M2 Pro/Max a month ago. The Ultra is just two Max stitched together, so what's there to prototype?

    Let's be honest, you have no idea what you're talking about.

    DuhSesame said:
    If you don’t understand what I mean: Apple can easily build a silicon that’s faster than x86 Flagships and running at lower power.  So much so it can fit into a laptop.

    Instead we’re stuck at a four-year old performance level because whatever reason for Apple.
    LOL "four year old performance". According to who? Compared to what? What's the basis for your claim that Apple can "easily" do what you say?

    Further evidence you have no idea what you're talking about.
    chasmmacike
  • Reply 64 of 80
    dewme said:

    The operating systems and the applications that are optimized for those operating systems have a hell of a lot more sway over buying behaviors than do benchmarks or narrow slices of very domain specific applications. No matter what platform I’ve ever used, the computer always spends a heck of a lot more time waiting on me than I spend waiting on the computer. But if you get paid to run benchmarks the situation may be reversed. 
    Or, maybe if you actually work in the applications they used for these benchmarks? What a weird dismissal of the target audience for this comparison, because you're not one of those people.
  • Reply 65 of 80
    dewmedewme Posts: 5,372member
    dewme said:

    The operating systems and the applications that are optimized for those operating systems have a hell of a lot more sway over buying behaviors than do benchmarks or narrow slices of very domain specific applications. No matter what platform I’ve ever used, the computer always spends a heck of a lot more time waiting on me than I spend waiting on the computer. But if you get paid to run benchmarks the situation may be reversed. 
    Or, maybe if you actually work in the applications they used for these benchmarks? What a weird dismissal of the target audience for this comparison, because you're not one of those people.
    I have to agree with you. If your productivity and/or your happiness is directly tied to running the exact applications with the same or similar datasets to those used in a benchmark comparison then you should be seeking out the best computer that delivers those things for you. If you’re faced with a choice between computer A and computer B a particular benchmark that you care about may sway you one way or the other, everything else being equal. 

    The problem I have with Mac vs PC comparisons is that it is fairly rare for “everything else being equal.” This leads me to be dismissive of most benchmarks in such scenarios like the one in this article, mostly because they tend to be too narrowly focused rather than geared towards the more typical way most people use computers for a variety of tasks. They also tend to overlook the entrenchment effect that exists in both camps. 

    A lot of the reasons and rationale being used on both sides of the debate are mostly being used to justify the individual choices and preferences that have already been established and ingrained into one’s existing belief system, like the necessity of plugging the PC into an existing power source to attain full performance. If the tables were turned, say by Apple offering up a big performance boost when plugged in, there would be plenty of defenders of that option. 

    A lot of what we’re seeing is similar to car comparisons and benchmarks like 0-60 times. The 0-60 benchmarks may be completely legitimate, but they are rarely the deciding factor in choosing one vehicle type or vehicle brand over another. The entire vehicle taken in aggregate plus the needs of the buyer and their budget have greater sway than 0-60 times. But sure, to your point, if you are a professional drag racer of showroom stock cars and want to win races, you’re going to pick the car with the fastest 0-60 time and probably don’t care as much about the leather seats, cargo space, and seating capacity for 8 people. 
    edited February 2023 muthuk_vanalingamspheric
  • Reply 66 of 80
    dewme said:
    dewme said:

    The operating systems and the applications that are optimized for those operating systems have a hell of a lot more sway over buying behaviors than do benchmarks or narrow slices of very domain specific applications. No matter what platform I’ve ever used, the computer always spends a heck of a lot more time waiting on me than I spend waiting on the computer. But if you get paid to run benchmarks the situation may be reversed. 
    Or, maybe if you actually work in the applications they used for these benchmarks? What a weird dismissal of the target audience for this comparison, because you're not one of those people.
    I have to agree with you. If your productivity and/or your happiness is directly tied to running the exact applications with the same or similar datasets to those used in a benchmark comparison then you should be seeking out the best computer that delivers those things for you. If you’re faced with a choice between computer A and computer B a particular benchmark that you care about may sway you one way or the other, everything else being equal. 

    The problem I have with Mac vs PC comparisons is that it is fairly rare for “everything else being equal.” This leads me to be dismissive of most benchmarks in such scenarios like the one in this article, mostly because they tend to be too narrowly focused rather than geared towards the more typical way most people use computers for a variety of tasks. They also tend to overlook the entrenchment effect that exists in both camps. 

    A lot of the reasons and rationale being used on both sides of the debate are mostly being used to justify the individual choices and preferences that have already been established and ingrained into one’s existing belief system, like the necessity of plugging the PC into an existing power source to attain full performance. If the tables were turned, say by Apple offering up a big performance boost when plugged in, there would be plenty of defenders of that option. 

    A lot of what we’re seeing is similar to car comparisons and benchmarks like 0-60 times. The 0-60 benchmarks may be completely legitimate, but they are rarely the deciding factor in choosing one vehicle type or vehicle brand over another. The entire vehicle taken in aggregate plus the needs of the buyer and their budget have greater sway than 0-60 times. But sure, to your point, if you are a professional drag racer of showroom stock cars and want to win races, you’re going to pick the car with the fastest 0-60 time and probably don’t care as much about the leather seats, cargo space, and seating capacity for 8 people. 
    This isn't a Mac vs PC comparison inasmuch as it's a comparison of raw power doing tasks that can measure just that using these chips. It's very plainly stated in the article what they're measuring. Not "the more typical way most people use computers for a variety of tasks". Not "leather seats" etc. And, once again, there are plenty of people who work in software like Premiere, Cinema 4D, Blender, etc all day long to whom this information is useful, myself included. 
    sphericchasmmacike
  • Reply 67 of 80
    dewmedewme Posts: 5,372member
    dewme said:
    dewme said:

    The operating systems and the applications that are optimized for those operating systems have a hell of a lot more sway over buying behaviors than do benchmarks or narrow slices of very domain specific applications. No matter what platform I’ve ever used, the computer always spends a heck of a lot more time waiting on me than I spend waiting on the computer. But if you get paid to run benchmarks the situation may be reversed. 
    Or, maybe if you actually work in the applications they used for these benchmarks? What a weird dismissal of the target audience for this comparison, because you're not one of those people.
    I have to agree with you. If your productivity and/or your happiness is directly tied to running the exact applications with the same or similar datasets to those used in a benchmark comparison then you should be seeking out the best computer that delivers those things for you. If you’re faced with a choice between computer A and computer B a particular benchmark that you care about may sway you one way or the other, everything else being equal. 

    The problem I have with Mac vs PC comparisons is that it is fairly rare for “everything else being equal.” This leads me to be dismissive of most benchmarks in such scenarios like the one in this article, mostly because they tend to be too narrowly focused rather than geared towards the more typical way most people use computers for a variety of tasks. They also tend to overlook the entrenchment effect that exists in both camps. 

    A lot of the reasons and rationale being used on both sides of the debate are mostly being used to justify the individual choices and preferences that have already been established and ingrained into one’s existing belief system, like the necessity of plugging the PC into an existing power source to attain full performance. If the tables were turned, say by Apple offering up a big performance boost when plugged in, there would be plenty of defenders of that option. 

    A lot of what we’re seeing is similar to car comparisons and benchmarks like 0-60 times. The 0-60 benchmarks may be completely legitimate, but they are rarely the deciding factor in choosing one vehicle type or vehicle brand over another. The entire vehicle taken in aggregate plus the needs of the buyer and their budget have greater sway than 0-60 times. But sure, to your point, if you are a professional drag racer of showroom stock cars and want to win races, you’re going to pick the car with the fastest 0-60 time and probably don’t care as much about the leather seats, cargo space, and seating capacity for 8 people. 
    This isn't a Mac vs PC comparison inasmuch as it's a comparison of raw power doing tasks that can measure just that using these chips. It's very plainly stated in the article what they're measuring. Not "the more typical way most people use computers for a variety of tasks". Not "leather seats" etc. And, once again, there are plenty of people who work in software like Premiere, Cinema 4D, Blender, etc all day long to whom this information is useful, myself included. 
    Would you personally switch platforms based on one particular product’s measurable performance advantage on a set of benchmarks that matter to you? If a year later the tables are turned, would you then go back the other way? If my income was tightly bound to the performance deltas and the apps in question were a revenue affecting constraint/bottleneck in my overall business process then I would have to answer “yes.” Most business people don’t want to leave money on the table, regardless of their personal preferences.

    On the other hand, I do understand what you and others are saying with respect to a particular processor+GPU combination from the overwhelmingly dominant personal computing platform clearly demonstrating that for certain applications, like GPU intensive creativity apps and games, the current “best benchmark leader” that Apple has available is clearly not the fastest option in the overall market, a market that Apple owns about 10% of. But this has generally been the case for a very long time, especially when it comes to gaming.

    The real question is whether these obvious disparities at the processing level will inspire Apple to up their game. I think that for as long as Apple enjoys its advantage across the “variety of tasks” spectrum with its intensely loyal 10% market share customer base the answer is “eventually.” Apple set out on a path with Apple Silicon knowing where their pros/cons were and with some stark advantages in some areas like performance per watt, single threaded performance, and graphics performance for an integrated CPU based system. They are now iterating on their initial design to improve on the pros and lessen the cons.

    Apple’s competition is absolutely doing the same thing in terms of their own set of pros/cons. I’m sure that the competition’s answer to the same question, whether their obvious disparities in areas where Apple dominates their current offerings will inspire them to up their game is also “eventually.” The relative market share disparity hasn’t changed a whole lot either way in a very long time and Apple is doing extremely well on the revenue front with their small slice. They are sticking to a roadmap that is serving them very well. In the meantime don’t venture too far from a wall plug if you have a discrete GPU based PC and try to force yourself to get excited about Apple Arcade if you’re a Mac user.
    muthuk_vanalingamsphericthadec
  • Reply 68 of 80
    Look below. The reality: Intel's competiton is AMD and vice versa. Neither of them - or the fans of the PC ecosystem (inclusive of Windows, ChromeOS and Linux with the latter having been a real factor in workstations and servers for decades and now is a factor in gaming thanks to the AMD-powered Steam Deck) - care about Macs. 

    https://videocardz.com/newz/amd-ryzen-9-7945hx-16-core-laptop-cpu-trades-places-with-intel-13th-gen-core-i9-hx-series-in-geekbench-leak
    chasm
  • Reply 69 of 80
    avon b7avon b7 Posts: 7,693member
    avon b7 said:

    A MBP will give you more battery time but an Intel laptop of this kind will do the job well enough if you aren't pushing it to the limit (which you would only do with it plugged in anyway). 
    As someone who works in animation and video, I can tell you firsthand that the difference between battery life on my 16" M1 Max and my previous 2018 i9 MBP is *HUGE*, and I work away from my desks all the time. Being able to work for hours in After Effects/FCP without having to look for an outlet is a game changer for me, and for a lot of people (see the M2 Pro/Max unplugged challenge video Apple showed at that event). My 140W charger rarely comes out of my bag.

    You're completely missing the point when you say "which you would only do with it plugged in anyway" — the point is, you no longer have to do that with Apple Silicon. That's game changing.
    I am not missing the point. 

    The point isn't battery life. The point is performance. Reduced battery life is simply a byproduct of the performance and is rendered (pun?) worthless when evaluating exactly and only that.

    As I said, anyone considering this kind of machine is fully aware of the battery powered tradeoffs. 

    You may work away from your desk most of the time but are you claiming there are no sockets on hand almost everywhere you work? 
    muthuk_vanalingam
  • Reply 70 of 80
    dewme said:
    dewme said:
    dewme said:

    The operating systems and the applications that are optimized for those operating systems have a hell of a lot more sway over buying behaviors than do benchmarks or narrow slices of very domain specific applications. No matter what platform I’ve ever used, the computer always spends a heck of a lot more time waiting on me than I spend waiting on the computer. But if you get paid to run benchmarks the situation may be reversed. 
    Or, maybe if you actually work in the applications they used for these benchmarks? What a weird dismissal of the target audience for this comparison, because you're not one of those people.
    I have to agree with you. If your productivity and/or your happiness is directly tied to running the exact applications with the same or similar datasets to those used in a benchmark comparison then you should be seeking out the best computer that delivers those things for you. If you’re faced with a choice between computer A and computer B a particular benchmark that you care about may sway you one way or the other, everything else being equal. 

    The problem I have with Mac vs PC comparisons is that it is fairly rare for “everything else being equal.” This leads me to be dismissive of most benchmarks in such scenarios like the one in this article, mostly because they tend to be too narrowly focused rather than geared towards the more typical way most people use computers for a variety of tasks. They also tend to overlook the entrenchment effect that exists in both camps. 

    A lot of the reasons and rationale being used on both sides of the debate are mostly being used to justify the individual choices and preferences that have already been established and ingrained into one’s existing belief system, like the necessity of plugging the PC into an existing power source to attain full performance. If the tables were turned, say by Apple offering up a big performance boost when plugged in, there would be plenty of defenders of that option. 

    A lot of what we’re seeing is similar to car comparisons and benchmarks like 0-60 times. The 0-60 benchmarks may be completely legitimate, but they are rarely the deciding factor in choosing one vehicle type or vehicle brand over another. The entire vehicle taken in aggregate plus the needs of the buyer and their budget have greater sway than 0-60 times. But sure, to your point, if you are a professional drag racer of showroom stock cars and want to win races, you’re going to pick the car with the fastest 0-60 time and probably don’t care as much about the leather seats, cargo space, and seating capacity for 8 people. 
    This isn't a Mac vs PC comparison inasmuch as it's a comparison of raw power doing tasks that can measure just that using these chips. It's very plainly stated in the article what they're measuring. Not "the more typical way most people use computers for a variety of tasks". Not "leather seats" etc. And, once again, there are plenty of people who work in software like Premiere, Cinema 4D, Blender, etc all day long to whom this information is useful, myself included. 
    Would you personally switch platforms based on one particular product’s measurable performance advantage on a set of benchmarks that matter to you? If a year later the tables are turned, would you then go back the other way? If my income was tightly bound to the performance deltas and the apps in question were a revenue affecting constraint/bottleneck in my overall business process then I would have to answer “yes.” Most business people don’t want to leave money on the table, regardless of their personal preferences.

    On the other hand, I do understand what you and others are saying with respect to a particular processor+GPU combination from the overwhelmingly dominant personal computing platform clearly demonstrating that for certain applications, like GPU intensive creativity apps and games, the current “best benchmark leader” that Apple has available is clearly not the fastest option in the overall market, a market that Apple owns about 10% of. But this has generally been the case for a very long time, especially when it comes to gaming.

    The real question is whether these obvious disparities at the processing level will inspire Apple to up their game. I think that for as long as Apple enjoys its advantage across the “variety of tasks” spectrum with its intensely loyal 10% market share customer base the answer is “eventually.” Apple set out on a path with Apple Silicon knowing where their pros/cons were and with some stark advantages in some areas like performance per watt, single threaded performance, and graphics performance for an integrated CPU based system. They are now iterating on their initial design to improve on the pros and lessen the cons.

    Apple’s competition is absolutely doing the same thing in terms of their own set of pros/cons. I’m sure that the competition’s answer to the same question, whether their obvious disparities in areas where Apple dominates their current offerings will inspire them to up their game is also “eventually.” The relative market share disparity hasn’t changed a whole lot either way in a very long time and Apple is doing extremely well on the revenue front with their small slice. They are sticking to a roadmap that is serving them very well. In the meantime don’t venture too far from a wall plug if you have a discrete GPU based PC and try to force yourself to get excited about Apple Arcade if you’re a Mac user.
    Apple's "stark advantages" in single threaded performance was never as big as the Apple fandom claimed, as benchmarks done not long after the M1 became available showed, and became even less so after 12th gen, 13th gen Intel and AMD Ryzen 4 was released. Releasing the M1 Pro/Max/Ultra and the M2 line widened the gap back again, but it was obvious that this was temporary. The main thing is that no one in the x86 world cares about single thread performance anyway because Intel and AMD cores have had 2 threads for decades and most applications have been written in that time to take advantage of them. So the gloating over single threaded performance was funny because it is a useless metric: single core performance (which measures ALL the threads available on the core) is far more useful. Ironically, if there is an application where single threaded performance matters it is gaming! Where the situation for Apple is hopeless because AAA games - whether console or "PC" - are nearly all 32 bit and use DirectX or Vulkan drivers. 

    Apple's "stark advantage" in power per watt is mostly due to being on TSMC 5 where AMD until recently was on TSMC 6 or TSMC 7 and Intel was on 14nm and is now on 10nm. You have already seen where AMD's 7040 4nm laptop chips outperform certain Apple Silicon chips on power per watt. But the big issue is going to be Intel because - unlike AMD - they have adopted big.LITTLE. When their laptop chips reach 5nm (Intel calls it 20A) they are going to rival Apple Silicon in power per watt also.

    Apple's market share was only 10% in the past year. Normally it is 5% to 7%. Also "In the meantime don’t venture too far from a wall plug if you have a discrete GPU based PC" simply isn't true. It wasn't true of the Intel MacBook Pros that you guys happily used right up until 4Q 2020 so don't pretend that it is the case now. Even Intel Core i9 laptops with the most power-hungry Nvidia graphics cards get over 3 hours battery life. CPUs and discrete GPUs that are actually designed for mobile laptops - as opposed to "laptops as workstations" - get 7 to 8 hours. Also, I would like to know the people that are spending 11 hours editing 4K videos with their MacBook Pros on their laps. They aren't. Their MacBooks are plugged into the wall just as their Wintel brethren are, because pretty much any actual work on a laptop requires a monitor. The only performance-intensive application where using the 14" laptop screen and keyboard/trackpad for hours instead of a monitor (when a "laptop" actually sits in your lap) is - again - gaming where Apple will need to address the 32 bit and driver situation. 


    muthuk_vanalingammacike
  • Reply 71 of 80
    sphericspheric Posts: 2,564member
    I don't know how other DAWs handle it, but in Logic, each channel strip is its own monolithic thread. 

    So single-thread performance directly affects which plugins, and how many, you can insert on a single channel strip. 

    There are workarounds, but if a single channel exceeds the single-thread performance of the CPU, the whole project comes to an abrupt halt, even if 80% of the total power goes unused. 

    I'm sure there must be other examples in the Windows world outside of gaming where tasks cannot simply be broken up and multi-threaded quite so easily. 
  • Reply 72 of 80
    thttht Posts: 5,450member
    spheric said:
    I don't know how other DAWs handle it, but in Logic, each channel strip is its own monolithic thread. 

    So single-thread performance directly affects which plugins, and how many, you can insert on a single channel strip. 

    There are workarounds, but if a single channel exceeds the single-thread performance of the CPU, the whole project comes to an abrupt halt, even if 80% of the total power goes unused. 

    I'm sure there must be other examples in the Windows world outside of gaming where tasks cannot simply be broken up and multi-threaded quite so easily. 
    Basically 90% of software that 90% of the market uses is dominated by single threaded performance, and I think I'm being generous with the percentages. Web browsers are multi-process, multi-threaded monstrosities, but people don't use multiple web pages simultaneously. It's look at this web page, then look at this other web page, they look at yet another page. They aren't interacting with pages simultaneously. The CPU is idle 99% of time during this process.

    The cores and memory are probably helpful because all the adware inside browsers are being run simultaneously. Yup, 99% of a typical user's computing resources are used by ads. Everytime I visit a webpage with a gazillion pop-up ads, I basically cringe. What a waste. Otherwise, most multithreaded apps are single thread dominant with all but 1 thread waiting on user input. Even the OS is like this. Single thread performance is hugely important for having low latency UI operations. So, the more single thread performance, the better.

    If you are talking about compute intensive workflows, like solving equations, crunching data, then yes, it's nice to have a lot of cores as the embarrassingly parallel workflows will scale well with cores. But those are the "pros". They shouldn't be taking advice from the peanut gallery, which includes reviewers and mass market media.
    edited February 2023
  • Reply 73 of 80
    sphericspheric Posts: 2,564member
    Well, thanks for clarifying that the machines in this comparison are completely irrelevant to 90% of 90% of the market. 

    They’re perfectly well-served by a MacBook Air.


  • Reply 74 of 80
    dewme said:
    dewme said:
    dewme said:

    The operating systems and the applications that are optimized for those operating systems have a hell of a lot more sway over buying behaviors than do benchmarks or narrow slices of very domain specific applications. No matter what platform I’ve ever used, the computer always spends a heck of a lot more time waiting on me than I spend waiting on the computer. But if you get paid to run benchmarks the situation may be reversed. 
    Or, maybe if you actually work in the applications they used for these benchmarks? What a weird dismissal of the target audience for this comparison, because you're not one of those people.
    I have to agree with you. If your productivity and/or your happiness is directly tied to running the exact applications with the same or similar datasets to those used in a benchmark comparison then you should be seeking out the best computer that delivers those things for you. If you’re faced with a choice between computer A and computer B a particular benchmark that you care about may sway you one way or the other, everything else being equal. 

    The problem I have with Mac vs PC comparisons is that it is fairly rare for “everything else being equal.” This leads me to be dismissive of most benchmarks in such scenarios like the one in this article, mostly because they tend to be too narrowly focused rather than geared towards the more typical way most people use computers for a variety of tasks. They also tend to overlook the entrenchment effect that exists in both camps. 

    A lot of the reasons and rationale being used on both sides of the debate are mostly being used to justify the individual choices and preferences that have already been established and ingrained into one’s existing belief system, like the necessity of plugging the PC into an existing power source to attain full performance. If the tables were turned, say by Apple offering up a big performance boost when plugged in, there would be plenty of defenders of that option. 

    A lot of what we’re seeing is similar to car comparisons and benchmarks like 0-60 times. The 0-60 benchmarks may be completely legitimate, but they are rarely the deciding factor in choosing one vehicle type or vehicle brand over another. The entire vehicle taken in aggregate plus the needs of the buyer and their budget have greater sway than 0-60 times. But sure, to your point, if you are a professional drag racer of showroom stock cars and want to win races, you’re going to pick the car with the fastest 0-60 time and probably don’t care as much about the leather seats, cargo space, and seating capacity for 8 people. 
    This isn't a Mac vs PC comparison inasmuch as it's a comparison of raw power doing tasks that can measure just that using these chips. It's very plainly stated in the article what they're measuring. Not "the more typical way most people use computers for a variety of tasks". Not "leather seats" etc. And, once again, there are plenty of people who work in software like Premiere, Cinema 4D, Blender, etc all day long to whom this information is useful, myself included. 
    Would you personally switch platforms based on one particular product’s measurable performance advantage on a set of benchmarks that matter to you? If a year later the tables are turned, would you then go back the other way? If my income was tightly bound to the performance deltas and the apps in question were a revenue affecting constraint/bottleneck in my overall business process then I would have to answer “yes.” Most business people don’t want to leave money on the table, regardless of their personal preferences.

    On the other hand, I do understand what you and others are saying with respect to a particular processor+GPU combination from the overwhelmingly dominant personal computing platform clearly demonstrating that for certain applications, like GPU intensive creativity apps and games, the current “best benchmark leader” that Apple has available is clearly not the fastest option in the overall market, a market that Apple owns about 10% of. But this has generally been the case for a very long time, especially when it comes to gaming.

    The real question is whether these obvious disparities at the processing level will inspire Apple to up their game. I think that for as long as Apple enjoys its advantage across the “variety of tasks” spectrum with its intensely loyal 10% market share customer base the answer is “eventually.” Apple set out on a path with Apple Silicon knowing where their pros/cons were and with some stark advantages in some areas like performance per watt, single threaded performance, and graphics performance for an integrated CPU based system. They are now iterating on their initial design to improve on the pros and lessen the cons.

    Apple’s competition is absolutely doing the same thing in terms of their own set of pros/cons. I’m sure that the competition’s answer to the same question, whether their obvious disparities in areas where Apple dominates their current offerings will inspire them to up their game is also “eventually.” The relative market share disparity hasn’t changed a whole lot either way in a very long time and Apple is doing extremely well on the revenue front with their small slice. They are sticking to a roadmap that is serving them very well. In the meantime don’t venture too far from a wall plug if you have a discrete GPU based PC and try to force yourself to get excited about Apple Arcade if you’re a Mac user.
    So, the short answer is no, I would not switch. I’ve been a Mac user since the originals and am not switching, ever. But, many people in my industries did exactly that because they were working heavily in 3D for example and Nvidia owns that market, and Macs dropping support for their cards meant having to go PC. I mostly work in 2D and video editing where my Macs have been fine for the level of work I’ve done. I’m getting back into 3D but if I ever needed the power I’d probably just use online render farms versus getting even a headless PC render box. My hope is by the time I need that, Apple *has* upped their GPU game further. And then build my own render cluster of Mac minis or get a Mac Pro if they come up with an adequate solution there. :
    spheric
  • Reply 75 of 80
    avon b7 said:
    avon b7 said:

    A MBP will give you more battery time but an Intel laptop of this kind will do the job well enough if you aren't pushing it to the limit (which you would only do with it plugged in anyway). 
    As someone who works in animation and video, I can tell you firsthand that the difference between battery life on my 16" M1 Max and my previous 2018 i9 MBP is *HUGE*, and I work away from my desks all the time. Being able to work for hours in After Effects/FCP without having to look for an outlet is a game changer for me, and for a lot of people (see the M2 Pro/Max unplugged challenge video Apple showed at that event). My 140W charger rarely comes out of my bag.

    You're completely missing the point when you say "which you would only do with it plugged in anyway" — the point is, you no longer have to do that with Apple Silicon. That's game changing.
    I am not missing the point. 

    The point isn't battery life. The point is performance. Reduced battery life is simply a byproduct of the performance and is rendered (pun?) worthless when evaluating exactly and only that.

    As I said, anyone considering this kind of machine is fully aware of the battery powered tradeoffs. 

    You may work away from your desk most of the time but are you claiming there are no sockets on hand almost everywhere you work? 
    There’s a difference between a Mac that can run under load and last for hours without any performance drop versus the PC that loses all of its power advantages when not plugged in and lasts for far less time. The availability of sockets is not the issue, there are myriad scenarios where it’s inconvenient or impossible to plug in, or bringing out a depleted laptop in a meeting, etc all of which can be an issue. Just saying “which you would only do with it plugged in anyway” conveniently waves away all of those concerns when in reality it is a major differentiator.
    sphericchasmmacike
  • Reply 76 of 80
    thadec said:
    dewme said:
    dewme said:
    dewme said:

    The operating systems and the applications that are optimized for those operating systems have a hell of a lot more sway over buying behaviors than do benchmarks or narrow slices of very domain specific applications. No matter what platform I’ve ever used, the computer always spends a heck of a lot more time waiting on me than I spend waiting on the computer. But if you get paid to run benchmarks the situation may be reversed. 
    Or, maybe if you actually work in the applications they used for these benchmarks? What a weird dismissal of the target audience for this comparison, because you're not one of those people.
    I have to agree with you. If your productivity and/or your happiness is directly tied to running the exact applications with the same or similar datasets to those used in a benchmark comparison then you should be seeking out the best computer that delivers those things for you. If you’re faced with a choice between computer A and computer B a particular benchmark that you care about may sway you one way or the other, everything else being equal. 

    The problem I have with Mac vs PC comparisons is that it is fairly rare for “everything else being equal.” This leads me to be dismissive of most benchmarks in such scenarios like the one in this article, mostly because they tend to be too narrowly focused rather than geared towards the more typical way most people use computers for a variety of tasks. They also tend to overlook the entrenchment effect that exists in both camps. 

    A lot of the reasons and rationale being used on both sides of the debate are mostly being used to justify the individual choices and preferences that have already been established and ingrained into one’s existing belief system, like the necessity of plugging the PC into an existing power source to attain full performance. If the tables were turned, say by Apple offering up a big performance boost when plugged in, there would be plenty of defenders of that option. 

    A lot of what we’re seeing is similar to car comparisons and benchmarks like 0-60 times. The 0-60 benchmarks may be completely legitimate, but they are rarely the deciding factor in choosing one vehicle type or vehicle brand over another. The entire vehicle taken in aggregate plus the needs of the buyer and their budget have greater sway than 0-60 times. But sure, to your point, if you are a professional drag racer of showroom stock cars and want to win races, you’re going to pick the car with the fastest 0-60 time and probably don’t care as much about the leather seats, cargo space, and seating capacity for 8 people. 
    This isn't a Mac vs PC comparison inasmuch as it's a comparison of raw power doing tasks that can measure just that using these chips. It's very plainly stated in the article what they're measuring. Not "the more typical way most people use computers for a variety of tasks". Not "leather seats" etc. And, once again, there are plenty of people who work in software like Premiere, Cinema 4D, Blender, etc all day long to whom this information is useful, myself included. 
    Would you personally switch platforms based on one particular product’s measurable performance advantage on a set of benchmarks that matter to you? If a year later the tables are turned, would you then go back the other way? If my income was tightly bound to the performance deltas and the apps in question were a revenue affecting constraint/bottleneck in my overall business process then I would have to answer “yes.” Most business people don’t want to leave money on the table, regardless of their personal preferences.

    On the other hand, I do understand what you and others are saying with respect to a particular processor+GPU combination from the overwhelmingly dominant personal computing platform clearly demonstrating that for certain applications, like GPU intensive creativity apps and games, the current “best benchmark leader” that Apple has available is clearly not the fastest option in the overall market, a market that Apple owns about 10% of. But this has generally been the case for a very long time, especially when it comes to gaming.

    The real question is whether these obvious disparities at the processing level will inspire Apple to up their game. I think that for as long as Apple enjoys its advantage across the “variety of tasks” spectrum with its intensely loyal 10% market share customer base the answer is “eventually.” Apple set out on a path with Apple Silicon knowing where their pros/cons were and with some stark advantages in some areas like performance per watt, single threaded performance, and graphics performance for an integrated CPU based system. They are now iterating on their initial design to improve on the pros and lessen the cons.

    Apple’s competition is absolutely doing the same thing in terms of their own set of pros/cons. I’m sure that the competition’s answer to the same question, whether their obvious disparities in areas where Apple dominates their current offerings will inspire them to up their game is also “eventually.” The relative market share disparity hasn’t changed a whole lot either way in a very long time and Apple is doing extremely well on the revenue front with their small slice. They are sticking to a roadmap that is serving them very well. In the meantime don’t venture too far from a wall plug if you have a discrete GPU based PC and try to force yourself to get excited about Apple Arcade if you’re a Mac user.
    Apple's "stark advantages" in single threaded performance was never as big as the Apple fandom claimed, as benchmarks done not long after the M1 became available showed, and became even less so after 12th gen, 13th gen Intel and AMD Ryzen 4 was released. Releasing the M1 Pro/Max/Ultra and the M2 line widened the gap back again, but it was obvious that this was temporary. The main thing is that no one in the x86 world cares about single thread performance anyway because Intel and AMD cores have had 2 threads for decades and most applications have been written in that time to take advantage of them. So the gloating over single threaded performance was funny because it is a useless metric: single core performance (which measures ALL the threads available on the core) is far more useful. Ironically, if there is an application where single threaded performance matters it is gaming! Where the situation for Apple is hopeless because AAA games - whether console or "PC" - are nearly all 32 bit and use DirectX or Vulkan drivers. 

    Apple's "stark advantage" in power per watt is mostly due to being on TSMC 5 where AMD until recently was on TSMC 6 or TSMC 7 and Intel was on 14nm and is now on 10nm. You have already seen where AMD's 7040 4nm laptop chips outperform certain Apple Silicon chips on power per watt. But the big issue is going to be Intel because - unlike AMD - they have adopted big.LITTLE. When their laptop chips reach 5nm (Intel calls it 20A) they are going to rival Apple Silicon in power per watt also.

    Apple's market share was only 10% in the past year. Normally it is 5% to 7%. Also "In the meantime don’t venture too far from a wall plug if you have a discrete GPU based PC" simply isn't true. It wasn't true of the Intel MacBook Pros that you guys happily used right up until 4Q 2020 so don't pretend that it is the case now. Even Intel Core i9 laptops with the most power-hungry Nvidia graphics cards get over 3 hours battery life. CPUs and discrete GPUs that are actually designed for mobile laptops - as opposed to "laptops as workstations" - get 7 to 8 hours. Also, I would like to know the people that are spending 11 hours editing 4K videos with their MacBook Pros on their laps. They aren't. Their MacBooks are plugged into the wall just as their Wintel brethren are, because pretty much any actual work on a laptop requires a monitor. The only performance-intensive application where using the 14" laptop screen and keyboard/trackpad for hours instead of a monitor (when a "laptop" actually sits in your lap) is - again - gaming where Apple will need to address the 32 bit and driver situation. 


    Lotta BS here. 

     - Gaming — you’re ignoring cross platform engines like Unreal/Unity. Companies can port their engines to Apple silicon — look at Capcom’s RE Engine and Resident Evil Village that just came out. DICE, Infinity Ward, et al could do the same for their engines as well. To claim the future is 32bit is just absurd. When Apple dropped 32 bit support, I lost access to maybe 2-3 really old games in my Steam library or from the App Store, out of like 30 maybe. Most everything was 64bit. 

     - “It wasn't true of the Intel MacBook Pros that you guys happily used right up until 4Q 2020 so don't pretend that it is the case now.” — Nonsense. My i9 MBP would maybe get 90 minutes under load, if I was lucky. My M1 Max gets easily 5-7 hours under the same kind of load (I think? It’s rare I need to stay on battery until it completely depletes because it lasts so long), and is quiet and cool where the i9 sounded like a hair dryer and was hot af. 

     - The PC laptop in this article dropped in performance greatly without a wall plug and does not have anywhere near the battery life the Mac does. This part particularly contradicts your claim: “In Miani's test, the battery loss from rendering that same 30-minute 4K 60FPS clip was also dramatic. From 100 percent battery, the MBP with M2 Max lost 10 percent of its battery capacity, while the MSI dropped from 100 to 42 percent doing the same job.” 

     - “Their MacBooks are plugged into the wall just as their Wintel brethren are, because pretty much any actual work on a laptop requires a monitor. The only performance-intensive application where using the 14" laptop screen and keyboard/trackpad for hours instead of a monitor (when a "laptop" actually sits in your lap) is - again - gaming where Apple will need to address the 32 bit and driver situation.” — Nonsense. I do tons of performance intensive work without my external display and not plugged into anything. I bet you think you need a mouse to do any serious work as well. ߘ校lso the 32bit thing again — L O L.
    edited February 2023 sphericmacike
  • Reply 77 of 80
    Users who want this kind of performance don't really care that much about having to plug in their laptop. They are not going do their expert magic at Star Bucks. Instead they might work at home or at a co-working space where outlets aren't exactly an issue.

    Yes, it all might be less elegant than a Mac, but the raw power offered here for the dollars paid is much better value than an M2 Max.
    Users who care about this kind of performance just don't buy a laptop...
    spheric
  • Reply 78 of 80
    chasmchasm Posts: 3,304member
    What about noise, heat and weight? It’s not really a laptop, it’s more of a mobile workstation, basically a desktop with a battery, so they can call it laptop. Reminds me of how Apple use to advertise the portability of the original Mac, which could be carried in a pack. Why bother with the battery at all?
    Funnily enough, noise, heat, and weight are all addressed — in both the article and the video. Not sure how you missed it?
    fastasleep
  • Reply 79 of 80
    kelliekellie Posts: 50member
    I think the EPA needs to regulate power consumption for computers. They are getting ready to impose even crazier energy and water consumption standards on dishwashers. There are many more computers in use than dishwashers. Save the planet with a new dishwasher and more efficient computers. Why is Apple Silicon so much more energy efficient? 5nm technology? On chip integration with memory, gpu, neural engines? Better integration between the OS and the hardware? Intel and Windows have always had the challenge of having to be more generic in architecture in order to work with so many third party hardware vendors. The MSI config does trounce the M2 on a variety of performance metrics. So I guess it’s good to know you have that level of performance if you need it. But it would be nice if Intel could design better power efficiency into their chips. Seems like Intel is way behind TSMC in fab technology and the soon to used 3nm technology.
Sign In or Register to comment.