thadec

About

Username
thadec
Joined
Visits
18
Last Active
Roles
member
Points
469
Badges
0
Posts
97
  • Apple Silicon dominates the market for ARM chips & will drive growth

    Good grief. There is so much wrong here.

    Unlike the x86 chip architecture from Intel, designed to be general-purpose processors, ARM-based system-on-a-chips (SoC) are highly customizable. Companies can design these processors as Apple has to include more high-performance CPU cores and integrated memory to compete with x86 CPUs.

    Totally ignoring that Apple went to Intel to design the Core Duo for the MacBook Air. Or that Microsoft and Sony get custom-made chips from AMD for their XBox and PlayStation consoles every generation. Or that Valve and Asus went to AMD to get custom-made chips for their consoles. 

    The custom cores allow advanced features such as artificial intelligence and machine learning tasks, such as Apple's Neural Engine it includes in its processors.

    Yeah ... Intel and AMD have their own integrated NPUs also.

    ARM chips can also be more efficient for power and customized to meet specific power requirements.

    As can x86 chips. The latest AMD 7040 laptop chips give better power per watt than the M1 and M1 Pro. Intel has a whole range of laptop chips that give better power per watt than the iPad Pro does.

    Integrating GPUs into the chips also improves performance by accelerating computational tasks such as machine learning and image recognition. As a result, it allows for faster and more accurate results in computing and makes it possible to run advanced apps and software on ARM-based computers.

    Literally every single Intel and AMD laptop chip has an integrated GPU and has for decades. Most Intel desktop chips have integrated GPUs too. Only AMD desktop chips regularly lack integrated GPUs. Are they as good as Apple's integrated GPUs? Not yet, though the integrated GPU on the latest AMD laptop chips perform as well as an Nvidia GTX 1650: https://wccftech.com/amd-radeon-780m-rdna-3-igpu-comes-within-striking-distance-of-nvidia-gtx-1650-dgpu/ But they don't have to. Because - unlike Apple Silicon - you have the option of using discrete GPUs with Intel and AMD chips. 

    As for the unified memory thing: face reality. First off, no one needs it. Apple uses unified memory to maximize performance. Intel and AMD can get the performance they need without it. Second, no one wants it. People prefer the ability to cheaply expand RAM on low and midrange systems. And they prefer the ability to add massive amounts of RAM on high end systems and servers. Being able to buy an entry level system for cheap and upgrade the RAM to 64 GB for $100 is huge. And this site states that actual machine learning workstations often require 1 TB of RAM https://www.pugetsystems.com/solutions/scientific-computing-workstations/machine-learning-ai/hardware-recommendations/#ram so you can forget about the oroposed 192 GB RAM Mac Pro being used by that crowd. 

    Look, I am not saying that the people who write for Apple Insider need a computer engineering background or anything like that, but so much in this article can be easily debunked just by spending 30 minutes on Anandtech. At least the magical and wishful thinking that Daniel Eran Dilger was infamous for on Apple Insider was mostly opinion.
    muthuk_vanalingamwilliamlondon
  • Apple struggles to break free from Samsung display reliance

    The editorial slant of this article is amazing. What is Samsung supposed to do? Let Apple take their IP and use it to create a competing product? Without even licensing it or paying for it? I know that people have this "every other global corporation is cutthroat and sometimes shady but Apple who is 100% above board" but consider this context:

    1. Right now having the best screens in the business is Samsung's sole market advantage. No one cares about "thin and light" anymore (something that Samsung refuses to acknowledge because they sunk hundreds of billions of R&D into competing with Apple in this area). Google has adopted much of their UX and multitasking software - the portions that most people actually like - into base Android. The Chinese Android phones are faster - as they design them to be overclocked without throttling as opposed to being as thin and light as possible - and cheaper (state subsidized). So if Apple does the same what will Samsung have left?

    2. It isn't just smartphones and tablets either. Samsung uses their screens to push their (overpriced) laptops - it is the reason why they don't sell desktops - as well as their TVs and monitors. No, Apple doesn't make Windows or ChromeOS laptops, TVs or general purpose monitors. Fine: what keeps Apple from licensing the tech (or worse, know-how process knowledge that can't be copyrighted or patented) to companies that do just to spite Samsung?

    3. Don't forget that Apple is the same company that used Qualcomm's IP for years without paying for it using the nonsense justification "we invented the smartphone and tablet so without us your IP wouldn't be worth anything anyway ... so we should pay substantially less for the same tech as everyone else including our direct competition." (Yes, the iPhone made Qualcomm's LTE and 5G patents more valuable. But without mobile radio technology mobile phones like the iPhone wouldn't exist in the first place. Anyone remember the Apple Newton? Exactly. By contrast though they weren't making as much money, Qualcomm was doing fine selling their tech to Nokia, Blackberry, Motorola etc. before the iPhone came along.) 

    4. And finally, do not forget that Apple spent years suing Samsung with copyright infringement lawsuits that they knew they had no chance at winning but were designed to either A. intimidate Samsung into abandoning the smartphone market (this was before Android took off, which Apple honestly never thought would happen, as some old "Android will never outsell iOS!" articles on Apple Insider from around 2010 and 2011 can attest) or B. ruin Samsung's reputation as a "copier" and "thief" which worked as people still refer to Samsung as such as this day despite the fingerprint sensor and abandoning the headphone jack are the only two things that Samsung has copied from Apple in over 10 years (and even there, Motorola was technically first with the fingerprint sensor and Samsung's budget phones still have the headphone jack). Meanwhile, Apple freely appropriates features and design language from Samsung wholesale - been the basis of nearly every major change to the iPhone and iPad since the iPhone 5 - and no one bats an eye. 

    Samsung and Apple need each other. But they don't like or trust each other. And the reasons for their emnity are mutual. (Another example: Apple bought Beats, for example, for the sole purposes of short circuiting Samsung's very successful cross licensing deal with them. Proof? What has Apple done with Beats since buying them? They haven't released new headphones in 6 years, new earbuds that were actually good in 4 years, and I can't even remember the last major Beats advertising campaign.) Oh yeah, and the thing about how Samsung refuses to agree to the terms that Apple usually forces on their suppliers ignores how many Apple suppliers have been bankrupted by those same terms. Samsung has the clout to prevent Apple from taking them to the cleaners like what happened to GT Technologies. Who ... doesn't exist anymore. The reason: https://www.reuters.com/article/us-apple-gt-advanced-tech/gt-advanced-bankruptcy-offers-warning-to-apple-suppliers-idUSKCN0HX0XV20141008

    Amazing how Samsung's actions to protect itself is being spun as a negative here.
    ctt_zhmuthuk_vanalingamgatorguymikethemartianFileMakerFeller
  • Razer Blade 16 vs Apple MacBook Pro 16-inch - compared

    There ARE some tests that could have been run that would have been meaningful.
    4K/8K video editing using the same apps (DaVinci Resolve, Adobe suite).
    Handbrake video encoding.
    Data science and/or machine learning benchmarking (Python etc)

    But with the single core scores so close and the graphics scores overwhelmingly favoring the gaming laptop, it is easy to guess why they weren't done.
    muthuk_vanalingamkbee
  • Intel has a faster processor than M2 Max, but at what cost?

    dewme said:
    dewme said:
    dewme said:

    The operating systems and the applications that are optimized for those operating systems have a hell of a lot more sway over buying behaviors than do benchmarks or narrow slices of very domain specific applications. No matter what platform I’ve ever used, the computer always spends a heck of a lot more time waiting on me than I spend waiting on the computer. But if you get paid to run benchmarks the situation may be reversed. 
    Or, maybe if you actually work in the applications they used for these benchmarks? What a weird dismissal of the target audience for this comparison, because you're not one of those people.
    I have to agree with you. If your productivity and/or your happiness is directly tied to running the exact applications with the same or similar datasets to those used in a benchmark comparison then you should be seeking out the best computer that delivers those things for you. If you’re faced with a choice between computer A and computer B a particular benchmark that you care about may sway you one way or the other, everything else being equal. 

    The problem I have with Mac vs PC comparisons is that it is fairly rare for “everything else being equal.” This leads me to be dismissive of most benchmarks in such scenarios like the one in this article, mostly because they tend to be too narrowly focused rather than geared towards the more typical way most people use computers for a variety of tasks. They also tend to overlook the entrenchment effect that exists in both camps. 

    A lot of the reasons and rationale being used on both sides of the debate are mostly being used to justify the individual choices and preferences that have already been established and ingrained into one’s existing belief system, like the necessity of plugging the PC into an existing power source to attain full performance. If the tables were turned, say by Apple offering up a big performance boost when plugged in, there would be plenty of defenders of that option. 

    A lot of what we’re seeing is similar to car comparisons and benchmarks like 0-60 times. The 0-60 benchmarks may be completely legitimate, but they are rarely the deciding factor in choosing one vehicle type or vehicle brand over another. The entire vehicle taken in aggregate plus the needs of the buyer and their budget have greater sway than 0-60 times. But sure, to your point, if you are a professional drag racer of showroom stock cars and want to win races, you’re going to pick the car with the fastest 0-60 time and probably don’t care as much about the leather seats, cargo space, and seating capacity for 8 people. 
    This isn't a Mac vs PC comparison inasmuch as it's a comparison of raw power doing tasks that can measure just that using these chips. It's very plainly stated in the article what they're measuring. Not "the more typical way most people use computers for a variety of tasks". Not "leather seats" etc. And, once again, there are plenty of people who work in software like Premiere, Cinema 4D, Blender, etc all day long to whom this information is useful, myself included. 
    Would you personally switch platforms based on one particular product’s measurable performance advantage on a set of benchmarks that matter to you? If a year later the tables are turned, would you then go back the other way? If my income was tightly bound to the performance deltas and the apps in question were a revenue affecting constraint/bottleneck in my overall business process then I would have to answer “yes.” Most business people don’t want to leave money on the table, regardless of their personal preferences.

    On the other hand, I do understand what you and others are saying with respect to a particular processor+GPU combination from the overwhelmingly dominant personal computing platform clearly demonstrating that for certain applications, like GPU intensive creativity apps and games, the current “best benchmark leader” that Apple has available is clearly not the fastest option in the overall market, a market that Apple owns about 10% of. But this has generally been the case for a very long time, especially when it comes to gaming.

    The real question is whether these obvious disparities at the processing level will inspire Apple to up their game. I think that for as long as Apple enjoys its advantage across the “variety of tasks” spectrum with its intensely loyal 10% market share customer base the answer is “eventually.” Apple set out on a path with Apple Silicon knowing where their pros/cons were and with some stark advantages in some areas like performance per watt, single threaded performance, and graphics performance for an integrated CPU based system. They are now iterating on their initial design to improve on the pros and lessen the cons.

    Apple’s competition is absolutely doing the same thing in terms of their own set of pros/cons. I’m sure that the competition’s answer to the same question, whether their obvious disparities in areas where Apple dominates their current offerings will inspire them to up their game is also “eventually.” The relative market share disparity hasn’t changed a whole lot either way in a very long time and Apple is doing extremely well on the revenue front with their small slice. They are sticking to a roadmap that is serving them very well. In the meantime don’t venture too far from a wall plug if you have a discrete GPU based PC and try to force yourself to get excited about Apple Arcade if you’re a Mac user.
    Apple's "stark advantages" in single threaded performance was never as big as the Apple fandom claimed, as benchmarks done not long after the M1 became available showed, and became even less so after 12th gen, 13th gen Intel and AMD Ryzen 4 was released. Releasing the M1 Pro/Max/Ultra and the M2 line widened the gap back again, but it was obvious that this was temporary. The main thing is that no one in the x86 world cares about single thread performance anyway because Intel and AMD cores have had 2 threads for decades and most applications have been written in that time to take advantage of them. So the gloating over single threaded performance was funny because it is a useless metric: single core performance (which measures ALL the threads available on the core) is far more useful. Ironically, if there is an application where single threaded performance matters it is gaming! Where the situation for Apple is hopeless because AAA games - whether console or "PC" - are nearly all 32 bit and use DirectX or Vulkan drivers. 

    Apple's "stark advantage" in power per watt is mostly due to being on TSMC 5 where AMD until recently was on TSMC 6 or TSMC 7 and Intel was on 14nm and is now on 10nm. You have already seen where AMD's 7040 4nm laptop chips outperform certain Apple Silicon chips on power per watt. But the big issue is going to be Intel because - unlike AMD - they have adopted big.LITTLE. When their laptop chips reach 5nm (Intel calls it 20A) they are going to rival Apple Silicon in power per watt also.

    Apple's market share was only 10% in the past year. Normally it is 5% to 7%. Also "In the meantime don’t venture too far from a wall plug if you have a discrete GPU based PC" simply isn't true. It wasn't true of the Intel MacBook Pros that you guys happily used right up until 4Q 2020 so don't pretend that it is the case now. Even Intel Core i9 laptops with the most power-hungry Nvidia graphics cards get over 3 hours battery life. CPUs and discrete GPUs that are actually designed for mobile laptops - as opposed to "laptops as workstations" - get 7 to 8 hours. Also, I would like to know the people that are spending 11 hours editing 4K videos with their MacBook Pros on their laps. They aren't. Their MacBooks are plugged into the wall just as their Wintel brethren are, because pretty much any actual work on a laptop requires a monitor. The only performance-intensive application where using the 14" laptop screen and keyboard/trackpad for hours instead of a monitor (when a "laptop" actually sits in your lap) is - again - gaming where Apple will need to address the 32 bit and driver situation. 


    muthuk_vanalingammacike
  • Intel has a faster processor than M2 Max, but at what cost?

    Look below. The reality: Intel's competiton is AMD and vice versa. Neither of them - or the fans of the PC ecosystem (inclusive of Windows, ChromeOS and Linux with the latter having been a real factor in workstations and servers for decades and now is a factor in gaming thanks to the AMD-powered Steam Deck) - care about Macs. 

    https://videocardz.com/newz/amd-ryzen-9-7945hx-16-core-laptop-cpu-trades-places-with-intel-13th-gen-core-i9-hx-series-in-geekbench-leak
    chasm