dewme

About

Username
dewme
Joined
Visits
560
Last Active
Roles
member
Points
10,556
Badges
2
Posts
4,250
  • 13-inch MacBook Pro with M2 review: Incremental upgrade and unexciting

    Alex_V said:
    Without contradicting any of the opinions expressed here, I’d like to remind everyone of one important concept, something that Apple does better than anyone in the industry: ‘amortisation’—the gradual process whereby a manufacturer recovers the initial capital investment on a new product. The sums invested in setting up production lines or factories are staggering. Ideally, as with iPhones, that investment is recovered quickly because of the fantastic volumes sold. But in product models that don’t sell in such great numbers, the longer you can keep that production line going the more profitable it becomes. It is like an orange where, instead of giving less juice the more you squeeze, it gives more. If Apple’s margin on hardware averages 25%, then the more production lines that it has churning out older models the more profitable the company can be, because older models make 35–40% (guesstimate). Also, by limiting their product range, Apple can go to their supplies and say: “We want this component, we plan to order 2 million a year, for the next 7 years.” If you are a supplier like LG or Sanyo or Samsung, you cry tears of joy when you win an Apple order, because you’re guaranteed revenue for so long! E.g. imagine how many LG-manufactured 27" 5K monitors Apple has sold since 2015. Apple can use that fact to bargain and drive component costs down, which is why no company can match Apple quality, for the same price, while hoping to come anywhere near Apple’s margins.
    That’s all fine and indisputable product development and supplier management concepts, which is all part of the Tim Cook mastery that has been executed within Apple with amazing precision during Tim’s time at the helm. This may have some bearing on the new M2 MacBook Pro feeling left-behind compared to its MacBook Air, but I think the reasons are a lot more diffused across a wider number of factors besides amortizing up-front, non recurring, and component supplier related investment costs.

    In my opinion, it’s probably a contributing factor, but not the primary one. I believe the primary throttle on Apple’s ability to move even faster is human resource limitations, as in getting enough people with the right skills assigned to its product teams. Apple threw down the gauntlet when they committed to Apple Silicon over Intel. To make everything more challenging they put a hard timeline around completing the transition, not simply starting it. This basically rewound every Mac product team back to square one and chopped the tail off of the ability to amortize the cost of certain components over a longer period of time. Fortunately, Apple’s profitability and war chest of funds allows it to absorb substantial unrecoverable sunk costs. 

    My opinion is that Apple isn’t putting older technology, like the Touch Bar, into products like the M2 MacBook Pro to “use up” their stash of Touch Bar assemblies acquired in great volumes, but rather that they lack the human resources to simultaneously redesign more than a couple of high demand new products at a time under tight schedule demands. If they could trickle out releases over an extended time frame things would be much different. But they set a tight timeline for the Apple Silicon transition so trickling out new products is not an option.

    The first generation of M1 Macs were obviously “transitional” designs, but now they are getting down to the serious business of designing products that fully exploit the benefits of Apple Silicon. It’s very telling that we have not seen a “transitional” version of the Mac Pro. As Apple’s top tier flagship Mac it would be a serious mistake for Apple to try to bring a transitional design of the Mac Pro to market. The next Mac Pro has to exude the benefits of Apple Silicon to its core, or more likely, its plethora of cores.
    vladgellerPascalxxwatto_cobra
  • 13-inch MacBook Pro with M2 review: Incremental upgrade and unexciting

    Very well written review. I think you hit the nail on the head with this statement:

    Nearly every workload applied by nearly every user is single-core and a burst process. The processor isn't running long enough, hot enough, for any of this to make a real difference to the overwhelming majority of Mac users looking at this price point.

    If notebook computers were highly task-driven dogs like a Belgian Malinois or Border Collie they'd be chewing holes on our sheetrock and destroying our sofa cushions due to our utter inattention and lack of tasking. All those idle cycles where the computer is just idling away waiting for us to do anything to raise its pulse above neutral, only to suddenly throw a big job at it for the tiniest period of time don't add up to much of anything. The M1 is so power efficient that I'm surprised the fans in the 13" MBP don't suffer from stiction problems due to their infrequent use. 

    I guess a few folks do throw some big workloads at their lower end notebooks, but if you're really hellbent on throwing big jobs at a lower tier Apple computer shouldn't you be using something like a Mac mini at the very least? Relatively speaking, its case has a lot more cooling capacity than just about any MBP. If you're using a 13" MBP that's hammering the fan more than 50% of the time, maybe you should be looking at the much more powerful 14" or 16" MacBook Pro. Time is money.

    Not trying to sound glib, but if the only "penalty" you're paying for not having active cooling in your MacBook Air is a bit of a slowdown in the rare instances where you need enough processing power over a short duration of time that is sufficient to cause your processor to break a sweat, what is the big deal? For the other 99% of the time the blissful silence, better screen, lighter weight, and better camera on the MacBook Air will more than offset those brief periods of incremental slowdown. 

    Like you said, the 13" M2 MBP slots into a price point that may make it attractive to some buyers. But there is no denying that the new M2 equipped MacBook Air is a far better and more attractive value for the vast majority of buyers in this price range. Heck, even the M1 MacBook Air is still more attractive than the 13" M2 MBP for the vast majority of users who will see little to no benefit from the incremental M2 enhancements.

    Anyone who's been following the "Air" lineage of Apple products, including MacBooks and iPads, fully understands that an "Air" product always implies some compromises versus "Pro" products, but the price-vs-performance ratio is always favorable and satisfying. The "Pro" moniker typically means that compromises are not to be expected, at least not to be expected relative to where the specific Pro product sits in the Pro price hierarchy.  The 13" MBP simply does not live up sufficiently to its Pro designation. It's too close to the Air and has nearly all of the compromises of the Air. When the Pro is a generation behind on design versus the Air things only get worse for the Pro.

    I think the current 13" M2 MacBook Pro was a fairly easy product to get to market in a short period of time. Apple undoubtedly has a next-gen 13" Pro in the pipeline and they are using the current one as a placeholder to keep their product offerings fleshed out as much as they can. It is what it is, but there is very little to get excited about.
    williamlondonelijahgchasmwatto_cobra
  • Original Apple-1 computer signed by Steve Wozniak sold for $340,100

    While the 6502 was a derivative of Motorola 8-bit designs I believe it was a MOS Technology part.  Cool to see all those socketed (probably TTL) chips and big electrolytic capacitors, one of which appears to be leaking. Of course it would be totally insane to “fix” anything at all on the board because it’s a museum piece, not a functional tool. 
    ravnorodombloggerblogwatto_cobra
  • New 'PacMan' flaw in Apple Silicon is an echo of Spectre and Meltdown

    Marvin said:
    dewme said:
    How did AppleInsider determine that physical access to the target device is required? I didn't see that mentioned at all in the MIT whitepaper. I've seen conflicting statements on other sites, for example https://www.tomshardware.com/news/mit-finds-vulnerability-in-arm-chips-demos-pacman-attack-on-apple-m1.
    The main site for it says it can be used remotely but it needs to be able to run code and they use C++, which sounds like they are transferring a compiled C++ executable over the network:

    https://pacmanattack.com
    https://pacmanattack.com/paper.pdf

    "Does this attack require physical access?
    Nope! We actually did all our experiments over the network on a machine in another room. PACMAN works just fine remotely if you have unprivileged code execution."

    This is to circumvent a protection used to stop other attacks. So they have to get native code with memory access onto the device, use this attack successfully to bypass the protection, then run a different attack to do some damage. Getting native code onto a device in the first place would allow someone to do all kinds of damage anyway.

    Security researchers get all excited about this stuff because they are stuck in a room all day testing these things but the results are mainly of interest to other security people. Apple commented on it with the following statement:

    "Based on our analysis as well as the details shared with us by the researchers, we have concluded this issue does not pose an immediate risk to our users and is insufficient to bypass operating system security protections on its own."
    Agreed, the researchers have in effect demonstrated something that is possible, but carrying it out in the wild rather than in a lab and overcoming the obstacles and satisfying all the caveats is another thing altogether.

    I have nothing but respect and gratitude for the researchers who are doing these kinds of investigations. It takes us so far beyond where we were 20 or 30 years ago when "just make it work" rarely {okay, never) involved doing even a shallow dive on potential security or privacy concerns. They are doing a fabulous job of finding latent issues baked into designs, not only at the surface but now much deeper down and conditional ones exposed by other issues. In most cases they are discovering these using reverse engineering and behavioral analysis, not by digging into the design artifacts that are only in the hands of the product owners. That's some pretty good detective work.

    The only issue I have with some of these discoveries is that the target audience for the subsequent disclosures is often just the product owner, not the general public. In the spirit of transparency everything gets pushed out to everyone and makes for some exciting large font headlines, especially when a clever name is assigned. However, doing so can create anxiety in end users for issues that either don't really impact them or for which there is no mitigation, other than the product owner reassuring them that they have nothing to worry about.

    Every controlled process requires a negative feedback loop. These researchers provide that necessary function and I wouldn't want it any other way. As far as public anxiety and scary headlines, it's just another one of many that everyone has to learn to deal with in our media saturated world.

    appleinsideruserh2pmuthuk_vanalingam
  • New 'PacMan' flaw in Apple Silicon is an echo of Spectre and Meltdown

    chadbag said:

    The flaw affects all kinds of ARM-based chips — not just Apple's. 

    But right above this line it says that the M1 is the first commercially available ARM based chip that offers the feature the flaw was found in.  In the last year have other ARM chips become commercially available with this pointer protection feature?

    Also, chips without the feature don't have the protection in the first place, so it stands to reason that all other ARM chips are by default equivalent to having this exploit by default exploited.  

    Yes, Apple is the first one to ship ARM based products with pointer authentication. Finding this vulnerability early on in the lifetime of the implementation this feature is a good thing and is in stark contract to the Spectre and Meltdown vulnerabilities, which were discovered much later in the lifetime of the features that proved to be vulnerable. In some sense finding PACMAN now has allowed an opportunity to close the barn door after only a couple of horses have escaped. Intel flaws were discovered after the entire herd of horses was on the loose for many years. 

    As far as ARM chips that don't support pointer authentication are concerned, they are no more or less vulnerable to pointer related exploits as they were prior to the PACMAN discovery, but they are not exploitable by the PACMAN mechanism. There is a big difference. Any kind of pointer related exploit always runs the risk of being sussed out or revealed by virtue of it causing the attacked program to crash. The PACMAN attack takes advantage of two vulnerabilities to allow the attacker to use brute force techniques to  discover the protected pointer (the secret) value without crashing the program. Normally, you would expect any authentication protocol to catch anyone trying to guess the secret, as Apple does with device logins, so coming up with a way to neuter that added layer of protection is a big deal.

    The attacker has to set up a lot of non-trivial scaffolding to implement a PACMAN attack, but this team at MIT has demonstrated that it is possible. 
    killroyAlex1Nh2pwatto_cobra