Upon further testing with my own production apps, I'm seeing instances of severe over-throttling of the i9 when running processes that both use a lot of memory (over 4 GB) and that are dominated by random memory accesses. At the extreme (memory usage over 20 GB on a system with 32 GB total memory), I'm seeing steady-state frequencies under 2 GHz and total processor package power usage of only ~11 watts. If other, smaller jobs are run simultaneously with a large one, the frequency immediately jumps to 2.7 GHz or more. If smaller jobs are run alone, the frequency is 2.9 GHz or higher.
It's as though a hypervisor detects that my large process is spending most of its time waiting for random memory accesses to be satisfied, so it decides to throttle the CPU to make the user wait even longer for the results. (Gee, thanks!)
Upon further testing with my own production apps, I'm seeing instances of severe over-throttling of the i9 when running processes that both use a lot of memory (over 4 GB) and that are dominated by random memory accesses. At the extreme (memory usage over 20 GB on a system with 32 GB total memory), I'm seeing steady-state frequencies under 2 GHz and total processor package power usage of only ~11 watts. If other, smaller jobs are run simultaneously with a large one, the frequency immediately jumps to 2.7 GHz or more. If smaller jobs are run alone, the frequency is 2.9 GHz or higher.
It's as though a hypervisor detects that my large process is spending most of its time waiting for random memory accesses to be satisfied, so it decides to throttle the CPU to make the user wait even longer for the results. (Gee, thanks!)
Did it try to overclock the memory for you?
That is exactly what is happening. Not the overclock part from ascii, but the hypervisor race to low power.
dewme said: This is another black eye on Apple’s quality process and needs to be fixed immediately.
Unfortunately, this is the case. This was a major "oversight" and one that clearly indicates a lack of testing by Apple QC.
Either that or they knew of the issue, but released anyway, with a desire to remedy post-launch (which would be troubling).
But it sounds like they were inexplicably unaware as some rando was able to identify a major issue by simple running. couple tests that represented what a huge portion of Mac customers would be doing on a daily basis.
This is pretty bad negligence. There is no way around it. Would be great to see this kind of thing improve in the coming year.
They are human beings, so mistakes happen. But this one in particular was entirely avoidable.
And the whole idea of Apple "working with Lee" is really lame. As if they needed his testing to identify the issue. They could just test themselves.
I have the 2106 MBP 15 fully upgraded. And I love it. No keyboard issues, no nothing. The Touch Bar DID lock up and crash on me a few times when I first bought it - but that was remedied by a patch that came out shortly after I made the purchase.
But if you upgrade to the fastest CPU possible and it throttles that severely from doing some work on a very popular app and it go unnoticed by Apple prior to release... I am not feeling the love.
I's obvious that this is basically a release bug, those are the hardest to QA properly since it's not really part of the normal software but added in deployment once the main software has been developed in tested extensively.
The normal QA process for the software probably didn't see it because on their machine, the drivers and low level software is not signed.
The device mostly functioned as expected for most people so it's not like the machine was broken. Lower level machines seemingly had the same bug, I7 but it didn't show up substantially. That means it wasn't an easy thing to see.
It was a release/deployment bug that seemingly only appeared clearly on one configuration in certain circumstances.
As a developer I always tried to get a "parallel" test on any new system -- where the whole system was run under real life conditions -- just to catch those bugs that 'slip through the cracks' and destroy credibility. It sounds like Apple skipped that step. Well, they're young. They'll learn.
Right... "real life conditions", why not fire QA and stop betas and just automate everything by putting a few machine on test beds, since testing in "real life conditions" tm will fix everything. How the frack do you define those conditions genius. You think there are not bugs that can run trough your tests if you are not testing all possible use cases and system variants (which is is not probable).
I've been doing very very large systems for 30 years and what you describe is garage level development, it's a lot more involved in things like what Apple is doing.
There was also a time were security was non existent in drivers (not long ago at all) and that kind of bug would never have occurred.
Testing deployment procedures is a lot harder that's why you tend to keep things as simple as possible. Maybe
It's possible they actually signed those firmwares before and some script change introduced a bug (not signed, wrong cert, etC) that wasn't caught by existing regression testing. They do need to make sure the testing tests the right thing for sure; and they no doubt fixed that now.
Considering the bug was not obvious and only hit the I9 while existing in all configs, it's highly possible it actually passed integration testing (or even regression tests from previous deployments). So, not "no test", but possibly a flaw in the testing.
If you've been "doing" (whatever that means) systems for 30 years I feel sorry for your customers...
If you put all your faith in technical, piecemeal QA tests and neglect to run a final check of the entire, integrated system under real world conditions, your testing is deficient. Neither one is sufficient by itself. For true quality, you need both.
And, if you don't know how to define "real life", then the system you design probably won't work well under real life conditions. But, even then, the point of a real life test is to take the system out from under the protective umbrella of the developer and QA and let it run out in the real world. That's pretty much what a full beta does. Again, Apple apparently skipped that step.
And, by the way: The bug that you say "was not obvious" was caught almost immediately by an outside tester who spotted something wrong. It was obvious to him. So, why was it not obvious to Apple? (Apparently because they forgot the last and final test: A real world test of the final, integrated system.
foggyhill said: You think there are not bugs that can run trough your tests if you are not testing all possible use cases and system variants (which is is not probable).
I think what people are a bit shocked about... is we're not talking all possible variants here. If the bug had showed up running version 3.12 of WizardCAD while the user stood on their head, that would be one thing. But, if it shows up running one of the more popular apps the machine is designed to run, people think, 'Why didn't they test that?!'
But, yea, the grid could get pretty big, pretty fast. And, what kinds of things do you test with each of those popular apps, etc.? I actually think it's more problematic that it was missed in a 'final build' kind of way... a bit like if all the MBPs went out missing the right rear screw.
It's not really in the "final build" seems it's a bit appart and that's the issue. Some regression probably got introduced because they changed the way they signed the firmware and didn't think it would impact deployment (or didn't see it in their suite of tests,so regression testing was not sufficient).
That's the excuse used by the developer right before he hears: "YOU'RE FIRED!"
dewme said: This is another black eye on Apple’s quality process and needs to be fixed immediately.
Unfortunately, this is the case. This was a major "oversight" and one that clearly indicates a lack of testing by Apple QC.
Either that or they knew of the issue, but released anyway, with a desire to remedy post-launch (which would be troubling).
But it sounds like they were inexplicably unaware as some rando was able to identify a major issue by simple running. couple tests that represented what a huge portion of Mac customers would be doing on a daily basis.
This is pretty bad negligence. There is no way around it. Would be great to see this kind of thing improve in the coming year.
They are human beings, so mistakes happen. But this one in particular was entirely avoidable.
And the whole idea of Apple "working with Lee" is really lame. As if they needed his testing to identify the issue. They could just test themselves.
I have the 2106 MBP 15 fully upgraded. And I love it. No keyboard issues, no nothing. The Touch Bar DID lock up and crash on me a few times when I first bought it - but that was remedied by a patch that came out shortly after I made the purchase.
But if you upgrade to the fastest CPU possible and it throttles that severely from doing some work on a very popular app and it go unnoticed by Apple prior to release... I am not feeling the love.
I's obvious that this is basically a release bug, those are the hardest to QA properly since it's not really part of the normal software but added in deployment once the main software has been developed in tested extensively.
The normal QA process for the software probably didn't see it because on their machine, the drivers and low level software is not signed.
The device mostly functioned as expected for most people so it's not like the machine was broken. Lower level machines seemingly had the same bug, I7 but it didn't show up substantially. That means it wasn't an easy thing to see.
It was a release/deployment bug that seemingly only appeared clearly on one configuration in certain circumstances.
As a developer I always tried to get a "parallel" test on any new system -- where the whole system was run under real life conditions -- just to catch those bugs that 'slip through the cracks' and destroy credibility. It sounds like Apple skipped that step. Well, they're young. They'll learn.
Right... "real life conditions", why not fire QA and stop betas and just automate everything by putting a few machine on test beds, since testing in "real life conditions" tm will fix everything. How the frack do you define those conditions genius. You think there are not bugs that can run trough your tests if you are not testing all possible use cases and system variants (which is is not probable).
I've been doing very very large systems for 30 years and what you describe is garage level development, it's a lot more involved in things like what Apple is doing.
There was also a time were security was non existent in drivers (not long ago at all) and that kind of bug would never have occurred.
Testing deployment procedures is a lot harder that's why you tend to keep things as simple as possible. Maybe
It's possible they actually signed those firmwares before and some script change introduced a bug (not signed, wrong cert, etC) that wasn't caught by existing regression testing. They do need to make sure the testing tests the right thing for sure; and they no doubt fixed that now.
Considering the bug was not obvious and only hit the I9 while existing in all configs, it's highly possible it actually passed integration testing (or even regression tests from previous deployments). So, not "no test", but possibly a flaw in the testing.
If you've been "doing" (whatever that means) systems for 30 years I feel sorry for your customers...
If you put all your faith in technical, piecemeal QA tests and neglect to run a final check of the entire, integrated system under real world conditions, your testing is deficient. Neither one is sufficient by itself. For true quality, you need both.
And, if you don't know how to define "real life", then the system you design probably won't work well under real life conditions. But, even then, the point of a real life test is to take the system out from under the protective umbrella of the developer and QA and let it run out in the real world. That's pretty much what a full beta does. Again, Apple apparently skipped that step.
And, by the way: The bug that you say "was not obvious" was caught almost immediately by an outside tester who spotted something wrong. It was obvious to him. So, why was it not obvious to Apple? (Apparently because they forgot the last and final test: A real world test of the final, integrated system.
Buddy, I've been in charge of overall engineering of systems in the tens of millions, go sell your 101 shit somewhere else.
You sound like someone who develops software in their god damn garage after they "read" a book.
I'm not answering anything else from you, so wallow in your "wisdom" all you like.
foggyhill said: You think there are not bugs that can run trough your tests if you are not testing all possible use cases and system variants (which is is not probable).
I think what people are a bit shocked about... is we're not talking all possible variants here. If the bug had showed up running version 3.12 of WizardCAD while the user stood on their head, that would be one thing. But, if it shows up running one of the more popular apps the machine is designed to run, people think, 'Why didn't they test that?!'
But, yea, the grid could get pretty big, pretty fast. And, what kinds of things do you test with each of those popular apps, etc.? I actually think it's more problematic that it was missed in a 'final build' kind of way... a bit like if all the MBPs went out missing the right rear screw.
It's not really in the "final build" seems it's a bit appart and that's the issue. Some regression probably got introduced because they changed the way they signed the firmware and didn't think it would impact deployment (or didn't see it in their suite of tests,so regression testing was not sufficient).
That's the excuse used by the developer right before he hears: "YOU'RE FIRED!"
Right.. More crap from mister amateur; read that second book now.
GeorgeBMac said: ... That's pretty much what a full beta does. Again, Apple apparently skipped that step.
And, by the way: The bug that you say "was not obvious" was caught almost immediately by an outside tester who spotted something wrong. It was obvious to him. So, why was it not obvious to Apple? (Apparently because they forgot the last and final test: A real world test of the final, integrated system.
Yeah, agreed. And, I'm still seeing excuses like this was some unique thing (i.e.: Apple had to work with Lee to determine it was 4k Red video out of Premiere or something). Yet once Lee caught it, others (including our own AI folks here) were able to replicate the problem, generally, using other software and benchmark tools. In other words, while the fix might have been a bit obscure, the problem showing up wasn't.
That said, I suppose they (Apple) were no longer really doing performance stats or comparisons at that point, so someone would have had to notice a percentage performance difference from the expectation, or be graphing things and notice the graphs didn't look right.
1) Get a spare bottom case cover. This you can modify to improve function. Then return to the original bottom case cover when you need service.
2) Create strategic holes over the CPU and GPU fans. to further improve airflow.
3) Use thermal tape to connect the heat sink to the aluminum bottom case cover. This allows the MacBook Pro’s unibody aluminum case to act like a real heat sink. Currently it does not since it has no contact with the motherboard.
The author’s MacBook Pro 2012 has been able to run without throttling at all with his mod. And his MacBook Pro has run without a glitch the past six years since he modded it despite frequent trips and n his backpack
4) Attach fans to the two bottom case holes to push air into the MacBook Pro. This will accelerate cooling, allowing even higher stable frequencies under heavy load.
Mike, what's your (hot) take on the temperature of the i9 MBP under load?
The chip itself will periodically surge to just below 100C which is fine. However, the fans and heat pipes bring it down by 20F very rapidly when the clock speed decreases to around the rated speed.
This is a solid patch. In all likelihood, this will be the last standalone article about this until we publish the full reviews of our gear later this week.
(edit, not F, C)
Thanks. I was actually thinking more about overall top case heat and fan noise. I saw something where the reviewer said the MBP got uncomfortably hot. I'm still leaning towards the i9, but if the i7 is noticeably cooler/quieter and not making hands sweat when under load I could see that being a deciding factor.
Mike, what's your (hot) take on the temperature of the i9 MBP under load?
The chip itself will periodically surge to just below 100C which is fine. However, the fans and heat pipes bring it down by 20F very rapidly when the clock speed decreases to around the rated speed.
This is a solid patch. In all likelihood, this will be the last standalone article about this until we publish the full reviews of our gear later this week.
(edit, not F, C)
Thanks. I was actually thinking more about overall top case heat and fan noise. I saw something where the reviewer said the MBP got uncomfortably hot. I'm still leaning towards the i9, but if the i7 is noticeably cooler/quieter and not making hands sweat when under load I could see that being a deciding factor.
Fan noise is about the same as it always has been, and its the same across the i7 and i9.
As far as the keyboard goes, It's not really uncomfortably hot to type on, but if you rest your hand on the keyboard itself -- that's a little too toasty for comfort. It is about the same across the i7 and i9.
Combined with how Macs have both become niche products and consistently fallen behind current technology, I have to assume that Apple is looking at making some upgrades to the Mac Team.
What technology are you talking about? The same Intel updates everyone else gets? Yes, they are slow to update but it’s not like PC manufacturers are anything other that generic part assemblers. They assemble generic parts for a generic OS that they all sell with minor hardware and software tweaks.
T-1 and T-2 chips (custom silicon) are well ahead of what’s offered on the PC and USB-C/Thunderbolt adoption is the future, as with USB back in the 90’s Apple bit the bullet and pioneered it’s use over older inferior tech when PCs lagged behind with Serial and Parallel ports as standard gear.
It’s clear Apple is moving towards a custom ARM CPU and possibly custom graphics chips. On that side of the fence Apple is ahead of the pack. It’s worth noting that iPhones are already as powerful as base level Macs and PCs and with their own graphics API and custom silicon, they can get more performance with less silicon, than Android. We already see Android devices needing more cores and more RAM to match the iPhone’s performance. Don’t be surprised if the PC market goes the same way down the road. Apple’s custom SSD configurations (no onboard controller) in the iMacPro are already a break with the norm, which gives them above average speed.
So you're admitting the T chips are the only thing Apple has done in the Mac lineup lately. What do those things do? Provide variable function keys that nobody uses and a finger print reader? Color me not impressed.
And now, a rookie error that somebody else had to catch.
Time for an upgrade!
The T-1 handles the Finger Print reader and contains the "secure enclave" to hold the credentials separate from the OS and CPU, as it does on the iPhone. It's now a part of the Mac. If you remember Samsung got caught storing Finger Print reader credentials in the open, in plain text files. I'm not sure if any other finger Print Reader on a PC have the equivalent of a "secure enclave".
The T-2 chips handles all i/o functions to include the Microphone, Camera, Secure Storage and Secure Boot separate from the OS and CPU. Secure Boot verifies the authenticity of the OS through a chain of signatures and checks from the firmware on up to insure the system has not been altered, and can prevent booting from external drives. Secure storage handles all of the device i/o and on-the-fly encryption for storage; the MBP and iMacPro have raw SSD chips in parallel (no on board controller) that are managed by the T-2 chip with no help from the CPU. This is how Apple achieves 2.5 GB/s read/write disk speeds without taxing the CPU. Microphone and Camera access are also managed by the T-2 and all access must be authorised through it.
Combined, these two chips greatly enhance the security and speed of the platform and are not dependent on the CPU.
Then of course there is macOS which is a part of the Mac lineup. No other PC manufacture other than Microsoft, when they choose to compete with their customers, has its own OS. All the other PC manufacturers are just 3rd party parts assemblers.
It's easy to forget the Apple does what Microsoft (Windows), Google (Android) and any Andriod phone or PC manufacture does, plus chip design, all inside of one company. No one else can claim that achievement. Have no doubt, the T-1 and T-2 chips along with macOS are just the beginning of the resurgence the desktop.
Have no doubt, the T-1 and T-2 chips along with macOS are just the beginning of the resurgence the desktop.
Ha - hyperbolic, but I won't deny that does excite me a bit.
Can anyone explain a bit more as to why cryptographic (I presume) encryption+keys are necessary within the firmware/hardware? Are they trying to prevent access, or do they consider some of the functionality trade secret stuff, or - what? Just to me, it sounds like additional overhead but I can't work out what justifies it. But I nearly left my house with one shoe on today, so figures I can't.
inequals said: Can anyone explain a bit more as to why cryptographic (I presume) encryption+keys are necessary within the firmware/hardware? Are they trying to prevent access, or do they consider some of the functionality trade secret stuff, or - what? Just to me, it sounds like additional overhead but I can't work out what justifies it. But I nearly left my house with one shoe on today, so figures I can't.
I think it is doing the stuff we've had for a while via software/firmware, but in a more secure and higher-performance way. For example, a lot of people use disk encryption (FileVault), but this does it on the fly via independent specialized hardware.
As to the why... well, there could be many reasons. If someone steals my laptop, I don't want them into the data on it. And, in our current culture, you don't have to be a criminal to want to hide your data, even if you don't have a bunch of extreme confidential information.
As I'm paying for something that is supposed to go up to 4.8 GHz when necessity arises, is there a way to test if that is going to happen?
It does.
To be clear, that's not what you're buying. You're buying a machine that can maintain the rated speed (2.9) and turbo up to that speed for brief periods of time.
It does the former after the patch, and also the latter.
Comments
That said, I suppose they (Apple) were no longer really doing performance stats or comparisons at that point, so someone would have had to notice a percentage performance difference from the expectation, or be graphing things and notice the graphs didn't look right.
The T-1 handles the Finger Print reader and contains the "secure enclave" to hold the credentials separate from the OS and CPU, as it does on the iPhone. It's now a part of the Mac. If you remember Samsung got caught storing Finger Print reader credentials in the open, in plain text files. I'm not sure if any other finger Print Reader on a PC have the equivalent of a "secure enclave".
The T-2 chips handles all i/o functions to include the Microphone, Camera, Secure Storage and Secure Boot separate from the OS and CPU. Secure Boot verifies the authenticity of the OS through a chain of signatures and checks from the firmware on up to insure the system has not been altered, and can prevent booting from external drives. Secure storage handles all of the device i/o and on-the-fly encryption for storage; the MBP and iMacPro have raw SSD chips in parallel (no on board controller) that are managed by the T-2 chip with no help from the CPU. This is how Apple achieves 2.5 GB/s read/write disk speeds without taxing the CPU. Microphone and Camera access are also managed by the T-2 and all access must be authorised through it.
Combined, these two chips greatly enhance the security and speed of the platform and are not dependent on the CPU.
It's easy to forget the Apple does what Microsoft (Windows), Google (Android) and any Andriod phone or PC manufacture does, plus chip design, all inside of one company. No one else can claim that achievement. Have no doubt, the T-1 and T-2 chips along with macOS are just the beginning of the resurgence the desktop.
Ha, ha, I meant that in a good way! Thanks for all the great news and reviews AI. Despite my profile I've been with you since 1998.
As to the why... well, there could be many reasons. If someone steals my laptop, I don't want them into the data on it. And, in our current culture, you don't have to be a criminal to want to hide your data, even if you don't have a bunch of extreme confidential information.
It does the former after the patch, and also the latter.