Apple says hidden Safari setting led to flawed Consumer Reports MacBook Pro battery tests

12346»

Comments

  • Reply 101 of 118
    I am confused.

    CR have been using the same tests for years and haven't encountered any issues when previously reviewing Macs. In the main they have have reported battery lives that are consistent with Apple's.

    Why are CR now the spawn of Satan? 

    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?

    Sorry, but I don't understand why any single firm should be given special treatment. If Apple have fixed the issue, then yes, perhaps CR should repeat the cacheless test and provide an update, but only if they allow other firms to do the same.
    edited January 2017 farjamed
  • Reply 102 of 118
    SoliSoli Posts: 10,035member
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.

    And this isn't the first time CR has used poor methodology to test technology. There is no skullduggery on their part, they simply don't have the expertise needed to do a comprehensive test that can trusted, regardless of whether it's above or below a company's own battery life results.
  • Reply 103 of 118
    gatorguygatorguy Posts: 24,213member
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.


    Isn't that what they did Soli, reaching out to Apple for assistance in figuring out the odd results?
  • Reply 104 of 118
    SoliSoli Posts: 10,035member
    gatorguy said:
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.
    Isn't that what they did Soli, reaching out to Apple for assistance in figuring out the odd results?
    They first came out to the public with results that said they don't recommend this MBP (HW) because of the results of the test. How many weeks ago was that? On yesterday did we see articles that shows that Apple figured out the issue and that CR was going to do retests.
  • Reply 105 of 118
    gatorguygatorguy Posts: 24,213member
    Soli said:
    gatorguy said:
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.
    Isn't that what they did Soli, reaching out to Apple for assistance in figuring out the odd results?
    They first came out to the public with results that said they don't recommend this MBP (HW) because of the results of the test. How many weeks ago was that? On yesterday did we see articles that shows that Apple figured out the issue and that CR was going to do retests.
    I think the ORIGINAL Consumer Reports article pointed out they had been in contact with Apple and would update their findings once Apple had weighed in.

  • Reply 106 of 118
    nobelpeaceprizenobelpeaceprize Posts: 5unconfirmed, member
    Although Apple did have a bug in the developer mode when cache was off, this who thing is really CR's fault. Of course, they didn't know that turning off cache would reduce the battery life so dramatically and they didn't know about the bug, but they claimed that they test all laptops using that method. If so their results cannot be attributed to real world usage. So essentially CR was creating something that wasn't completely a problem in the first place. Although I'm glad they collaborated with Apple or Apple wouldn't have found the bug yet!
  • Reply 107 of 118
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.

    And this isn't the first time CR has used poor methodology to test technology. There is no skullduggery on their part, they simply don't have the expertise needed to do a comprehensive test that can trusted, regardless of whether it's above or below a company's own battery life results.
    Apple say that their own test comprises of them only visiting 25 "popular" sites over Wifi until the battery dies.

    In order to ensure continuity I would imagine that Apple store those sites in a cache on their own local servers. And that those same caches are used year in, year out, to ensure that each new Mac is tested under the same stresses as the previous ones. 

    If my assumption is correct (which I accept might not be the case) then the only difference is that Apple cache those sites on the Mac, CR don't. Neither test strikes me as being particularly complex or scientific though (and neither is particularly "real world" unless the "typical" user only ever visits the same 25 static websites).
    edited January 2017 singularity
  • Reply 108 of 118
    nhtnht Posts: 4,522member
    hungover said:
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.

    And this isn't the first time CR has used poor methodology to test technology. There is no skullduggery on their part, they simply don't have the expertise needed to do a comprehensive test that can trusted, regardless of whether it's above or below a company's own battery life results.
    Apple say that their own test comprises of them only visiting 25 "popular" sites over Wifi until the battery dies.

    In order to ensure continuity I would imagine that Apple store those sites in a cache on their own local servers. And that those same caches are used year in, year out, to ensure that each new Mac is tested under the same stresses as the previous ones. 

    If my assumption is correct (which I accept might not be the case) then the only difference is that Apple cache those sites on the Mac, CR don't. Neither test strikes me as being particularly complex or scientific though (and neither is particularly "real world" unless the "typical" user only ever visits the same 25 static websites).
    You can't read?  Apple runs 3 tests, website, video and standby. Had CR not been lazy and ran a more comprehensive test they would have found strange results in 1 of the 3 tests and known it there was a bug.  They did know because of they tried Chrome and found consistently high battery life but ignored it to go with a sensationalist result they KNEW was wrong.

    Finally, how many sites do you visit?  I doubt it's actually more than 25.  Hitting the top 25 most popular sites sequentially is a lot more realistic than what CR did.  As the sites change over the course of the test the cache gets updated just like it would in real life.  Not artificially forced to reload every time.

    Your assumption is silly as they would not likely reuse the same "static" sites every year as the internet evolves with ever more complex standards, web frameworks and features. 
    hungover
  • Reply 109 of 118
    crowleycrowley Posts: 10,453member
    nht said:
    hungover said:
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.

    And this isn't the first time CR has used poor methodology to test technology. There is no skullduggery on their part, they simply don't have the expertise needed to do a comprehensive test that can trusted, regardless of whether it's above or below a company's own battery life results.
    Apple say that their own test comprises of them only visiting 25 "popular" sites over Wifi until the battery dies.

    In order to ensure continuity I would imagine that Apple store those sites in a cache on their own local servers. And that those same caches are used year in, year out, to ensure that each new Mac is tested under the same stresses as the previous ones. 

    If my assumption is correct (which I accept might not be the case) then the only difference is that Apple cache those sites on the Mac, CR don't. Neither test strikes me as being particularly complex or scientific though (and neither is particularly "real world" unless the "typical" user only ever visits the same 25 static websites).
    You can't read?  Apple runs 3 tests, website, video and standby. Had CR not been lazy and ran a more comprehensive test they would have found strange results in 1 of the 3 tests and known it there was a bug.  They did know because of they tried Chrome and found consistently high battery life but ignored it to go with a sensationalist result they KNEW was wrong.
    They didn't ignore it, they explicitly called it out, which is why you know about it.  However, Chrome is not the browser that ships with a MacBook Pro.
    hungover
  • Reply 110 of 118
    crowleycrowley Posts: 10,453member
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) 
    The 19.5 hours seems very odd.  I wonder if the display was off for these tests?  Or if in some other way the test case amounted to less than normal use.

    Doesn't seem likely that a possible side effect of the bug is to increase battery life.
  • Reply 111 of 118
    nhtnht Posts: 4,522member
    crowley said:
    nht said:
    hungover said:
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.

    And this isn't the first time CR has used poor methodology to test technology. There is no skullduggery on their part, they simply don't have the expertise needed to do a comprehensive test that can trusted, regardless of whether it's above or below a company's own battery life results.
    Apple say that their own test comprises of them only visiting 25 "popular" sites over Wifi until the battery dies.

    In order to ensure continuity I would imagine that Apple store those sites in a cache on their own local servers. And that those same caches are used year in, year out, to ensure that each new Mac is tested under the same stresses as the previous ones. 

    If my assumption is correct (which I accept might not be the case) then the only difference is that Apple cache those sites on the Mac, CR don't. Neither test strikes me as being particularly complex or scientific though (and neither is particularly "real world" unless the "typical" user only ever visits the same 25 static websites).
    You can't read?  Apple runs 3 tests, website, video and standby. Had CR not been lazy and ran a more comprehensive test they would have found strange results in 1 of the 3 tests and known it there was a bug.  They did know because of they tried Chrome and found consistently high battery life but ignored it to go with a sensationalist result they KNEW was wrong.
    They didn't ignore it, they explicitly called it out, which is why you know about it.  However, Chrome is not the browser that ships with a MacBook Pro.
    They ignored it when making their senstionalist and incorrect assertion that MBP battery life sucked.  They even stated they ignored it.
    hungover
  • Reply 112 of 118
    crowleycrowley Posts: 10,453member
    nht said:
    crowley said:
    nht said:
    hungover said:
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.

    And this isn't the first time CR has used poor methodology to test technology. There is no skullduggery on their part, they simply don't have the expertise needed to do a comprehensive test that can trusted, regardless of whether it's above or below a company's own battery life results.
    Apple say that their own test comprises of them only visiting 25 "popular" sites over Wifi until the battery dies.

    In order to ensure continuity I would imagine that Apple store those sites in a cache on their own local servers. And that those same caches are used year in, year out, to ensure that each new Mac is tested under the same stresses as the previous ones. 

    If my assumption is correct (which I accept might not be the case) then the only difference is that Apple cache those sites on the Mac, CR don't. Neither test strikes me as being particularly complex or scientific though (and neither is particularly "real world" unless the "typical" user only ever visits the same 25 static websites).
    You can't read?  Apple runs 3 tests, website, video and standby. Had CR not been lazy and ran a more comprehensive test they would have found strange results in 1 of the 3 tests and known it there was a bug.  They did know because of they tried Chrome and found consistently high battery life but ignored it to go with a sensationalist result they KNEW was wrong.
    They didn't ignore it, they explicitly called it out, which is why you know about it.  However, Chrome is not the browser that ships with a MacBook Pro.
    They ignored it when making their senstionalist and incorrect assertion that MBP battery life sucked.  They even stated they ignored it.
    No, they didn't ignore it, they explicitly called it out in the original blog post, even though it didn't form part of their standard testing regimen:
    Once our official testing was done, we experimented by conducting the same battery tests using a Chrome browser, rather than Safari. For this exercise, we ran two trials on each of the laptops, and found battery life to be consistently high on all six runs. That’s not enough data for us to draw a conclusion, and in any case a test using Chrome wouldn’t affect our ratings, since we only use the default browser to calculate our scores for all laptops. But it’s something that a MacBook Pro owner might choose to try.
    They also plainly suggest that the problems are due to software bugs (no further "comprehensive" testing needed) with their reaction and communication to Apple:
    Consumer Reports has shared diagnostic files pulled from all three computers with Apple in the hope that this will help the company diagnose and fix any problem. We will report back with any updates.
    (Both quotes from http://www.consumerreports.org/laptops/macbook-pros-fail-to-earn-consumer-reports-recommendation/


    And I hardly think calling a result "varied" and "inconsistent" when it has been exactly that is "senastionalist" or even an assertion that it "sucked".  Indeed, I rather think you're being sensationalist with your stymying of Consumer Reports.  

    Their standard tests were hampered by a software bug that didn't exist on previous products, Apple are fixing it, CR will retest.  No need to accuse anyone of being unreasonably lazy or negligent.
    singularity
  • Reply 113 of 118
    nht said:
    hungover said:
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.

    And this isn't the first time CR has used poor methodology to test technology. There is no skullduggery on their part, they simply don't have the expertise needed to do a comprehensive test that can trusted, regardless of whether it's above or below a company's own battery life results.
    Apple say that their own test comprises of them only visiting 25 "popular" sites over Wifi until the battery dies.

    In order to ensure continuity I would imagine that Apple store those sites in a cache on their own local servers. And that those same caches are used year in, year out, to ensure that each new Mac is tested under the same stresses as the previous ones. 

    If my assumption is correct (which I accept might not be the case) then the only difference is that Apple cache those sites on the Mac, CR don't. Neither test strikes me as being particularly complex or scientific though (and neither is particularly "real world" unless the "typical" user only ever visits the same 25 static websites).
    You can't read?  Apple runs 3 tests, website, video and standby. Had CR not been lazy and ran a more comprehensive test they would have found strange results in 1 of the 3 tests and known it there was a bug.  They did know because of they tried Chrome and found consistently high battery life but ignored it to go with a sensationalist result they KNEW was wrong.

    Finally, how many sites do you visit?  I doubt it's actually more than 25.  Hitting the top 25 most popular sites sequentially is a lot more realistic than what CR did.  As the sites change over the course of the test the cache gets updated just like it would in real life.  Not artificially forced to reload every time.

    Your assumption is silly as they would not likely reuse the same "static" sites every year as the internet evolves with ever more complex standards, web frameworks and features. 
    If you must insist on being rude then I suggest that you try to get your facts right first lest you end up looking silly when corrected.

    It is true that Apple run 3 different tests but from what I can see they do not average out those results.

    On the tech specs page for the new MBP they list the results as following
    • Up to 10 hours wireless web
    • Up to 10 hours iTunes movie playback
    • Up to 30 days of standby time
    If Apple were to give equal weighting, the average would be 246.7 hours. 

    My understanding is that Apple publish the wireless web results as the headline figure. The CR test is therefore pretty damned close to the Apple test. CR did report that Chrome offered different results but it is not their job to trouble-shoot bugs in the native browsers. 

    Is my assumption as silly as you think? If Apple were to visit (real) dynamic sites the results would vary massively based upon variable factors such as Java/Flash/adverts/traffic/hops/DNS errors. It would make a mockery of their comparative results over time. Let's say that, for example, site number 2 decides to add lots of Flash based video in 2017, the Wifi browsing figure would be significantly lower than in 2016.

    Thus I conclude that Apple are caching static pages on their own local servers. If you can provide evidence to the contrary I will concede that I am wrong, until such a time, perhaps you should accept that you know less than you assume.

    How many different pages/sites do i visit each day? A shed load more than 25 and yes, I frequently (manually) dump the web cache for each of those pages.










  • Reply 114 of 118
    nht said:
    crowley said:
    nht said:
    hungover said:
    Soli said:
    hungover said:
    They repeated the (non-cached) test a number of times, visiting the same local websites and got massively different results. And still they are at fault?
    Regarding that, yes they are. When you go from 4.5 hours to 19.5 hours (nearly doubling Apple's statement of 10 hours under normal use) while doing a very specific test that only uses website page loads and does so with an developer setting disabled, then they should've considered there was something else going on.

    And this isn't the first time CR has used poor methodology to test technology. There is no skullduggery on their part, they simply don't have the expertise needed to do a comprehensive test that can trusted, regardless of whether it's above or below a company's own battery life results.
    Apple say that their own test comprises of them only visiting 25 "popular" sites over Wifi until the battery dies.

    In order to ensure continuity I would imagine that Apple store those sites in a cache on their own local servers. And that those same caches are used year in, year out, to ensure that each new Mac is tested under the same stresses as the previous ones. 

    If my assumption is correct (which I accept might not be the case) then the only difference is that Apple cache those sites on the Mac, CR don't. Neither test strikes me as being particularly complex or scientific though (and neither is particularly "real world" unless the "typical" user only ever visits the same 25 static websites).
    You can't read?  Apple runs 3 tests, website, video and standby. Had CR not been lazy and ran a more comprehensive test they would have found strange results in 1 of the 3 tests and known it there was a bug.  They did know because of they tried Chrome and found consistently high battery life but ignored it to go with a sensationalist result they KNEW was wrong.
    They didn't ignore it, they explicitly called it out, which is why you know about it.  However, Chrome is not the browser that ships with a MacBook Pro.
    They ignored it when making their senstionalist and incorrect assertion that MBP battery life sucked.  They even stated they ignored it.
    Simple question.

    When was the last time that CR refused to recommend a Mac?


  • Reply 115 of 118
    nhtnht Posts: 4,522member
    crowley said:
    nht said:
    They ignored it when making their senstionalist and incorrect assertion that MBP battery life sucked.  They even stated they ignored it.
    No, they didn't ignore it, they explicitly called it out in the original blog post, even though it didn't form part of their standard testing regimen:
    Once our official testing was done, we experimented by conducting the same battery tests using a Chrome browser, rather than Safari. For this exercise, we ran two trials on each of the laptops, and found battery life to be consistently high on all six runs. That’s not enough data for us to draw a conclusion, and in any case a test using Chrome wouldn’t affect our ratings, since we only use the default browser to calculate our scores for all laptops. But it’s something that a MacBook Pro owner might choose to try.
    They also plainly suggest that the problems are due to software bugs (no further "comprehensive" testing needed) with their reaction and communication to Apple:
    Consumer Reports has shared diagnostic files pulled from all three computers with Apple in the hope that this will help the company diagnose and fix any problem. We will report back with any updates.
    (Both quotes from http://www.consumerreports.org/laptops/macbook-pros-fail-to-earn-consumer-reports-recommendation/

    And I hardly think calling a result "varied" and "inconsistent" when it has been exactly that is "senastionalist" or even an assertion that it "sucked".  Indeed, I rather think you're being sensationalist with your stymying of Consumer Reports.  

    Their standard tests were hampered by a software bug that didn't exist on previous products, Apple are fixing it, CR will retest.  No need to accuse anyone of being unreasonably lazy or negligent.
    They posted a sensationalist "not recommended" because they deviated from their own methodology rather than state that they could not provide a rating at this time.

    "However, with the widely disparate figures we found in the MacBook Pro tests, an average wouldn’t reflect anything a consumer would be likely to experience in the real world. For that reason, we are reporting the lowest battery life results, and using those numbers in calculating our final scores. It’s the only time frame we can confidently advise a consumer to rely on if he or she is planning use the product without access to an electrical outlet."

    They deviated from their own method by using the lowest score rather than the average that they do for all other laptops.  Why?  Because they knew if they flunked the MBP that it would make headlines.  They have NO idea what a consumer would experience in the real world because they were not conducting a real world test.

    The unbiased and proper thing to do would have been to report that they found significant problems during battery testing and could not provide any recommendation until it had been resolved.  

    They KNEW the issue was a bug with Safari because they couldn't replicate it in Chrome.

    They DELIBERATELY deviated from their established procedures to generate an artificially low score to generate a sensationalist result. 
    tallest skil
  • Reply 116 of 118
    nhtnht Posts: 4,522member

    hungover said:

    Is my assumption as silly as you think? If Apple were to visit (real) dynamic sites the results would vary massively based upon variable factors such as Java/Flash/adverts/traffic/hops/DNS errors. 
    My apologies, you are correct.  They use a a dedicated web server and snapshots rather than live for wifi internet tests for the iPhone.  They would do the same for laptops.

    Whether the snapshots change over the duration of the test is not listed.
    hungover
  • Reply 117 of 118
    nht said:

    They DELIBERATELY deviated from their established procedures to generate an artificially low score to generate a sensationalist result. 

    You're right that reporting an average is the typical approach, but I don't think baiting clicks was the reason they didn't in this case.

    Averages are meaningful when they're derived from a limited range. If the machine ran for 9 hours in one test and 11 in another, the average would be a meaningful number. When the range is inexplicably wide an average would actually be misleading. I forget the actual numbers now, but weren't they something like less than 4 at the low end and over 19 at the high end? Reporting an average of 11.5 (or whatever it would actually be) would be an obfuscation of the strange results.

    If one is advocating for Apple, CR's choice to present only the lowest number in their summary is frustrating, but we have to recognize that this was a very unusual case with the results being atypical in the extreme. Perhaps instead of using the lowest number they could have listed the range of results, but let's remember that it's only the summary we're talking about. They were pretty clear in the text of the report that the tests produced widely varying results, actual numbers were provided, and the unusual nature of the situation was noted.

    There may have been a better way to indicate the strange and unusual findings of their tests, but the decision not to just publish an average strikes me as the right one.
    singularity
  • Reply 118 of 118
    SkadSkad Posts: 1unconfirmed, member
    For me its not a question of CR changing or not changing settings. The general complaints from users much more reflects the fact that the batteries of these computers are just not good enough. Apple should just accept that and work at producing a better product next time.
Sign In or Register to comment.