Apple working with Consumer Reports on MacBook Pro battery findings, says Phil Schiller

135678

Comments

  • Reply 41 of 147
    Sure, there is a software issue.  They'll fix it.

    What I found shocking is the battery hit from the Touch Bar.  A 20% battery hit?  What junk!  No one should buy that model!

    What this means is everyone should buying the heavily discounted old model.
    lkruppmacpluspluspulseimagesRayz2016pscooter63StrangeDaysadamcroundaboutnowbrucemcchia
  • Reply 42 of 147
    lkrupplkrupp Posts: 10,557member
    “Funny” is the new down-vote. Get used to it.
    anantksundaramfarjamedRayz2016pscooter63pulseimagesadamcroundaboutnowbrucemcsuddenly newton
  • Reply 43 of 147
    Soli said:
    Do they just buy a single MBP for their testing?
    How many should they buy?
    pulseimagesbrucemc
  • Reply 44 of 147
    Consumer Reports is not into the "clickbait" thing. Also, they have no reason to target the MacBook Pro, which accounts for a very small percentage of PC sales. Note: this same fact is cited by an oped "No, Apple did not switch to USB-C on its new MacBook Pros to profit from dongle & adapter sales" down the page which points out that Apple switching to USB Type C on MacBooks is not some profiteering on accessories scheme with stating "Apple's best quarter ever for the Mac was the September 2015 frame, when the company sold 5.7 million computers." You may find fault with Consumer Reports' testing methodology, but the important thing to remember is that they use the same methodology to test devices from all manufacturers. Suggesting they adopt Apple's methodology for testing Apple devices or have Apple play a role in designing the test for them A) gives Apple an advantage that other manufacturers do not enjoy and B) would not necessarily be helpful in identifying something that Apple may have missed due to flaws in their own QA program, "tunnel vision" etc. I will state that Consumer Reports' methodology is not that different from the "benchmarking" tests that I see on a lot of sites. Add that to the fact that Consumer Reports is hardly a tech site i.e. AnandTech, so their testing is going to be more "general purpose" geared to the needs and use cases of the average user, just as they are not Motor Trend when it comes to their car reviews. AnandTech, ComputerWorld, Tom's Hardware etc. probably would have followed up the Safari tests with a bunch of different tests on a variety of applications, but that isn't really Consumer Reports' job. And yes, this is likely a software issue as opposed to a hardware issue. Consumer Reports' own review stated as much, and also stated that they are going to revise it later when the software issue is fixed. And as the notorious resource hog and bug magnet "platform-as-a-browser" application Chrome gave better results than Safari, then the problem is almost certainly Safari itself. Do not pretend as if Safari hasn't been a huge headache for years. Apple will release a patch for Safari to resolve this issue - or a patch in the OS that addresses whatever is causing the issue with Safari - Consumer Reports will revise the review and all this will be forgotten. If it takes more than a week, it will only be because everyone is on Christmas vacation. So no conspiracies, no nefarious attempt by Consumer Reports to undermine Apple or make money. (Funny, none of these accusations are made when these same companies give Apple good reviews and good PR ... and when they give bad reviews and PR to Microsoft, Samsung and the other competition.) This is just yet another "Apple PLEASE do something about Safari" in a long line of them. And please improve iTunes while you are at it.
    anantksundarammacpluspluslkruppadamcbrucemcsuddenly newton
  • Reply 45 of 147
    Rayz2016 said:
    Soli said:
    Do they just buy a single MBP for their testing?
    They bought two machines: a 13-inch and a 15-inch model. Ideally they should have bought two of each, but I'm not sure it would have made a difference in this case. 
    How would two of each help? What if one pair had shown inconsistent results, the other consistent? Get a third, tie-breaker pair? 

    The only relevant question is, did they do anything differently in this front then what they have done in the past with Apple and non-Apple products in these types of tests. Everything else is cherry-picking or sour-graping the results of the tests. 
    pscooter63brucemcgatorguy
  • Reply 46 of 147
    calicali Posts: 3,494member
     Imagine if Apple finds this is foul play? They should sue the crap out of them if so. 

    Niwadays you never know. 
    anantksundaramfarjamedpulseimages9secondkox2
  • Reply 47 of 147
    MplsPMplsP Posts: 3,931member
    MplsP said:
    "...the publication's results are not in line with Apple's own "extensive lab tests or field data..."

    I'm not sure what field tests Apple has done, but there are tons or reports/complaints of poor battery life. it's pretty clear that there is an issue with the machines, whether it's software or hardware based is still up for debate. Either way I'm sticking with my Mid-2011 MacBook air. I still get 3 hours of battery life on it, which is just as good as half the people are getting with their brand new MacBook Pros. I really can't disagree with Consumer Reports - I wouldn't advise buying one of these either.
    How many of the "tons of complaints" went away after the indexing and iCloud upload finished?
    Really no way to tell, and the internet is of course the internet, so it's very difficult to accurately quantify any of this. Even taking into account the fact that people are more likely to complain than gush online, the general view I've gotten from various online sites is that the 2016 MacBook Pro rollout has been one of the worst in a long time. Perhaps the battery life improved after OS X finished indexing, but I also haven't seen may reports saying "my battery life sucked at first, but now it's great!" Apple removing the 'time remaining' indicator also gave the appearance of them trying to cover up the problem rather than admitting and fixing it.

    Consumer Reports is not the first place I go for tech advice, but in their defense, their test is more realistic than some of the benchmarking tests, and the fact that they got such wildly discordant results indicates that something is askew and deserves an explanation. A lot of people also trust them, so having a "Consumer Reports Fails to Recommend New MacBook Pro" headline is significant bad PR for Apple. Given Apple's history of being completely aloof to reviews and press reports, the fact that Schiller is both taking to CR and publicizing it is pretty significant.
    adamc
  • Reply 48 of 147
    calicali Posts: 3,494member
    Damage control!
    Damage control? Did you actually read their report? They got wildly different battery results and on the high end much higher than anyone else has gotten. How could they publish a report with such variation?
    Yes, I did. The fact that Phil is reacting so quickly to just about every slam against the new MBP's means there is really something wrong with the batteries and no one is believing Apple's explanation of removing a certain battery display feature will magically fix the problem.
    when Apple fixes the problem behind the scenes iHaters scream "Apple doesn't care about its customers!!"


    pulseimages
  • Reply 49 of 147
    After Consumer Reports refused to assign a recommend rating to any of Apple's three new MacBook Pro models, a first for the laptop line, Apple SVP of Worldwide Marketing Phil Schiller said the company is working with the publication to resolve the apparent battery issue.




    In a surprise pre-Christmas report published Thursday, Consumer Reports said it could not recommend any new MacBook Pro model due to battery life concerns. Specifically, the publication's in-house testing revealed wild fluctuations in unplugged operating survivability, in some cases ranging from 16 hours to as little as 3.75 hours.

    Consumer Reports assigned the 15-inch MacBook Pro with Touch Bar a mediocre score of 56 points out of 100. The 13-inch versions with and without Touch Bar received similarly poor ratings of 40 and 47, respectively. With such a dismal showing, no MacBook achieved a "recommended" designation, a first for Apple.

    Responding to the critique, Schiller in a tweet on Friday said Apple is "[w]orking with [Consumer Reports] to understand their battery tests," noting the publication's results are not in line with Apple's own "extensive lab tests or field data."

    Working with CR to understand their battery tests. Results do not match our extensive lab tests or field data. https://t.co/IWtfsmBwpO

    -- Philip Schiller (@pschiller)


    Whereas Apple spends substantial capital on special machinery, facilities and man hours to perform rigorous quality assurance testing, Consumer Reports applied an arguably less scientific methodology in its trials. As noted in the original review, the publication ran a series of tests that involved downloading ten pre-selected web pages over Wi-Fi using Safari. Screen brightness settings were consistent, and the trial runs proceeded until the laptop shut down.

    That said, the low-end results cannot be ignored.

    In its evaluation, Consumer Reports found 15-inch MacBook Pro with Touch Bar battery life varied from 18.5 hours to 8 hours, while the 13-inch variant with Touch Bar ran for between 16 hours and 3.75 hours. The 13-inch MacBook Pro without Touch Bar managed 19.5 hours at its best, but conked out in only 4.5 hours in a subsequent test. Final ratings were calculated using the lowest battery life results.

    Apple initially declined to comment on the publication's findings, saying only that customers can contact AppleCare if they have questions or concerns about MacBook performance. The publication did leave the door open for an Apple response, noting the battery life of many modern products are "influenced" by software updates.

    "If Apple updates its software in a way that the company claims will substantively change battery performance, we will conduct fresh tests," the original report said.

    Apparently Consumer Reports got its wish.
    I don't think Consumer Reports ever "works with" manufacturers. They buy the units they test themselves and stay independent of the manufacturers. I can see them providing their testing methods which they often do in the magazine report itself.
    gatorguy
  • Reply 50 of 147
    avon b7avon b7 Posts: 7,691member
    Soli said:
    Do they just buy a single MBP for their testing?
    How many should they buy?
    Exactly. This is a consumer magazine and their testing should reflect the experience Joe Public could get. That means buying a machine from the retail channel, putting it through its paces and reporting the results.

    There is no need to buy a spread of machines to see if some vary in test results from others.

    If testing was skewed or wrong in some way that can be looked into and corrected on the same machine.

    Apple can examine the machine in question and the testing that was done. If the machine is deemed to have some kind of problem, it should be reported with absolute transparency and the magazine should go out and purchase another random unit and retest it.

    If the testing itself is deemed to be flawed and the magazine agrees that that is the case, then they can retest everything having eliminated the testing flaw and present new results.

    I'm glad Phil is taking care of his hens in such a proactive way. I can't remember any product that has had him come out publicly to defend on so many occasions.


    brucemc
  • Reply 51 of 147
    SoliSoli Posts: 10,035member
    Soli said:
    Do they just buy a single MBP for their testing?
    How many should they buy?
    I'd say at least two of each size class, but ideally at least one in each of the major configurations.
  • Reply 52 of 147
    Rayz2016Rayz2016 Posts: 6,957member
    Rayz2016 said:
    Soli said:
    Do they just buy a single MBP for their testing?
    They bought two machines: a 13-inch and a 15-inch model. Ideally they should have bought two of each, but I'm not sure it would have made a difference in this case. 
    How would two of each help? What if one pair had shown inconsistent results, the other consistent? Get a third, tie-breaker pair? 

    The only relevant question is, did they do anything differently in this front then what they have done in the past with Apple and non-Apple products in these types of tests. Everything else is cherry-picking or sour-graping the results of the tests. 

    Okay, let me introduce you to a little testing methodology I like to call 'leaving no stone unturned'. With the internet and its mum screaming that every other Macbook Pro has a failing battery, then I would have bought two machines of each configuration to make sure that I wasn't looking at a hardware problem. If I'm seeing the same problem with the same test on all machines, then chances are we're looking at a problem with the software. If some of the machines behave normally and one or two of the mentions exhibit the odd behaviour, then I would be leaning towards a problem with the hardware. 

    I would also check memory while the test is running, but unlike you, I've tested hardware/software combinations before. If I see results that don't make sense, I try a second machine to make sure I'm not dealing with dodgy hardware, especially if I've run the same tests on other configuration  It saves a lot of time. And also, unlike you, I wouldn't count their second laptop, which happens to be a completely different configuration, as the second hardware check machine. Two machines of each type; that's what I would go with.

    As for 'cherry-picking' and 'sour-graping'? I think you're mistaking that for people asking for more details of CR's methodology. Their first run recorded a battery life much higher than anything else recorded for this laptop. Rather than saying 'Wow, this laptop can go for 18 hours', the sour-grapers are saying, 'That sounds a bit high.' 

    What I suspect will happen is that this problem will be traced to a bug in the software (and since this doesn't seem to happen with Chrome then it looks like Safari might be the culprit) that Apple will fix.  What people are hoping for is that the test will show a problem with the hardware, which they somehow think will cause Apple to have a hallelujah moment and start making making laptops bristling with legacy ports and a battery the size of an aircraft carrier.


    pscooter63watto_cobra
  • Reply 53 of 147
    I am curious how many pay attention to CR.  I have never bothered and have had great results with what I use.
  • Reply 54 of 147
    Sure, there is a software issue.  They'll fix it.

    What I found shocking is the battery hit from the Touch Bar.  A 20% battery hit?  What junk!  No one should buy that model!

    What this means is everyone should buying the heavily discounted old model.
    Too late.  Millions already have and, along with most reviewers, they report loving it.  I love mine and am getting 7 to 11 hours consistently.   What should we do now oh great swami?
    watto_cobra
  • Reply 55 of 147
    freeper said:
    Consumer Reports is not into the "clickbait" thing. Also, they have no reason to target the MacBook Pro, which accounts for a very small percentage of PC sales. Note: this same fact is cited by an oped "No, Apple did not switch to USB-C on its new MacBook Pros to profit from dongle & adapter sales" down the page which points out that Apple switching to USB Type C on MacBooks is not some profiteering on accessories scheme with stating "Apple's best quarter ever for the Mac was the September 2015 frame, when the company sold 5.7 million computers." You may find fault with Consumer Reports' testing methodology, but the important thing to remember is that they use the same methodology to test devices from all manufacturers. Suggesting they adopt Apple's methodology for testing Apple devices or have Apple play a role in designing the test for them A) gives Apple an advantage that other manufacturers do not enjoy and B would not necessarily be helpful in identifying something that Apple may have missed due to flaws in their own QA program, "tunnel vision" etc. I will state that Consumer Reports' methodology is not that different from the "benchmarking" tests that I see on a lot of sites. Add that to the fact that Consumer Reports is hardly a tech site i.e. AnandTech, so their testing is going to be more "general purpose" geared to the needs and use cases of the average user, just as they are not Motor Trend when it comes to their car reviews. AnandTech, ComputerWorld, Tom's Hardware etc. probably would have followed up the Safari tests with a bunch of different tests on a variety of applications, but that isn't really Consumer Reports' job. And yes, this is likely a software issue as opposed to a hardware issue. Consumer Reports' own review stated as much, and also stated that they are going to revise it later when the software issue is fixed. And as the notorious resource hog and bug magnet "platform-as-a-browser" application Chrome gave better results than Safari, then the problem is almost certainly Safari itself. Do not pretend as if Safari hasn't been a huge headache for years. Apple will release a patch for Safari to resolve this issue - or a patch in the OS that addresses whatever is causing the issue with Safari - Consumer Reports will revise the review and all this will be forgotten. If it takes more than a week, it will only be because everyone is on Christmas vacation. So no conspiracies, no nefarious attempt by Consumer Reports to undermine Apple or make money. (Funny, none of these accusations are made when these same companies give Apple good reviews and good PR ... and when they give bad reviews and PR to Microsoft, Samsung and the other competition.) This is just yet another "Apple PLEASE do something about Safari" in a long line of them. And please improve iTunes while you are at it.
    CR is very well into the clickbait thing as it sells visitors' data to advertisers and "doesn't recommend" brings always more clicks than "recommends" in regard to famous brands. Their vague statements about tests, their avoidance to give any meaningful description of these tests ("download ten pages" is not a test and would not exhaust the battery) shows that this time CR is caught with dirty hands. They may be "hardly a tech site" but measuring a battery performance is still a tech issue which should be done under proper engineering rules, they cannot escape with that statement...
    edited December 2016 watto_cobra
  • Reply 56 of 147
    Rayz2016 said:
    Rayz2016 said:
    Soli said:
    Do they just buy a single MBP for their testing?
    They bought two machines: a 13-inch and a 15-inch model. Ideally they should have bought two of each, but I'm not sure it would have made a difference in this case. 
    How would two of each help? What if one pair had shown inconsistent results, the other consistent? Get a third, tie-breaker pair? 

    The only relevant question is, did they do anything differently in this front then what they have done in the past with Apple and non-Apple products in these types of tests. Everything else is cherry-picking or sour-graping the results of the tests. 

    Okay, let me introduce you to a little testing methodology I like to call 'leaving no stone unturned'. With the internet and its mum screaming that every other Macbook Pro has a failing battery, then I would have bought two machines of each configuration to make sure that I wasn't looking at a hardware problem. If I'm seeing the same problem with the same test on all machines, then chances are we're looking at a problem with the software. If some of the machines behave normally and one or two of the mentions exhibit the odd behaviour, then I would be leaning towards a problem with the hardware. 

    I would also check memory while the test is running, but unlike you, I've tested hardware/software combinations before. If I see results that don't make sense, I try a second machine to make sure I'm not dealing with dodgy hardware, especially if I've run the same tests on other configuration  It saves a lot of time. And also, unlike you, I wouldn't count their second laptop, which happens to be a completely different configuration, as the second hardware check machine. Two machines of each type; that's what I would go with.

    As for 'cherry-picking' and 'sour-graping'? I think you're mistaking that for people asking for more details of CR's methodology. Their first run recorded a battery life much higher than anything else recorded for this laptop. Rather than saying 'Wow, this laptop can go for 18 hours', the sour-grapers are saying, 'That sounds a bit high.' 

    What I suspect will happen is that this problem will be traced to a bug in the software (and since this doesn't seem to happen with Chrome then it looks like Safari might be the culprit) that Apple will fix.  What people are hoping for is that the test will show a problem with the hardware, which they somehow think will cause Apple to have a hallelujah moment and start making making laptops bristling with legacy ports and a battery the size of an aircraft carrier.


    You can leave as many stones turned or unturned as you want, but it's irrelevant. Are you implying that CR should do special testing for Apple? Why? Do you have any evidence that they test other products in the way you suggest? C'mon.

    Their test replicates a real world case where a typical consumer buys one randomly and uses it. Period. They're not a very wealthy publication (they might even be a non-profit, for all I know), they don't even accept ads. Cut them a little slack. There are a number of idiots posting here making all sorts of claims about these guys looking for bribes, looking for clickbait, etc. That's offensive nonsense.

    If the problem is a SW problem, great. Apple should -- and will -- fix it. I am not "hoping for" anything other than: (i) If it's not a problem to begin with, let find out that it's not; (ii) It it's a problem, then let it get fixed. End of story.
    edited December 2016 adamcbrucemc
  • Reply 57 of 147

    Soli said:
    Soli said:
    Do they just buy a single MBP for their testing?
    How many should they buy?
    I'd say at least two of each size class, but ideally at least one in each of the major configurations.
    Please see Para 2 of my reply to @Rayz2016 above. (The rest of my reply to him does not apply to yours).
    edited December 2016
  • Reply 58 of 147

    I am curious how many pay attention to CR.  I have never bothered and have had great results with what I use.
    Judging from the reactions here and elsewhere -- not to mention attracting the attention of Apple itself -- I guess plenty of people do. The fact that you "have never bothered" amounts to one data point that's neither here nor there.
    brucemcpulseimagesavon b7
  • Reply 59 of 147

    CR is very well into the clickbait thing as it sells visitors' data to advertisers and "doesn't recommend" brings always more clicks than "recommends" in regard to famous brands. Their vague statements about tests, their avoidance to give any meaningful description of these tests ("download ten pages" is not a test and would not exhaust the battery) shows that this time CR is caught with dirty hands. They may be "hardly a tech site" but measuring a battery performance is still a tech issue which should be done under proper engineering rules, they cannot escape with that statement...
    Groan. Please stop with this bullshit. See excerpt cut and pasted below (the full page here): https://ec.consumerreports.org/ec/cro/order.htm?EXTKEY=SG72CR0&gclid=COq81-7ijdECFQhMDQodUHEI1A
    edited December 2016 brucemc
  • Reply 60 of 147
    slurpyslurpy Posts: 5,384member
    voodooru said:
    I miss Steve. 

    Tim's too busy doing exclusive ABC interviews to repeat the same boring 'we're great' corporate lines. 

    Poor Phil has to do work during holiday week. 

    Maybe more attention to detail before release?? Ya'know... what Apple is actually famous for. 

    Working with CR??? How the mighty have taken a bruising! Not to mention looking to hire battery experts. One would think Apple had that under control with over two decades of laptop experience. 

    Thanks for the useless, shitty post. Obviously, Phil was trying to convey that he didn't think their results made any sense, and Apple is trying to find out WTF they did to get those results. He's casting doubt on them in a polite way, and it's justified. As for your trolling, if you were actually around during the Steve days, you would know that Apple had a lot of fuckups, most of which were much larger than anything under Cook's watch (when Apple is selling an order of magnitude more products, to a much whinier, entitled set of people who make 50,000 Youtube videos shared a billion times for every perceived and imagined bug. CR didn't recommend the iPhone 4 during Steve's watch, you hypocrite.
    edited December 2016 macplusplusration alStrangeDaysroundaboutnowpscooter63brucemcpulseimagespalominechiawatto_cobra
Sign In or Register to comment.