wiggin

About

Username
wiggin
Joined
Visits
32
Last Active
Roles
member
Points
258
Badges
0
Posts
2,265
  • Consumer Reports now recommends MacBook Pro after Apple software fix

    nht said:
    gatorguy said:
    nht said:

    lkrupp said:
    The kerfuffle uncovered a bug. Apple admits a bug caused the issue for CR’s testing. The bug was squashed. The new tests caused CR to change its recommendation. We should be happy.
    Exactly right. All the bile being vented over CR is really misplaced.  CR used the same testing parameters on multiple brands, and Apple alone failed. That's good science.  Apple found that the failure was due to "an obscure bug", and they fixed it.  Now CR is being responsible and cooperative by re-testing the Apple product(s) and giving a passing score.  Bravo!  Rather than pretend the bug didn't exist and point fingers, let's be happy that everything has worked out.

    Just because the average user might not have encountered the bug doesn't mean it didn't exist.  And CR found it.  GREAT!  Apple should have found it first.
    FALSE.  CR did NOT use the same testing parameters as on other brands.  Had they kept to their standard testing procedure they would have ended up with a 10.4 hour batter life.  Instead they changed their standard testing method to only using the lowest battery life score instead of an average artificially scoring the MBP low even through they KNEW the test was flawed given that Chome produced the correct run times.

    You are mistaken sir. Chrome is not a browser that the MacBook would have come with. That's why they used Safari which would be the more typical one that Apple buyers would work with and the one pre-installed as the default. AFAIK they used the same test procedures with all other laptops including previous Apple models without major issue, and using whatever the default browser was. This was an odd man out, thus CU/CR reaching out to Apple for assistance is discovering the problem, which Apple identified and apparently now fixed. That's a good thing isn't it? 
    I am not mistaken at all.  They admitted they deviated from their standard procedures in their article. Their standard procedure is to average the results of the tests.  They choose not to do so but to use the lowest time instead.  Had they kept to their standard procedure they would have gotten a score that may have been lower than other MBPs but likely above their recommendation threshold. 

    Perhaps true, but I'd bet that those other results that were averaged probably were all within the same ballpark, so taking the average as a single, representative number is valid. But if you have hugely varying numbers as in the first set of tests on the MBP, for a test that should produce fairly consistent, repeatable results, then presenting only a single number (the average) would border on testing malpractice. In the world of statics it would be an average with a very low confidence level.

    And you seem to be confusing testing procedure with reporting. The report is the interpretation of the testing results, and that includes pointing out any anomalies. You don't just publish a low confidence average without discussing the outliers. The fact that they told you they normally present the average and then explained to you why they didn't in this case is a level of transparency we should beg for more of in all of our new sources!

    Now, if you can show us were they tested other computers and got similarly wild results but then only presented the average, then we can talk about conspiracy, bias, unfairness to Apple. But until then, these are just the facts.
    williamlondongatorguy
  • Consumer Reports now recommends MacBook Pro after Apple software fix

    bulk001 said:
    How is this a backpedal? CR had an issue with the battery and reported it. The report got Apple's attention and a fix. As the problem doesn't exist any longer they change their recommendation.

    They reported it as a problem with the MacBook Pro, when in fact it was a problem with Safari (which runs on any Mac). That's an important distinction. This was not a MacBook Pro design flaw or battery issue.
    Isn't it Apple's claim that their products are better because they "make the whole widget"? Therefore the whole widget should be subject to testing. Battery life testing has little meaning outside of the context of the software being used.
    williamlondon
  • Consumer Reports now recommends MacBook Pro after Apple software fix

    Soli said:
    Apropos of nothing, I'm just wondering how one gets a job doing these battery tests. You'd be paid basically to browse the web, watch videos, etc. for hours until the battery runs out. Sounds like a pretty good gig.
    It's a script that pulls that locally cached pages repeatedly, hence the need to disable cacheing when running their test.

    It's not an unsound test, but it's an incomplete way to test the battery life of any modern personal computer. Unfortunately, there may never be a magic variety of scripts to run different tasks that will be useful to all, but running a script that tests a single type of event for a complete test, which is then repeated at least twice, and then runs a script that tests another app for a common usage in the same way, and so on, should be able to give us enough info to figure out how well any computing device will work for us. Too bad macOS doesn't have the same, nifty "which apps have used what percentage of power since the last charge?" option like iOS.
    Testing is far more complicated that many people here seem to realize. Sure, you could create a script with 10,000 real world websites to load rather than the static set of in-house sites CR used (was it 10 "fake" sites, if I recall correctly?). That way you could leave the cache enabled. But that would be a completely non-repeatable test as the content of those sites could change from one test to another. And besides, if you cycled through that list of sites one-by-one, you'd be loading new content with every page load...which would be the same effect as disabling the cache!

    Disabling the cache is the simplest way to get repeatable results. The fact that they got very erratic would in no way be explainable from the disabled cache setting...that would increase consistency, not decrease it. In hindsight, with Apple's assistance, yes it's a head-slapper when you tie it back to the developer setting. But at the time of the testing there would probably be little to suggest that the developer setting they used is what was activating the bug it uncovered.
    williamlondon
  • Consumer Reports now recommends MacBook Pro after Apple software fix

    NY1822 said:
    Apple did not "fix" any bug....the only evidence we have of what Apple did is in the last line of the article:
    "After we asked Consumer Reports to run the same test using normal user settings, they told us their MacBook Pro systems consistently delivered the expected battery life"

    Consumer reports comment of "with the updated software" could mean "us doing it the correct way this time" NOT "Apple updated their software"
    Wrong. From what I've read elsewhere the new tests with the Apple bug fix were run with the cache still disabled.
    williamlondon
  • Consumer Reports now recommends MacBook Pro after Apple software fix

    No. Not after "software fix", after their FRAUD was exposed to everyone.

    Running the machine with a hidden developer setting that is DESIGNED to reduce performance, and then not disclosing that fact, and then claiming an independant test shows the machine under-performs the manufacturers claims is FRAUD.  Possibly Libel/Slander as well.

    CU is not a credible organization.
    And yet the new tests were run with the cache still disabled, and they got over 15 hours of run time on every MBP they tested. So clearly disabling the cache was not the cause of the battery issue. Apple's bug was the cause. Would this still be considered a "non-standard" configuration, sure. But it was not the cause of the erratic battery performance.

    In fact, now we could argue that CRs testing methodology is flawed because it is wildly over-estimating battery life. 18+ hours in one test?!?! That's absurd! But I don't see many people complaining about that result.
    gatorguywilliamlondon