AppleZulu

About

Username
AppleZulu
Joined
Visits
261
Last Active
Roles
member
Points
9,259
Badges
2
Posts
2,580
  • Can Apple's HomePod take on a surround sound theater system?

    AI really does need to think harder about methodologies for their various analyses. In this case, the outcome is probably not changed by much, but wow. Using a poor source material like a YouTube clip is dubious enough. Then, after a fail, the tester proceeds to run a comparison with completely different source hardware and software. I’m not sure why anyone would have started with a decision to play the clip from an iPhone over AirPlay to begin with, but when that option was denied by the HomePod setup, any good tester would have also re-done the surround setup test with the AppleTV’s native YouTube app, in order to match, as closely as possible, the HomePod test with the same source material. Instead, AI used a completely different source setup for the two tests, which is about as arbitrarily unscientific as you can get. 

    There was a recent controversy over HomePod testing at Consumer Reports. I pointed out in that case that CR made an error in its effort to be too fair and consistent in comparing hardware by testing it all in a sound-deadened room. In that case, CR is used to comparing standard speakers, and didn’t want room acoustics to affect comparisons of output from standard speakers. When testing HomePods, however, such a deadened room defeats the key HomePod feature of actively  measuring and using an imperfect room’s acoustics to create its adapted output. The solution to that problem would have been for CR to compare HomePod to other devices in a real-world setup, not to put the HomePods in a different environment and leave the others in the ‘dead’ room.

    Likewise, in this test, AI should have matched everything that is possible to match of the source hardware, software and content in order to make a direct comparison. Instead, they punted halfway through the test and compared completely different setups, which is just plain amateurish.
    dysamoriapatchythepirate
  • Review: HomeKit compatible First Alert Onelink Safe & Sound smoke detector more than it ap...

    macxpress said:
    williamh said:
    crabby said:
    In our end of the valley we get episodic power outages. Absent a $6500 Tesla Powerwall , that leaves a window of vulnerability which I have managed with battery-operated Nests and an UPS for the WiFi set up. It does not appear this device has that option.
    I read the listing on Amazon and it has a built-in 10 year backup battery.  I think they addressed that pretty well.
    Most smoke detectors today have a 10yr lifespan. If they don't, then I wouldn't buy it. 
    The actual sensor degrades below acceptable levels after ten years, and then the random false alarms start happening, but only during the hours of midnight and six a.m., which is pretty remarkable for even non ‘smart’ devices that don’t have any sort of clock in them. Seriously, though, it’s because of this that a long-term built-in backup battery is actually mandated to not work longer than ten years. The sensor is less reliable, so a twenty-year battery isn’t allowed, because it would motivate owners to keep using the devices beyond the reliability factor of the sensors. 

    I’d be more interested in the reviewed product if it had a version without Alexa spyware, without the added expense of the music feature (I already have HomePods, which surely sound way better than smoke-alarm Muzak), and a false alarm silencer button on the bottom of the device, where you can actually get to it with a stick or kitchen utensil without climbing on the furniture. My significant other is tolerating my rollout of HomeKit stuff but would probably just go ahead and burn the house down with me in it if told that silencing a cooking-induced alarm would now require opening an app on the phone left out in the car, a conversation with Siri made more difficult by the screaming smoke alarm in the background, or by climbing on the furniture to push a button inexplicably located on the side of the alarm, an action that probably invites hearing damage by putting the user’s head inches away from an alarm blasting at probably 120+ decibels at the source, just to reach up and push that side button.
    gatorguyStrangeDayswatto_cobra
  • 2016 MacBook Pro butterfly keyboards failing twice as frequently as older models

    cgWerks said:
    AppleZulu said:
    A sample has to have a valid relationship to the whole, and you can’t ignore a decrease in your sample denominator when comparing one sample to another, and then claim that a portion of that sample has thus ‘doubled.’ That’s a huge, glaring math error. AI made some very broad and dubious assumptions about how their samples relate to the whole, and then made comparatively fine detail year-to-year comparisons of data in their samples, while ignoring important aspects of the required math. They then created a headline out of their bottom line calculation, that keyboard failures have ‘doubled,’ and others pick up that headline as though it’s the gospel truth. It’s not. It’s just not. 

    The keyboard issue could be a big deal, or it could be another case of a problem amplified by the fact that the people who don’t experience the problem don’t bother to post their nominal experience online, or bring their perfectly functional machine to the shop for repairs. AI has essentially taken that scenario and wrapped it in a pretend statistical analysis and presented it as though it’s scientifically proven to be a big deal. The truth is, we don’t know. We just have anecdotal complaints.
    Maybe the problem is that I'm not a statistician, but I'm not understanding the problem. If you take a sampling of repair centers and compare the number of repairs for one series vs another series of product, I'd think that's fine... as far as it goes. Or, are you saying there is a problem in how they calculated the 'double' aspect based on that data?

    Anyway, the actual failure rate is fairly irrelevant to me, aside from this possibly backing up everything I'm hearing anecdotally. IMO, there are way more 'failed' keyboards for this series, even if they never made it to the repair center. Only once have I had to take a can of compressed air to my MacBook Pro keyboard in decades of using them... and then it was my fault for too much snacking over my laptop. If dust-specks are causing it to 'fail' then it's a problem even if they are fully breaking.

    But, sure, I'd like such evidence to be as accurate as possible, so your remarks make me curious.
    Let’s back into the question. What would a failure rate for keyboards on MacBooks mean? Ultimately, it would be a percentage of the machines made that have keyboards that go bad. So that would be a scalable number, right? The AI article talks about a keyboard failure rate of 11.8 (almost 12!) percent for 2016 machines, nearly doubled from 6.0 percent in 2014. 

    Does that mean that of every 100 MacBooks sold, nearly 12 of them had bad keyboards? For every thousand machines, nearly 120 had bad keyboards? That would be catastrophic! But see, that’s not what their number actually says. They’re saying that 12% of the reported service tickets at the shops they surveyed were for keyboard issues. That’s an entirely different thing, and it doesn’t say anything predictive about the failure rate for a given number of machines sold. It’s not scalable. It doesn’t actually tell you anything. 

    If you were going to take a sample to determine the failure rate for keyboards on MacBooks, you would need a randomized sample of machines out there being used, and the sample would have to be large enough to statistically represent the whole population. So it would be something like, of 1,000 MacBooks in use during the first year of a given model, how many required keyboard repairs? Then you could compare that rate from one year to the next. But you have to include in that sample all the machines, including those that needed no repairs at all. That’s not what AI did here. They just looked at numbers for machines that showed up at repair shops, and calculated how many of those that required repair, required specifically keyboard repairs. Without knowing anything about how many didn’t show up at the repair shop at all, you can’t then extrapolate anything back out to the total population. It’s essentially a self-selection bias. 

    Second, the number of repair tickets in their sample was different for each year, from 2014, ‘15, and ‘16. The 11.8 percent ‘keyboard failure rate’ they reported for ‘16 was from a one-third smaller pool of repair tickets in 2016, as compared to 2014. So, a third fewer 2016 MacBooks came in for repairs, but a larger percentage of those repairs were for keyboards. If that information conveys anything that you could extrapolate back out to the total population, it’s that, when comparing 2016 MacBooks to 2014, slightly more appeared to have keyboard issues (but nowhere near double), but that 2016 models were actually a third less likely to need any sort of repair at all. There are enough questions about their sampling methodology that I wouldn’t actually assert that, either, but a better case could be made for that than for the sensationalized but unsubstantiated assertion that keyboard failures ‘doubled.’
    Rayz2016
  • Consumer Reports' dismissal of HomePod a familiar tale to Apple fans [u]

    gatorguy said:
    gatorguy said:
    "Apple says that every time you move the speaker, it senses the motion, then automatically adjusts itself to its placement in the room using a series of test tones and complex algorithms to minimize reflections from nearby walls or other objects. That’s not a feature we evaluated."

    ...is such a core aspect of the design...?  Could that explain why they found the sound (or EQ) less to their liking...?
    Is that quote from their actual review?

    Not evaluating a core feature/aspect of a product's design that has a direct bearing on performance is very un-scientific.
    It is an actual quote, yes.
    Mike, since it's something that's done seamlessly it's not something they should have to consider is it? The HomePod should sound its best no matter whether it was moved from one spot to another. 
    I don't have an issue with them not testing that particular feature, and I didn't mention it in the story because it wasn't relevant to the larger point and would inflame for no real reason.

    My issue is, a drive-by assessment by CR with no discussion behind the opinion to try and capture the headline cycle. Why didn't they finish the evaluation, and publish after their customary month?
    With that I agree. I've not read the original CR piece. Did they say they were doing a more complete review this next month? If not it does lean towards a drive-by. 
    "Full test results for these speakers, which also incorporate factors such as ease of use and versatility, will be released in the next few weeks."

    As far as I can tell, this is the first time they've pre-announced results from a test that won't be complete for a few weeks.
    That really does fly in the face of what Consumer Reports claims to do. Publishing a conclusion while offering no reasons for it is of little use to the reader. 

    I’ll make some predictive notes, though, based on what you can see in the little video clip that appears to show their testing room. First, that room is a horrible setup for any speaker. Every set of speakers they tested is piled together on a couple of rows of shelving, and they’re all in a room with sound baffling on the walls and ceiling.

    Second, that’s not a particularly conducive environment for any speaker, but (much like they did in their recent MBP test) they’ve actually created an environment that defeats the HomePod’s key feature. All the sound baffling will kill much of the testing and calibration that the HomePod is programmed to do. Plus, most people’s homes have hard surface walls and ceilings, which the HomePod uses to its advantage. 

    So their testing methodology looks like they compare the speakers’ raw output in a dead space, with the listener positioned right in front of the speaker. Plus the testers have manually adjusted EQ and whatever for the other speakers to try to match them to the output of their reference speakers (wouldn’t this introduce a certain sort of bias?), while the HomePod must simply try to run its programs to create a flat frequency response in a dead room that is unlike most real world environments and is probably a standard deviation or two away from what it’s normally programmed to do. They think they’re setting up a dispassionate laboratory test, but by ignoring the fact that a new device functions in a fundamentally different way, their test is anything but dispassionate and scientific. It entirely misses the point of innovation. 
    roundaboutnowtmayfoggyhillpscooter63randominternetperson
  • HomePod review: Your mileage may vary, but crank it up for the ride

    chasm said:
    Obviously this is a very sincere review from a real HomePod owner, I've been paying attention to those. The things Neil singled out as the biggest issues, it seems to me, could all be fixed in future software updates, so I'm not particularly concerned about them -- to me it is obvious that you will eventually be able to set up a HomePod with an Android phone, since you can subscribe to Apple Music on those devices (as an example), but its not here yet. The first year, I predict, will bring a number of software updates to HomePod (starting this spring).

    I agree with Foggyhill that this review's overall score should have been broken out, primarily because of the "potential" as Neil called it. From every review I've seen, the speaker is a 4.5 or 5 out of five, and Siri is a 2.5 or 3 out of five. Since Siri will do nothing but grow (though I very much doubt it will ever be on par with the iOS version, since it's not really intended to do so), breaking the score out in this manner makes it clearer to skimmers who don't read much or any of the article that if you're buying it for music, it's a great deal for the money. If you're buying it to be as much of a vocal assistant as Alexa, look elsewhere -- or check back in a year or two.
    I wouldn't hold your breath for the HomePod setup with an Android phone thing. Because Apple has positioned itself as the company that pays attention to security, it's highly unlikely they will provide a non-Apple portal into HomePod's setup and security settings. Security is only as strong as the weakest link, so enabling a much more easily hackable device to be used to then set up an Apple device would represent an entirely unacceptable security hole. Because HomePod can serve as a HomeKit hub, keeping its setup secure is paramount. The whole point of a HomeKit hub is to ensure secure, encrypted connection from outside the home to inside the home so that there is no insecure point of entry that would allow surreptitious control of automated lights, locks and garage door openers. They're not going to risk a hacked Android phone being used to transfer hacked settings to a HomePod, which then opens the door (in some cases literally) for nefarious access to HomeKit controlled devices.
    foregoneconclusion