Apple's HomePod isn't about Siri, but rather the future of home audio

123578

Comments

  • Reply 81 of 142
    chelinchelin Posts: 106member
    I’m starting to think that there never be a HomePod. Who would buy it?

    In order for it to compete someone has to be willing to give up their existing system. Perhaps someone moving into their first apartment without a sound system?

    for us the sound has to be substantially better than our Sonos system. Also we gave up HomeKit as the latency with alexa is about 2-3 seconds vs a minute with siri. But also there’s a 50/50 chance that the HomeKit hookup works compared to 100%.
  • Reply 82 of 142


    This is a very basic example what the multiple tweeters in the HomePod can do to create a wider soundstage and also to eliminate problems with phase (and will look familiar to anyone who watched the HomePod video).

    When sound is reflected off the rear walls, it has a longer path to take to get to the listener than sound coming directly from the speaker. In my example above, the total time for sound to reach the listener is 2.5ms from the front driver and a total of 8.0ms from the rear drivers. When the sound reaches your ears it could be perfectly in phase, completely out of phase or most likely, somewhere in between.

    A 360 degree speaker like the Google Home or Amazon Echo can't do anything to compensate for any potential phase issues that can arise when sound reflected off walls and other objects interacts with sound that travels directly to the listener. And given their target audience and use, it doesn't really matter. These devices are not used for any serious music listening and are typically just for background music.

    Apple has told us the HomePod can use beamforming to direct sound. Beamforming has two requirements to work: you need multiple drivers and you need to be able to adjust the phase individually for each driver. Phase can be adjusted mechanically (physically changing the position of a driver or speaker) or electrically (through digital time delay). Obviously, the HomePod uses digital time delay to adjust phase.

    In the example above, the sound to the rear tweeters would be sent out normally. However, the sound to the front tweeter would be delayed by 5.5ms. This delay allows the rear sound to "catch up" to the direct sound such that by the time it's reflected off the rear walls and starts moving forward it will end up being in phase with the sound from the front tweeter. All the possible issues that can arise with sounds being out of phase are thus eliminated. Since the HomePod also has 6 microphones, calculating this delay time would be fairly straightforward. A few clicks or other test tones played through the individual tweeters can be measured by the microphones so the HomePod can determine exactly how far away it is from any walls and set the appropriate delay time accordingly.

    It's important to note that what I described above isn't actually beamforming. It's a well established method of using time delay to control phase and improve sound quality. Beamforming is more complex but still relies on the same basic principles (precisely controlling phase to multiple drivers to direct where sound goes).

    When Apple says beamforming it's not a marketing term. The HomePod has the proper hardware and layout to enable beamforming. All the rest (Sonos, Google, Amazon) don't.
    jasenj1
  • Reply 83 of 142
    gatorguygatorguy Posts: 24,176member


    This is a very basic example what the multiple tweeters in the HomePod can do to create a wider soundstage and also to eliminate problems with phase (and will look familiar to anyone who watched the HomePod video).

    When sound is reflected off the rear walls, it has a longer path to take to get to the listener than sound coming directly from the speaker. In my example above, the total time for sound to reach the listener is 2.5ms from the front driver and a total of 8.0ms from the rear drivers. When the sound reaches your ears it could be perfectly in phase, completely out of phase or most likely, somewhere in between.Phase can be adjusted mechanically (physically changing the position of a driver or speaker) or electrically (through digital time delay). Obviously, the HomePod uses digital time delay to adjust phase.

    In the example above, the sound to the rear tweeters would be sent out normally. However, the sound to the front tweeter would be delayed by 5.5ms. This delay allows the rear sound to "catch up" to the direct sound such that by the time it's reflected off the rear walls and starts moving forward it will end up being in phase with the sound from the front tweeter. All the possible issues that can arise with sounds being out of phase are thus eliminated. Since the HomePod also has 6 microphones, calculating this delay time would be fairly straightforward. A few clicks or other test tones played through the individual tweeters can be measured by the microphones so the HomePod can determine exactly how far away it is from any walls and set the appropriate delay time accordingly.

    It's important to note that what I described above isn't actually beamforming. It's a well established method of using time delay to control phase and improve sound quality. Beamforming is more complex but still relies on the same basic principles (precisely controlling phase to multiple drivers to direct where sound goes).

    When Apple says beamforming it's not a marketing term. The HomePod has the proper hardware and layout to enable beamforming. All the rest (Sonos, Google, Amazon) don't.
    But do at least a couple of those (Sonos/Google) have the hardware needed to accomplish what you diagramed, other than reflecting off the back wall which doesn't seem like a necessary feature. (Why not just compute the time delay firing forward?) I think so. Both Sonos One and Home Max appear perfectly capable of computing delay times and adjusting phase matched to your room structure (wall angles, size, furniture placement, other reflecting/absorbing surfaces, etc). We already factually know Google uses beamforming in concert with the microphones for better sound recognition. It's a short drive to using the same technology for much of the same sound adjustments as Apple might be accomplishing thru other methods isn't it? As for beamforming, and despite your stated years of professional audio experience, I think you yourself have realized your knowledge about what Apple may be doing with it is somewhat lacking since you are having some difficulty expressing how it will benefit the Home Pod or why it disadvantages others who lack it.

    Now having said all that I personally expect the Google Home Max to be noticeably bass-heavy just as the original 2016 Google Home was. I'm not a fan of boom-sound myself and I've no doubt many others would agree. Fortunately for those owners there is an included equalizer now. It's certainly possible the Home Max may suffer some muddiness.

    Apple will of course have a well-thought-out Home Pod with very good sound and well matched to other Apple products. It may well bring in more revenues than other mid-range smart-speakers by a significant margin, which at the end of the day is all this is about: More profits.  

    Sonos One with Alexa (other voice assistant support coming soon) is what I would expect to have the better features for most folks and at least comparable overall sound to the Home Pod if not a bit fuller, particularly for the price which is significantly undercutting both Apple and Google. That's before the expected Play:5 replacement this next year that will likely support Amazon Alexa, Google Assistant and Apple's Siri at launch.

    Another plus for some Homepod competitors is the much better cross-platform and 3rd party support, services like Spotify, Pandora, iHeart Radio, Amazon Music, TuneIn and others, that won't be supported by Apple. IMHO If you're not already deep into the Apple ecosystem there are better featured products for you than the Home Pod, but for dedicated Apple fans the Home Pod might be (and probably will be) the best choice among the three as long as your music sources only from Apple Music/iTunes or your personal library.

    And again this is just personal opinion but at some point Apple's Homepod will have to support some 3rd party streaming. As is they are too limited IMO, assuming of course that the product they eventually ship is the same as the one they demo'd and spec'd back in the summer.
    edited December 2017
  • Reply 84 of 142
    I wonder if Apple went after the wrong market here. I just helped my sister set up an Echo yesterday and she absolutely loves it. And she’s someone all-in on Apple devices. But she would never pay $350 for a speaker. The Echo she got was $69 with tax. The audio isn’t amazing but I’ve heard worse and she thought it sounded great. On TV I’m constantly seeing commercials for Google Home mini starting at $29. Where is the evidence peope are willing to pay a significant premium for smart speakers, and where is the evidence Home Pod will be good enough for serious audiophiles? It seems like it’s in this niche space that will appeal to Apple die hards but not a mass market. 
    Soli
  • Reply 85 of 142
    croprcropr Posts: 1,122member
    kevin kee said:
    waverboy said:
    Regardless of this hyperbiased article, Siri still needs a lot of work.  There's simply zero excuse for it not being up to the level of Alexa.  It's half-assed in comparison, which I would hope is uncharacteristic of Apple.
    Say you. I don't think Siri is worse than Alexa, or even that Alexa is better than Siri, because they two is not exactly the same. I have never had any problem with Siri, she can understand me 100% all the time. What I think Siri lack is her ability to store information about you. Alexa is equipped to get as many information about you as possible so that it can cater to your needs even if you don't need it. Siri will not do that for privacy reason obviously. But for day to day use, Siri is more capable to do any of your request. Play music, set up calendar, set up meeting, set up timer, check email, check game scrores, etc. she does it brilliantly. Asking if you need to buy a new underwear? Not so much, that is what Alexa does.
    I am living in Antwerp, Belgium.  I did a small test asking Siri and Google Now the route to 5 main streets in Antwerp: Americalei, Grote Steenweg, Meir, Noorderlaan, and Desguinlei.  I did the test in Dutch, being the local language in Antwerp. 
    Siri had "Noorderlaan" correct  and recognized "Grote Steenweg" but pointed to the "Grote Steenweg" in Mortsel, a nearby city.  It failed for the rest which is very disappointing.  
    Google Now had 4 correct , only missing "Desguinlei", which is a tricky one because of the difficult pronunciation. 

    Although this is only a small test  that does not cover all the use cases, it just shows that Siri is nothing to be proud of
  • Reply 86 of 142
    dewmedewme Posts: 5,335member

    It's important to note that what I described above isn't actually beamforming. It's a well established method of using time delay to control phase and improve sound quality. Beamforming is more complex but still relies on the same basic principles (precisely controlling phase to multiple drivers to direct where sound goes).

    When Apple says beamforming it's not a marketing term. The HomePod has the proper hardware and layout to enable beamforming. All the rest (Sonos, Google, Amazon) don't.
    Your explanation is a reasonable approximation and in-fact early beamformers used simple electromagnetic delay lines to achieved spatial definition. What's not apparent from your diagram is that the HomePod (as-described by Apple) is not simply a way to "optimize" the listening experience for a listener sitting in one static position in the room. It is also not tracking listeners as they move around the room. What I suspect it is doing is forming multiple beams radially around the speaker (360 degrees), kind of like the flower petals on a daisy viewed from above. A listener situated anywhere around the HomePod should hear essentially the same composite sound. This would work ideally in an open auditorium where there are no obstructions anywhere in the 360 degrees around the HomePod. When there are obstructions, you will get multi-path interference at multiple listening positions as ericthehalfbee shows. The HomePod will have to sample the generated sound from every beam to determine what type of compensation is needed to reduce the interference for all beams. This may include attenuating some of the speakers in the array that are projecting into the obstruction as well as adapting the beamforming, effectively killing certain beams. Whatever it does, it must be a compromise strategy that works good enough for all transmit beams since the goal of the HomePod is always to provide listeners situated anywhere around the room with a subjectively good listening experience, not just one listener in one location. I'm fairly certain there will be some measurable differences in the sound at different angles due to the compromises and unique geometries involved with different room layouts, but it should still be qualitatively and quantitatively better than the sound produced without beamforming. Using 2 HopePods together will also change the sound dynamic because each speaker may have different obstructions and room geometries to contend with.

    Beamforming is not a marketing term by itself. But technical terms are quite often used to create an air of sophistication and/or technical prowess that is absolutely intended to convey product superiority. In the case of the HomePod, Apple is using beamforming terminology to back up their assertion that HomePod will provide a better listening experience for all listeners in a room, regardless of where they are situated. It's a bit of technical "why" to bolster a claim and is clearly placed at the intersection of technology and marketing. I don't see this as a negative unless the actual product does not live up to expectations and beamforming subsequently takes on a negative connotation. But yeah, at this point Apple is using beamforming as a marketing term for the majority of consumers because they could have simply said "HomePod will sound great no matter where you are sitting or standing in the room."
    edited December 2017 gatorguyrandominternetpersoncgWerks
  • Reply 87 of 142
    k2kwk2kw Posts: 2,075member
    I wonder if Apple went after the wrong market here. I just helped my sister set up an Echo yesterday and she absolutely loves it. And she’s someone all-in on Apple devices. But she would never pay $350 for a speaker. The Echo she got was $69 with tax. The audio isn’t amazing but I’ve heard worse and she thought it sounded great. On TV I’m constantly seeing commercials for Google Home mini starting at $29. Where is the evidence peope are willing to pay a significant premium for smart speakers, and where is the evidence Home Pod will be good enough for serious audiophiles? It seems like it’s in this niche space that will appeal to Apple die hards but not a mass market. 
    They better either have a vastly improved Siri built-in or drop the price to move volume on this ($299, $999 for 4).   The fact that Amazon has developed Alexa in a few years shows that you can build an assistant that does very good voice recognition in a few years.   I just don't think its been a priority at Apple.      It has been more like the proverbial step child. 
  • Reply 88 of 142
    gatorguy said:
    FWIW the current Google Home (from last year) has 360 degree sound, and beamforming is used at least for voice recognition purposes. The current Echo also features 360 degree sound. 
    The Amazon Echo (2nd gen) has a single woofer and a single tweeter (no stereo). The woofer fires down and the tweeter fires up. They claim it's "360 degree sound" simply because the speakers aren't outward facing and the circular grille allows sound to escape on all sides. You could place a boombox on the floor facing up and basically claim the same thing. In other words, it's a half-baked solution. The current Google Home has speakers that face outward... basically one on each side + woofer. Does that really give you 360 degree sound? Only if you want to include reflected sound waves. The side facing speakers aren't really going to cover 360 degrees themselves. Also, you can forget about stereo since the tweeters face in opposite directions. 
    cgWerks
  • Reply 89 of 142
    smaffeismaffei Posts: 237member
    The assertion that comparing HomePod and Echo is like comparing "Apples to Oranges" is simply ludicrous. They both offer similar features and provide similar functionality. 

    This is just fanboi back paddle because Echo is selling like hotcakes this holiday and Apple FAILED to get their overpriced, seriously limited product to market in time for Christmas. Denial in it's finest form.

  • Reply 90 of 142
    If HomePod performs as well as a Sonos Play 5, I’ll eat my shoe. 
  • Reply 91 of 142
    gatorguygatorguy Posts: 24,176member
    dewme said:

    It's important to note that what I described above isn't actually beamforming. It's a well established method of using time delay to control phase and improve sound quality. Beamforming is more complex but still relies on the same basic principles (precisely controlling phase to multiple drivers to direct where sound goes).

    When Apple says beamforming it's not a marketing term. The HomePod has the proper hardware and layout to enable beamforming. All the rest (Sonos, Google, Amazon) don't.
    Your explanation is a reasonable approximation and in-fact early beamformers used simple electromagnetic delay lines to achieved spatial definition. What's not apparent from your diagram is that the HomePod (as-described by Apple) is not simply a way to "optimize" the listening experience for a listener sitting in one static position in the room. It is also not tracking listeners as they move around the room. What I suspect it is doing is forming multiple beams radially around the speaker (360 degrees), kind of like the flower petals on a daisy viewed from above. A listener situated anywhere around the HomePod should hear essentially the same composite sound. This would work ideally in an open auditorium where there are no obstructions anywhere in the 360 degrees around the HomePod. When there are obstructions, you will get multi-path interference at multiple listening positions as ericthehalfbee shows. The HomePod will have to sample the generated sound from every beam to determine what type of compensation is needed to reduce the interference for all beams. This may include attenuating some of the speakers in the array that are projecting into the obstruction as well as adapting the beamforming, effectively killing certain beams. Whatever it does, it must be a compromise strategy that works good enough for all transmit beams since the goal of the HomePod is always to provide listeners situated anywhere around the room with a subjectively good listening experience, not just one listener in one location. I'm fairly certain there will be some measurable differences in the sound at different angles due to the compromises and unique geometries involved with different room layouts, but it should still be qualitatively and quantitatively better than the sound produced without beamforming (All other things including speaker quality/enclosure being equal). Using 2 HopePods together will also change the sound dynamic because each speaker may have different obstructions and room geometries to contend with.

    Beamforming is not a marketing term by itself. But technical terms are quite often used to create an air of sophistication and/or technical prowess that is absolutely intended to convey product superiority. In the case of the HomePod, Apple is using beamforming terminology to back up their assertion that HomePod will provide a better listening experience for all listeners in a room, regardless of where they are situated. It's a bit of technical "why" to bolster a claim and is clearly placed at the intersection of technology and marketing. I don't see this as a negative unless the actual product does not live up to expectations and beamforming subsequently takes on a negative connotation. But yeah, at this point Apple is using beamforming as a marketing term for the majority of consumers because they could have simply said "HomePod will sound great no matter where you are sitting or standing in the room."
    Thanks. Your explanation is far more helpful. I like the visual of petals on a daisy, very descriptive. 

    BTW something I meant to comment on earlier is that IMO Amazon and Google, and to a lesser extent Sonos, are competing more with each other than with Apple's Homepod. Well perhaps Sonos has an eye on the Homepod (adding Alexa was relatively easy), not so much Google and Amazon.  Homepod is for those already embedded in Apple's ecosystem, and that's a pretty big segment. They'll of course be successful with it just as they are with nearly every Apple product.

    The Home Max has been in the works for over a year, and Amazon is of course always actively developing newer versions of its Echo products. Neither was developed as a knee-jerk response to the Home Pod, but between Amazon and Google the former is well in the lead at least in mindshare.
    edited December 2017 muthuk_vanalingam
  • Reply 92 of 142
    I wonder if Apple went after the wrong market here. I just helped my sister set up an Echo yesterday and she absolutely loves it. And she’s someone all-in on Apple devices. But she would never pay $350 for a speaker. The Echo she got was $69 with tax. The audio isn’t amazing but I’ve heard worse and she thought it sounded great. On TV I’m constantly seeing commercials for Google Home mini starting at $29. Where is the evidence peope are willing to pay a significant premium for smart speakers, and where is the evidence Home Pod will be good enough for serious audiophiles? It seems like it’s in this niche space that will appeal to Apple die hards but not a mass market. 
    Look at the price range for headphones. Then compare it to the price range for compact speaker systems. They're not that different, so there's little doubt that a market exists for Apple's product/price. For example, the Sonos 5 compact speaker is retailing for $499...and that isn't really an audiophile system. That's more mid-range. $350 for all of the technology that Apple is packing into the HomePod is not a premium at all in the current market. 
    randominternetperson
  • Reply 93 of 142
    gatorguy said:


    This is a very basic example what the multiple tweeters in the HomePod can do to create a wider soundstage and also to eliminate problems with phase (and will look familiar to anyone who watched the HomePod video).

    When sound is reflected off the rear walls, it has a longer path to take to get to the listener than sound coming directly from the speaker. In my example above, the total time for sound to reach the listener is 2.5ms from the front driver and a total of 8.0ms from the rear drivers. When the sound reaches your ears it could be perfectly in phase, completely out of phase or most likely, somewhere in between.Phase can be adjusted mechanically (physically changing the position of a driver or speaker) or electrically (through digital time delay). Obviously, the HomePod uses digital time delay to adjust phase.

    In the example above, the sound to the rear tweeters would be sent out normally. However, the sound to the front tweeter would be delayed by 5.5ms. This delay allows the rear sound to "catch up" to the direct sound such that by the time it's reflected off the rear walls and starts moving forward it will end up being in phase with the sound from the front tweeter. All the possible issues that can arise with sounds being out of phase are thus eliminated. Since the HomePod also has 6 microphones, calculating this delay time would be fairly straightforward. A few clicks or other test tones played through the individual tweeters can be measured by the microphones so the HomePod can determine exactly how far away it is from any walls and set the appropriate delay time accordingly.

    It's important to note that what I described above isn't actually beamforming. It's a well established method of using time delay to control phase and improve sound quality. Beamforming is more complex but still relies on the same basic principles (precisely controlling phase to multiple drivers to direct where sound goes).

    When Apple says beamforming it's not a marketing term. The HomePod has the proper hardware and layout to enable beamforming. All the rest (Sonos, Google, Amazon) don't.
    But do at least a couple of those (Sonos/Google) have the hardware needed to accomplish what you diagramed, other than reflecting off the back wall which doesn't seem like a necessary feature. (Why not just compute the time delay firing forward?) I think so. Both Sonos One and Home Max appear perfectly capable of computing delay times and adjusting phase matched to your room structure (wall angles, size, furniture placement, other reflecting/absorbing surfaces, etc). We already factually know Google uses beamforming in concert with the microphones for better sound recognition. It's a short drive to using the same technology for much of the same sound adjustments as Apple might be accomplishing thru other methods isn't it? As for beamforming, and despite your stated years of professional audio experience, I think you yourself have realized your knowledge about what Apple may be doing with it is somewhat lacking since you are having some difficulty expressing how it will benefit the Home Pod or why it disadvantages others who lack it.

    Now having said all that I personally expect the Google Home Max to be noticeably bass-heavy just as the original 2016 Google Home was. I'm not a fan of boom-sound myself and I've no doubt many others would agree. Fortunately for those owners there is an included equalizer now. It's certainly possible the Home Max may suffer some muddiness.

    Apple will of course have a well-thought-out Home Pod with very good sound and well matched to other Apple products. It may well bring in more revenues than other mid-range smart-speakers by a significant margin, which at the end of the day is all this is about: More profits.  

    Sonos One with Alexa (other voice assistant support coming soon) is what I would expect to have the better features for most folks and at least comparable overall sound to the Home Pod if not a bit fuller, particularly for the price which is significantly undercutting both Apple and Google. That's before the expected Play:5 replacement this next year that will likely support Amazon Alexa, Google Assistant and Apple's Siri at launch.

    Another plus for some Homepod competitors is the much better cross-platform and 3rd party support, services like Spotify, Pandora, iHeart Radio, Amazon Music, TuneIn and others, that won't be supported by Apple. IMHO If you're not already deep into the Apple ecosystem there are better featured products for you than the Home Pod, but for dedicated Apple fans the Home Pod might be (and probably will be) the best choice among the three as long as your music sources only from Apple Music/iTunes or your personal library.

    And again this is just personal opinion but at some point Apple's Homepod will have to support some 3rd party streaming. As is they are too limited IMO, assuming of course that the product they eventually ship is the same as the one they demo'd and spec'd back in the summer.

    Everything you said in regards to phase and beamforming is, again, completely false.

    My only problem is trying to explain complex ideas so the layperson can understand. Your comments about phase make this abundantly clear.

    appletreewick
  • Reply 94 of 142
    gatorguygatorguy Posts: 24,176member
    gatorguy said:


    This is a very basic example what the multiple tweeters in the HomePod can do to create a wider soundstage and also to eliminate problems with phase (and will look familiar to anyone who watched the HomePod video).

    When sound is reflected off the rear walls, it has a longer path to take to get to the listener than sound coming directly from the speaker. In my example above, the total time for sound to reach the listener is 2.5ms from the front driver and a total of 8.0ms from the rear drivers. When the sound reaches your ears it could be perfectly in phase, completely out of phase or most likely, somewhere in between.Phase can be adjusted mechanically (physically changing the position of a driver or speaker) or electrically (through digital time delay). Obviously, the HomePod uses digital time delay to adjust phase.

    In the example above, the sound to the rear tweeters would be sent out normally. However, the sound to the front tweeter would be delayed by 5.5ms. This delay allows the rear sound to "catch up" to the direct sound such that by the time it's reflected off the rear walls and starts moving forward it will end up being in phase with the sound from the front tweeter. All the possible issues that can arise with sounds being out of phase are thus eliminated. Since the HomePod also has 6 microphones, calculating this delay time would be fairly straightforward. A few clicks or other test tones played through the individual tweeters can be measured by the microphones so the HomePod can determine exactly how far away it is from any walls and set the appropriate delay time accordingly.

    It's important to note that what I described above isn't actually beamforming. It's a well established method of using time delay to control phase and improve sound quality. Beamforming is more complex but still relies on the same basic principles (precisely controlling phase to multiple drivers to direct where sound goes).

    When Apple says beamforming it's not a marketing term. The HomePod has the proper hardware and layout to enable beamforming. All the rest (Sonos, Google, Amazon) don't.
    But do at least a couple of those (Sonos/Google) have the hardware needed to accomplish what you diagramed, other than reflecting off the back wall which doesn't seem like a necessary feature. (Why not just compute the time delay firing forward?) I think so. Both Sonos One and Home Max appear perfectly capable of computing delay times and adjusting phase matched to your room structure (wall angles, size, furniture placement, other reflecting/absorbing surfaces, etc). We already factually know Google uses beamforming in concert with the microphones for better sound recognition. It's a short drive to using the same technology for much of the same sound adjustments as Apple might be accomplishing thru other methods isn't it? As for beamforming, and despite your stated years of professional audio experience, I think you yourself have realized your knowledge about what Apple may be doing with it is somewhat lacking since you are having some difficulty expressing how it will benefit the Home Pod or why it disadvantages others who lack it.

    Now having said all that I personally expect the Google Home Max to be noticeably bass-heavy just as the original 2016 Google Home was. I'm not a fan of boom-sound myself and I've no doubt many others would agree. Fortunately for those owners there is an included equalizer now. It's certainly possible the Home Max may suffer some muddiness.

    Apple will of course have a well-thought-out Home Pod with very good sound and well matched to other Apple products. It may well bring in more revenues than other mid-range smart-speakers by a significant margin, which at the end of the day is all this is about: More profits.  

    Sonos One with Alexa (other voice assistant support coming soon) is what I would expect to have the better features for most folks and at least comparable overall sound to the Home Pod if not a bit fuller, particularly for the price which is significantly undercutting both Apple and Google. That's before the expected Play:5 replacement this next year that will likely support Amazon Alexa, Google Assistant and Apple's Siri at launch.

    Another plus for some Homepod competitors is the much better cross-platform and 3rd party support, services like Spotify, Pandora, iHeart Radio, Amazon Music, TuneIn and others, that won't be supported by Apple. IMHO If you're not already deep into the Apple ecosystem there are better featured products for you than the Home Pod, but for dedicated Apple fans the Home Pod might be (and probably will be) the best choice among the three as long as your music sources only from Apple Music/iTunes or your personal library.

    And again this is just personal opinion but at some point Apple's Homepod will have to support some 3rd party streaming. As is they are too limited IMO, assuming of course that the product they eventually ship is the same as the one they demo'd and spec'd back in the summer.

    Everything you said in regards to phase and beamforming is, again, completely false.

    My only problem is trying to explain complex ideas so the layperson can understand. Your comments about phase make this abundantly clear.

    Ummm... Well since I didn't state any fact about either beamforming or phase in that post I suppose you must be referring to the questions I asked you as being "completely false"? :eyeroll:

    What I specifically wanted you to comment on, and what you successfully avoided addressing, was whether Google's Home Max and Sonos One had the necessary components to accomplish what you diagrammed,  a "well established method of using time delay to control phase and improve sound quality". The obvious exception to your diagram is bouncing it off a rear wall, and if that's a mandatory element please explain why. Perhaps you haven't yet explained it properly, but you've not made any attempt to explain why not relying on beamforming would automatically disadvantage Sonos or anyone which was one of the original points you made , and what lead to much of this back and forth.

    Your quote: "This is far more complex than simple EQ (which has been around forever) and will give the HomePod a huge advantage over Sonos, Google or anyone else", while also mentioning Apple would be "analyzing sound in the time domain" without explaining why you believe Sonos, Google and anyone else's hardware not using "beamforming" would be incapable of doing so. As you've explained things so far beamforming is not a mandatory feature for that. Explaining in layman's terms is of course what you should strive for because that's who most of us are when discussing audio tech. We're laymen, while you claim not to be. 

    My sole comment about beamforming in that post was not a statement of fact but was a question posed to you which apparently you cannot answer? Fair enough. TBH DewMe is much more helpful at explaining what he believes Apple is doing. 
    edited December 2017
  • Reply 95 of 142
    smaffei said:
    The assertion that comparing HomePod and Echo is like comparing "Apples to Oranges" is simply ludicrous. They both offer similar features and provide similar functionality. 

    This is just fanboi back paddle because Echo is selling like hotcakes this holiday and Apple FAILED to get their overpriced, seriously limited product to market in time for Christmas. Denial in it's finest form.

    You clearly missed the thesis of this article.

    If Car and Driver said "Comparing a $100,000 Porsche with a $15,000 Kia is comparing apples to oranges" would you post a comment saying "This is simply ludicrous. They both offer similar features and provide similar functionality. This is just fanboi back paddle because Kia is selling like hotcakes this holiday and Porsche FAILED to get their overpriced, seriously limited product to market in time for Christmas. Denial in it's finest form."

    You sound like those people who blasted the iPod when it was first announced, except that at least in that case the iPod was specially competing against cheap, small-capacity MP3 players.  
  • Reply 96 of 142
    k2kwk2kw Posts: 2,075member
    smaffei said:
    The assertion that comparing HomePod and Echo is like comparing "Apples to Oranges" is simply ludicrous. They both offer similar features and provide similar functionality. 

    This is just fanboi back paddle because Echo is selling like hotcakes this holiday and Apple FAILED to get their overpriced, seriously limited product to market in time for Christmas. Denial in it's finest form.

    Yep ,
        11 months ago DED was saying great improvements are coming to Siri because of Alexa.

    http://iphone.appleinsider.com/articles/17/01/07/is-apple-getting-siri-ous-in-the-face-of-amazons-alexa-echo

    Siri should be so much better just because of the microphones in the HomePod.

    It would frightening if Dilger is basing current spin on inside knowledge about Siri just not measuring up.

  • Reply 97 of 142
    SoliSoli Posts: 10,035member
    k2kw said:
    smaffei said:
    The assertion that comparing HomePod and Echo is like comparing "Apples to Oranges" is simply ludicrous. They both offer similar features and provide similar functionality. 

    This is just fanboi back paddle because Echo is selling like hotcakes this holiday and Apple FAILED to get their overpriced, seriously limited product to market in time for Christmas. Denial in it's finest form.

    Yep ,
        11 months ago DED was saying great improvements are coming to Siri because of Alexa.

    http://iphone.appleinsider.com/articles/17/01/07/is-apple-getting-siri-ous-in-the-face-of-amazons-alexa-echo

    Siri should be so much better just because of the microphones in the HomePod.

    It would frightening if Dilger is basing current spin on inside knowledge about Siri just not measuring up.
    I'd assume that just having an array of far-field microphones will make Siri appear to be more intelligent because she'll objectively be a better listener. That may be the thread where I was told that Amazon is pathetic for needing more than one microphone.
  • Reply 98 of 142
    foggyhillfoggyhill Posts: 4,767member
    gatorguy said:


    This is a very basic example what the multiple tweeters in the HomePod can do to create a wider soundstage and also to eliminate problems with phase (and will look familiar to anyone who watched the HomePod video).

    When sound is reflected off the rear walls, it has a longer path to take to get to the listener than sound coming directly from the speaker. In my example above, the total time for sound to reach the listener is 2.5ms from the front driver and a total of 8.0ms from the rear drivers. When the sound reaches your ears it could be perfectly in phase, completely out of phase or most likely, somewhere in between.Phase can be adjusted mechanically (physically changing the position of a driver or speaker) or electrically (through digital time delay). Obviously, the HomePod uses digital time delay to adjust phase.

    In the example above, the sound to the rear tweeters would be sent out normally. However, the sound to the front tweeter would be delayed by 5.5ms. This delay allows the rear sound to "catch up" to the direct sound such that by the time it's reflected off the rear walls and starts moving forward it will end up being in phase with the sound from the front tweeter. All the possible issues that can arise with sounds being out of phase are thus eliminated. Since the HomePod also has 6 microphones, calculating this delay time would be fairly straightforward. A few clicks or other test tones played through the individual tweeters can be measured by the microphones so the HomePod can determine exactly how far away it is from any walls and set the appropriate delay time accordingly.

    It's important to note that what I described above isn't actually beamforming. It's a well established method of using time delay to control phase and improve sound quality. Beamforming is more complex but still relies on the same basic principles (precisely controlling phase to multiple drivers to direct where sound goes).

    When Apple says beamforming it's not a marketing term. The HomePod has the proper hardware and layout to enable beamforming. All the rest (Sonos, Google, Amazon) don't.
    But do at least a couple of those (Sonos/Google) have the hardware needed to accomplish what you diagramed, other than reflecting off the back wall which doesn't seem like a necessary feature. (Why not just compute the time delay firing forward?) I think so. Both Sonos One and Home Max appear perfectly capable of computing delay times and adjusting phase matched to your room structure (wall angles, size, furniture placement, other reflecting/absorbing surfaces, etc). We already factually know Google uses beamforming in concert with the microphones for better sound recognition. It's a short drive to using the same technology for much of the same sound adjustments as Apple might be accomplishing thru other methods isn't it? As for beamforming, and despite your stated years of professional audio experience, I think you yourself have realized your knowledge about what Apple may be doing with it is somewhat lacking since you are having some difficulty expressing how it will benefit the Home Pod or why it disadvantages others who lack it.

    Now having said all that I personally expect the Google Home Max to be noticeably bass-heavy just as the original 2016 Google Home was. I'm not a fan of boom-sound myself and I've no doubt many others would agree. Fortunately for those owners there is an included equalizer now. It's certainly possible the Home Max may suffer some muddiness.

    Apple will of course have a well-thought-out Home Pod with very good sound and well matched to other Apple products. It may well bring in more revenues than other mid-range smart-speakers by a significant margin, which at the end of the day is all this is about: More profits.  

    Sonos One with Alexa (other voice assistant support coming soon) is what I would expect to have the better features for most folks and at least comparable overall sound to the Home Pod if not a bit fuller, particularly for the price which is significantly undercutting both Apple and Google. That's before the expected Play:5 replacement this next year that will likely support Amazon Alexa, Google Assistant and Apple's Siri at launch.

    Another plus for some Homepod competitors is the much better cross-platform and 3rd party support, services like Spotify, Pandora, iHeart Radio, Amazon Music, TuneIn and others, that won't be supported by Apple. IMHO If you're not already deep into the Apple ecosystem there are better featured products for you than the Home Pod, but for dedicated Apple fans the Home Pod might be (and probably will be) the best choice among the three as long as your music sources only from Apple Music/iTunes or your personal library.

    And again this is just personal opinion but at some point Apple's Homepod will have to support some 3rd party streaming. As is they are too limited IMO, assuming of course that the product they eventually ship is the same as the one they demo'd and spec'd back in the summer.

    Everything you said in regards to phase and beamforming is, again, completely false.

    My only problem is trying to explain complex ideas so the layperson can understand. Your comments about phase make this abundantly clear.

    He doesn't give a hoot if he's in error (it's the new thing don't you know, not caring about being wrong without embarrassment), just throw things out, lie, distort, withhold, cherry pick, and more and feel smug  doing it; it is "working" for that orange juice man wringing the life out of the US right now and he's got legions of copiers.
    edited December 2017
  • Reply 99 of 142
    foggyhillfoggyhill Posts: 4,767member
    smaffei said:
    The assertion that comparing HomePod and Echo is like comparing "Apples to Oranges" is simply ludicrous. They both offer similar features and provide similar functionality. 

    This is just fanboi back paddle because Echo is selling like hotcakes this holiday and Apple FAILED to get their overpriced, seriously limited product to market in time for Christmas. Denial in it's finest form.

    Right... Similar features, only if 128kb/s mp3's sound coming from something that sounds similar to a 1999 mp3 portable mp3 player is the same as 256kb/s aac going out to an array of speakers. They're both the same, they make sound... That's how your assessment goes seemingly.

    The fact you did the whole false equivalency means you don't care about facts and the use of "fanboi" cements that status.
    edited December 2017
  • Reply 100 of 142
    fmalloyfmalloy Posts: 105member
    Until this is actually delivered all this beamforming and phase and multiple drivers and microphones is just talk and hype. We need to hear it for ourselves, in our own rooms. I have a high-quality (depends on your standards) amp and speakers with 12" woofers, along with midrange and tweeter drivers. I'll be very interested to hear what it sounds like with only 4" drivers. Even with long-throw woofers, the excursion needs to be carefully controlled. They have to move a lot of air. I'm skeptical. And it's not stereo?
Sign In or Register to comment.