How HomePod leverages Apple's silicon expertise to deliver advanced audio performance

1235

Comments

  • Reply 81 of 117
    analogjackanalogjack Posts: 1,073member
    "There's no need to adjust EQ settings; the product is designed to play great sounding music, movie audio and voice content without futzing with controls."

    That's all well and good if one has perfect hearing. My hearing is pretty good but I know I'm losing a bit of high end. If Apple wants to really do something clever, they should have a set up where a used sits in a pre determined spot and the HP plays a series of tones at various frequencies and asks you if you can hear them, just as they do in a hearing test centre. It could be done with the home pods or even better they could send a signal to your wireless ear pods or other wireless Apple proprietary headphone. Then that setting could be saved as a personal setting. The technology is there already contained within the HP. 

    Also Apple should plan to release a dedicated sub pod, with maybe an 8" high displacement single woofer that could be placed almost anywhere, then if you had two normal HP's the sub would take considerable pressure off the current 4" HP woofers so they would in effect become solely midrange, then you'd probably have an awesome set up that many audiophiles would be interested in. And best of all it's modular. Only upon purchasing a HP sub would the processing change for the current HP's
    edited March 2018 igerardjony0
  • Reply 82 of 117
    foggyhillfoggyhill Posts: 4,767member
    "There's no need to adjust EQ settings; the product is designed to play great sounding music, movie audio and voice content without futzing with controls."

    That's all well and good if one has perfect hearing. My hearing is pretty good but I know I'm losing a bit of high end. If Apple wants to really do something clever, they should have a set up where a used sits in a pre determined spot and the HP plays a series of tones at various frequencies and asks you if you can hear them, just as they do in a hearing test centre. It could be done with the home pods or even better they could send a signal to your wireless ear pods or other wireless Apple proprietary headphone. Then that setting could be saved as a personal setting. The technology is there already contained within the HP. 

    Also Apple should plan to release a dedicated sub pod, with maybe an 8" high displacement single woofer that could be placed almost anywhere, then if you had two normal HP's the sub would take considerable pressure off the current 4" HP woofers so they would in effect become solely midrange, then you'd probably have an awesome set up that many audiophiles would be interested in. And best of all it's modular. Only upon purchasing a HP sub would the processing change for the current HP's
    They'll likely give the ability to put a listening profile on the Homepod eventually, though through Airplay, you can do it already.

    The main difficulty is that different frequencies are not reflected the same (and a material's reflectivity will change depending on frequency), so if they boost the high end you lkely need to direct your sound from all speakers differently to get the same large sweet spot effect.

    Everyone around then would have a bad listening experience while the person with the hearing loss would be fine.. We can imagine considering the many difficulties, why they decided not to do this at launch.

    It's never as simple as people say cause this speaker doesn't have just one speaker and one channel and they're not all pointing the same direction and some are pointing at the wall.
  • Reply 83 of 117
    plovellplovell Posts: 824member
    I have a three point AirPlay music system at home, driven from my Mac. Adding HomePod in the kitchen makes four. It sounds great but is hopeless at integration. If I try to control the volume on the HomePod "buttons", the system-wide volume is reduced to almost nothing. Doesn't matter whether I use "+" or "-". If I tell Siri to stop (i.e. pause) that works OK. But trying to resume starts playing at the first track in my library (I have iCloud Library but not Apple Music). So - nice engineering but the usage cases are woefully deficient.
    pscooter63
  • Reply 84 of 117
    foggyhillfoggyhill Posts: 4,767member
    plovell said:
    I have a three point AirPlay music system at home, driven from my Mac. Adding HomePod in the kitchen makes four. It sounds great but is hopeless at integration. If I try to control the volume on the HomePod "buttons", the system-wide volume is reduced to almost nothing. Doesn't matter whether I use "+" or "-". If I tell Siri to stop (i.e. pause) that works OK. But trying to resume starts playing at the first track in my library (I have iCloud Library but not Apple Music). So - nice engineering but the usage cases are woefully deficient.
    Maybe that's why Airplay2 is coming (I haven't looked if it covers those issues).
    watto_cobra
  • Reply 85 of 117
    racerhomie3racerhomie3 Posts: 1,264member
    I hope Apple allows a HomePod App Store this year. The potential for excellent audio apps are endless.
    But if I buy a good speaker in my lifetime, it will definitely be a HomePod.
    watto_cobra
  • Reply 86 of 117
    igerardigerard Posts: 14member
    The issue with the Pogue test, is that it sends the sound with AirPlay, then don’t use the EQ made by AM Analytics ... then would need some work on the EQ side

    then lack the reason Apple promote AM on HomePod : dedicated EQ for each song it plays...
    watto_cobra
  • Reply 87 of 117
    TuuborTuubor Posts: 53member
    Most people commenting on the fact that the Google Home Max has equal soundquality forget that the HomePod is almost 1/3 of the size, thus making it leaps and bounds better in soundquality and engineering. The competition wont be able to reproduce the same kind of soundquality for many years to come in that size. And also most of the reviewers say the HomePod does sound better. Google home is just louder because they sacrifice distorting the sound over 80% volume. 
    bakedbananaswatto_cobra
  • Reply 88 of 117
    AppleInsider said:

    Having listened to a pre-release demonstration of two HomePods in action...

    Following the above link to your previous article in which you say:

    Listening to itself
    HomePod also listens in all directions using 6 microphones. In addition to listening for your Siri commands (which I didn't have the ability to try out), it also listens to its own music playback, specifically for analyzing reflected sounds that help it tune itself for its surroundings, dramatically widening the "sweet spot" in the room where its audio reproduction sounds best. 
     it also listens to its own music playback, specifically for analyzing reflected sounds that help it tune itself for its surroundings,

    From this I infer that the HomePods are dynamically tuning themselves to the audio being played while it is being played!

    Is that what you meant to say?  If so, can you provide a citation?
  • Reply 89 of 117
    MisterKit said:
    I can’t imagine the engineering that will go into getting two HomePods to work in stereo. There are many factors that have to be worked out. One HomePod is analyzing the characteristics of the environment and making intelligent decisions about compensation and “opening up” the sound to the room. A second HomePod would do the same and have to be aware of the first HomePod. There are sooo many timing and phase issues that it is not funny. And on top of that it has to maintain the “stereo” image that the mixing engineers/producers intended. It is no surprise that Apple is still working on this.
    You can create an ersatz stereo effect on two HomePods.  On a Mac, you run two copies of the Airfoil app, each with it's equalizer:



    But, from what I've read here and elsewhere, Apple may be developing a new type of sound, an immersion sound -- that subsumes surround sound, stereo and HiFi.

    'Course you need to wear a pair of these to get the full  immersion sound effect!


    edited March 2018 stourque
  • Reply 90 of 117
    gatorguygatorguy Posts: 24,176member
    AppleInsider said:

    Having listened to a pre-release demonstration of two HomePods in action...

    Following the above link to your previous article in which you say:

    Listening to itself
    HomePod also listens in all directions using 6 microphones. In addition to listening for your Siri commands (which I didn't have the ability to try out), it also listens to its own music playback, specifically for analyzing reflected sounds that help it tune itself for its surroundings, dramatically widening the "sweet spot" in the room where its audio reproduction sounds best. 
     it also listens to its own music playback, specifically for analyzing reflected sounds that help it tune itself for its surroundings,

    From this I infer that the HomePods are dynamically tuning themselves to the audio being played while it is being played!

    Is that what you meant to say?  If so, can you provide a citation?
    The bass certainly is in order to help minimize distortion, and using a microphone dedicated to that. Apple says so in writing. If the HomePod is also continually adapting the the tweeters/mids for sound, tone and direction while music is playing I'd love to see that same citation myself.

    Dick, while waiting for DED's citation here's something you might try that would help point things in the proper direction. While a single homepod of yours is playing music, and without moving it at all, place a pillow fairly close on one side of it and see if you detect the high end dynamically adjusting to that new obstruction. Moving it should cause it to remap of course since Apple says that's what should happen when the accelerometer registers significant movement, but I would bet dollars to donuts it won't adapt overall sound to a new modifier otherwise. Otherwise why would Apple even mention what happens when it's moved?

    Since Apple themselves say the HomePod "maps" the room's overall sound signature by analyzing reflected sound when it's initially set up, and then again each time it's moved to a new location as registered by the accelerometer, the logical inference is that if it is NOT moved it will not be tuning the overall sound in real time.
    edited March 2018
  • Reply 91 of 117
    foggyhill said:
    "There's no need to adjust EQ settings; the product is designed to play great sounding music, movie audio and voice content without futzing with controls."

    That's all well and good if one has perfect hearing. My hearing is pretty good but I know I'm losing a bit of high end. If Apple wants to really do something clever, they should have a set up where a used sits in a pre determined spot and the HP plays a series of tones at various frequencies and asks you if you can hear them, just as they do in a hearing test centre. It could be done with the home pods or even better they could send a signal to your wireless ear pods or other wireless Apple proprietary headphone. Then that setting could be saved as a personal setting. The technology is there already contained within the HP. 

    Also Apple should plan to release a dedicated sub pod, with maybe an 8" high displacement single woofer that could be placed almost anywhere, then if you had two normal HP's the sub would take considerable pressure off the current 4" HP woofers so they would in effect become solely midrange, then you'd probably have an awesome set up that many audiophiles would be interested in. And best of all it's modular. Only upon purchasing a HP sub would the processing change for the current HP's
    They'll likely give the ability to put a listening profile on the Homepod eventually, though through Airplay, you can do it already.

    The main difficulty is that different frequencies are not reflected the same (and a material's reflectivity will change depending on frequency), so if they boost the high end you lkely need to direct your sound from all speakers differently to get the same large sweet spot effect.

    Everyone around then would have a bad listening experience while the person with the hearing loss would be fine.. We can imagine considering the many difficulties, why they decided not to do this at launch.

    It's never as simple as people say cause this speaker doesn't have just one speaker and one channel and they're not all pointing the same direction and some are pointing at the wall.
    I suspect what Apple will do is:

    1. create an unique individual listening profile on your iPhone
    2. the HomePods will look for individuals iPhones, their listening profile and their location within range (the room)
    3. adjust the HomePods' output, as best as possible, to provide sweet spots for all the individuals within range
  • Reply 92 of 117
    talexytalexy Posts: 80member
    macapfel said:
    I really like the HomePod. What I don't understand is that it 1. does not understand non-english song titles or interprets and 2. that when asking what song is playing and it is not english, Siri just produces gibberish. It shouldn't be too difficult to tag songs with what language the title/interpret is so Siri 1. understands what one is asking for and 2. Siri can correctly tell what song/interpret HomePod is playing. I am a bit surprised by this not very good implementation.
    That is exactly what I am wondering about since the introduction of siri. Apple always advertised the way you can command music with siri, but today still siri isn't capable of understanding Songnames or Bandnames in their respective language other than your own. Even when you start you sentence with: "Hey Siri, play songs from *any name of a band from a country with another language than the language your iPhone is set to*". With all their R&D money they definitively should be able to get this right.
    This is important! Surely more important than asking Siri how your favorite football team played last weekend.
  • Reply 93 of 117
    cpsro said:
    cpsro said:
    Too bad it (re)produces infrasound found in many live audio streams.  Very annoying.
    What does that mean?  
    https://www.mercurynews.com/2018/02/27/ice-schaaf-warning-of-recent-sweep-was-irresponsible-decision/
    tmay said:
    cpsro said:
    Too bad it (re)produces infrasound found in many live audio streams.  Very annoying.
    Infrasound is below the limits of human hearing, first of all, and almost certainly below the limits of what the HomePod can reproduce.
    Infrasound is below the frequency where you'd hear a "tone," but it's very perceptible as pressure waves hitting the eardrums.  The HomePod is quite capable of producing annoying infrasound--predictable from the high excursion subwoofer it contains--and as evidenced by my personal experience listening to NPR radio on the device. Outdoor venues often have wind noise and in-studio programs sometimes pick up people blowing on the mic as they speak.
    Isn't that like complaining about HD video because you can get distracted by the cosmetic imperfections of the TV personalities?  (Which is true, but not the fault of the TV.)
    watto_cobra
  • Reply 94 of 117
    gatorguygatorguy Posts: 24,176member
    foggyhill said:
    "There's no need to adjust EQ settings; the product is designed to play great sounding music, movie audio and voice content without futzing with controls."

    That's all well and good if one has perfect hearing. My hearing is pretty good but I know I'm losing a bit of high end. If Apple wants to really do something clever, they should have a set up where a used sits in a pre determined spot and the HP plays a series of tones at various frequencies and asks you if you can hear them, just as they do in a hearing test centre. It could be done with the home pods or even better they could send a signal to your wireless ear pods or other wireless Apple proprietary headphone. Then that setting could be saved as a personal setting. The technology is there already contained within the HP. 

    Also Apple should plan to release a dedicated sub pod, with maybe an 8" high displacement single woofer that could be placed almost anywhere, then if you had two normal HP's the sub would take considerable pressure off the current 4" HP woofers so they would in effect become solely midrange, then you'd probably have an awesome set up that many audiophiles would be interested in. And best of all it's modular. Only upon purchasing a HP sub would the processing change for the current HP's
    They'll likely give the ability to put a listening profile on the Homepod eventually, though through Airplay, you can do it already.

    The main difficulty is that different frequencies are not reflected the same (and a material's reflectivity will change depending on frequency), so if they boost the high end you lkely need to direct your sound from all speakers differently to get the same large sweet spot effect.

    Everyone around then would have a bad listening experience while the person with the hearing loss would be fine.. We can imagine considering the many difficulties, why they decided not to do this at launch.

    It's never as simple as people say cause this speaker doesn't have just one speaker and one channel and they're not all pointing the same direction and some are pointing at the wall.
    I suspect what Apple will do is:

    1. create an unique individual listening profile on your iPhone
    2. the HomePods will look for individuals iPhones, their listening profile and their location within range (the room)
    3. adjust the HomePods' output, as best as possible, to provide sweet spots for all the individuals within range
    Wouldn't everyone still receive the same bass sound anyway no matter what their personal preference is since it isn't directional (formed) and there's only a single speaker delivering it? If you don't like your bass heavy and your individual listening profile says so while the teen in the room likes it boom and so their profile disagrees with yours, well... You can't both be happy. 
  • Reply 95 of 117
    dick applebaumdick applebaum Posts: 12,527member
    gatorguy said:
    AppleInsider said:

    Having listened to a pre-release demonstration of two HomePods in action...

    Following the above link to your previous article in which you say:

    Listening to itself
    HomePod also listens in all directions using 6 microphones. In addition to listening for your Siri commands (which I didn't have the ability to try out), it also listens to its own music playback, specifically for analyzing reflected sounds that help it tune itself for its surroundings, dramatically widening the "sweet spot" in the room where its audio reproduction sounds best. 
     it also listens to its own music playback, specifically for analyzing reflected sounds that help it tune itself for its surroundings,

    From this I infer that the HomePods are dynamically tuning themselves to the audio being played while it is being played!

    Is that what you meant to say?  If so, can you provide a citation?
    The bass certainly is in order to help minimize distortion, and using a microphone dedicated to that. If the HomePod is also continually adapting the the tweeters/mids for both sound and direction while music is playing I'd love to see that same citation myself.

    Dick, while waiting for DED's citation here's something you might try that would help point things in the proper direction. While a single homepod of yours is playing music, and without moving it at all, place a pillow on one side of it and see if the high end dynamically adjusts to that new obstruction. Moving it should cause it to remap since Apple says that's what should happen when the accelerometer registers significant movement, but I would bet dollars to donuts it won't adapt to a new sound modifier otherwise.

    Since Apple themselves say the HomePod "maps" the room's overall sound signature by analyzing reflected sound when it's initially set up, and then again each time it's moved to a new location as registered by the accelerometer, the logical inference is that if it is NOT moved it will not be tuning the overall sound in real time.
    Good idea!  I can't do that now, as it's ~5:AM and the household is still asleep.  I will try later.

    I suspect, with its current software/firmware, the HomePods won't adjust in real time -- but I believe the arey capable of doing so.

    One consideration is how the HomePods buffer/adjust their output.  From my tests with Airplay, it appears that EQ adjustments are made at buffer-in... subsequent EQ adjustments take effect after the buffer clears and sound with the new adjustments are buffered.  HomePod Siri volume adjustments (usually) take effect immediately.

    To reach its full potential I think that, with Airplay 2, the HomePods will have to adjust on buffer-out. These adjustments could include:
    1. the room profile (from setup)
    2. the music profile from metadata (created at recording -- or post analysis)
    3. the music as it is being played (heard by the HomePods)
    4. the listeners' individual hearing profiles and locations in the room

    Item 4 may not be necessary to realize immersive audio for all in the room!
    edited March 2018
  • Reply 96 of 117
    gatorguygatorguy Posts: 24,176member
    gatorguy said:
    AppleInsider said:

    Having listened to a pre-release demonstration of two HomePods in action...

    Following the above link to your previous article in which you say:

    Listening to itself
    HomePod also listens in all directions using 6 microphones. In addition to listening for your Siri commands (which I didn't have the ability to try out), it also listens to its own music playback, specifically for analyzing reflected sounds that help it tune itself for its surroundings, dramatically widening the "sweet spot" in the room where its audio reproduction sounds best. 
     it also listens to its own music playback, specifically for analyzing reflected sounds that help it tune itself for its surroundings,

    From this I infer that the HomePods are dynamically tuning themselves to the audio being played while it is being played!

    Is that what you meant to say?  If so, can you provide a citation?
    The bass certainly is in order to help minimize distortion, and using a microphone dedicated to that. If the HomePod is also continually adapting the the tweeters/mids for both sound and direction while music is playing I'd love to see that same citation myself.

    Dick, while waiting for DED's citation here's something you might try that would help point things in the proper direction. While a single homepod of yours is playing music, and without moving it at all, place a pillow on one side of it and see if the high end dynamically adjusts to that new obstruction. Moving it should cause it to remap since Apple says that's what should happen when the accelerometer registers significant movement, but I would bet dollars to donuts it won't adapt to a new sound modifier otherwise.

    Since Apple themselves say the HomePod "maps" the room's overall sound signature by analyzing reflected sound when it's initially set up, and then again each time it's moved to a new location as registered by the accelerometer, the logical inference is that if it is NOT moved it will not be tuning the overall sound in real time.
    Good idea!  I can't do that now, as it's ~5:AM and the household is still asleep.  I will try later.

    I suspect, with its current software/firmware, the HomePods won't adjust in real time -- but I believe the arey capable of doing so.

    One consideration is how the HomePods buffer/adjust their output.  From my tests with Airplay, it appears that EQ adjustments are made at buffer-in... subsequent EQ adjustments take effect after the buffer clears and sound with the new adjustments are buffered.  HomePod Siri volume adjustments (usually) take effect immediately.

    To reach its full potential I think that, with Airplay 2, the HomePods will have to adjust on buffer-out. These adjustments could include:
    1. the room profile (from setup)
    2. the music profile from metadata (created at recording -- or post analysis)
    3. the music as it is being played (heard by the HomePods)
    4. the listeners' individual hearing profiles and locations in the room

    Item 4 may not be necessary to realize immersive audio for all in the room!
    I agree that Apple should be capable of doing what you suggest, and those are excellent suggestions.  Apple is obviously delaying Airplay 2 and for good reasons no doubt. Perhaps they're recognizing some deficiencies that weren't apparent before HomePods got out in the wild and they're now taking their time to address them.  
  • Reply 97 of 117
    GG1GG1 Posts: 483member
    Tuubor said:
    Most people commenting on the fact that the Google Home Max has equal soundquality forget that the HomePod is almost 1/3 of the size, thus making it leaps and bounds better in soundquality and engineering. The competition wont be able to reproduce the same kind of soundquality for many years to come in that size. And also most of the reviewers say the HomePod does sound better. Google home is just louder because they sacrifice distorting the sound over 80% volume. 
    If the HomePod is successful, maybe we'll see a larger HomePod (BoomPod?) that will really compete with much bigger speakers.

    I wouldn't be surprised if Apple's engineering that went into the HomePod will trickle down to Apple's forthcoming over-the-ear headphones (using the same multi-microphone design for active noise cancelling) and trickle up into a larger HomePod.

    There are already the M-series chips (motion coprocessors whose functions were absorbed by the A-series), S-series (Apple Watch), T-series (MacBookPro touchstrip and other functions), and W-series (improved Bluetooth, maybe AirPlay). There may be a new series with the headphones and later HomePod variants.
  • Reply 98 of 117
    I suffer from modest hearing loss at both ends of the spectrum in one ear while the other ear is a bit better.  Maybe this could be "corrected" with expensive hearing aids, but I really doubt it's possible with any external speakers.  Having the speaker crank up the high and low frequencies would likely just make everything sound like crap for me and especially for anyone else around.  The fact is, with today's technology, I will never hear music the same way I could 20 or 30 years ago.  I don't expect any home speaker to address that.  The gold standard should be whether the performance sounds the same to me as it would if I were there when it was recorded.  Thus, no custom equalization required to suit my tastes.
  • Reply 99 of 117
    dick applebaumdick applebaum Posts: 12,527member
    gatorguy said:
    gatorguy said:

    The bass certainly is in order to help minimize distortion, and using a microphone dedicated to that. If the HomePod is also continually adapting the the tweeters/mids for both sound and direction while music is playing I'd love to see that same citation myself.

    Dick, while waiting for DED's citation here's something you might try that would help point things in the proper direction. While a single homepod of yours is playing music, and without moving it at all, place a pillow on one side of it and see if the high end dynamically adjusts to that new obstruction. Moving it should cause it to remap since Apple says that's what should happen when the accelerometer registers significant movement, but I would bet dollars to donuts it won't adapt to a new sound modifier otherwise.

    Since Apple themselves say the HomePod "maps" the room's overall sound signature by analyzing reflected sound when it's initially set up, and then again each time it's moved to a new location as registered by the accelerometer, the logical inference is that if it is NOT moved it will not be tuning the overall sound in real time.
    Good idea!  I can't do that now, as it's ~5:AM and the household is still asleep.  I will try later.

    I suspect, with its current software/firmware, the HomePods won't adjust in real time -- but I believe the arey capable of doing so.

    One consideration is how the HomePods buffer/adjust their output.  From my tests with Airplay, it appears that EQ adjustments are made at buffer-in... subsequent EQ adjustments take effect after the buffer clears and sound with the new adjustments are buffered.  HomePod Siri volume adjustments (usually) take effect immediately.

    To reach its full potential I think that, with Airplay 2, the HomePods will have to adjust on buffer-out. These adjustments could include:
    1. the room profile (from setup)
    2. the music profile from metadata (created at recording -- or post analysis)
    3. the music as it is being played (heard by the HomePods)
    4. the listeners' individual hearing profiles and locations in the room

    Item 4 may not be necessary to realize immersive audio for all in the room!
    I agree that Apple should be capable of doing what you suggest, and those are excellent suggestions.  Apple is obviously delaying Airplay 2 and for good reasons no doubt. Perhaps they're recognizing some deficiencies that weren't apparent before HomePods got out in the wild and they're now taking their time to address them.  
    Yup!

    And, I suspect that Apple is quite interested in that Google patent for instrument-separation that you linked in a prior thread... IDK, if the single DSP on the HomePod 1 is robust enough to handle instrument-separation at listening-time -- But, wow, being able to isolate an individual instrument -- that changes audio as we know it.

    It's John Williams' time!
    randominternetperson
  • Reply 100 of 117
    dick applebaumdick applebaum Posts: 12,527member
    I suffer from modest hearing loss at both ends of the spectrum in one ear while the other ear is a bit better.  Maybe this could be "corrected" with expensive hearing aids, but I really doubt it's possible with any external speakers.  Having the speaker crank up the high and low frequencies would likely just make everything sound like crap for me and especially for anyone else around.  The fact is, with today's technology, I will never hear music the same way I could 20 or 30 years ago.  I don't expect any home speaker to address that.  The gold standard should be whether the performance sounds the same to me as it would if I were there when it was recorded.  Thus, no custom equalization required to suit my tastes.
    Why couldn't that be possible?  Say, the recording studio records metadata along with the performance recording-- then on playback (on any system) the metadata is used to recreate the recording performance, compensating for the room and speakers, etc,

    I think that what this disruption (Apple, Google, B&O et al) is all about. 
    edited March 2018
Sign In or Register to comment.