Last Active
  • Apple's HomePod isn't about Siri, but rather the future of home audio

    foggyhill said:
    gatorguy said:

    This is a very basic example what the multiple tweeters in the HomePod can do to create a wider soundstage and also to eliminate problems with phase (and will look familiar to anyone who watched the HomePod video).

    When sound is reflected off the rear walls, it has a longer path to take to get to the listener than sound coming directly from the speaker. In my example above, the total time for sound to reach the listener is 2.5ms from the front driver and a total of 8.0ms from the rear drivers. When the sound reaches your ears it could be perfectly in phase, completely out of phase or most likely, somewhere in between.Phase can be adjusted mechanically (physically changing the position of a driver or speaker) or electrically (through digital time delay). Obviously, the HomePod uses digital time delay to adjust phase.

    In the example above, the sound to the rear tweeters would be sent out normally. However, the sound to the front tweeter would be delayed by 5.5ms. This delay allows the rear sound to "catch up" to the direct sound such that by the time it's reflected off the rear walls and starts moving forward it will end up being in phase with the sound from the front tweeter. All the possible issues that can arise with sounds being out of phase are thus eliminated. Since the HomePod also has 6 microphones, calculating this delay time would be fairly straightforward. A few clicks or other test tones played through the individual tweeters can be measured by the microphones so the HomePod can determine exactly how far away it is from any walls and set the appropriate delay time accordingly.

    It's important to note that what I described above isn't actually beamforming. It's a well established method of using time delay to control phase and improve sound quality. Beamforming is more complex but still relies on the same basic principles (precisely controlling phase to multiple drivers to direct where sound goes).

    When Apple says beamforming it's not a marketing term. The HomePod has the proper hardware and layout to enable beamforming. All the rest (Sonos, Google, Amazon) don't.
    But do at least a couple of those (Sonos/Google) have the hardware needed to accomplish what you diagramed, other than reflecting off the back wall which doesn't seem like a necessary feature. (Why not just compute the time delay firing forward?) I think so. Both Sonos One and Home Max appear perfectly capable of computing delay times and adjusting phase matched to your room structure (wall angles, size, furniture placement, other reflecting/absorbing surfaces, etc). We already factually know Google uses beamforming in concert with the microphones for better sound recognition. It's a short drive to using the same technology for much of the same sound adjustments as Apple might be accomplishing thru other methods isn't it? As for beamforming, and despite your stated years of professional audio experience, I think you yourself have realized your knowledge about what Apple may be doing with it is somewhat lacking since you are having some difficulty expressing how it will benefit the Home Pod or why it disadvantages others who lack it.

    Now having said all that I personally expect the Google Home Max to be noticeably bass-heavy just as the original 2016 Google Home was. I'm not a fan of boom-sound myself and I've no doubt many others would agree. Fortunately for those owners there is an included equalizer now. It's certainly possible the Home Max may suffer some muddiness.

    Apple will of course have a well-thought-out Home Pod with very good sound and well matched to other Apple products. It may well bring in more revenues than other mid-range smart-speakers by a significant margin, which at the end of the day is all this is about: More profits.  

    Sonos One with Alexa (other voice assistant support coming soon) is what I would expect to have the better features for most folks and at least comparable overall sound to the Home Pod if not a bit fuller, particularly for the price which is significantly undercutting both Apple and Google. That's before the expected Play:5 replacement this next year that will likely support Amazon Alexa, Google Assistant and Apple's Siri at launch.

    Another plus for some Homepod competitors is the much better cross-platform and 3rd party support, services like Spotify, Pandora, iHeart Radio, Amazon Music, TuneIn and others, that won't be supported by Apple. IMHO If you're not already deep into the Apple ecosystem there are better featured products for you than the Home Pod, but for dedicated Apple fans the Home Pod might be (and probably will be) the best choice among the three as long as your music sources only from Apple Music/iTunes or your personal library.

    And again this is just personal opinion but at some point Apple's Homepod will have to support some 3rd party streaming. As is they are too limited IMO, assuming of course that the product they eventually ship is the same as the one they demo'd and spec'd back in the summer.

    Everything you said in regards to phase and beamforming is, again, completely false.

    My only problem is trying to explain complex ideas so the layperson can understand. Your comments about phase make this abundantly clear.

    He doesn't give a hoot if he's in error (it's the new thing don't you know, not caring about being wrong without embarrassment), just throw things out, lie, distort, withhold, cherry pick, and more and feel smug  doing it; it is "working" for that orange juice man wringing the life out of the US right now and he's got legions of copiers.
    Reminds me of you... You're just as quick to insult as you are to turn tail and run.

     It baffles me how you're both rude and insulting to almost everyone you might disagree with. You'll attract more bees with honey than vinegar.
  • Apple's HomePod isn't about Siri, but rather the future of home audio

    DanielEran said:

    False. Smart Sound is Google's brand for changing the volume to ambient sounds. It and Sonos TruePlay adapt sound to the room, but are not on the same level as what Apple is doing with its A6 powered HW. But really, if Google could sell hardware it would be, wouldn't it? Or is it just "showing other companies how to do things" again? Really impressive how you run those goalposts around.  
    The Google Home Max includes a new feature dubbed Smart Sound, which taps into Google’s machine learning expertise to tailor the audio experience to your tastes—and even your physical surroundings. Smart Sound will dynamically adjust the Google Home Max’s audio output based on its physical surroundings; if you move it from a crowded corner spot to an open kitchen counter, the speaker will automatically change how it sounds for optimal audio quality. Over time, Google’s AI will learn context about your home and adjust sound based on that, too. Examples given onstage included raising the volume if the Google Home Max detects your dishwasher running in the background, or lowering it during those bleary morning hours. Google claims the voice recognition will discern the different people talking to it, and thus create custom playlists tailored to particular music tastes.

    Do you mind clarifying what Apple is doing different and/or better?

  • Video: Apple iPhone X versus Samsung Galaxy Note 8 benchmark comparison

    foggyhill said:
    VRing said:
    foggyhill said:
    lkrupp said:
    gatorguy said:
    They'll spin into "it doesn't matter". As the AI author alluded to both phones are so capable no one will notice in actual use. 
    And right on cue they’re doing it right now in this thread. When Apple was behind they trotted out every benchmark they could find and declared victory. Now the shoe is on the other foot and... it doesn’t matter because no one will notice in real world use. Apple fans tried to say the same thing (it doesn’t matter) and we’re laughed at as clueless iSheep. Look at what’s being said by them here. What benchmark are they gushing over? Why the one that shows Samsung either equal to or slightly ahead of Apple. 
    Why the one that shows Samsung either equal to or slightly ahead of Apple. - I am actually surprised that iPhone X is marginally losing to Samsung Note in 1 or 2 benchmarks. I don't think this would be the case with iPhone 8 or even 8 Plus. X seems to be using a much higher resolution internally than 8 Plus which seems to affect the benchmark scores.
    The slightly ahead is in fact a statistical tie considering sampling, I just hate false facts.
    . Those few benchmarks rely on cores and the Note still has more than the X.
    In fact, these benchmarks are probably least relevant at all to an actual user.
    The actual results for the Note 8 on that test are considerably higher from every single source that's not Apple Insider (including Futuremark).

    3DMark uses the same physics engine as popular games, so it's actually quite relevant to real world performance.
    Well, give me your source buds, all of them so I can see if you're not pulling things out of your ass. If it comes from a moron on youtube, forget about and keep it to yourself.
    Gladly. For reference, Apple Insider's score for the Note 8 running Sling Shot Extreme was 2614.

    Futuremark (the makers of the 3DMark benchmark)
    • Sling Shot Extreme: 3602
    • Sling Shot Extreme Unlimited: 3975
    UL Benchmarks
    • Sling Shot Extreme: 3595
    • Sling Shot Extreme Unlimited: 3969
    Digital Trends
    • Sling Shot Extreme: 3577
    • Sling Shot Extreme Unlimited: Not Tested
    Hot Hardware
    • Sling Shot Extreme: Not Tested
    • Sling Shot Extreme Unlimited: 4068

    As a number of sites use the "Unlimited" benchmark, I also posted those results to show consistency with the results on Futuremark's official page. Additionally, here are some numbers to compare for the iPhone X in this benchmark.

    Futuremark's iPhone X scores:
    • Sling Shot Extreme: 2681
    • Sling Shot Extreme Unlimited: 3175
  • Video: Apple iPhone X versus Samsung Galaxy Note 8 benchmark comparison

    I'm curious to see what the long term performance is. In other words, does it heavily throttle after a couple of minutes of use? Would the scores of the benchmarks see drastic changes on the 2nd or 3rd run? It's hard to say as some of the benchmarks, such as Geekbench, will give a pause to prevent thermal throttling. Are the numbers the Galaxy and iPhone outputting even sustainable beyond a couple minutes of real world use?
    chasm said:
    Apple’s use of Metal (which nearly all developers are going to take advantage of, giving Apple a significant win now and going forward).
    I think you're forgetting about Vulkan which launched a year or two back on Android. Most major engines already support it (Unreal, Unity, CryEngine, Source 2, etc.). 
    sflocal said:
    It's an absolute embarrassment that Samsung with more cores does much worse than an iPhone with what is technically a "slower" chip.  
    Since when did having more cores make something "technically" faster? Your logic doesn't make any sense.