I've long wondered when someone would merge head tracking (obtained from the camera) to modify sound sent to headphones to lock the sound source to a world frame of reference.
Uh, I experience this every single time I use my PSVR. 3D audio has been a thing in the VR world for years. Have you never used a VR headset before?
However, that's not what they're doing here, since headphones move with your head and need to simulate a fixed sound source, while speakers are in a fixed position and have to compensate for head movement by way of audio trickery from fixed points to fool the ears, which is what this is.
Instead of fooling around with this stuff and destroying the integrity of the original recordings by manipulating phase and other techniques, Apple should simply try to come up with a way to add more bass to the internal speakers (if this is for gaming apps, then it's fine).
There's no mention of music in this article. This is for enhancing perception of on screen objects.
"Audio signal processing for virtual acoustics can greatly enhance a movie, a sports even, a videogame or other screen viewing experience, adding to the feeling of "being there," says Apple
"Sonic holography" has been a thing since the 70s or so. It's pretty cool. Gives you the perception of a wider soundstage without the physical distance between speakers to support it. I have a Carver C-4000 preamp (and an M-400 cube!) from the 80s which can do some pretty amazing things.
Yeah. Though not geared towards music reproduction active sonar systems have been using transmit beam forming and beam steering since the 1950s. What is being described here is a direct adaptation of the exact same acoustic principles used by active sonars for sound transmission, and all array based sonars (and the brains of living creatures that hear) for sound reception. Very cool stuff.
My understanding is active phased array sonar is more like parametric speakers. I don't think that's what Apple is talking about here, but maybe!
It all ties back to the principle of the superposition of (sound) waves, so both approaches share the same underlying principles. Nobody is copying anyone else, they’re simply all taking advantage of a common knowledge base of the physics of acoustics.
What Apple is doing is very similar to an approach one might take when designing a target simulator for a passive sonar array, where you’d have to model both the distance and angle between the sound source (target) and the receiving array. Doing this properly requires fairly intimate knowledge of the acoustic processing characteristics of the receiver, including both the “array” and receiver signal processing. This really highlights how far Apple has progressed towards combining their understanding of how their technology best interacts with innate human sensory abilities.
The physics parts are relatively straightforward because they are based on sound underlying principles. The human side of their solution is where Apple really shines. Like I said, very cool stuff.
I've built (and designed) active phased array radio antennas, so I'm very familiar with the physics behind wave interference (particularly self-interference, as seen in Yagi-Uda antennas). This patent sounds like it blends a little of that with a heaping dose of psychoacoustics. My reading is it's more about faking the reflections so the listener thinks the audio is coming from a certain position rather than using wave interference to physically cause it to come from the right angle.
Talking of HomePods ... Siri still can't access my iTunes Match from any HomePods we have, instead always attempting to open Apple Music which I don't subscribe to. All other devices including Car Play and Apple TV 4K using the remote's mic have no such problem. Everything is up to date. Has anyone else seen this problem?
I have the same issue and Apple Support confirmed the bug to me a couple weeks ago. It has to do with the multi-user features in iOS 13.3. Hopefully 13.4 will finally fix the issue. Turn off "recognise my voice" in the Home app and you will be able to access your playlists again.
Thank you! That's been driving me nuts. The really weird thing is it worked for some artists and not others. Now working normally once I found out how to turn off "recognize my voice" in the Home app, wow, Apple's GUI in iOS is getting pretty unintuitive, speaking as a Mac user since day one.
Comments
Uh, I experience this every single time I use my PSVR. 3D audio has been a thing in the VR world for years. Have you never used a VR headset before?
However, that's not what they're doing here, since headphones move with your head and need to simulate a fixed sound source, while speakers are in a fixed position and have to compensate for head movement by way of audio trickery from fixed points to fool the ears, which is what this is.
There's no mention of music in this article. This is for enhancing perception of on screen objects.
"Audio signal processing for virtual acoustics can greatly enhance a movie, a sports even, a videogame or other screen viewing experience, adding to the feeling of "being there," says Apple