or Connect
AppleInsider › Forums › Software › Mac OS X › Core Audio in Mac OS X Tiger to improve audio handling
New Posts  All Forums:Forum Nav:

Core Audio in Mac OS X Tiger to improve audio handling

post #1 of 28
Thread Starter 
Apple's next-generation OS will include an updated version of the company's Core Audio technology that will deliver new tools and functions to improve the audio handling experience of Mac OS X applications.

Sources close to Apple say that the new version of Core Audio will introduce two new audio units, aggregate device support, nonnative audio file format support, and other tools that let developers perform live mixing and playback of audio content.

New Audio Units

Tiger's version of Core Audio will present two new audio units that developers can use in their applications.

A file-playback audio unit will make it possible for applications to use an existing sound file as an input source. The audio unit will convert file data from any format to linear PCM so that it can be processed by other audio units, sources said. Linear PCM (LPCM) is an uncompressed audio format that is similar to CD audio, but with higher sampling frequencies and quantisations. The format offers up to 8 channels of 48kHz or 96kHz sampling frequency and 16, 20 or 24 bits per sample but not all at the same time.

The second audio unit will handle time and pitch transformation modifications to audio data. Developers will be able to use this audio unit to change the playback speed without changing the pitch, or vise versa.

Aggregate Device Support

Tiger's version of Core Audio will also support the creation of aggregate devices, sources said, allowing developers to combine multiple audio devices together under the auspices of a single audio device. For example, developers can take two devices with two input channels each, combine them to create a single aggregate device with four input channels, and then let Core Audio take care of routing the audio data to the appropriate hardware device.

Nonnative Audio File Format Support

Another planned enhancement to Core Audio is an extension mechanism for supporting audio file formats not natively supported by Core Audio in Mac OS X. This mechanism will reportedly allow developers to create components for reading and writing these audio file formats, while an updated API will aid applications in detecting and utilizing the custom components.

Two New APIs

Tipsters also said that Core Audio will add two new APIs to its toolbox.

A new extended audio file API will ease the complexity previously associated with converting files from one format to another. The API will read data in any format supported by Core Audio and then convert it to and from linear PCM format.

Meanwhile, a new clock API will allow developers to create and track MIDI time formats. The API will reportedly support the use of the MIDI Time Code (MTC) and MIDI Beat Clock protocols to synchronize operations between MIDI-based controllers.

Developer Tools

Finally, sources expect Tiger to ship with a new Audio Unit Lab tool, which will let developers graphically host audio units and examine the results. The tool will reportedly host audio units and allow developers to use them to do live mixing and playback of audio content.

\tIn recent weeks AppleInsider has provided extensive coverage of Mac OS X 10.4 Tiger. Previous reports include coverage of Tiger's Spotlight search, Safari with RSS, Mail 2.0 with smart mailboxes, iCal 1.5.3, Resolution Independent UI and 256x256 Icons, AppleScript 1.10, Installer 2.0, web enabled Help, Fast Logout, Access Control Lists, OpenGL enhancements, adoption of OpenAL, Core Data, PDF Kit, SQLite, and networking-related enhancements.
post #2 of 28
As I (silently) predicted, Tiger will be more than demoed at WWDC and Apple's website.

This is great news.
post #3 of 28
The "beat" goes on!(hardy har har). Really though the enhancements sound great. Core Audio is shaping up to be a very nice architecture for developers. I'm impressed with Apple.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #4 of 28
Umm...can anyone say iTunes Extreme?

People will now be able to add audio filters onto MP3s and AAC files. Or pretty much anything that makes sound waves.

Does Microsoft have something similar in the pipeline? The Mac is becoming a dream machine to graphics, video and audio artists.
post #5 of 28
Aggregate device support sounds really, really, cool. I wonder how much control over routing and all of that stuff we'll be able to have in the end? I'm guessing a new version of Logic to go along with Tiger to better implement all of these fun things?
post #6 of 28
Quote:
Originally posted by Big Red
Aggregate device support sounds really, really, cool. I wonder how much control over routing and all of that stuff we'll be able to have in the end? I'm guessing a new version of Logic to go along with Tiger to better implement all of these fun things?

What does that mean anyways? Does it mean you'd be able to plug in a guitar and a keyboard piano and play them simultaneously?
post #7 of 28
Quote:
Originally posted by kim kap sol
What does that mean anyways? Does it mean you'd be able to plug in a guitar and a keyboard piano and play them simultaneously?

No I believe it means you can "combine" two hardware audio interfaces into one "logical" interface and keep sync and routing functions transparent to the end user.

The Midi sync features will be nice as well. Any sync improvements are important in audio where sync is God.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #8 of 28
I think it's similar to what Propellerheads Reason offer. I'm not sure though, but it sounds like the routing features of Reason. :-)
post #9 of 28
Quote:
Originally posted by kim kap sol
Umm...can anyone say iTunes Extreme?

"QuickTime 7."

QuickTime is dead. Long live QuickTime!
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
"...within intervention's distance of the embassy." - CvB

Original music:
The Mayflies - Black earth Americana. Now on iTMS!
Becca Sutlive - Iowa Fried Rock 'n Roll - now on iTMS!
Reply
post #10 of 28
Indeed.

It'll be interesting to see what happens with the CoreVIA APIs and the QuickTime brand.
My brain is hung like a HORSE!
Reply
My brain is hung like a HORSE!
Reply
post #11 of 28
Are we going to get a simple built-in voice recorder yet? Remember Simple Text? I shouldn't have to go to iMovie or GarageBand to record a simple voice memo. I've downloaded Audacity but haven't tried it yet. Nevertheless, such a tool ought to come with the OS.

I hope this also indicates that neglected technologies like voice synthesis and voice recognition are going to be improved. I haven't noticed any real improvement in either since 1997 using OS 8.5 on a Performa 6400 at 200 MHz. Now I have a dual 2GHz G5 using the latest OS X and the biggest thing is Victoria, high-quality. They need to get rid of 90% of the voices that are jokes (Bad News, Good News, Hysterical, Bubbles) and have little practical application. More human choices would be great. And don't get me started about voice recogniton!
post #12 of 28
Quote:
Originally posted by kim kap sol
What does that mean anyways? Does it mean you'd be able to plug in a guitar and a keyboard piano and play them simultaneously?

No its a bit more than that. At present Core Audio only allows one input/output stream to be active at any one time. That stream can have multiple channels (my Metric Halo IO has 18 in and out) but you can't have multiple streams. This is a pain if you need large numbers of in and outs. A G5 could easily handle hundreds of simultaneous channels. MOTU have their own solution for Digital Performer but its a little kludgey. ProTools of course does a similar proprietary thing. What the update will do (and its about time) will allow multiple interfaces from multiple vendors to be treated and routed as one single large in/out audio interface at the core level.

Tiger is shaping up to be the sexiest development environment on the planet.
post #13 of 28
This feature ia already available. It is known as JackOSX.

Check it out:
http://www.jackosx.com/
post #14 of 28
Quote:
Originally posted by MoBird
This feature ia already available. It is known as JackOSX.

Check it out:
http://www.jackosx.com/

Now I may be wrong here but I don't see anywhere on the JACK site where it says you can aggregate I/O hardware. I've heard people talk about JACK but only within the context of routing audio from one app to another in ways that were previously impossible.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #15 of 28
Sources close to Apple say that the new version of Core Audio will introduce two new audio units, aggregate device support, nonnative audio file format support, and other tools that let developers perform live mixing and playback of audio content.

Does this mean support for Multiple Audio Interfaces?

Could anyone here answer that question? Thanks!
post #16 of 28
Quote:
Originally posted by MoBird
This feature ia already available. It is known as JackOSX.

Check it out:
http://www.jackosx.com/

Er...not the same thing at all.
post #17 of 28
Quote:
Originally posted by donnie7@bellatlantic.net
Sources close to Apple say that the new version of Core Audio will introduce two new audio units, aggregate device support, nonnative audio file format support, and other tools that let developers perform live mixing and playback of audio content.

Does this mean support for Multiple Audio Interfaces?

Could anyone here answer that question? Thanks!

Read the thread dude!
post #18 of 28
Quote:
Originally posted by AppleInsider
A file-playback audio unit will make it possible for applications to use an existing sound file as an input source. The audio unit will convert file data from any format to linear PCM so that it can be processed by other audio units, sources said. Linear PCM (LPCM) is an uncompressed audio format that is similar to CD audio, but with higher sampling frequencies and quantisations. The format offers up to 8 channels of 48kHz or 96kHz sampling frequency and 16, 20 or 24 bits per sample but not all at the same time.

This is cool, but I wonder how it interacts with the FairPlay DRM?
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #19 of 28
JACK is for routing multiple apps through an interface. That's cool but many people want to route multiple apps through multiple hardware interfaces. This is done using custom code for many high end interfaces but now that Core Audio supports it even the small guys can gang their cards together and reap the CA stability and latency.

Quote:
Jack (the Jack Audio Connection Kit) is a low-latency audio server, written originally for the GNU/Linux operating system, and now with Mac OS X support. It can connect any number of different applications to a single hardware audio device; it also allows applications to send and receive audio to and from each other.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #20 of 28
>Meanwhile, a new clock API will allow developers to create and track
>MIDI time formats. The API will reportedly support the use of the
>MIDI Time Code (MTC) and MIDI Beat Clock protocols to synchronize
>operations between MIDI-based controllers.

Now... if iTunes, and Airport Express both used these MIDI technologies, could I play music in my study SYNCHRONISED with the music in my living room?

Hmmm...
What do you think?
post #21 of 28
I think the vagaries of network latency would eat it alive.
My brain is hung like a HORSE!
Reply
My brain is hung like a HORSE!
Reply
post #22 of 28
Quote:
Originally posted by Kickaha
I think the vagaries of network latency would eat it alive.

And yet, MIDI is designed specifically to allow multiple instruments to be digitally connected and play in sync, isn't it?

I know it's not applied to playing songs [top 40 hits]... but surely the technology and timecodes to get things synchronised would work for either?

Though, I am no MIDI expert!!!!

[edit: definition of 'songs']
post #23 of 28
MIDI has tight synchronization across all devices by means of a clock signal.

Networking is deliberately *A*synchronous. You can't make it work in lockstep.
My brain is hung like a HORSE!
Reply
My brain is hung like a HORSE!
Reply
post #24 of 28
I hereby nominate this thread title for the Captain Obvious Award.

"I do not fear computers. I fear the lack of them" -Isaac Asimov
Reply
"I do not fear computers. I fear the lack of them" -Isaac Asimov
Reply
post #25 of 28
Yeah I guess the title is obvious: "Core Audio in Mac OS X Tiger to improve audio handling". If it didn't improve audio why would they do it!? hehe

Anyway, back to MIDI -

I've done some reading around on various sites for the relationship of MIDI to Ethernet. MIDI currently IS used over ethernet, it uses time servers (including standard internet NTP), and the greatest network latencies of your connected devices are found and used to set a standard delay, hopefully quite short.

It is hopefully short with MIDI because they want to work with instruments in REAL TIME. Of course, a delay when playing music on 2 separate computers doesn't matter even if it's a few seconds as long as they are playing in time with each other - it just means that when you skip to the next song, it will take that long to start playing.

BUT, that said - MIDI is really using multiple networking technologies to synchronise the devices over ethernet. So yes it's obviously not using the old style midi that required quality 5 pin cables less than 50 feet. In fact, Apple could probably just say "we're synchronising our audio and video now" and put together the technologies without using MIDI at all.

So who knows if Apple are planning on synchronising video, audio, etc in Tiger - maybe this rumour relates to that, maybe not - though I hope when they do work in this area they use an existing standard.

Hope that's useful.

Main source (plus the relevant links for timecodes!):
http://www.fact-index.com/m/mu/music...interface.html
http://www.grame.fr/MidiShare/fr/Reseaux/Reseaux.html
post #26 of 28
Quote:
Originally posted by GregAlexander
>Meanwhile, a new clock API will allow developers to create and track
>MIDI time formats. The API will reportedly support the use of the
>MIDI Time Code (MTC) and MIDI Beat Clock protocols to synchronize
>operations between MIDI-based controllers.

This is another great little code nugget for developers. It means that MIDI sync becomes part of the core API's, which basically means that any app and several apps together can be time synchronised. Lots of possibilities here... a few off the top of my head: How about MS Word flipping pages in sync to Logic, Karaoke style; Keynote changing slides synced with an external sequencer: FCP playout synced to Garageband; Maya animation timed in response to Digital Performer. It should make it much easier to obtain MIDI sync over ethernet/WiFi. Of course the main benefit is to encourage new apps that we can't even imagine yet:

Man I'm going to have to hit those Cocoa Programming books again. Tiger rocks!
post #27 of 28
Quote:
Originally posted by GregAlexander
Yeah I guess the title is obvious: "Core Audio in Mac OS X Tiger to improve audio handling". If it didn't improve audio why would they do it!? hehe

Anyway, back to MIDI -

I've done some reading around on various sites for the relationship of MIDI to Ethernet. MIDI currently IS used over ethernet, it uses time servers (including standard internet NTP), and the greatest network latencies of your connected devices are found and used to set a standard delay, hopefully quite short.

It is hopefully short with MIDI because they want to work with instruments in REAL TIME. Of course, a delay when playing music on 2 separate computers doesn't matter even if it's a few seconds as long as they are playing in time with each other - it just means that when you skip to the next song, it will take that long to start playing.

BUT, that said - MIDI is really using multiple networking technologies to synchronise the devices over ethernet. So yes it's obviously not using the old style midi that required quality 5 pin cables less than 50 feet. In fact, Apple could probably just say "we're synchronising our audio and video now" and put together the technologies without using MIDI at all.

So who knows if Apple are planning on synchronising video, audio, etc in Tiger - maybe this rumour relates to that, maybe not - though I hope when they do work in this area they use an existing standard.

Hope that's useful.

Main source (plus the relevant links for timecodes!):
http://www.fact-index.com/m/mu/music...interface.html
http://www.grame.fr/MidiShare/fr/Reseaux/Reseaux.html

Good links... but they don't address the real problems I was thinking of.

1) That's MIDI over Ethernet, not MIDI over a full-fledged network. What's the difference? Their network topology wasn't described in #2, but from that I assume that it was pretty flat, without routers, switches, and such. Heck, mLAN from Yamaha is MIDI over FireWire, and now we have FireWire networking too - saying that Ethernet has replaced the old serial MIDI wiring is still a long ways from MIDI over a full network. Now, they could have very implemented this on, say, a campus network, and just didn't state so clearly.

2) WiFi (the original technology under discussion) makes Ethernet look positively deterministic. It is much, much worse, as just a physical layer.

Edit: Ok, drilled through to the OpenSound Control page... very interesting stuff! *Almost* makes me think a MIDI *replacement* over the network is a solved problem.
My brain is hung like a HORSE!
Reply
My brain is hung like a HORSE!
Reply
post #28 of 28
Quote:
Originally posted by MoBird
This feature ia already available. It is known as JackOSX.

Check it out:
http://www.jackosx.com/

Jack does NOT support multiple interfaces in OS-X! Contacted the developers and said that it is a function of the OS and the app to gett multi interface support. Still waitingfor it in Tiger and Logic 7. One interface is all you can use with Logic 6 Pro and OS-X 10.3x.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Mac OS X
AppleInsider › Forums › Software › Mac OS X › Core Audio in Mac OS X Tiger to improve audio handling