Audio Units are used for a variety of purposes; to generate, process, receive, or otherwise manipulate streams of audio. They are building blocks that may be used singly or connected together to form an audio signal graph.
Whilst they can be thought of generally as Audio Plug-Ins, or Software Synthesizers, they can also perform other roles, such as interfacing a graph of audio units to an audio device or writing out a file with the results of a processing graph.
They are presented on the system as a Component, and like all components they are identified by 4 char codes that describe the component's type, sub-type and manufacturerID.
Components allow for any application to dynamically load code that will conform to a particular set of C functions that are defined, typically by Apple, to be the set of API that a component should support. This allows for applications to expect a particular set of behaviours from particular component types, and for developers of Components to expect that their code will be exercised in particular and well understood ways.
The contract that is expressed by the Audio Unit API will therefore embody a considerable degree of expected behaviours. As such, a collection of C++ base classes is used to implement the Audio Units that are provided with any particular release of Mac OS X. In order to facilitate the deployment of Audio Units from other developers, Apple also provides these base classes in the SDK. Developers are strongly encouraged to utilise these classes for their own audio units to ensure a consistency of behaviour across a broad spectrum of audio units. Furthermore, much of the management of the state of an audio unit is managed by these base classes, thus making the task of implementing an audio unit considerably easier. Separate documentation is provided for those classes in the SDK as well.
This document describes the API of Audio Units, with additional documentation to describe the C++ implementation available. The implementation document presumes an understanding of the basic operational parameters of an Audio Unit, which is described herein.
There are also a number of other API services that directly use and support Audio Units. There are some utility calls in the AudioToolbox.framework
that allow external client establish listeners to receive notifications when parameter values change. There is also an AUMIDIController
API that provides a simple interface to map incoming MIDI messages from a designated MIDI Source to an Audio Unit's parameters. Finally, there is also an AudioUnitCarbonView
component (that is declared in the AudioUnit.framework
that provides a component interface to allow Audio Units to provide a UI. Apple also ships a GenericAUView component, which is a version of this component, that can be used to display a generic view of an Audio Unit's parameters. These topics are discussed here as well.
Finally, the AudioToolbox.framework
provides a number of API services that are use Audio Units or are used by Audio Units. The AUGraph provides services to construct a processing graph of Audio Units. The MusicSequence and MusicPlayer objects provide an ability to schedule events, and these events can be delivered to the nodes of an AUGraph (which are in turn Audio Units). The AudioConverter is used by both the AUConverter unit and the AudioDevice Output units, in order to allow these units to perform simple conversions of audio data such as int->float or sample rate conversions. The AudioConverter provides additional functionality that is not directly relevant to Audio Units (such as the use of Audio Codecs for encoding and decoding non-PCM based audio data).