Header file: AudioUnit/MusicDevice.h
Music Device Components are Audio Units - they implement all of the standard selectors as described in previous sections of this document. Its additional APIs are designed around the concept of the creation of notes, where notes ultimately generate audio data in response to some control protocol. Essentially the tasks required to do software based synthesizers, virtual instruments, and so forth.
Contents | |
The MusicDevice and MIDI | |
The Extended Control Protocol of the MusicDevice | |
Music Device Parameters | |
Types and Structures |
Functions | ||
MusicDeviceMIDIEvent | Send a MIDI event to a Music Device. | |
MusicDeviceSysEx | Send a MIDI system-exclusive message to a Music Device. | |
MusicDevicePrepareInstrument | Load an instrument. | |
MusicDeviceReleaseInstrument | Release an instrument's resources. | |
MusicDeviceStartNote | Start a note, using the extended control protocol. | |
MusicDeviceStopNote | Stop a note, using the extended control protocol. | |
Defined Types | ||
MusicDeviceInstrumentID | ||
MusicDeviceNoteParamsPtr | ||
MusicDeviceGroupID | ||
NoteInstanceID | ||
MusicDeviceComponent | ||
Structs | ||
MusicDeviceNoteParams | ||
MusicDeviceNoteParams3 | ||
MusicDeviceNoteParams16 |
MIDI (Musical Instrument Digital Interface) was designed as a protocol to allow the communication of data to control synthesizers, particularly from a performance perspective. It has been very successful and widely adopted since its introduction in the early 1980's. The MIDI specification is now multi-faceted, with various standards that apply to a wide variety of situations and uses. In its inception, MIDI contained two primary features, the specification of a transport layer (baud rate, type of plug and cable, etc) and a protocol for how these bits would be transmitted and recognised by the receiver.
There are many references and notes about MIDI freely available, so we will assume that you understand how MIDI works for the basis of this discussion.
Music Device components have two functions that provide a MIDI-based protocol interface.
ComponentResult MusicDeviceMIDIEvent( MusicDeviceComponent ci, UInt32 inStatus, UInt32 inData1, UInt32 inData2, UInt32 inOffsetSampleFrame );This function is used for any of the MIDI Channel events:
ComponentResult MusicDeviceSysEx( MusicDeviceComponent ci, const UInt8 * inData, UInt32 inLength );
The MIDI protocol for starting and controlling notes is based around channels. The NoteOn event starts a note on the specified channel, then the various control events are by and large, addressed to channel numbers. When stopping a note, both the channel and the note number are specified. MIDI also contains the concept of instruments, where in normal usage one instrument is active per channel (defined through the channels' bank and program change number). Some hardware synthesizers work around this limitation by defining performances where a channel may have different instruments assigned to different key ranges within the same channel.
There are a number of limitations to MIDI that need not apply to a software based system. The resolution of MIDI messages is often less than is desirable. Control messages are directed solely at a particular channel; the finer grained controls of instruments sharing a channel number is lost. For a software based synthesizer there is no discrimination between the act of making an instrument active on a particular channel and preparing the instrument for being used at a later stage. There are insufficient channels within a single MIDI stream to control complex scores.
Music Device Components provide an extended protocol for dealing with music instruments. This protocol is intended to address the shortcomings of MIDI, whilst retaining a compatibility mode (i.e. the MIDI protocol can be represented completely within this extended protocol, without inheriting its limitations). It is intended that any component that implements the MusicDevice API can do so by just implementing these extended API, and the MIDI messages can be translated to those API internally. As the MIDI messages fit within the extended protocol, a host can use either set of methods interchangeably to address the MusicDevice, and there should be no inconsistency as a result of that, provided of course that the host is using the extended API within the restrictions of the MIDI protocol.
Firstly, the extended protocol is note based. A note is started, and when started it is assigned both an InstrumentID that the note should be played on as well as the GroupID that this particular note should belong to. Control messages are sent to a group, as in MIDI, but because notes are assigned to their instrument when they are begun, there is no strong corelation between an instrument and a group as there is in MIDI. The GroupID that a note is assigned to is an arbitrary value. For maximum flexibility each note can be assigned its own GroupID, or to control a group of notes with single commands, a collection of notes can all be assigned to the same GroupID. A note is finished in one of two ways - the individual note can be stopped, or all of the notes belonging to a particular group can be stopped with a single command to that GroupID.
ComponentResult MusicDevicePrepareInstrument( MusicDeviceComponent ci, MusicDeviceInstrumentID inInstrument );Used to prepare an instrument without explicitly setting the instrument for a specific channel/group. The InstrumentID that is passed in to PrepareInstrument can be encoded in a manner that allows instruments to be read from MIDI based sample collections (which both DLS and SoundFont formats are). In that case, the ID is formatted as:
0xMMLLPP
MM
is the MSB of a bank select control change message
(range 0->127 when shifted down)LL
is the LSB of a bank select control change message
(range 0->127 when shifted down)PP
is the 0-127 value of a program change messageComponentResult MusicDeviceReleaseInstrument( MusicDeviceComponent ci, MusicDeviceInstrumentID inInstrument );This call tells the MusicDevice to release any resources associated with the specified instrument.
ComponentResult MusicDeviceStartNote( MusicDeviceComponent ci, MusicDeviceInstrumentID inInstrument, MusicDeviceGroupID inGroupID, NoteInstanceID * outNoteInstanceID, UInt32 inOffsetSampleFrame, const MusicDeviceNoteParams * inParams );StartNote returns a NoteInstanceID that is used to stop a note. The group that is specified when a note is started is the groupID that is used in the AudioUnitSetParameter call to control the notes on that group. Because you can specifiy the InstrumentID as well when you start a note, it is possible to start notes on different instruments but in the same group, allowing a single parameter value to alter notes on different instruments but on the same groups.
The returned NoteInstanceID will follow the following convention. If the
supplied note number (which is by convention the first value that is passed in
in the inParams
struct is an integral number, then the returned
noteID will be that integer. If the note number is a floating point number, then
a randomly created uniqueID will be returned. Thus, if you're using this function
with integral note numbers, then that same note number can be used to turn the
note off on that group. Because of this the outNoteInstanceID
parameter can be NULL.
The inParams
is a variable length struct. By convention, a
two value struct will have the same semantic as MIDI, with the first value being
equivalent to a MIDI note number, and the second velocity. Because the struct
allows floating point numbers, the note can specify a fraction pitch, for
instance, 60.5 would indicate note number 60 (in MIDI terms) with a quarter
semitone added (in a twelve tone scale).
However, this is by convention and is not required. The start note params can correspond to any number of parameters (and any range of values for those parameters) based on the synthesizer that is being used.
ComponentResult MusicDeviceStopNote( MusicDeviceComponent ci, MusicDeviceGroupID inGroupID, NoteInstanceID inNoteInstanceID, UInt32 inOffsetSampleFrame );This calls stops the specified note that is currently playing on that group at the specified sample offest into the next render call of the MusicDevice component.
Parameters are discussed in some detail in the section on Audio Unit Parameters. However, it is worthwhile to reiterate some of that here.
ComponentResult AudioUnitSetParameter (AudioUnit ci, AudioUnitParameterID inID, AudioUnitScope inScope, AudioUnitElement inElement, Float32 inValue, UInt32 inBufferOffsetInFrames);When used with a MusicDevice component, this call (and the others for parameter values) can be used in two ways. Firstly, it is used in the same way as with other Audio Units and their parameters. For instance, Apple's DLSMusicDevice publishes two parameters (tuning and volume) in 10.2 in the same manner as other Audio Units; these are published in the global scope with their parameter value type, range of values, etc.
Secondly, these calls can be used by specifying the
kAudioUnitScope_Group
for the scope. In this usage, the ID's in the
range of 0-127 will correspond with the standard attribution of MIDI controls,
e.g. 1 is modulation wheel, 7 is volume, 9 is pan, 123 is All Notes Off, etc. For
those MIDI controls that have their own command status (pitch bend, channel
pressure), those values are used with the channel set to zero. Thus, to apply
Pitch Bend the parameterID is 0xE0, for Channel Pressure (commonly called After
Touch), the parameterID is 0xD0. The element is the group number (0-15 would map
to the 16 MIDI channels) that the control should be applied to.
In this usage neither Program Change or Poly Pressure is required. Program Change because the instrumentID is specified when the note is started (and if a particular channel selection is required for patch and bank, then the appropriate MIDI message can be sent, though this is redundant really if you're using the extended API). PolyPressure, because the program can arbitrarily assign notes to groups and instruments, and then can apply the after-touch controller to the required groups.
Thus, to turn off all notes sounding on group 12:
AudioUnitSetParameter (myMusicDevice, kAllNotesOff, //123 kAudioUnitScope_Group, 12, // my group ID 0, 0 /*do this as soon as you can*/);Similarly, to apply PitchBend of zero to all notes sounding on channel or group 4:
//kPitchBendOffset == 8192 AudioUnitSetParameter (myMusicDevice, kPitchBend, //0xE0 kAudioUnitScope_Group, 4, // my group ID 0 + kPitchBendOffset, 0);Why 8192? PitchBend is a 14bit MIDI control message, and so (for compatibility) when using these MIDI equivalent control ID's, you should use the normal range associated with that MIDI message. (PitchBends range is 0 to 16383 with 8192 being the zero point - you can think of this as -8192 to 8191). The above message would look like
OxE3 0x00 0x40
in
MIDI.
There are a number of properties that Apple's DLSMusicDevice supports,
which are discussed in the section on Music Device Properties. Of
particular interest is the property
kMusicDeviceProperty_GroupOutputBus
. This will separate the notes
that are rendered for different groups to different output busses of the
MusicDevice, essentially allowing for an external client to have a finer degree
of control over the mix between the different group's output. Bare in mind, that
this does not necessarily mean that each group has only one instrument sound.
For example, let us imagine we want to have 4 groups from our synthesizer, and
want them to go to 4 different busses...
// FIRST, we need to allocate 4 output busses for our music device UInt32 outBusses = 4; AudioUnitSetProperty (myMusicDevice, kAudioUnitProperty_BusCount, kAudioUnitScope_Output, 0, &outBusses, sizeof (outBusses)); for (int i = 0; i < 4; ++i) { UInt32 theBusNumber = i; AudioUnitSetProperty (myMusicDevice, kMusicDeviceProperty_GroupOutputBus, kAudioUnitScope_Group, i, //this is our groupID &theBusNumber, sizeof(theBusNumber)); }We're assigning output busses and groups here as the same number (but they could be different). Then, any notes that are produced on groupID == 0 will be output on this MusicDevice's output bus (elementID) 0.
typedef UInt32 MusicDeviceInstrumentID;
typedef MusicDeviceNoteParams *MusicDeviceNoteParamsPtr;
typedef UInt32 MusicDeviceGroupID;
typedef UInt32 NoteInstanceID;
typedef ComponentInstance MusicDeviceComponent;
struct MusicDeviceNoteParams { UInt32 argCount; float args[1]; };
struct MusicDeviceNoteParams3 { UInt32 argCount; float args[3]; };
struct MusicDeviceNoteParams16 { UInt32 argCount; float args[16]; };