DirectMusic and MIDI
We've already alluded several times to DirectMusic's support for the MIDI format. This can be a strength or a weakness; the interoperability with this standard sequencing format allows for easy authoring from more sequencing-focused tools in the composer's palette (like Sonar or Cubase SX). But using DirectMusic for sound effects can work against this support — a sound effect is often meant to be played once or looped indefinitely. The concept of a finite "note on" and "duration" is somewhat foreign in this instance. As we've discussed already, the use of clock-time Segments can mitigate this somewhat for sound effects — they are given a fixed duration in milliseconds that are independent of tempo.
While DirectMusic supports MIDI, it has several significant aspects that allow it to overcome some of the basic limitations of MIDI devices. For instance, typical devices are limited to 16 MIDI channels. The Microsoft Software Synthesizer, the synthesis engine used by DirectMusic, allows you to reference up to 32 million discrete channels. Using DirectMusic Producer, you will be able to use 1000 of these performance channels, or pchannels, per Segment.
DirectMusic also adds some basic optimizations to MIDI to allow for easier manipulation of controller data. Traditionally, sweeping a controller from one value to another consisted of many rapid MIDI controller events. As a file format (and authoring convenience) optimization, DirectMusic allows the audio producer to instead use continuous controller curves. The start time, end time, initial value, end value, and curve type can all be specified. When the content plays back, the intermediate values are dynamically generated to remain compatible with traditional MIDI.
A common question raised when a game is layering 20 pieces of music is whether the composer needs to worry about authoring each piece of audio onto its own set of pchannels. With traditional MIDI, you cannot simultaneously play two separate Instruments on the same channel at the same time — the patch change from one overrides the other. Similarly, if two Tracks on the same channel use the same MIDI controllers, they will override each other. Because DirectMusic allows the composer to play multiple pieces of content simultaneously, DirectMusic provides solutions to these basic MIDI limitations.
From the point of view of MIDI, DirectMusic AudioPaths effectively act as unique synthesizers with their own unique pchannels. An AudioPath is pretty much what it sounds like—aroute that audio data follows through hardware or software before it is output. PC AudioPaths can include DirectX Media Objects (DMOs), which are software audio processing effects. AudioPaths also define the behavior of content being played onto them, such as whether they can be positioned in 3D space (kind of like surround sound but different — more on this later), whether they should be played as stereo or mono, and so on. As far as MIDI is concerned, each AudioPath gets its own unique set of "physical" pchannels (granted, this is all in software, so we're not talking about tangible channels). For instance, if two sequences are authored, both using pchannels 1 through 16, playing them on separate AudioPaths will keep the two from interacting. If one Segment on AudioPath A has a piano on pchannel 1 and another Segment on AudioPath B has a French horn on pchannel 1, they will both play happily. If we tried to play these two Segments onto a single AudioPath, one of the patch changes would win out, and one Segment would be playing the wrong instruments.
That said, sometimes playing multiple pieces of content onto the same AudioPath is desirable. For instance, a Segment could be authored simply to alter the mix of music played by another Segment (using volume controller curves). Alternatively, in the example of 3D audio above, we probably would want to play everything that will emanate from this single position onto the same AudioPath.
Therefore, we have a solution for multiple simultaneous pieces of music. But what about MIDI controller curve interaction? Of course, if the pchannels are kept unique (either at authoring time or by playing the content onto separate AudioPaths), the MIDI controllers won't conflict. But what about the above example of a Segment where we just want to alter the mix? If the Segment it's altering has its own volume controller curves, the two will conflict, and we might get the volume jumping rapidly between the two. The classic solution is to use volume curves in one Segment and expression curves in the other Segment. This is a common approach in sequenced music, as both affect perceived volume and apply additively. This way the audio producer sets Volume (MIDI Continuous Controller #7) in the primary piece of music, and uses Expression (MIDI Continuous Controller #11) to reduce this volume per individual performance channel in the second, layered piece of music.
DirectMusic provides a more generalized solution for this problem — not only for volume but also for pitch bend, chorus send, reverb send, and numerous other MIDI continuous controllers. The audio producer specifies an additive curve ID for each MIDI continuous controller curve created. Curves that have the same ID override each other, as with traditional MIDI. Curves that have different IDs are applied additively (or more accurately for the case of volume, the process becomes subtractive).