Chapter 11: AudioPaths


Overview

Todor Fay

AudioPaths bring a whole new level of flexibility and control to your music and sound effects. So far, everything we have done plays through one AudioPath. This has worked well for us in our programming examples up to this point; however, it is limited with respect to adding sound effects and increased musical complexity. Let's take a look:

  • No way to address 3D: For sound effects work, it is critical that one or more sound-producing Segments can be routed to a specific 3D position in space. It is also required that the 3D position be directly accessible to the host application so it can be controlled during gameplay.

  • Pchannel collision: Segments authored with the same pchannels conflict when played at the same time. For example, instrument choices from one Segment are applied to another, and a piano part is played by a bassoon. It should be possible to play two Segments in their own "virtual" pchannel spaces, so they cannot overlap.

  • No independent audio processing: It should be possible to set up different audio processors to affect different sound effects paths in different ways, such as applying echo to sounds going under a tunnel or distortion to an engine. Likewise, different musical instruments as well as sections can benefit from individualized audio processing (reverb, chorus, compression, etc.).

  • No independent MIDI processing: In the same vein, it should be possible to set up different MIDI, or pmsg, processors (tools) to manipulate individual musical parts and sound effects.

In fact, this was the status quo with DirectX 7.0. AudioPaths, introduced in DirectX 8.0, set out to solve these issues.




DirectX 9 Audio Exposed(c) Interactive Audio Development
DirectX 9 Audio Exposed: Interactive Audio Development
ISBN: 1556222882
EAN: 2147483647
Year: 2006
Pages: 170

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net