Using a live orchestra to record your game's music is, in many cases, key to achieving the kind of cinematic scope and feel many games are after. But a beautifully composed, solidly performed, precisely recorded soundtrack is only half the battle. Getting that music implemented with an adaptive system that allows it to interact with the gameplay, setting it up to be mixed by the game's runtime mix system, and getting it tested to ensure that it behaves correctly under all circumstances is at least as important--and difficult--as composing it in the first place.
This session follows several pieces of music through all their stages of production, in context, from concept to composition, recording to implementation. It explores the complex iterative process that must happen in order for the soundtrack to link with the gameplay. And it discusses trade-offs at every stage for budget, schedule, and creative ambition. Extensive, real-world examples from games such as DEMON STONE and LORD OF THE RINGS: THE TWO TOWERS provide a solid foundation for the analysis.