This paper presents an experimental application that shows how an interactive process can save time and production resources in the process pipeline of music scoring for a 3D animation Movie. This investigation departs from several rules (about 30) that performers use to add certain expressive emotions to a score, covering different aspects of music performance.

Basically the application of the rules produces a rearrange in time and velocity parameters such as “microtiming”, articulation, tempo and sound level. The resulting application should take as input a basic music score (MIDI) quantized linearly, and imprint expressiveness and emotion to this music in sync with the expressiveness cues from the timeline exported from the animation project. Acoustic cues are driven facial and gesture expression from the characters. 

Project granted by COST Action - EU