logo Ircam

Affective Avatars

Activity Period: 
01/01/2008 - 31/12/2009
Status: 
Finished

Several economic players have lavishly invested in the creation of services, games, and virtual worlds where users interact amongst themselves using their "avatar".

The Affective Avatars project's goal is to create affective avatars in real-time; the avatar's expressiveness is controlled by the user's voice.

The expressive end emotional parameters extracted from the real-time processing of the vocal signal are used to pilot the expressiveness of the lips, faces, and bodies of the avatars. The user's vocal pitch is also transformed in real-time, giving the avatar a voice that is in keeping with its image.

Four scientific/technological issues will be addressed in this project:

  • detection of emotions in the human voice,
  • modeling physical expessiveness (bodytalk),
  • transformation of vocal timbre in real-time,
  • modeling a catalogue of vocal profiles with expressiveness and multimodal cohesion.

Project Details

Program

ANR - TLOG (Technologies Logicielles de l'Agence Nationale de la Recherche)

Participants

Coordinator

Limsi, FR (Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur)

Partners

Cantoche
IRCAM, FR (Institut de Recherche et Coordination Acoustique/Musique)
Limsi, FR (Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur)
Voxler

cantoche.gif ircam.gif cnrslimsi.gif voxler.jpg