Type of the project: interactive video performance (AI music composition).

This is an AI-based Machine Learning project that can detect human body movement and transfer that gesticulation into music. For instance, a visitor stands in front of a screen and watches a short picturesque movie without sound. According to what they feel from the scenes, they’re asked to show their emotions with their hand movements. Based on their body language, a piece of music will be generated by our system, combining the short film and their « own music ». 

Artist/idea author: Seg Kirakossian

Student developer: Mohamed Shahidul Islam

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *