Type of the project: interactive video performance (AI music composition).

This is an AI-based Machine Learning project that can detect human body movement and transfer that gesticulation into music. For instance, a visitor stands in front of a screen and watches a short picturesque movie without sound. According to what they feel from the scenes, they’re asked to show their emotions with their hand movements. Based on their body language, a piece of music will be generated by our system, combining the short film and their “own music”. 

Artist/idea author: Seg Kirakossian

Student developer: Mohamed Shahidul Islam

Leave a Reply

Your email address will not be published. Required fields are marked *