Deep Learning and Machine Learning for Music and Interactive Arts
Title: Deep Learning and Machine Learning for Music and Interactive Arts
SNIC Project: Berzelius-2022-122
Project Type: LiU Berzelius
Principal Investigator: Kivanc Tatar <>
Affiliation: Chalmers tekniska högskola
Duration: 2022-06-01 – 2022-12-01
Classification: 10209


Music and Arts play a powerful role in the progress of society. New technologies of Machine Learning and Deep Learning indicate promising new affordances for artistic creation. This project focuses on Machine Learning and Deep Learning architectures for real time interactions in Music and Interactive Arts practices. On the technology side, the project researches algorithms that bring together: 1- movement and sound, 2- sound and moving images, 3- movement, sound and moving images. Multimodality is a key aspect of many artistic practices, whether it is musical practices that are inseparable from movement and embodiment, or moving images that are often accompanied by music or sound. Hence, this project starts from an interdisciplinary perspective with the goal of transdisciplinary knowledge creation. The project specifically addresses the accessibility issues of Deep Learning, which limits artists to explore these new technologies in their artistic practices. In addition to the ease in the integration of the Deep Learning architectures in artistic practices, the project aims to change the perspective of Deep Learning research to take the human factors in artistic practice into consideration. These new research perspectives accommodate design iterations and human factors in AI and Machine Learning, towards shifting the research emphasis from gestalt generation to design processes. In parallel, the project aims to contribute with new research methodologies that concentrate on descriptive knowledge creation with the mindset of closing the gap between scientific research outcomes and their impact in artistic communities and society in general.