Interactive Musical Setting with Deep Learning and Object Recognition Conference Paper uri icon

abstract

  • The SeMI - Interactive Musical Setting, explores the possibilities of joining machine learning, the physical and the sound world. In this context, a machine learning algorithm and model was used to identify physical objects through image processing. Each physical object is associated with a student’s produced musical texture that starts playing when the object is recognized by the device. This allows defining use cases in which students have to develop diverse although interrelated sound textures and combine them with a physical world, in both a fake orchestra, that reacts to people and objects in front of it, and mood rooms, for example. The application was developed for iPad and iPhone, using Swift programming language and the iOS operating system and used in the classes of the masters on Teaching of Musical Education in the Basic School.
  • This work has been supported by FCT -Fundacao para a Ciencia e Tecnologia within the Project Scope: UIDB/05757/2020.

publication date

  • January 1, 2020