Projects per year
In this paper we present the recent developments in the AI-terity instrument. AI-terity is a deformable, non-rigid musical instrument that comprises a particular artificial intelligence (AI) method for generating audio samples for real-time audio synthesis. As an improvement, we developed the control interface structure with additional sensor hardware. In addition, we implemented a new hybrid deep learning architecture, GANSpaceSynth, in which we applied the GANSpace method on the GANSynth model. Following the deep learning model improvement, we developed new autonomous features for the instrument that aim at keeping the musician in an active and uncertain state of exploration. Through these new features, the instrument enables more accurate control on GAN latent space. Further, we intend to investigate the current developments through a musical composition that idiomatically reflects the new autonomous features of the AI-terity instrument. We argue that the present technology of AI is suitable for enabling alternative autonomous features in audio domain for the creative practices of musicians.
|Title of host publication||Proceedings of the International Conference on New Interfaces for Musical Expression|
|Publisher||International Conference on New Interfaces for Musical Expression|
|Publication status||Published - 15 Jun 2021|
|MoE publication type||A4 Article in a conference publication|
|Event||International Conference on New Interfaces for Musical Expression - Shanghai, China|
Duration: 15 Jun 2021 → 18 Jun 2021
|Conference||International Conference on New Interfaces for Musical Expression|
|Period||15/06/2021 → 18/06/2021|
- Artificial Intelligence (AI)
- new interfaces for musical expression
- Digital musical instruments
- Deep Learning
FingerprintDive into the research topics of 'AI-terity 2.0: An Autonomous NIME Featuring GANSpaceSynth Deep Learning Model'. Together they form a unique fingerprint.
- 1 Active