Authors

Mathew Vallejo

Files

Download Available for download on Monday, July 01, 2024

Download Full Text (38 KB)

Abstract

This research addresses the viability of triggering certain elements in the Max/MSP and Max for Live visual coding environment with atypical, gesture-based external control stimuli including cameras and infrared sensors in an effort to functionally incorporate them into a real-time live audiovisual performance as a New Interface for Musical Expression (NIME). This involves testing the usability of combinations of established hardware mapped to control specific parameters within the software and, in turn, the software’s response to the variable nature of these converted inputs, the required parameters for reasonable usability, and the artistic benefits of their incorporation in a live performance. The research methodology incorporates live trials of the instrument by users of random experience levels followed by completion of a standardized survey response to gauge general usability and intuitiveness. The challenge of this research centers around the highly volatile nature of gesture-based control, the added complication of blending multiple gesture-controlled hardware interfaces, and developing an appropriate software environment to effectively capture input data and manipulate the visual outcome. Conclusions of the research suggest strong interest among users in gesture-controlled visuals due to a heightened sense of connection with the performance process and visual outcome.

Publication Date

7-1-2022

Campus

Valencia (Spain) Campus

Keywords

Research Project; Gesture Control; Creative Coding; Audiovisual Research; Max/MSP; Max for Live

Project Components

Project Components: project video (.mov), reflection paper (.pdf), audio tracks (ZIP file containing 3 .wav files), visualizer (.amxd).

Infinite Wicked Desert

Share

COinS