Inclusive music activities for people with physical disabilities commonly emphasise facilitated processes , based both on constrained gestural capabilities, and on the simplicity of the available interfaces. Inclusive music processes employ consumer controllers, computer access tools and/or specialized digital musical instruments (DMIs). The first category reveals a design ethos identified by the authors as artefact multiplication – many sliders, buttons, dials and menu layers; the latter types offer ergonomic accessibility through artefact magnification. We present a prototype DMI that eschews artefact multiplication in pursuit of enhanced real time creative independence. We reconceptualise the universal click-drag interaction model via a single sensor type, which affords both binary and continuous performance control. Accessibility is optimized via a familiar interaction model and through customized ergonomics, but it is the mapping strategy that emphasizes transparency and sophistication in the hierarchical correspondences between the available gesture dimensions and expressive musical cues. Through a participatory and progressive methodology we identify an ostensibly simple targeting gesture rich in dynamic and reliable features: (1) contact location; (2) contact duration; (3) momentary force; (4) continuous force, and; (5) dyad orientation. These features are mapped onto dynamic musical cues, most notably via new mappings for vibrato and arpeggio execution.
|Journal||Proceedings of the International Conference on NIME, Louisiana, USA. June 2015|
|Publication status||Published - 1 Jun 2015|
Bibliographical noteReference text: T. Anderson, C. Smith. Composability: widening participation in music making for people with disabilities via music software and controller solutions. In: Proceedings of the 2nd International conference on assistive technologies, April 11-12, 1996, New York: ACM, pp. 110-116.  A. Aziz, B. Hayden, C. Warren, S. Follmer. The flote: an instrument for people with limited mobility. In: Proceedings of the 10th ACM conference on computers and accessibility, Halifax, Nova Scotia 2008, p. 295.  B. Cappelen, A-P. Andersson. Musicking tangibles for empowerment. In: Proceedings of the 13th International conference on computers helping people with special needs, ICCHP 2012. LNCS, p. 255.  G. Doherty, T. Anderson, M. Wilson, G. Faconti. A control centred approach to designing interaction with novel devices. In: Proceedings of HCI International Conference on Universal Access in human computer interaction, New Orleans 2001. Lawrence Erlbaum Associates, p. 287.  J. Goodman, P. Langdon, J. Clarkson. Formats for user data in inclusive design. In: C. Stephanidis (ed.) Universal access in HCI part 1, LNCS 4554. Springer-Verlag 2007, pp. 117-126.  D. Hall. Musical acoustics, (2nd ed.) California: Brooks-Cole 1991.  A. Hunt, M. Wanderley, R. Kirk. Towards a model for instrumental mapping in expert musical interaction. In: Proceedings of the International computer music conference, ICMA 2000, pp. 209-212.
 A. Hunt, R. Kirk, M. Neighbour. Multiple media interfaces for music therapy. IEEE Multimedia, 11(3) July-September 2004, pp. 50-58.  F. Hwang, S. Keates, P. Langdon, J. Clarkson. Mouse movements of motion-impaired users: a sub-movement analysis. In: Proceedings of the 6th international conference on computers and accessibility. New York: ACM 2004, p. 102.  P. Juslin. Communicating emotion in music performance: a review and theoretical framework. In: Music and emotion: theory and research, New York: OUP 2001, p. 317.  L. Kessous, D. Arfib. Bimanuality in alternate musical instruments. In: Proceedings of the International conference on new interfaces for musical expression, NIME 2003, p. 141.  D. Meckin, N. Bryan-Kinns. MoosikMasheens: music, motion and narrative with young people who have complex needs. NIME 2014 workshop on accessibility.  K. Price, A. Sears. Performance-based functional assessment: an algorithm for measuring physical capabilities. In: Proceedings of the 10th ACM Conference on Computers and Accessibility, New York: ACM 2008, pp. 217-224.  J. Rovan, M. Wanderley, S. Dubnov, P. Depalle. Instrumental gesture mapping strategies as expressivity determinants in computer music performance. In: Proceedings of the International Kansei technology of emotion workshop [online], Paris: IRCAM 1997.  T. Swingler. The invisible keyboard in the air: an overview of the educational, therapeutic and creative applications of the EMS Soundbeam. In: Proceedings of the 2nd European conference on disability, virtual reality and associated technology, Sweden 1998, University of Reading: ECDVRAT, p. 255.  T. Ungvary, R. Vertegaal. Cognition and physicality in musical cyberinstruments. In: M. Wanderley and M. Battier (eds.) Trends in Gestural Control of Music, Paris: IRCAM 2000, pp. 371-386.  D. van Nort, M. Wanderley, P. Depalle. Mapping control structures for sound synthesis: functional and topological perspectives. In: Computer Music Journal 38(3), pp. 6-22, MIT Press 2014.  M. Wanderley, J. Viollet, F. Isart, X. Rodet. On the choice of transducer technologies for specific musical functions. In: Proceedings of the International computer music conference, Berlin 2000.  A. Wennerstrom. The music of everyday speech: prosody and discourse analysis, New York: OUP 2001, p.vii.  D. Wessel, M. Wright. Problems and prospects for intimate musical control of computers. In: Computer music journal 26(3) Fall 2002 Cambridge, MA: MIT Press, pp. 11-22.  C. Williams. Unintentional intrusive participation in multimedia interactive environments. In: Proceedings of the 7th International conference on disability, virtual reality and associated technology, Portugal 2008. University of Reading: ICDVRAT, p.205.
 L. Wyse, D. Nguyen. Instrumentalising synthesis models. In: Proceedings of the International conference on new interfaces for musical expression, NIME 2010, pp. 140-143.
- bespoke design
- cerebral palsy
- customized mappings
- feature extraction.