Access the full text.
Sign up today, get DeepDyve free for 14 days.
A. Kapur, Georg Essl, Philip Davidson, P. Cook (2002)
The Electronic Tabla ControllerJournal of New Music Research, 32
M. Waisvisz (1985)
The Hands: A Set of Remote MIDI-Controllers, 1985
Serge Laubier (1998)
The Meta-InstrumentComputer Music Journal, 22
Scott Wilson, Michael Gurevich, B. Verplank, P. Stang (2003)
Microcontrollers in Music HCI Instruction - Reflections on our Switch to the Atmel AVR Platform
(2011)
DAFX: Digital Audio Effects
R. Fiebrink, D. Trueman, P. Cook (2009)
A Meta-Instrument for Interactive, On-the-Fly Machine Learning
Andy Hunt, M. Wanderley, M. Paradis (2002)
The Importance of Parameter Mapping in Electronic Instrument DesignJournal of New Music Research, 32
R. Avtar (2006)
Learn to Play on Sitar
Arne Eigenfeldt, A. Kapur (2008)
Multi-Agent Multimodal Performance Analysis, 2008
David Merrill (2003)
Head-Tracking for Gestural and Continuous Control of Parameterized Audio Effects
A. Kapur, Michael Darling, Dimitri Diakopoulos, Jim Murphy, J. Hochenbaum, Owen Vallis, C. Bahn (2011)
The Machine Orchestra: An Ensemble of Human Laptop Performers and Robotic Musical InstrumentsComputer Music Journal, 35
S. Dixon (2000)
A Lightweight Multi-agent Musical Beat Tracking System
M. Puckette (1996)
Pure Data : another integrated computer music environment, 1996
R. Knapp, J. Jaimovich, N. Coghlan (2009)
Measurement of motion and emotion during musical performance2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops
M. Benning, A. Kapur, B. Till, G. Tzanetakis (2007)
Multimodal Sensor Analysis of Sitar Performance: Where is the Beat?2007 IEEE 9th Workshop on Multimedia Signal Processing
J. Snyder (2010)
Exploration of an Adaptable Just Intonation System
A. Kapur, G. Percival, M. Lagrange, G. Tzanetakis (2007)
Pedagogical Transcription for Multimodal Sitar Performance
Raghav Menon (1973)
Discovering Indian Music
[This paper describes the design of an Electronic Sitar controller, a digitally modified version of Saraswati’s (the Hindu Goddess of Music) 19-stringed, pumpkin shelled, traditional North Indian instrument. The ESitar uses sensor technology to extract gestural information from a performer, deducing music information such as pitch, pluck timing, thumb pressure, and 3-axes of head tilt to trigger real-time sounds and graphics. It allows for a variety of traditional sitar technique as well as new performance methods. Graphical feedback allows for artistic display and pedagogical feedback. The ESitar uses a programmable Atmel microprocessor which outputs control messages via a standard MIDI jack.]
Published: Mar 7, 2017
Keywords: Control Information; Head Tilt; Acoustic Sound; Pitch Detection; Loop Buffer
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.