- The paper introduces design criteria emphasizing minimal latency (target 10ms, achieved 7ms) and 1ms variance to ensure heightened control intimacy.
- It demonstrates the integration of OSC and programmable connectivity processors to support continuous, real-time musical gestures in live settings.
- The study advocates innovative interface metaphors that transform traditional inputs into intuitive controllers, enhancing musical expressiveness.
Intimate Musical Control of Computers
Wessel and Wright's paper, "Problems and Prospects for Intimate Musical Control of Computers," presents a comprehensive paper on advancing computer-based musical instruments, focusing on live performance applications. Their work is situated within the broader context of enhancing musician-computer interaction through innovative design principles and technologies.
Design Criteria and Control Intimacy
The authors emphasize a set of design criteria for developing musical instruments that offer initial ease of use alongside a long-term potential for virtuosity. They highlight the importance of minimal latency and low variance latency as critical factors for achieving a high degree of control intimacy. The paper specifies an acceptable upper latency bound at 10 milliseconds, with their systems achieving near 7 milliseconds. They argue for the necessity of maintaining latency variation within 1 millisecond to preserve the expressive capabilities of the performer.
Technological Framework
Wessel and Wright introduce various technologies supporting their vision, such as customized gestural controllers and a programmable connectivity processor. A notable contribution is the Open Sound Control (OSC) protocol, which offers discrete event protocols optimized for modern networking and is integrated into platforms like Max/MSP and SuperCollider. OSC's symbolic naming and synchronous capabilities provide robust solutions for real-time musical interactions.
Continuous vs. Discrete Control
The paper critiques the traditional reliance on MIDI's discrete event protocol, advocating for systems that accommodate continuous gestural data, more aligned with natural musical expression. The authors propose their connectivity processor as a solution for synchronizing continuous gestures with audio streams, ensuring a cohesive, real-time performance environment.
Innovative Interfaces and Control Metaphors
The authors explore various metaphors for musical control to facilitate intuitive interaction. They detail how digitizing tablets and other gestural interfaces can be repurposed for complex musical mappings beyond simple parameter control. Concepts such as "drag and drop," "scrubbing," and "dipping" serve as foundational metaphors that inform the development of their software, aligning musical processes with human cognitive models.
Implications and Future Directions
The paper outlines profound implications for both theoretical and practical developments in AI and human-computer interaction within music. By advocating for more nuanced interfaces that mirror the adaptability and precision of traditional instruments, they pave the way for future research to further explore and refine these technologies. Additionally, their work pushes the boundary of computer music performance, encouraging further exploration in network-based collaboration and innovative control paradigms.
Conclusion
Wessel and Wright's research offers significant insights into developing more intuitive and expressive computer-based musical instruments. By addressing latency issues, advocating for continuous control, and proposing innovative metaphors, they contribute substantially to the evolution of live performance technologies. Their work continues to serve as a reference point for exploring the intersection of technology and musical expression in the ongoing advancement of musical interfaces.