Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 87 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 17 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 102 tok/s Pro
Kimi K2 166 tok/s Pro
GPT OSS 120B 436 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Quantum Musical Expression

Updated 24 September 2025
  • Quantum-Musical Expression is an interdisciplinary field applying quantum mechanics to music creation, performance, and representation through phenomena like superposition and entanglement.
  • It leverages quantum formalisms to map musical states onto Hilbert spaces, enabling probabilistic sound outcomes and novel instrument designs such as the Quantum Guitar.
  • Advanced quantum algorithms facilitate algorithmic composition and reactive music systems, opening new avenues for sound synthesis and real-time performance.

Quantum-Musical Expression refers to the application of quantum mechanical principles, formalism, and technology to the representation, creation, performance, and perception of music. This emerging field spans theoretical frameworks, practical computational methods, live instrument design, systems for algorithmic composition, and artistic performances that are fundamentally shaped by quantum-specific features such as superposition, entanglement, measurement uncertainty, and quantum randomness.

1. Quantum Representation of Musical States and Superposition

Quantum-musical frameworks often begin by quantizing musical entities—tones, motifs, or entire scores—using the mathematical structures of quantum theory. For example, a musical octave with tones {c, d, e, f, g, a, b} is mapped to an orthonormal basis in a Hilbert space (e.g., ℂ⁷), so that any musical state is a normalized superposition:

Ψ=acVc+adVd++abVb,kak2=1|\Psi\rangle = a_c |V_c\rangle + a_d |V_d\rangle + \ldots + a_b |V_b\rangle, \quad \sum_k |a_k|^2 = 1

This model generalizes to both individual notes (modeled as qubits or qudits) and polyphonic structures (multi-qubit systems). Coherent superposition allows a tone to be in multiple classical states "at once" until a measurement (e.g., listening) collapses it to a specific outcome (Putz et al., 2015). For example, the superposition

ψ=12(01)|\psi\rangle = \frac{1}{\sqrt{2}}(|0\rangle - |1\rangle)

produces a quantum “50:50” tone whose perception upon repeated listening events is fundamentally probabilistic (Putz et al., 2021).

2. Entanglement and Correlation in Quantum Music

Quantum entanglement, a haLLMark of non-classicality, is exploited to encode nonseparable correlations between musical elements. For instance, two tones e and a can form a Bell-like quantum state:

Ψ±=12(0e1a±1e0a)|\Psi^{\pm}\rangle = \frac{1}{\sqrt{2}}(|0_e\rangle |1_a\rangle \pm |1_e\rangle |0_a\rangle)

Here, no definite state can be ascribed to either note individually; only joint measurement has physical meaning (Putz et al., 2015, Putz et al., 2021). This structure enables "entangled ensembles," where multiple instruments or players become dynamically linked via shared quantum states (Carney, 22 Sep 2025). The "Sound of Entanglement" (Rodríguez et al., 10 Sep 2025) demonstrates the use of live measurements of polarization-entangled photons in a Bell test to control motif selection and sound processing, ensuring strong non-local correlations in both sound and visuals.

3. Quantum Devices and Quantum-Inspired Musical Instruments

The physical realization of quantum-musical expression is progressing via two main routes:

  • Quantum-instrumentation and live interfaces: The Quantum Guitar (Coecke, 3 Sep 2025) implements the mapping of string microstates to qubits, enabling real-time quantum state manipulation using midi foot controllers and quantum operations (e.g., Rx(θ)R_x(\theta) rotations). Synthesis is achieved with platforms such as Moth's Actias, allowing continuous blending between classical and quantum sound outputs. Superconducting qubit circuits have also been used as musical synthesizers, processing GHz quantum signals into audio through analog and digital sonification (Topel et al., 2022).
  • Embedded quantum simulations and ensemble entanglement: In (Carney, 22 Sep 2025), tonal centrality is captured from live MIDI, fed into a quantum simulation (e.g., on a Raspberry Pi Pico), and the resulting quantum state (e.g., Φ+|\Phi^+\rangle or Ψ+|\Psi^+\rangle) is mapped back to MIDI parameters for each player’s instrument, producing dynamically correlative or anti-correlative musical behavior.

4. Algorithmic and Computational Approaches

Quantum computing provides a platform for algorithmic composition and adaptive sequencing beyond classical stochastic or rule-based methods:

  • Quantum annealing and QUBO modeling: Musical composition can be formulated as a binary optimization problem; variables encode the presence of notes, chords, or rhythmic elements, with constraints and preferences mapped to a cost (energy) function. Embedding such QUBO formulations into quantum annealers (e.g., D-Wave) allows for the exploration of combinatorially rich musical spaces, minimizing a function like:

Q(n)=iaini+i,jbijninjQ(n) = \sum_i a_i n_i + \sum_{i,j} b_{ij} n_i n_j

which is then mapped to an Ising Hamiltonian for quantum processing (Arya et al., 2022, Itaboraí et al., 11 Sep 2024).

  • Variational quantum algorithms for sonification and composition: Platforms such as the Variational Quantum Harmonizer (VQH) (Itaboraí et al., 11 Sep 2024) enable real-time mapping of VQE (and related) outputs into musical parameters, where, for example, the marginal probabilities for qubits in state 1|1\rangle determine amplitudes, frequencies, or filter settings of oscillators (see Table 1). Adiabatic progression and iterative outputs of the VQA thus drive evolving musical textures:

| Quantum Output | Sonic Parameter | Example Mapping | |-------------------------------|-------------------|-----------------------------------------| | Marginal probability cn(t)c_n(t) | Oscillator amplitude | Ancn(t)A_n \sim c_n(t) | | Energy expectation E0(t)E_0(t) | Filter or envelope | e.g., filter gain modulated by E0(t)E_0(t) | | Iteration index | Temporal evolution | control points for live coding timing |

5. Perceptual and Artistic Implications

Quantum-musical expression fundamentally alters the reproducibility and perceptual variability of artistic experience:

  • Quantum parallelism and indeterminacy: Each listener’s measurement collapses the musical state differently, creating aleatoric experiences surpassing prior indeterminate techniques (Putz et al., 2015, Putz et al., 2021). Reception is inherently probabilistic; the same score “contains” a distribution of performances.
  • Gestaltic and ambiguous representations: The representation of musical themes as quantum data sets, including positive centroids and mixed states, models the inherent ambiguity and multi-layered perception of musical ideas. Quantum fidelity functions allow recognition and classification that respect this ambiguity (Chiara et al., 2022).

6. Sound Processing, Sonification, and Novel Methodologies

Quantum-theoretic models have influenced both analysis and synthesis of sound:

  • Quantum Vocal Theory of Sounds (QVTS): Models sounds as Bloch sphere superpositions of “vocal primitives” (phonation, turbulence, myoelastic vibrations), enabling systematic analysis, decomposition, and creative sound manipulation (Mannone et al., 2021).
  • Sonification of quantum scientific data: Physical quantum processes (e.g., Rabi oscillations, non-classical Wigner functions) are directly mapped to frequency, amplitude, timbre, or spatial parameters, resulting in dynamic soundscapes that reflect genuine quantum dynamics, randomness, and non-classicality (Yamada et al., 2023).
  • Reactive systems and beat generation: Quantum machine learning techniques and encoding strategies have been employed to generate interactive beat patterns using measured quantum noise to introduce “soft rules” and responsive, improvisatory structures (Oshiro, 2022).

7. Future Directions and Interdisciplinary Impact

Quantum-musical expression establishes new paradigms for collaboration and research:

  • Hybrid and networked instruments: Integration of quantum processing into traditional instrument interfaces (Quantum Guitar, embedded simulation on microcontrollers) allows for new forms of expressive control and audience engagement (Coecke, 3 Sep 2025, Carney, 22 Sep 2025).
  • Interactive compositions and pedagogical tools: Real-time sonification of optimization, entanglement-driven reactive systems, and sonified quantum simulations enable educational platforms where listeners and performers develop intuition for abstract quantum phenomena (Itaboraí et al., 11 Sep 2024).
  • Artistic exploration of non-locality and contextuality: The unique ability of quantum processes to enforce non-classical correlations, unpredictability, and context-dependent outcomes expands artistic, philosophical, and scientific horizons, setting the stage for the development of “entangled ensembles” and quantum aleatoric music (Rodríguez et al., 10 Sep 2025, Carney, 22 Sep 2025).

In summary, quantum-musical expression synthesizes quantum physics, advanced computation, cognitive modeling, and artistic practice. It redefines the compositional and performance ecology by introducing features—superposition, entanglement, contextual measurement, stochasticity—that are impossible to realize within classical frameworks. As quantum technology matures, the field is poised for deepened integration into both the technical development of music technology and the conceptual fabric of musical art.

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Quantum-Musical Expression.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube