Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 91 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 31 tok/s
GPT-5 High 36 tok/s Pro
GPT-4o 95 tok/s
GPT OSS 120B 478 tok/s Pro
Kimi K2 223 tok/s Pro
2000 character limit reached

Long Baseline Quantum Telescopy

Updated 16 August 2025
  • Long Baseline Quantum Telescopy is a quantum-enhanced method that uses entanglement and repeaters to overcome photon loss and phase noise in astronomical interferometry.
  • It employs advanced measurement protocols such as spatial mode sorting and post-selection to extract visibility and achieve microarcsecond angular resolution.
  • The approach enables scalable multi-telescope networks and superresolution imaging, while addressing challenges in entangled photon sources, quantum memories, and repeater bandwidth.

Long Baseline Quantum Telescopy (LBQT) is a quantum-enhanced approach to astronomical interferometry that leverages quantum entanglement, quantum communication, and quantum measurement protocols to enable and surpass the resolution limits of classical long-baseline telescopes. By employing quantum repeaters, distributed entanglement, quantum memories, and advanced receiver architectures, LBQT aims to circumvent the constraints imposed by optical loss, phase noise, and signal degradation over large baselines, offering routes to microarcsecond and sub-microarcsecond angular resolution at optical and near-IR wavelengths.

1. Fundamental Concepts and Quantum State Description

Traditional optical interferometry measures the spatial coherence of starlight by physically recombining light collected at separated telescopes. The quantum state of a single astronomical photon detected at two telescopes (denoted L and R) is

0L1R+eiϕ1L0R,\ket{0}_L\ket{1}_R + e^{i\phi}\ket{1}_L\ket{0}_R,

where ϕ\phi encodes the phase difference induced by the baseline and the source geometry. The classical interference pattern is extracted by combining the beams and measuring as a function of tunable delay.

The sensitivity and resolution are fundamentally limited when the baseline grows: photon loss, uncontrolled phase drifts, and the technical difficulty in maintaining phase-stable optical links causes signal degradation that rapidly worsens with increasing distance. In the LBQT paradigm, quantum repeaters and distributed entangled states (such as Bell pairs or multi-site “W states”) are introduced to act as quantum channels, enabling remote, high-fidelity sharing of quantum information without requiring physical transmission of photons over the entire baseline (Gottesman et al., 2011).

Quantum repeaters partition the channel into short segments, allowing for entanglement distillation and swapping protocols to purify shared states over long distances. This decouples the baseline from the loss-induced limits prevalent in direct detection schemes.

2. Entangled-State Interferometry and Measurement Protocols

In an entanglement-assisted interferometer, a resource state (e.g., 0L1R+eiδ1L0R\ket{0}_L\ket{1}_R + e^{i\delta}\ket{1}_L\ket{0}_R where δ\delta can be tuned via delay) is generated and distributed to the telescope sites (potentially using repeaters). The astronomical photon is then interfered locally at each site with the corresponding half of the entangled state. The physical signal is never required to traverse the full baseline—only local interference and measurement are needed.

After beam splitting and post-selection (retaining detection events where one photon is found at each site), the probability of correlated outcomes is determined by

Pcorr=1+Re(eiδ)2,P_{\text{corr}} = \frac{1 + \text{Re}(e^{-i\delta})}{2},

while anticorrelation probability is

Panti=1Re(eiδ)2.P_{\text{anti}} = \frac{1 - \text{Re}(e^{-i\delta})}{2}.

By scanning the auxiliary phase δ\delta, one can recover the source's phase information, emulating the role of direct detection interferometry but over arbitrary distances and with potentially superior fidelity (Gottesman et al., 2011).

Extension to nn telescopes is achieved using a WW state: W=eiδ11,0,,0+eiδ20,1,,0++eiδn0,,0,1,|W\rangle = e^{i\delta_1}|1,0,\ldots,0\rangle + e^{i\delta_2}|0,1,\ldots,0\rangle + \ldots + e^{i\delta_n}|0,\ldots,0,1\rangle, reducing the post-selection loss from 50% per pair (Bell state) to $1/n$ per site.

3. Mathematical Framework and Performance

The detection statistics and estimation precision can be formalized through the density matrix formalism for the photon state, e.g.

ρ=12(0000 010 010 0000 ),\rho = \frac{1}{2} \begin{pmatrix} 0 & 0 & 0 & 0 \ 0 & 1 & * & 0 \ 0 & * & 1 & 0 \ 0 & 0 & 0 & 0 \ \end{pmatrix},

where off-diagonal terms (“*”) encode coherence (i.e., visibility). For incoherent or extended sources, off-diagonal elements are reduced by spatial averaging.

The measurement protocols—beam splitters, variable delays, parity checks—ultimately aim to extract the "visibility" (complex coherence) as efficiently as possible. The quantum Fisher information (QFI) formalism is used to assess fundamental sensitivity limits, with post-selected quantum measurements designed to saturate the QFI for parameter estimation tasks such as source localization and separation (Sajjad et al., 2023, Padilla et al., 4 Apr 2025).

For multi-telescope arrays, use of spatial-mode sorting (e.g., SPADE) at each site further enables the decomposition of the incoming field into orthogonal modes, which can be combined with quantum memory encoding and entanglement-assisted nonlocal operations to achieve the quantum limit for, e.g., two-point resolution and superresolution imaging (Padilla et al., 24 Jun 2024, Padilla et al., 4 Apr 2025).

4. Advantages and Experimental/Technological Challenges

Advantages

  • Baseline Extension: By distributing entanglement instead of the starlight itself, LBQT bypasses transmission loss and phase errors, enabling practical interferometry over baselines ranging from continental to interplanetary distances. This allows pursuit of angular resolutions orders of magnitude beyond classical capability (Gottesman et al., 2011).
  • Error Correction and Noise Management: Quantum repeater protocols support entanglement purification, correcting phase/damping noise and preserving visibility on timescales comparable to atmospheric fluctuations (\sim10 ms).
  • Resource Scalability: Moving from direct detection to entanglement-distributed architectures allows nn-telescope arrays to scale loss as $1/n$ rather than $1/2$ per pair, improving multi-site efficiency.
  • Generalizability: The ability to emulate arbitrary multimode interferometric measurements via quantum circuits and nonlocal gates (Padilla et al., 4 Apr 2025, Padilla et al., 24 Jun 2024) removes constraints inherent to fixed physical layouts.
  • Feasibility of Superresolution and Sub-Rayleigh Imaging: By designing optimal receivers based on QFI, spatial mode sorting, and entanglement distribution, LBQT achieves superresolution—even in cases where the Rayleigh criterion is violated classically (Sajjad et al., 2023, Padilla et al., 4 Apr 2025).

Experimental and Technological Challenges

  • Photon Source and Entanglement Distribution: LBQT requires on-demand, high-brightness, true single-photon sources, or high-fidelity spontaneous parametric down-conversion with quantum network compatibility. Weak coherent states decrease interference visibility.
  • Quantum Memory and Synchronization: Quantum memory lifetimes must cover gate times and classical communication round-trip; timing jitter must be kept to tens of picoseconds or better. Bandwidth narrowness (e.g., Δλ0.025\Delta\lambda \approx 0.025 nm at λ=800\lambda = 800 nm for \sim30 ps detectors) is challenging.
  • Quantum Repeater Protocol Bandwidth: Present repeaters are not yet high bandwidth, presenting a bottleneck for real-time interferometry.
  • Post-selection and Loss: The protocol incurs signal loss due to the need for post-selection (e.g., to remove events where both photons hit the same site), although this can be partially mitigated by using more complex states (WW-states or multiport schemes).

5. Feasibility, Theoretical and Experimental Results

Proof-of-principle protocols, such as entanglement distillation using linear optics and the DLCZ architecture for heralded entanglement between atomic ensembles, have demonstrated viability of key LBQT components (Gottesman et al., 2011). Quantum memory and light-matter interfaces continue to improve; practical binary encoding of arrival times (requiring only log2(N+1)\lceil \log_2(N+1) \rceil ancilla pairs for NN time bins) has been refined to minimize resource demands (Czupryniak et al., 2021). Quantum error-corrected memory and repeater networks remain under active development, with promising progress in both systems and integrated photonics platforms.

Detector advances (e.g., in fast photon-counting and time-tagging) increasingly match the requirements, but high-fidelity, on-demand entangled state delivery over long distances is the primary scaling challenge.

6. Applications, Implications, and Future Directions

Quantum repeaters enable linking of optical telescopes at arbitrary separations, thus directly translating baseline increases into enhanced spatial resolution. LBQT opens the path to:

  • Microarcsecond or superior imaging: Resolving stellar surfaces, exoplanet atmospheres, or black hole environments beyond current means.
  • Flexible, array-based architectures: nn-site arrays with scalable loss properties and theoretically quantum-optimal performance for multiparameter imaging (Padilla et al., 24 Jun 2024, Padilla et al., 4 Apr 2025).
  • Exploitation of quantum networks: Enabling planetary-scale or deep-space arrays using quantum communication infrastructure, with applications in astrophysics, navigation, and fundamental physics tests.
  • Foundational studies: Examining quantum-classical boundaries in astronomical environments, and utilizing protocols that allow quantum nonlocality or relativistic quantum information tests at unprecedented scales (Lin et al., 2020, Mohageg et al., 2021).

Significant challenges remain in terms of entanglement distribution rate, memory/logic gate error rates, and system integration, but the theoretical and proof-of-principle results suggest that, with technological advances in quantum networking and photonics, long baseline quantum telescopy could become a realistic component of next-generation high-resolution astronomical observatories.