Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 97 tok/s
Gemini 2.5 Pro 44 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 100 tok/s Pro
GPT OSS 120B 464 tok/s Pro
Kimi K2 186 tok/s Pro
2000 character limit reached

Harnessing high-dimensional hyperentanglement through a biphoton frequency comb (1506.04231v1)

Published 13 Jun 2015 in physics.optics and quant-ph

Abstract: Quantum entanglement is a fundamental resource for secure information processing and communications, where hyperentanglement or high-dimensional entanglement has been separately proposed towards high data capacity and error resilience. The continuous-variable nature of the energy-time entanglement makes it an ideal candidate for efficient high-dimensional coding with minimal limitations. Here we demonstrate the first simultaneous high-dimensional hyperentanglement using a biphoton frequency comb to harness the full potential in both energy and time domain. The long-postulated Hong-Ou-Mandel quantum revival is exhibited, with up to 19 time-bins, 96.5% visibilities. We further witness the high-dimensional energy-time entanglement through Franson revivals, which is observed periodically at integer time-bins, with 97.8% visibility. This qudit state is observed to simultaneously violate the generalized Bell inequality by up to 10.95 deviations while observing recurrent Clauser-Horne-Shimony-Holt S-parameters up to 2.76. Our biphoton frequency comb provides a platform in photon-efficient quantum communications towards the ultimate channel capacity through energy-time-polarization high-dimensional encoding.

Citations (48)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.