Papers
Topics
Authors
Recent
2000 character limit reached

A compact entanglement distillery using realistic quantum memories (1308.0842v3)

Published 4 Aug 2013 in quant-ph

Abstract: We adopt the beam splitter model for losses to analyse the performance of a recent compact continuous-variable entanglement distillation protocol [Phys. Rev. Lett. 108, 060502, (2012)] implemented using realistic quantum memories. We show that the decoherence undergone by a two-mode squeezed state while stored in a quantum memory can strongly modify the results of the preparatory step of the protocol. We find that the well-known method for locally increasing entanglement, phonon subtraction, may not result in entanglement gain when losses are taken into account. Thus, we investigate the critical number $m_c$ of phonon subtraction attempts from the matter modes of the quantum memory. If the initial state is not de-Gaussified within $m_c$ attempts, the protocol should be restarted to obtain any entanglement increase. Moreover, the condition $m_c>1$ implies an additional constraint on the subtraction beam splitter interaction transmissivity, viz. it should be about 50% for a wide range of protocol parameters. Additionally, we consider the average entanglement rate, which takes into account both the unavoidable probabilistic nature of the protocol and its possible failure as a result of a large number of unsuccessful subtraction attempts. We find that a higher value of the average entanglement can be achieved by increasing the subtraction beam splitter interaction transmissivity. We conclude that the compact distillation protocol with the practical constraints coming from realistic quantum memories allows a feasible experimental realization within existing technologies.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.