Papers
Topics
Authors
Recent
Search
2000 character limit reached

Transforming the Lindblad Equation into a System of Linear Equations: Performance Optimization and Parallelization of an Algorithm

Published 3 Dec 2019 in physics.comp-ph, cs.DC, and quant-ph | (1912.01491v3)

Abstract: With their constantly increasing peak performance and memory capacity, modern supercomputers offer new perspectives on numerical studies of open many-body quantum systems. These systems are often modeled by using Markovian quantum master equations describing the evolution of the system density operators. In this paper we address master equations of the Lindblad form, which are a popular theoretical tool in quantum optics, cavity quantum electrodynamics, and optomechanics. By using the generalized Gell-Mann matrices as a basis, any Lindblad equation can be transformed into a system of ordinary differential equations with real coefficients. This allows us to use standard high-performance parallel algorithms to integrate the equations and thus to emulate open quantum dynamics in a computationally efficient way. Recently we presented an implementation of the transform with the computational complexity scaling as $O(N5 log N)$ for dense Lindbaldians and $O(N3 log N)$ for sparse ones. However, infeasible memory costs remain a serious obstacle on the way to large models. Here we present a parallel cluster-based implementation of the algorithm and demonstrate that it allows us to integrate a sparse Lindbladian model of the dimension $N=2000$ and a dense random Lindbladian model of the dimension $N=200$ by using $25$ nodes with $64$ GB RAM per node.

Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.