Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transforming the Lindblad Equation into a System of Linear Equations: Performance Optimization and Parallelization of an Algorithm (1912.01491v3)

Published 3 Dec 2019 in physics.comp-ph, cs.DC, and quant-ph

Abstract: With their constantly increasing peak performance and memory capacity, modern supercomputers offer new perspectives on numerical studies of open many-body quantum systems. These systems are often modeled by using Markovian quantum master equations describing the evolution of the system density operators. In this paper we address master equations of the Lindblad form, which are a popular theoretical tool in quantum optics, cavity quantum electrodynamics, and optomechanics. By using the generalized Gell-Mann matrices as a basis, any Lindblad equation can be transformed into a system of ordinary differential equations with real coefficients. This allows us to use standard high-performance parallel algorithms to integrate the equations and thus to emulate open quantum dynamics in a computationally efficient way. Recently we presented an implementation of the transform with the computational complexity scaling as $O(N5 log N)$ for dense Lindbaldians and $O(N3 log N)$ for sparse ones. However, infeasible memory costs remain a serious obstacle on the way to large models. Here we present a parallel cluster-based implementation of the algorithm and demonstrate that it allows us to integrate a sparse Lindbladian model of the dimension $N=2000$ and a dense random Lindbladian model of the dimension $N=200$ by using $25$ nodes with $64$ GB RAM per node.

Citations (2)

Summary

We haven't generated a summary for this paper yet.