Dictionary-based sparse block encoding with low subnormalization and circuit depth (2405.18007v4)
Abstract: We propose a novel protocol for the sparse block encoding. When our sparse block encoding protocol is compiled into the ${\rm U(2), CNOT}$ gate set, we show that it queries a $2n\times2n$-dimensional sparse matrix with $s$ non-zero elements in a circuit depth of $\mathcal{O}(\log(ns))$ with $\mathcal{O}(n2s)$ ancillary qubits. This represents an exponential improvement over existing methods. Furthermore, our protocol achieves a subnormalization of $\sum_{l=0}{s_0}\vert A_l\vert$, where the proposed dictionary structure contains $s_0$ data items ($s_0\leq s$) and $A_l$ denotes the value of the $l$-th data item. The dictionary data structure establishes a unified framework for various sparse block-encoding protocols, with implementations connected to linear combinations of unitaries (LCU) and sparse access intermediate model (SAIM). To demonstrate the practical utility of our approach, we provide several applications including the Laplacian matrices in graph problems and discrete differential operators.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Collections
Sign up for free to add this paper to one or more collections.