Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Pyramid Vector Quantizer with power projection (1705.05285v1)

Published 3 May 2017 in math.OC, cs.IT, and math.IT

Abstract: Pyramid Vector Quantizer (PVQ) is a promising technique especially for multimedia data compression, already used in Opus audio codec and considered for AV1 video codec. It quantizes vectors from Euclidean unit sphere by first projecting them to $L1$ norm unit sphere, then quantizing and encoding there. This paper shows that the used standard radial projection is suboptimal and proposes to tune its deformations by using parameterized power projection: $x\to xp/|xp|$ instead, where the optimized power $p$ is applied coordinate-wise, getting usually $\geq 0.5\, dB$ improvement comparing to radial projection.

Citations (4)

Summary

We haven't generated a summary for this paper yet.