Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Probabilistic Spherical Discriminant Analysis: An Alternative to PLDA for length-normalized embeddings (2203.14893v1)

Published 28 Mar 2022 in stat.ML and cs.LG

Abstract: In speaker recognition, where speech segments are mapped to embeddings on the unit hypersphere, two scoring backends are commonly used, namely cosine scoring or PLDA. Both have advantages and disadvantages, depending on the context. Cosine scoring follows naturally from the spherical geometry, but for PLDA the blessing is mixed -- length normalization Gaussianizes the between-speaker distribution, but violates the assumption of a speaker-independent within-speaker distribution. We propose PSDA, an analogue to PLDA that uses Von Mises-Fisher distributions on the hypersphere for both within and between-class distributions. We show how the self-conjugacy of this distribution gives closed-form likelihood-ratio scores, making it a drop-in replacement for PLDA at scoring time. All kinds of trials can be scored, including single-enroll and multi-enroll verification, as well as more complex likelihood-ratios that could be used in clustering and diarization. Learning is done via an EM-algorithm with closed-form updates. We explain the model and present some first experiments.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Albert Swart (8 papers)
  2. Anna Silnova (22 papers)
  3. Themos Stafylakis (35 papers)
  4. Niko Brümmer (18 papers)
  5. Ladislav Mošner (14 papers)
  6. Oldřich Plchot (16 papers)
  7. Lukáš Burget (45 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.