Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 82 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 18 tok/s
GPT-5 High 12 tok/s Pro
GPT-4o 96 tok/s
GPT OSS 120B 467 tok/s Pro
Kimi K2 217 tok/s Pro
2000 character limit reached

Counting of surfaces and computational complexity in column sums of symmetric group character tables (2406.17613v1)

Published 25 Jun 2024 in hep-th, math.CO, math.GR, and math.RT

Abstract: The character table of the symmetric group $S_n$, of permutations of $n$ objects, is of fundamental interest in theoretical physics, combinatorics as well as computational complexity theory. We investigate the implications of an identity, which has a geometrical interpretation in combinatorial topological field theories, relating the column sum of normalised central characters of $S_n$ to a sum of structure constants of multiplication in the centre of the group algebra of $S_n$. The identity leads to the proof that a combinatorial computation of the column sum belongs to complexity class \shP. The sum of structure constants has an interpretation in terms of the counting of branched covers of the sphere. This allows the identification of a tractable subset of the structure constants related to genus zero covers. We use this subset to prove that the column sum for a conjugacy class labelled by partition $\lambda$ is non-vanishing if and only if the permutations in the conjugacy class are even. This leads to the result that the determination of the vanishing or otherwise of the column sum is in complexity class \pP. The subset gives a positive lower bound on the column sum for any even $ \lambda$. For any disjoint decomposition of $ \lambda$ as $\lambda_1 \sqcup \lambda_2 $ we obtain a lower bound for the column sum at $ \lambda$ in terms of the product of the column sums for $ \lambda_1$ and$\lambda_2$. This can be expressed as a super-additivity property for the logarithms of column sums of normalized characters.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube