Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tournaments, Johnson Graphs, and NC-Teaching (2205.02792v1)

Published 5 May 2022 in math.CO and cs.DM

Abstract: Quite recently a teaching model, called "No-Clash Teaching" or simply "NC-Teaching", had been suggested that is provably optimal in the following strong sense. First, it satisfies Goldman and Matthias' collusion-freeness condition. Second, the NC-teaching dimension (= NCTD) is smaller than or equal to the teaching dimension with respect to any other collusion-free teaching model. It has also been shown that any concept class which has NC-teaching dimension $d$ and is defined over a domain of size $n$ can have at most $2d \binom{n}{d}$ concepts. The main results in this paper are as follows. First, we characterize the maximum concept classes of NC-teaching dimension $1$ as classes which are induced by tournaments (= complete oriented graphs) in a very natural way. Second, we show that there exists a family $(\cC_n){n\ge1}$ of concept classes such that the well known recursive teaching dimension (= RTD) of $\cC_n$ grows logarithmically in $n = |\cC_n|$ while, for every $n\ge1$, the NC-teaching dimension of $\cC_n$ equals $1$. Since the recursive teaching dimension of a finite concept class $\cC$ is generally bounded $\log|\cC|$, the family $(\cC_n){n\ge1}$ separates RTD from NCTD in the most striking way. The proof of existence of the family $(\cC_n)_{n\ge1}$ makes use of the probabilistic method and random tournaments. Third, we improve the afore-mentioned upper bound $2d\binom{n}{d}$ by a factor of order $\sqrt{d}$. The verification of the superior bound makes use of Johnson graphs and maximum subgraphs not containing large narrow cliques.

Citations (2)

Summary

We haven't generated a summary for this paper yet.