Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Noisy Bayesian Active Learning (1312.2315v1)

Published 9 Dec 2013 in cs.IT, math.IT, math.OC, math.ST, and stat.TH

Abstract: We consider the problem of noisy Bayesian active learning, where we are given a finite set of functions $\mathcal{H}$, a sample space $\mathcal{X}$, and a label set $\mathcal{L}$. One of the functions in $\mathcal{H}$ assigns labels to samples in $\mathcal{X}$. The goal is to identify the function that generates the labels even though the result of a label query on a sample is corrupted by independent noise. More precisely, the objective is to declare one of the functions in $\mathcal{H}$ as the true label generating function with high confidence using as few label queries as possible, by selecting the queries adaptively and in a strategic manner. Previous work in Bayesian active learning considers Generalized Binary Search, and its variants for the noisy case, and analyzes the number of queries required by these sampling strategies. In this paper, we show that these schemes are, in general, suboptimal. Instead we propose and analyze an alternative strategy for sample collection. Our sampling strategy is motivated by a connection between Bayesian active learning and active hypothesis testing, and is based on querying the label of a sample which maximizes the Extrinsic Jensen-Shannon divergence at each step. We provide upper and lower bounds on the performance of this sampling strategy, and show that these bounds are better than previous bounds.

Citations (22)

Summary

We haven't generated a summary for this paper yet.