Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
GPT-5.1
GPT-5.1 104 tok/s
Gemini 3.0 Pro 36 tok/s Pro
Gemini 2.5 Flash 133 tok/s Pro
Kimi K2 216 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Teaching and learning in uncertainty (1901.07063v3)

Published 21 Jan 2019 in cs.IT, cs.SI, math.IT, and math.PR

Abstract: We investigate a simple model for social learning with two agents: a teacher and a student. The teacher's goal is to teach the student the state of the world; however, the teacher himself is not certain about the state of the world and needs to simultaneously learn this parameter and teach it to the student. We model the teacher's and student's uncertainties via noisy transmission channels, and employ two simple decoding strategies for the student. We focus on two teaching strategies: a "low-effort" strategy of simply forwarding information, and a "high-effort" strategy of communicating the teacher's current best estimate of the world at each time instant, based on his own cumulative learning. Using tools from large deviation theory, we calculate the exact learning rates for these strategies and demonstrate regimes where the low-effort strategy outperforms the high-effort strategy. Finally, we present a conjecture concerning the optimal learning rate for the student over all joint strategies between the student and the teacher.

Citations (11)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.