Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PAC-Bayes Bounds for Meta-learning with Data-Dependent Prior (2102.03748v1)

Published 7 Feb 2021 in cs.LG and stat.ML

Abstract: By leveraging experience from previous tasks, meta-learning algorithms can achieve effective fast adaptation ability when encountering new tasks. However it is unclear how the generalization property applies to new tasks. Probably approximately correct (PAC) Bayes bound theory provides a theoretical framework to analyze the generalization performance for meta-learning. We derive three novel generalisation error bounds for meta-learning based on PAC-Bayes relative entropy bound. Furthermore, using the empirical risk minimization (ERM) method, a PAC-Bayes bound for meta-learning with data-dependent prior is developed. Experiments illustrate that the proposed three PAC-Bayes bounds for meta-learning guarantee a competitive generalization performance guarantee, and the extended PAC-Bayes bound with data-dependent prior can achieve rapid convergence ability.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Tianyu Liu (177 papers)
  2. Jie Lu (127 papers)
  3. Zheng Yan (116 papers)
  4. Guangquan Zhang (38 papers)
Citations (10)

Summary

We haven't generated a summary for this paper yet.