Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Calibrated Robust Fine-Tuning of Vision-Language Models (2311.01723v7)

Published 3 Nov 2023 in cs.CV and cs.AI

Abstract: Improving out-of-distribution (OOD) generalization during in-distribution (ID) adaptation is a primary goal of robust fine-tuning of zero-shot models beyond naive fine-tuning. However, despite decent OOD generalization performance from recent robust fine-tuning methods, confidence calibration for reliable model output has not been fully addressed. This work proposes a robust fine-tuning method that improves both OOD accuracy and confidence calibration simultaneously in vision LLMs. Firstly, we show that both OOD classification and OOD calibration errors have a shared upper bound consisting of two terms of ID data: 1) ID calibration error and 2) the smallest singular value of the ID input covariance matrix. Based on this insight, we design a novel framework that conducts fine-tuning with a constrained multimodal contrastive loss enforcing a larger smallest singular value, which is further guided by the self-distillation of a moving-averaged model to achieve calibrated prediction as well. Starting from empirical evidence supporting our theoretical statements, we provide extensive experimental results on ImageNet distribution shift benchmarks that demonstrate the effectiveness of our theorem and its practical implementation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Changdae Oh (12 papers)
  2. Hyesu Lim (6 papers)
  3. Mijoo Kim (2 papers)
  4. Jaegul Choo (161 papers)
  5. Alexander Hauptmann (46 papers)
  6. Zhi-Qi Cheng (61 papers)
  7. Kyungwoo Song (38 papers)
  8. Dongyoon Han (49 papers)
  9. Sangdoo Yun (71 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.

Youtube Logo Streamline Icon: https://streamlinehq.com