Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Annotation-Free Group Robustness via Loss-Based Resampling (2312.04893v1)

Published 8 Dec 2023 in cs.CV

Abstract: It is well-known that training neural networks for image classification with empirical risk minimization (ERM) makes them vulnerable to relying on spurious attributes instead of causal ones for prediction. Previously, deep feature re-weighting (DFR) has proposed retraining the last layer of a pre-trained network on balanced data concerning spurious attributes, making it robust to spurious correlation. However, spurious attribute annotations are not always available. In order to provide group robustness without such annotations, we propose a new method, called loss-based feature re-weighting (LFR), in which we infer a grouping of the data by evaluating an ERM-pre-trained model on a small left-out split of the training data. Then, a balanced number of samples is chosen by selecting high-loss samples from misclassified data points and low-loss samples from correctly-classified ones. Finally, we retrain the last layer on the selected balanced groups to make the model robust to spurious correlation. For a complete assessment, we evaluate LFR on various versions of Waterbirds and CelebA datasets with different spurious correlations, which is a novel technique for observing the model's performance in a wide range of spuriosity rates. While LFR is extremely fast and straightforward, it outperforms the previous methods that do not assume group label availability, as well as the DFR with group annotations provided, in cases of high spurious correlation in the training data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com