Multi Proxy Anchor Family Loss for Several Types of Gradients (2110.03997v8)
Abstract: The deep metric learning (DML) objective is to learn a neural network that maps into an embedding space where similar data are near and dissimilar data are far. However, conventional proxy-based losses for DML have two problems: gradient problem and application of the real-world dataset with multiple local centers. Additionally, the performance metrics of DML also have some issues with stability and flexibility. This paper proposes three multi-proxies anchor (MPA) family losses and a normalized discounted cumulative gain (nDCG@k) metric. This paper makes three contributions. (1) MPA-family losses can learn using a real-world dataset with multi-local centers. (2) MPA-family losses improve the training capacity of a neural network owing to solving the gradient problem. (3) MPA-family losses have data-wise or class-wise characteristics with respect to gradient generation. Finally, we demonstrate the effectiveness of MPA-family losses, and MPA-family losses achieves higher accuracy on two datasets for fine-grained images.
- doi:10.1109/CVPR.2015.7298682.
- doi:10.1109/ICCVW.2013.77. URL https://doi.org/10.1109/ICCVW.2013.77
- doi:10.1109/ICCV.2019.00655.
- doi:10.1109/CVPR.2006.100. URL https://doi.org/10.1109/CVPR.2006.100
- doi:10.1109/ICCV.2017.47.
- doi:10.1145/582415.582418. URL https://doi.org/10.1145/582415.582418
- doi:10.1145/1102351.1102363. URL https://doi.org/10.1145/1102351.1102363
- doi:10.1109/CVPR.2015.7298594.
- doi:10.1109/TPAMI.2018.2846566.
- doi:10.1109/CVPR.2017.237.
- doi:10.1109/ICCV.2017.94. URL https://doi.ieeecomputersociety.org/10.1109/ICCV.2017.94
- doi:10.1109/ICCV.2017.309.
- Shozo Saeki (1 paper)
- Minoru Kawahara (1 paper)
- Hirohisa Aman (1 paper)