Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

STAIR: Learning Sparse Text and Image Representation in Grounded Tokens (2301.13081v2)

Published 30 Jan 2023 in cs.CV

Abstract: Image and text retrieval is one of the foundational tasks in the vision and language domain with multiple real-world applications. State-of-the-art approaches, e.g. CLIP, ALIGN, represent images and texts as dense embeddings and calculate the similarity in the dense embedding space as the matching score. On the other hand, sparse semantic features like bag-of-words models are more interpretable, but believed to suffer from inferior accuracy than dense representations. In this work, we show that it is possible to build a sparse semantic representation that is as powerful as, or even better than, dense presentations. We extend the CLIP model and build a sparse text and image representation (STAIR), where the image and text are mapped to a sparse token space. Each token in the space is a (sub-)word in the vocabulary, which is not only interpretable but also easy to integrate with existing information retrieval systems. STAIR model significantly outperforms a CLIP model with +$4.9\%$ and +$4.3\%$ absolute Recall@1 improvement on COCO-5k text$\rightarrow$image and image$\rightarrow$text retrieval respectively. It also achieved better performance on both of ImageNet zero-shot and linear probing compared to CLIP.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Chen Chen (753 papers)
  2. Bowen Zhang (161 papers)
  3. Liangliang Cao (52 papers)
  4. Jiguang Shen (6 papers)
  5. Tom Gunter (13 papers)
  6. Albin Madappally Jose (4 papers)
  7. Alexander Toshev (48 papers)
  8. Jonathon Shlens (58 papers)
  9. Ruoming Pang (59 papers)
  10. Yinfei Yang (73 papers)
Citations (11)
Youtube Logo Streamline Icon: https://streamlinehq.com