Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

INTERN: A New Learning Paradigm Towards General Vision (2111.08687v2)

Published 16 Nov 2021 in cs.CV, cs.AI, and cs.LG

Abstract: Enormous waves of technological innovations over the past several years, marked by the advances in AI technologies, are profoundly reshaping the industry and the society. However, down the road, a key challenge awaits us, that is, our capability of meeting rapidly-growing scenario-specific demands is severely limited by the cost of acquiring a commensurate amount of training data. This difficult situation is in essence due to limitations of the mainstream learning paradigm: we need to train a new model for each new scenario, based on a large quantity of well-annotated data and commonly from scratch. In tackling this fundamental problem, we move beyond and develop a new learning paradigm named INTERN. By learning with supervisory signals from multiple sources in multiple stages, the model being trained will develop strong generalizability. We evaluate our model on 26 well-known datasets that cover four categories of tasks in computer vision. In most cases, our models, adapted with only 10% of the training data in the target domain, outperform the counterparts trained with the full set of data, often by a significant margin. This is an important step towards a promising prospect where such a model with general vision capability can dramatically reduce our reliance on data, thus expediting the adoption of AI technologies. Furthermore, revolving around our new paradigm, we also introduce a new data system, a new architecture, and a new benchmark, which, together, form a general vision ecosystem to support its future development in an open and inclusive manner. See project website at https://opengvlab.shlab.org.cn .

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (27)
  1. Jing Shao (109 papers)
  2. Siyu Chen (105 papers)
  3. Yangguang Li (44 papers)
  4. Kun Wang (355 papers)
  5. Zhenfei Yin (41 papers)
  6. Yinan He (34 papers)
  7. Jianing Teng (4 papers)
  8. Qinghong Sun (4 papers)
  9. Mengya Gao (8 papers)
  10. Jihao Liu (60 papers)
  11. Gengshi Huang (3 papers)
  12. Guanglu Song (45 papers)
  13. Yichao Wu (34 papers)
  14. Yuming Huang (25 papers)
  15. Fenggang Liu (8 papers)
  16. Huan Peng (4 papers)
  17. Shuo Qin (8 papers)
  18. Chengyu Wang (93 papers)
  19. Yujie Wang (103 papers)
  20. Conghui He (114 papers)
Citations (33)