Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PIN: A Knowledge-Intensive Dataset for Paired and Interleaved Multimodal Documents (2406.13923v1)

Published 20 Jun 2024 in cs.AI, cs.CL, cs.CV, and cs.MM

Abstract: Recent advancements in Large Multimodal Models (LMMs) have leveraged extensive multimodal datasets to enhance capabilities in complex knowledge-driven tasks. However, persistent challenges in perceptual and reasoning errors limit their efficacy, particularly in interpreting intricate visual data and deducing multimodal relationships. Addressing these issues, we introduce a novel dataset format, PIN (Paired and INterleaved multimodal documents), designed to significantly improve both the depth and breadth of multimodal training. The PIN format is built on three foundational principles: knowledge intensity, scalability, and support for diverse training modalities. This innovative format combines markdown files and comprehensive images to enrich training data with a dense knowledge structure and versatile training strategies. We present PIN-14M, an open-source dataset comprising 14 million samples derived from a diverse range of Chinese and English sources, tailored to include complex web and scientific content. This dataset is constructed meticulously to ensure data quality and ethical integrity, aiming to facilitate advanced training strategies and improve model robustness against common multimodal training pitfalls. Our initial results, forming the basis of this technical report, suggest significant potential for the PIN format in refining LMM performance, with plans for future expansions and detailed evaluations of its impact on model capabilities.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (16)
  1. Junjie Wang (164 papers)
  2. Yin Zhang (98 papers)
  3. Yatai Ji (15 papers)
  4. Yuxiang Zhang (104 papers)
  5. Chunyang Jiang (12 papers)
  6. Yubo Wang (53 papers)
  7. Kang Zhu (12 papers)
  8. Zekun Wang (50 papers)
  9. Tiezhen Wang (2 papers)
  10. Wenhao Huang (98 papers)
  11. Jie Fu (229 papers)
  12. Bei Chen (56 papers)
  13. Qunshu Lin (11 papers)
  14. Minghao Liu (44 papers)
  15. Ge Zhang (170 papers)
  16. Wenhu Chen (134 papers)
Citations (1)