Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Vectorization and Rasterization: Self-Supervised Learning for Sketch and Handwriting (2103.13716v1)

Published 25 Mar 2021 in cs.CV

Abstract: Self-supervised learning has gained prominence due to its efficacy at learning powerful representations from unlabelled data that achieve excellent performance on many challenging downstream tasks. However supervision-free pre-text tasks are challenging to design and usually modality specific. Although there is a rich literature of self-supervised methods for either spatial (such as images) or temporal data (sound or text) modalities, a common pre-text task that benefits both modalities is largely missing. In this paper, we are interested in defining a self-supervised pre-text task for sketches and handwriting data. This data is uniquely characterised by its existence in dual modalities of rasterized images and vector coordinate sequences. We address and exploit this dual representation by proposing two novel cross-modal translation pre-text tasks for self-supervised feature learning: Vectorization and Rasterization. Vectorization learns to map image space to vector coordinates and rasterization maps vector coordinates to image space. We show that the our learned encoder modules benefit both raster-based and vector-based downstream approaches to analysing hand-drawn data. Empirical evidence shows that our novel pre-text tasks surpass existing single and multi-modal self-supervision methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Ayan Kumar Bhunia (63 papers)
  2. Pinaki Nath Chowdhury (37 papers)
  3. Yongxin Yang (73 papers)
  4. Timothy M. Hospedales (69 papers)
  5. Tao Xiang (324 papers)
  6. Yi-Zhe Song (120 papers)
Citations (54)

Summary

We haven't generated a summary for this paper yet.