Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Random Network Distillation as a Diversity Metric for Both Image and Text Generation (2010.06715v1)

Published 13 Oct 2020 in cs.LG, cs.CL, and cs.CV

Abstract: Generative models are increasingly able to produce remarkably high quality images and text. The community has developed numerous evaluation metrics for comparing generative models. However, these metrics do not effectively quantify data diversity. We develop a new diversity metric that can readily be applied to data, both synthetic and natural, of any type. Our method employs random network distillation, a technique introduced in reinforcement learning. We validate and deploy this metric on both images and text. We further explore diversity in few-shot image generation, a setting which was previously difficult to evaluate.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Liam Fowl (25 papers)
  2. Micah Goldblum (96 papers)
  3. Arjun Gupta (24 papers)
  4. Amr Sharaf (13 papers)
  5. Tom Goldstein (226 papers)
Citations (3)