Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hybrid Random Features (2110.04367v3)

Published 8 Oct 2021 in cs.LG and stat.ML

Abstract: We propose a new class of random feature methods for linearizing softmax and Gaussian kernels called hybrid random features (HRFs) that automatically adapt the quality of kernel estimation to provide most accurate approximation in the defined regions of interest. Special instantiations of HRFs lead to well-known methods such as trigonometric (Rahimi and Recht, 2007) or (recently introduced in the context of linear-attention Transformers) positive random features (Choromanski et al., 2021). By generalizing Bochner's Theorem for softmax/Gaussian kernels and leveraging random features for compositional kernels, the HRF-mechanism provides strong theoretical guarantees - unbiased approximation and strictly smaller worst-case relative errors than its counterparts. We conduct exhaustive empirical evaluation of HRF ranging from pointwise kernel estimation experiments, through tests on data admitting clustering structure to benchmarking implicit-attention Transformers (also for downstream Robotics applications), demonstrating its quality in a wide spectrum of machine learning problems.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (13)
  1. Krzysztof Choromanski (96 papers)
  2. Haoxian Chen (15 papers)
  3. Han Lin (53 papers)
  4. Yuanzhe Ma (4 papers)
  5. Arijit Sehanobish (20 papers)
  6. Deepali Jain (26 papers)
  7. Jake Varley (12 papers)
  8. Andy Zeng (54 papers)
  9. Valerii Likhosherstov (25 papers)
  10. Dmitry Kalashnikov (34 papers)
  11. Vikas Sindhwani (60 papers)
  12. Adrian Weller (150 papers)
  13. Michael S Ryoo (6 papers)
Citations (20)