Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A new hope for network model generalization (2207.05843v2)

Published 12 Jul 2022 in cs.NI and cs.LG

Abstract: Generalizing ML models for network traffic dynamics tends to be considered a lost cause. Hence for every new task, we design new models and train them on model-specific datasets closely mimicking the deployment environments. Yet, an ML architecture called_Transformer_ has enabled previously unimaginable generalization in other domains. Nowadays, one can download a model pre-trained on massive datasets and only fine-tune it for a specific task and context with comparatively little time and data. These fine-tuned models are now state-of-the-art for many benchmarks. We believe this progress could translate to networking and propose a Network Traffic Transformer (NTT), a transformer adapted to learn network dynamics from packet traces. Our initial results are promising: NTT seems able to generalize to new prediction tasks and environments. This study suggests there is still hope for generalization, though it calls for a lot of future research.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Alexander Dietmüller (4 papers)
  2. Siddhant Ray (6 papers)
  3. Romain Jacob (8 papers)
  4. Laurent Vanbever (23 papers)
Citations (26)

Summary

We haven't generated a summary for this paper yet.