Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 171 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 38 tok/s Pro
GPT-5 High 43 tok/s Pro
GPT-4o 108 tok/s Pro
Kimi K2 173 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Convexity and Optimization in Deficit Round Robin Scheduling for Delay-Constrained Systems (2503.23366v1)

Published 30 Mar 2025 in cs.NI

Abstract: The Deficit Round Robin (DRR) scheduler is widely used in network systems for its simplicity and fairness. However, configuring its integer-valued parameters, known as quanta, to meet stringent delay constraints remains a significant challenge. This paper addresses this issue by demonstrating the convexity of the feasible parameter set for a two-flow DRR system under delay constraints. The analysis is then extended to n-flow systems, uncovering key structural properties that guide parameter selection. Additionally, we propose an optimization method to maximize the number of packets served in a round while satisfying delay constraints. The effectiveness of this approach is validated through numerical simulations, providing a practical framework for enhancing DRR scheduling. These findings offer valuable insights into resource allocation strategies for maintaining Quality of Service (QoS) standards in network slicing environments.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.