Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Proactive Location-Based Scheduling of Delay-Constrained Traffic Over Fading Channels (1607.00329v1)

Published 1 Jul 2016 in cs.IT and math.IT

Abstract: In this paper, proactive resource allocation based on user location for point-to-point communication over fading channels is introduced, whereby the source must transmit a packet when the user requests it within a deadline of a single time slot. We introduce a prediction model in which the source predicts the request arrival $T_p$ slots ahead, where $T_p$ denotes the prediction window (PW) size. The source allocates energy to transmit some bits proactively for each time slot of the PW with the objective of reducing the transmission energy over the non-predictive case. The requests are predicted based on the user location utilizing the prior statistics about the user requests at each location. We also assume that the prediction is not perfect. We propose proactive scheduling policies to minimize the expected energy consumption required to transmit the requested packets under two different assumptions on the channel state information at the source. In the first scenario, offline scheduling, we assume the channel states are known a-priori at the source at the beginning of the PW. In the second scenario, online scheduling, it is assumed that the source has causal knowledge of the channel state. Numerical results are presented showing the gains achieved by using proactive scheduling policies compared with classical (reactive) networks. Simulation results also show that increasing the PW size leads to a significant reduction in the consumed transmission energy even with imperfect prediction.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Antonious M. Girgis (14 papers)
  2. Amr El-Keyi (28 papers)
  3. Mohammed Nafie (31 papers)
  4. Ramy Gohary (1 paper)
Citations (3)