Efforts to embed AI at the edge of networks are rapidly gaining traction, yet they bring with them a significant energy challenge. As applications proliferate across consumer electronics, healthcare, and manufacturing, leveraging deep neural networks (DNNs) has become common. These AI models, often requiring large amounts of data, have typically been processed on cloud servers. However, the latency in communication and potential privacy issues have begun pushing deep learning tasks closer to users on wireless edge networks.
Edge AI, particularly in support of upcoming sixth-generation (6G) networks, promises ubiquitous AI applications with critical performance. But the limited resources of wireless edge networks and the energy-intensive nature of DNNs present substantial challenges. AI's transformative power hinges on the need to balance between resource limitations and intensive computation requirements. Thus, an energy-conscious approach to edge AI that ensures optimal and sustainable performance is imperative.
The reviewed paper provides a survey on green edge AI, focusing on energy-efficient design methodologies for training data acquisition, edge training, and edge inference—three critical tasks in edge AI systems. It addresses efficient data acquisition for centralized edge learning by considering data sampling and transmission methods while ensuring minimal energy expenditure. Novel strategies are proposed, including adaptive sampling rates and learning-centric communications that prioritize important data and adapt to system dynamics.
For distributed edge model training, the paper discusses methods to minimize on-device model updates and computations. Techniques such as model quantization, gradient sparsification, and knowledge distillation are suggested for conserving energy. Additionally, resource management strategies like local training adaptation, dynamic device selection, and data offloading are critical for reducing energy usage in edge AI systems.
Finally, the paper explores potential future research directions, suggesting interests in integrated sensing and communication (ISAC), hardware-software co-design for edge AI platforms, and neuromorphic computing with spiking neural networks and compute-in-memory techniques. It also looks at the potential of harnessing green energy to power edge AI systems without incurring carbon emissions.
In summary, the paper articulates the energy challenges associated with edge AI and outlines a comprehensive set of strategies and methodologies to enhance energy efficiency. By doing so, it lays out a road map for future sustainable developments in edge AI systems within the context of emerging 6G networks.