Edge Artificial Intelligence: Systematic Review Traces Evolution, Maps Taxonomy, and Identifies Future Horizons

Edge Artificial Intelligence represents a significant shift in computing, bringing intelligence directly to devices and enabling real-time data processing with enhanced privacy and reduced delay. Mohamad Abou Ali and Fadi Dornaika, both from the University of the Basque Country, lead a comprehensive review that systematically charts the evolution of this rapidly developing field. Their work establishes a clear taxonomic framework for understanding Edge AI, categorising approaches by deployment location, processing capabilities like TinyML and federated learning, application areas, and underlying hardware. By tracing the field’s development from early content delivery networks to modern on-device intelligence, and critically assessing current challenges and emerging opportunities, this research provides a valuable resource for both academics and industry professionals seeking to navigate the future of intelligent edge computing.

Edge AI, IoT, and Emerging Networks

Edge AI represents a significant shift in artificial intelligence, bringing computation closer to the data source at the network edge, rather than relying solely on centralized cloud resources. This convergence of AI and the Internet of Things is crucial for next-generation wireless networks like 5G and 6G. Researchers are exploring how virtual representations of physical systems, called digital twins, can be effectively monitored and controlled using Edge AI, unlocking new possibilities for real-time analysis and decision-making in a variety of applications. Key areas of development include specialized hardware to accelerate AI computations at the edge, and innovative architectures like compute-in-memory, which performs calculations directly within memory to reduce energy consumption and latency.

Researchers are also investigating optical computing, utilizing light for computation, and energy harvesting techniques to power edge devices using ambient energy sources, aiming for self-powered, autonomous systems. Software and algorithmic advancements focus on deploying large language models directly on edge devices, training AI with limited data through few-shot learning, and enabling continuous learning from new data without forgetting previous knowledge. Reinforcement learning is being used to create adaptive and optimized edge AI systems, while explainable AI aims to make AI decision-making processes more transparent and understandable. Addressing bias in AI models and integrating data from various edge sources are also critical areas of focus.

Effective system-level design involves optimizing task distribution between edge devices, fog nodes, and the cloud, dynamically adapting task allocation based on network conditions, and training AI models collaboratively across multiple edge devices without sharing raw data, enhancing privacy through federated learning. Protecting edge AI systems from attacks, ensuring data privacy, efficiently allocating limited resources, and strategically moving computations to more powerful resources are also essential considerations. The potential applications of Edge AI are vast, spanning autonomous driving, intelligent wireless communication networks, predictive maintenance in industrial IoT, remote patient monitoring in healthcare, smart city infrastructure, and immersive experiences in the metaverse. Researchers developed a novel multi-dimensional taxonomy to classify Edge AI research, considering deployment location, processing capability, application domain, and hardware architecture. This framework provides an integrated view of the field, revealing key trends and research gaps. The taxonomy categorizes deployment locations from on-device processing to regional edge computing, and processing capabilities from TinyML to federated learning.

It also considers application domains like healthcare and smart cities, and hardware types ranging from CPUs to neuromorphic chips. This detailed categorization allows for nuanced comparison of different approaches and identification of critical trade-offs. The study highlights the historical connection between modern Edge AI and earlier technologies like content delivery networks and fog computing, establishing a crucial continuity often missing from previous surveys. Researchers emphasize the importance of considering the interdependencies between hardware, software, and application layers, rather than treating them in isolation. The study traces the field’s development from early content delivery networks to modern on-device intelligence, identifying core enabling technologies such as specialized hardware, optimized software, and communication protocols. A critical assessment of challenges reveals interconnected issues including resource limitations, security vulnerabilities, model management complexities, power consumption constraints, and connectivity dependence. The study highlights the trade-offs between operational reliability and energy efficiency, noting that data buffering or compression algorithms can increase power consumption.

Fragmentation within the Edge AI ecosystem, with proprietary hardware and incompatible software, hinders integration and increases development costs. While industry initiatives aim for standardization, achieving genuine interoperability remains a long-term objective crucial for seamless collaboration between devices. Future research directions promise to redefine the performance-energy frontier, particularly through hyper-specialized accelerators optimized for specific model architectures. In-memory and near-memory computing, utilizing technologies like memristors, aim to mitigate the von Neumann bottleneck, enabling ultra-low-power inference.

Post-digital computing paradigms, including analog AI and optical computing, demonstrate potential for significant energy reduction. Furthermore, advanced energy harvesting technologies combined with ultra-low-power AI processors promise perpetually operational devices in remote environments. Algorithmic research focuses on imbuing Edge AI systems with greater autonomy and efficiency through continual and lifelong learning systems. The analysis reveals a coherent progression from foundational technologies like content delivery networks and fog computing to modern paradigms such as TinyML and federated learning, demonstrating that the shift towards distributed edge intelligence has been neither accidental nor instantaneous. The research demonstrates that the contemporary Edge AI landscape is characterized by sophisticated hardware-software co-design across a spectrum of resource constraints, enabling transformative applications in sectors including healthcare, industrial automation, and smart cities. However, the authors acknowledge significant challenges that currently limit widespread adoption, including resource limitations, security vulnerabilities, and complexities in model management and power consumption. Looking forward, the team projects that future advancements will focus on next-generation hardware paradigms, advanced algorithms capable of continuous adaptation, seamless edge-cloud collaboration, and the integration of trustworthiness and explainability into fundamental design principles. Realizing the full potential of Edge AI will require continued interdisciplinary collaboration across multiple engineering and computer science domains.

👉 More information
🗞 Edge Artificial Intelligence: A Systematic Review of Evolution, Taxonomic Frameworks, and Future Horizons
🧠 ArXiv: https://arxiv.org/abs/2510.01439

Rohail T.

Rohail T.

As a quantum scientist exploring the frontiers of physics and technology. My work focuses on uncovering how quantum mechanics, computing, and emerging technologies are transforming our understanding of reality. I share research-driven insights that make complex ideas in quantum science clear, engaging, and relevant to the modern world.

Latest Posts by Rohail T.:

Renormalization Group Flow Irreversibility Enables Constraints on Effective Spatial Dimensionality

Renormalization Group Flow Irreversibility Enables Constraints on Effective Spatial Dimensionality

December 20, 2025
Replica Keldysh Field Theory Unifies Quantum-Jump Processes in Bosonic and Fermionic Systems

Replica Keldysh Field Theory Unifies Quantum-Jump Processes in Bosonic and Fermionic Systems

December 20, 2025
Quantum Resource Theory Achieves a Unified Operadic Foundation with Multicategorical Adjoints

Quantum Resource Theory Achieves a Unified Operadic Foundation with Multicategorical Adjoints

December 20, 2025