Link Fluctuations Reduce Data Transmission by 20 Percent

Researchers are tackling the challenge of reliable data transmission in increasingly complex multi-hop wireless Internet of Things networks. Jothi Prasanna Shanmuga Sundaram and Magzhan Gabidolla, from the University of California, Merced, working with colleagues at Mitsubishi Electric Research Laboratories, including Jianlin Guo, Toshiaki Koike-Akino, Pu Wang, Kieran Parsons, Philip V. Orlik, Takenori Sumi, Yukimasa Nagai, Miguel A. Carreira-Perpinan and Alberto E. Cerpa, have developed an Enhanced Dynamic Relay Point Protocol (EDRP) to improve data dissemination efficiency. Current contention-based protocols often suffer from performance degradation due to collisions, and while Dynamic Relay Point (DRP) offers improvements, it struggles with fluctuating real-world link quality. This new research addresses these limitations through a combination of Link-Quality Aware CSMA and a machine learning-based Block Size Selection algorithm, demonstrably increasing goodput by an average of 39.43% compared to existing methods and paving the way for more robust and scalable IoT deployments.

For years, wireless networks have struggled to reliably share data amongst many devices, particularly as those networks grow more complex. Now, a revised data relay protocol promises to overcome limitations caused by fluctuating signal strength and network congestion. Testing demonstrates this enhanced system improves data transfer rates by 39. Scientists are increasingly focused on the transition of Internet of Things (IoT) applications from reliance on battery power to utilising grid-powered nodes.

This shift is driving a need for more efficient data dissemination protocols suited to these new network architectures. Traditional contention-based protocols often experience reduced performance due to the overhead of control packet exchanges. Scientists have identified the core issues stemming from fluctuating link quality and the passive acknowledgements inherent in DRP’s design. Integrating two key components to overcome the limitations of its predecessor.

These components work in concert to improve data transmission in challenging environments. To achieve reliable communication in dense, urban or semi-urban settings presents unique difficulties, as buildings, human activity. Vehicle movements all contribute to erratic signal attenuation. Instead of attempting to enforce strict timing, it dynamically adjusts to the prevailing conditions, offering a more practical solution for these complex environments. Analysis showed DRP unable to effectively manage fluctuations in link quality, resulting in overlapping transmissions and reduced goodput performance. Observations of in-field link quality indicated erratic changes over time, prompting a design that avoids fixed block sizes for rateless coding.

ML-BSS utilizes TAO-optimised ordinal regression trees to predict future link quality conditions and then optimally sets the block size for rateless coding, minimising overhead and maximising goodput. In-field testing of EDRP showed an average goodput improvement of 39.43% when compared to competing protocols, confirming the effectiveness of adapting transmission parameters to the active nature of real-world wireless networks.

Understanding the precise nature of link quality fluctuations was essential to designing EDRP. Theoretical analysis characterised the design requirements and trade-offs governing collision behaviour and dissemination efficiency. DRP, a contention-based data dissemination protocol, relies on distributed delay timers to schedule transmissions, intending to prioritize nodes with stronger connections. By recognising that both good and poor link conditions impact data transmission efficiency, ML-BSS was developed to predict future link quality. This prediction then informs the optimal block size for rateless coding. A technique where data is encoded in a way that allows for reconstruction even with lost packets, minimising overhead and maximising goodput.

To achieve this prediction, TAO-optimised ordinal regression trees were employed, a statistical method for forecasting sequential data. While this transition promises greater reliability and capability, it exposes weaknesses in existing data transmission protocols designed for intermittent connections. The core issue isn’t merely about speed. But about managing contention, the unavoidable collisions that occur when multiple devices attempt to transmit data simultaneously.

Previous approaches relied on exchanging control packets to resolve these conflicts, a process that eats into available bandwidth and limits overall efficiency. Simply prioritising nodes with stronger connections proves insufficient when real-world signal strength fluctuates unpredictably. Instead, this effort demonstrates the need for a system that anticipates these changes and adapts accordingly.

Through integrating active adjustments to transmission delays and intelligently selecting data block sizes, The team achieved a substantial increase in goodput. In turn, the actual rate of successful data delivery. Here, this represents a move away from static optimisation towards a more responsive, adaptive network architecture — while machine learning algorithms can estimate future conditions, they are not infallible. Inaccurate predictions could negate some of the gains.

Also, the experiments were conducted within a specific environment. Extending these findings to more complex and varied deployments will require further investigation. Once these limitations are addressed, the potential extends beyond simple data transmission. Such an approach might be integrated with other network management techniques, such as improved routing protocols or energy-saving mechanisms.

Beyond the immediate benefits to smart grids and industrial monitoring, the principles of adaptive contention resolution could inform the design of future wireless communication systems, even those operating at higher frequencies. At a time when spectrum is increasingly crowded, any technology that improves efficiency and reduces interference deserves attention, and this effort offers a promising step in that direction.

👉 More information
🗞 EDRP: Enhanced Dynamic Relay Point Protocol for Data Dissemination in Multi-hop Wireless IoT Networks
🧠 ArXiv: https://arxiv.org/abs/2602.17619

Rohail T.

Rohail T.

As a quantum scientist exploring the frontiers of physics and technology. My work focuses on uncovering how quantum mechanics, computing, and emerging technologies are transforming our understanding of reality. I share research-driven insights that make complex ideas in quantum science clear, engaging, and relevant to the modern world.

Latest Posts by Rohail T.:

Abstract geometric manifold with quantum gate symbols represented as smooth curved paths across surface

Quantum Gates Mapped to Predictable Geometric Space

February 26, 2026
Logical qubit core protected by layered stabilizer shields, error ripples absorbed at outer boundary, central state glowing steadily with high coherence

Quantum Error Framework Boosts Logical State Fidelity

February 26, 2026
Qubit array with fewer measurement beams scanning across it, bright efficient readout node at center, minimal energy pulses compared to background

Quantum Computers Cut Measurement Costs with New Method

February 25, 2026