Deep Reinforcement Learning-Enhanced Levenberg-Marquardt Neural Network for Improved Energy Efficiency in Wireless Sensor Networks
DOI:
https://doi.org/10.48314/ceti.v1i3.33Keywords:
Deep RL, Enhanced levenberg-marquardt neural network, Energy efficiency, Anomaly detection, WSN, Network lifetimeAbstract
In applications like environmental monitoring and tracking, ensuring the reliability of Wireless Sensor Network (WSN) and Energy Efficiency (EE) optimization serves as a major challenge in recent years. Then, severe power limitations were often faced by these WSN networks because (sensor nodes) SN have limited energy capacity. Here, an advanced method was suggested in this study, that method integrates the Reinforcement Learning (RL) into the Low-Energy Adaptive Clustering Hierarchy (LEACH) protocol for the purpose of improving clustering and Energy Management (EM). The Deep Reinforcement Learning (DRL) with Enhanced Levenberg-Marquardt Neural Network (ELMNN) classification was integrated in this model. Smart optimization of sensor communication protocols and clustering strategies by this integration. This model supports in reducing the Energy Consumption (EC), and prolonging the Network Lifetime (NL). From the simulation outcomes, it is clear that the suggested method executes better than the conventional methods by EE, Anomaly Detection (AD) accuracy, and network stability. These simulated outcomes highlight the suggested model’s ability. This model also supports the Real-Time (RT) applications in energy-constrained WSN.