Adaptive Resource Optimization for IoT-Enabled Disaster-Resilient Non-Terrestrial Networks using Deep Reinforcement Learning

dc.contributor.authorJeribi, Fathe
dc.contributor.authorJohn Martin, R.
dc.coverage.issue2cs
dc.coverage.volume34cs
dc.date.accessioned2025-05-12T08:56:25Z
dc.date.available2025-05-12T08:56:25Z
dc.date.issued2025-06cs
dc.description.abstractThe increasing deployment of IoT devices across sectors such as agriculture, transportation, and infrastructure has intensified the need for connectivity in remote and non-terrestrial regions. Non-terrestrial networks (NTNs), which include maritime and space platforms, face unique challenges for IoT connectivity, including mobility and weather conditions, which are critical for maintaining quality of service (QoS), especially in disaster management scenarios. The dynamic nature of NTNs makes static resource allocation insufficient, necessitating adaptive strategies to address varying demands and environmental conditions during disaster management. In this paper, we propose an adaptive resource optimization approach for disaster-resilient IoT connectivity in non-terrestrial environments using deep reinforcement learning. Initially, we design the chaotic plum tree (CPT) algorithm for clustering IoT nodes to maximize the number of satisfactory connections, ensuring all nodes meet sustainability requirements in terms of delay and QoS. Additionally, unmanned aerial vehicles (UAVs) are used to provide optimal coverage for IoT nodes in disaster areas, with coverage optimization achieved through the non-linear smooth optimization (NLSO) algorithm. Furthermore, we develop the multi-variable double deep reinforcement learning (MVD-DRL) framework for resource management, which addresses congestion and transmission power of IoT nodes to enhance network performance by maximize successful connections. Simulation results demonstrate that our MVD-DRL approach reduces the average end-to-end delay by 50.24% compared to existing approaches. It also achieves a throughput improvement of 13.01%, an energy consumption efficiency of 68.71%, and an efficiency in the number of successful connections of 17.51% compared to current approaches.en
dc.formattextcs
dc.format.extent243-257cs
dc.format.mimetypeapplication/pdfen
dc.identifier.citationRadioengineering. 2025 vol. 34, č. 2, s. 243-257. ISSN 1210-2512cs
dc.identifier.doi10.13164/re.2025.0243en
dc.identifier.issn1210-2512
dc.identifier.urihttps://hdl.handle.net/11012/250918
dc.language.isoencs
dc.publisherRadioengineering Societycs
dc.relation.ispartofRadioengineeringcs
dc.relation.urihttps://www.radioeng.cz/fulltexts/2025/25_02_0243_0257.pdfcs
dc.rightsCreative Commons Attribution 4.0 International licenseen
dc.rights.accessopenAccessen
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.subjectInternet of Things (IoT)en
dc.subjectDisaster Managementen
dc.subjectResource Optimizationen
dc.subjectDeep Reinforcement Learningen
dc.subjectNon-Terrestrial Networken
dc.titleAdaptive Resource Optimization for IoT-Enabled Disaster-Resilient Non-Terrestrial Networks using Deep Reinforcement Learningen
dc.type.driverarticleen
dc.type.statusPeer-revieweden
dc.type.versionpublishedVersionen
eprints.affiliatedInstitution.facultyFakulta eletrotechniky a komunikačních technologiícs
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
25_02_0243_0257.pdf
Size:
1.01 MB
Format:
Adobe Portable Document Format
Description:
Collections