TY - GEN
T1 - RIS-Enabled UAV Swarm Optimization Framework for Energy Harvesting and Data Collection in Post-Disaster Recovery Management
AU - Dhuheir, Marwan
AU - Hamdaoui, Bechir
AU - Erbad, Aiman
AU - Al-Fuqaha, Ala
AU - Abdallah, Mohamed
AU - Guizani, Mohsen
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Unmanned aerial vehicles (UAVs) are proven useful for enabling wireless power transfer (WPT), resource offloading, and data collection from ground IoT devices in post-disaster scenarios where conventional communication infrastructure is compromised. As 6G networks emerge, offering ultra-reliable low-latency communication and enhanced energy efficiency, UAVs are poised to play a critical role in extending 6G features to challenging environments. The key challenges in this context include limited UAV flight duration, energy constraints, limited resources, and the reliability of data collection, all of which impact the effectiveness of UAV operations. Motivated by the need for efficient resource allocation and reliable data collection, we propose a solution using UAV swarms combined with reconfigurable intelligent surfaces (RIS) to optimize energy harvesting for IoT devices and enhance communication quality. We formulate the problem of resource optimization, UAVs-RIS trajectory planning, and RIS configuration as a mixed integer nonlinear programming optimization problem and solve it in a dynamic condition by transforming it into a Markov decision process and utilizing a deep reinforcement learning (DRL) approach based on proximal policy optimization (PPO) algorithm to solve it. Simulation results demonstrate that our framework outperforms traditional approaches, including the Actor-Critic (AC) algorithm and a greedy solution, achieving superior performance in energy harvesting efficiency, data collection, and communication reliability.
AB - Unmanned aerial vehicles (UAVs) are proven useful for enabling wireless power transfer (WPT), resource offloading, and data collection from ground IoT devices in post-disaster scenarios where conventional communication infrastructure is compromised. As 6G networks emerge, offering ultra-reliable low-latency communication and enhanced energy efficiency, UAVs are poised to play a critical role in extending 6G features to challenging environments. The key challenges in this context include limited UAV flight duration, energy constraints, limited resources, and the reliability of data collection, all of which impact the effectiveness of UAV operations. Motivated by the need for efficient resource allocation and reliable data collection, we propose a solution using UAV swarms combined with reconfigurable intelligent surfaces (RIS) to optimize energy harvesting for IoT devices and enhance communication quality. We formulate the problem of resource optimization, UAVs-RIS trajectory planning, and RIS configuration as a mixed integer nonlinear programming optimization problem and solve it in a dynamic condition by transforming it into a Markov decision process and utilizing a deep reinforcement learning (DRL) approach based on proximal policy optimization (PPO) algorithm to solve it. Simulation results demonstrate that our framework outperforms traditional approaches, including the Actor-Critic (AC) algorithm and a greedy solution, achieving superior performance in energy harvesting efficiency, data collection, and communication reliability.
KW - energy harvesting
KW - multiple RIS
KW - UAV swarm resource optimization
KW - Wireless power transfer
UR - https://www.scopus.com/pages/publications/105018453022
U2 - 10.1109/ICC52391.2025.11161898
DO - 10.1109/ICC52391.2025.11161898
M3 - Conference contribution
AN - SCOPUS:105018453022
T3 - IEEE International Conference on Communications
SP - 1286
EP - 1291
BT - ICC 2025 - IEEE International Conference on Communications
A2 - Valenti, Matthew
A2 - Reed, David
A2 - Torres, Melissa
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2025 IEEE International Conference on Communications, ICC 2025
Y2 - 8 June 2025 through 12 June 2025
ER -