TY - JOUR
T1 - Reinforcement learning for resource provisioning in the vehicular cloud
AU - Salahuddin, Mohammad A.
AU - Al-Fuqaha, Ala
AU - Guizani, Mohsen
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/8
Y1 - 2016/8
N2 - This article presents a concise view of vehicular clouds that incorporates various vehicular cloud models that have been proposed to date. Essentially, they all extend the traditional cloud and its utility computing functionalities across the entities in the vehicular ad hoc network. These entities include fixed roadside units, onboard units embedded in the vehicle, and personal smart devices of drivers and passengers. Cumulatively, these entities yield abundant processing, storage, sensing, and communication resources. However, vehicular clouds require novel resource provisioning techniques that can address the intrinsic challenges of dynamic demands for the resources and stringent QoS requirements. In this article, we show the benefits of reinforcement-learning-based techniques for resource provisioning in the vehicular cloud. The learning techniques can perceive long-term benefits and are ideal for minimizing the overhead of resource provisioning for vehicular clouds.
AB - This article presents a concise view of vehicular clouds that incorporates various vehicular cloud models that have been proposed to date. Essentially, they all extend the traditional cloud and its utility computing functionalities across the entities in the vehicular ad hoc network. These entities include fixed roadside units, onboard units embedded in the vehicle, and personal smart devices of drivers and passengers. Cumulatively, these entities yield abundant processing, storage, sensing, and communication resources. However, vehicular clouds require novel resource provisioning techniques that can address the intrinsic challenges of dynamic demands for the resources and stringent QoS requirements. In this article, we show the benefits of reinforcement-learning-based techniques for resource provisioning in the vehicular cloud. The learning techniques can perceive long-term benefits and are ideal for minimizing the overhead of resource provisioning for vehicular clouds.
UR - https://www.scopus.com/pages/publications/84986220271
U2 - 10.1109/MWC.2016.7553036
DO - 10.1109/MWC.2016.7553036
M3 - Article
AN - SCOPUS:84986220271
SN - 1536-1284
VL - 23
SP - 128
EP - 135
JO - IEEE Wireless Communications
JF - IEEE Wireless Communications
IS - 4
M1 - 7553036
ER -