Vehicular Edge Computing (VEC) is considered a major enabler for multi-service vehicular 6G scenarios. However, limited computation, communication, and storage resources of terrestrial edge servers are becoming a bottleneck and hindering the performance of VEC-enabled Vehicular Networks (VNs). Aerial platforms are considered a viable solution allowing for extended coverage and expanding available resources. However, in such a dynamic scenario, it is important to perform a proper service placement based on the users' demands. Furthermore, with limited computing and communication resources, proper user-server assignments and offloading strategies need to be adopted. Considering their different time scales, a multi-time-scale optimization process is proposed here to address the joint service placement, network selection, and computation offloading problem effectively. With this scope in mind, we propose a multi-time-scale Markov Decision Process (MDP) based Reinforcement Learning (RL) to solve this problem and improve the latency and energy performance of VEC-enabled VNs. Given the complex nature of the joint optimization process, an advanced deep Q-learning method is considered. Comparison with various benchmark methods shows an overall improvement in latency and energy performance in different VN scenarios.

Multi-Time-Scale Markov Decision Process for Joint Service Placement, Network Selection, and Computation Offloading in Aerial IoV Scenarios / Shinde, Swapnil Sadashiv; Tarchi, Daniele. - In: IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING. - ISSN 2327-4697. - ELETTRONICO. - 11:(2024), pp. 5364-5379. [10.1109/tnse.2024.3445890]

Multi-Time-Scale Markov Decision Process for Joint Service Placement, Network Selection, and Computation Offloading in Aerial IoV Scenarios

Tarchi, Daniele
2024

Abstract

Vehicular Edge Computing (VEC) is considered a major enabler for multi-service vehicular 6G scenarios. However, limited computation, communication, and storage resources of terrestrial edge servers are becoming a bottleneck and hindering the performance of VEC-enabled Vehicular Networks (VNs). Aerial platforms are considered a viable solution allowing for extended coverage and expanding available resources. However, in such a dynamic scenario, it is important to perform a proper service placement based on the users' demands. Furthermore, with limited computing and communication resources, proper user-server assignments and offloading strategies need to be adopted. Considering their different time scales, a multi-time-scale optimization process is proposed here to address the joint service placement, network selection, and computation offloading problem effectively. With this scope in mind, we propose a multi-time-scale Markov Decision Process (MDP) based Reinforcement Learning (RL) to solve this problem and improve the latency and energy performance of VEC-enabled VNs. Given the complex nature of the joint optimization process, an advanced deep Q-learning method is considered. Comparison with various benchmark methods shows an overall improvement in latency and energy performance in different VN scenarios.
2024
11
5364
5379
Shinde, Swapnil Sadashiv; Tarchi, Daniele
File in questo prodotto:
File Dimensione Formato  
ftgsfthhhxfcpydsdvrbcrsgfrhnsmwr.pdf

accesso aperto

Tipologia: Versione finale referata (Postprint, Accepted manuscript)
Licenza: Creative commons
Dimensione 6.24 MB
Formato Adobe PDF
6.24 MB Adobe PDF
Multi-Time-Scale_Markov_Decision_Process_for_Joint_Service_Placement_Network_Selection_and_Computation_Offloading_in_Aerial_IoV_Scenarios.pdf

accesso aperto

Tipologia: Pdf editoriale (Version of record)
Licenza: Creative commons
Dimensione 4.1 MB
Formato Adobe PDF
4.1 MB Adobe PDF

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1381010
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact