谷歌浏览器插件
订阅小程序
在清言上使用

Correction To: Task Offloading for Vehicular Edge Computing with Edge‑cloud Cooperation

World Wide Web(2022)

引用 8|浏览2
暂无评分
摘要
Vehicular edge computing (VEC) is emerging as a novel computing paradigm to meet low latency demands for computation-intensive vehicular applications. However, most existing offloading schemes do not take the dynamic edge-cloud computing environment into account, resulting in high delay performance. In this paper, we propose an efficient offloading scheme based on deep reinforcement learning for VEC with edge-cloud computing cooperation, where computation-intensive tasks can be executed locally or can be offloaded to an edge server, or a cloud server. By jointly considering: i) the dynamic edge-cloud computing environment; ii) fast offloading decisions, we leverage deep reinforcement learning to minimize the average processing delay of tasks by effectively integrating the computation resources of vehicles, edge servers, and the cloud server. Specifically, a deep Q-network (DQN) is used to adaptively learn optimal offloading schemes in the dynamic environment by balancing the exploration process and the exploitation process. Furthermore, the learned offloading scheme can make fast by speeding up the convergence of the training process "to" the offloading scheme can be quickly learned by speeding up the convergence of the training process of DQN, which is good for fast offloading decisions. We conduct extensive simulation experiments and the experimental results show that the proposed offloading scheme can achieve a good performance.
更多
查看译文
关键词
Task offloading,Vehicular edge computing,Edge-cloud computing cooperation,Deep reinforcement learning,Deep Q-network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要