Resource allocation for MEC system with multi-users resource competition based on deep reinforcement learning approach

Computer Networks(2022)

引用 3|浏览12
暂无评分
摘要
Mobile edge computing (MEC) is an effective computing paradigm for mobile devices in the 5G era to reduce computing delay and energy consumption. However, in a multi-user resource competition environment, the revenue-driven behavior of edge servers will cause some users to increase delays or fail tasks. Considering this situation, we take the success rate of computation offloading as the trust value of the edge server, and build a system model from the user’s perspective, taking delay and energy consumption as the multi-objective task of joint optimization. In the optimization goal, we consider three factors: offloading delay, energy consumption, and queuing delay. Simultaneously minimizing energy consumption and delay is a contradiction problem. Therefore, we solve the problem based on the principle of reducing energy consumption as much as possible when the offload success rate (decreasing delay) is prioritized. Further, we build the problem as a Markov decision problem (MDP) with multi-factor reward value, and treat the trust value as a state of the system. Finally, we use an extended deep deterministic policy gradient (DDPG) algorithm (a DDPG algorithm with multi-objective reward) to work around this problem. Experimental results show that our proposed scheme can better reduce the delay and energy consumption in computation offloading of mobile users (MUs) significantly better than the baseline schemes. The advantages of our proposed scheme are more obvious in an environment where computing resources are tight.
更多
查看译文
关键词
Mobile edge computing,Deep reinforcement learning,Computation offloading,Delay,Energy consumption
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要