TY - CHAP
T1 - Computation Offloading in Energy Harvesting Powered MEC Network
AU - Sun, Zhenfeng
AU - Zhao, Ming
AU - Nakhai, Mohammad Reza
N1 - Publisher Copyright:
© 2021 IEEE.
Copyright:
Copyright 2021 Elsevier B.V., All rights reserved.
PY - 2021/6
Y1 - 2021/6
N2 - Mobile edge computing (MEC) is a promising technique which migrates computational intensive tasks from smart devices to edge servers, so as to increase the computational capacity of smart devices while saving the battery energy. In this paper, we consider a MEC network system where the smart device with energy harvesting module and electricity storage chooses an offloading rate to offload computational task to one of edge servers. We formulate the offloading problem as a joint minimization problem of energy consumption and delay in the long-term, while meeting the smart device's quality of experience constraints. The challenge is that the varying renewable energy generation and the energy consumed by the current action will affect the next battery level. To address this problem, we use a reinforcement learning model to account for future dynamics in the environment. To this end, we develop an algorithm based on a deep Q noisy neural network that adjusts automatically the noise level at smart devices for exploration, and hence, replaces the epsilon-greedy policy, traditionally used in the Q-learning algorithm. Simulation results show that the proposed algorithm achieves lower energy consumption and better quality of experience, as compared to the celebrated deep Q-learning algorithm and the random scheme.
AB - Mobile edge computing (MEC) is a promising technique which migrates computational intensive tasks from smart devices to edge servers, so as to increase the computational capacity of smart devices while saving the battery energy. In this paper, we consider a MEC network system where the smart device with energy harvesting module and electricity storage chooses an offloading rate to offload computational task to one of edge servers. We formulate the offloading problem as a joint minimization problem of energy consumption and delay in the long-term, while meeting the smart device's quality of experience constraints. The challenge is that the varying renewable energy generation and the energy consumed by the current action will affect the next battery level. To address this problem, we use a reinforcement learning model to account for future dynamics in the environment. To this end, we develop an algorithm based on a deep Q noisy neural network that adjusts automatically the noise level at smart devices for exploration, and hence, replaces the epsilon-greedy policy, traditionally used in the Q-learning algorithm. Simulation results show that the proposed algorithm achieves lower energy consumption and better quality of experience, as compared to the celebrated deep Q-learning algorithm and the random scheme.
KW - Edge computing
KW - noisy neural network
KW - reinforcement learning
UR - http://www.scopus.com/inward/record.url?scp=85115718841&partnerID=8YFLogxK
U2 - 10.1109/ICC42927.2021.9500984
DO - 10.1109/ICC42927.2021.9500984
M3 - Conference paper
AN - SCOPUS:85115718841
T3 - IEEE International Conference on Communications
BT - ICC 2021 - IEEE International Conference on Communications, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 IEEE International Conference on Communications, ICC 2021
Y2 - 14 June 2021 through 23 June 2021
ER -