TY - GEN
T1 - Deep Reinforcement Learning-based Task Offloading and Resource Allocation in MEC-enabled Wireless Networks
AU - Birhanu Engidayehu, Seble
AU - Mahboob, Tahira
AU - Young Chung, Min
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Mobile edge computing (MEC) has recently become an enabling technology for mobile operators that are offering a diverse set of services. These services require extensive storage, energy, and computation resources. However, user devices (UDs) have resource constraint to meet the requirements of such services. To tackle the contradiction between resource-constrained UDs and computationally intensive services, MEC have been proposed. MEC servers provide task execution services for UDs. On receiving a service request from the UDs, a MEC server within the network may dynamically allocate computation and memory resources for the task execution. As the MEC servers have limited capacity, efficient utilization of MEC resources is necessitated. Also, it is challenging to find an optimal solution for efficient resource allocations due to different task requirements for a diverse set of services offered to users and dynamicity in wireless networks. To address these problems, we propose a partial task offloading and resource allocation scheme to maximize user task completion within a tolerable time period while minimizing energy consumption. In this paper, we convert the formulated optimization problem to a markov decision process (MDP) and then propose a solution based on the deep deterministic policy gradient (DDPG) algorithm. The performance results show that the proposed method completes a greater number of tasks within a tolerable delay and reduces the energy consumption in the network, compared to those of other conventional schemes.
AB - Mobile edge computing (MEC) has recently become an enabling technology for mobile operators that are offering a diverse set of services. These services require extensive storage, energy, and computation resources. However, user devices (UDs) have resource constraint to meet the requirements of such services. To tackle the contradiction between resource-constrained UDs and computationally intensive services, MEC have been proposed. MEC servers provide task execution services for UDs. On receiving a service request from the UDs, a MEC server within the network may dynamically allocate computation and memory resources for the task execution. As the MEC servers have limited capacity, efficient utilization of MEC resources is necessitated. Also, it is challenging to find an optimal solution for efficient resource allocations due to different task requirements for a diverse set of services offered to users and dynamicity in wireless networks. To address these problems, we propose a partial task offloading and resource allocation scheme to maximize user task completion within a tolerable time period while minimizing energy consumption. In this paper, we convert the formulated optimization problem to a markov decision process (MDP) and then propose a solution based on the deep deterministic policy gradient (DDPG) algorithm. The performance results show that the proposed method completes a greater number of tasks within a tolerable delay and reduces the energy consumption in the network, compared to those of other conventional schemes.
KW - Deep deterministic policy gradient (DDPG)
KW - mobile edge computing (MEC)
KW - partial offloading
KW - resource allocation
UR - https://www.scopus.com/pages/publications/85143085823
U2 - 10.1109/APCC55198.2022.9943689
DO - 10.1109/APCC55198.2022.9943689
M3 - Conference contribution
AN - SCOPUS:85143085823
T3 - APCC 2022 - 27th Asia-Pacific Conference on Communications: Creating Innovative Communication Technologies for Post-Pandemic Era
SP - 226
EP - 230
BT - APCC 2022 - 27th Asia-Pacific Conference on Communications
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 27th Asia-Pacific Conference on Communications, APCC 2022
Y2 - 19 October 2022 through 21 October 2022
ER -