TY - JOUR AU1 - Samriya, Jitendra Kumar AU2 - Kumar, Mohit AU3 - Gill, Sukhpal Singh AB - Mobile Internet services are developing rapidly for several applications based on computational ability such as augmented/virtual reality, vehicular networks, etc. The mobile terminals are enabled using mobile edge computing (MEC) for offloading the task at the edge of the cellular networks, but offloading is still a challenging issue due to the dynamism, and uncertainty of upcoming IoT requests and wireless channel state. Moreover, securing the offloading data enhanced the challenges of computational complexities and required a secure and efficient offloading technique. To tackle the mentioned issues, a reinforcement learning‐based Markov decision process offloading model is proposed that optimized energy efficiency, and mobile users' time by considering the constrained computation of IoT devices, moreover guarantees efficient resource sharing among multiple users. An advanced encryption standard is employed in this work to fulfil the requirements of data security. The simulation outputs reveal that the proposed approach surpasses the existing baseline models for offloading overhead and service cost QoS parameters ensuring secure data offloading. TI - Secured data offloading using reinforcement learning and Markov decision process in mobile edge computing JF - International Journal of Network Management DO - 10.1002/nem.2243 DA - 2023-09-01 UR - https://www.deepdyve.com/lp/wiley/secured-data-offloading-using-reinforcement-learning-and-markov-uSZMg0n61T VL - 33 IS - 5 DP - DeepDyve ER -