A Resource Allocation Scheme for Packet Delay Minimization in Multi-Tier Cellular-Based IoT Networks

Abstract

With advances in Internet of Things (IoT) technologies, billions of devices are becoming connected, which can result in the unprecedented sensing and control of the physical environments. IoT devices have diverse quality of service (QoS) requirements, including data rate, latency, reliability, and energy consumption. Meeting the diverse QoS requirements presents great challenges to existing fifth-generation (5G) cellular networks, especially in unprecedented scenarios in 5G networks, such as connected vehicle networks, where strict data packet latency may be required. The IoT devices with these scenarios have higher requirements on the packet latency in networking, which is essential to the utilization of 5G networks. In this paper, we propose a multi-tier cellular-based IoT network to address this challenge, with a particular focus on meeting application latency requirements. In the multi-tier network, access points (APs) can relay and forward packets from IoT devices or other APs, which can support higher data rates with multi-hops between IoT devices and cellular base stations. However, as multiple-hop relaying may cause additional delay, which is crucial to delay-sensitive applications, we develop new schemes to mitigate the adverse impact. Firstly, we design a traffic-prioritization scheduling scheme to classify packets with different priorities in each AP based on the age of information (AoI). Then, we design different channel-access protocols for the transmission of packets according to their priorities to ensure the QoS in networking and the effective utilization of the limited network resources. A queuing-theory-based theoretical model is proposed to analyze the packet delay for each type of packet at each tier of the multi-tier IoT networks. An optimal algorithm for the distribution of spectrum and power resources is developed to reduce the overall packet delay in a multi-tier way. The numerical results achieved in a two-tier cellular-based IoT network show that the target packet delay for delay-sensitive applications can be achieved without a large cost in terms of traffic fairness.

Publication DOI: https://doi.org/10.3390/math11214538
Divisions: College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies > Electronics & Computer Engineering
College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies
College of Engineering & Physical Sciences
College of Engineering & Physical Sciences > Aston Centre for Artifical Intelligence Research and Application
Additional Information: Copyright: © 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).
Uncontrolled Keywords: 5G,AoI,cellular-based IoT network,multiple access protocol,packet transmission delay,resources allocation,Computer Science (miscellaneous),Engineering (miscellaneous),General Mathematics
Publication ISSN: 2227-7390
Last Modified: 16 Dec 2024 09:00
Date Deposited: 13 Nov 2023 11:13
Full Text Link:
Related URLs: https://www.mdp ... 7390/11/21/4538 (Publisher URL)
PURE Output Type: Article
Published Date: 2023-11-03
Accepted Date: 2023-10-26
Authors: Li, Jin
Guan, Wenyang
Tang, Zuoyin (ORCID Profile 0000-0001-7094-999X)

Download

[img]

Version: Published Version

License: Creative Commons Attribution

| Preview

Export / Share Citation


Statistics

Additional statistics for this record