AI/ML for next generation wireless networks
Abstract
The next generation wireless networks 5G and beyond 5G are heterogeneous networks. They are effectively formed and highly active and complex design networks to transportabundant data at an effective way with concentrated speed and offering extremely low dormancy. 5G networks have the capability of supporting more number of subscribers with high reliability. These wireless networks have significant issues related to design, deployment, data storage, operation, administration and management. These networks demands automotive design with less human intervention. These are highly intelligent networks interms of reasoning, decision making. The resource allocation, usage and network deployment are mainly impact the performance of the next generation wireless networks. Wireless networks 5G or 6G may be standalone structure or they built upon the existing infrasture. The next generation wireless applications includes Smart Home, shrewd cities, transport and logistics,autonomous driving, drone operation, security and surveillance, satellite internet, smart farming, fleet management, healthcare, and mission-critical applications. The above mentioned applications highly demands less power requirement, high response time, less cost, minimizing interference and redundancy to maximize coverage and capacity. To meet the above mentioned issues artificial intelligence and machine learning techniques are employed. The integration of AI/ML technology with wireless celluar network is described in this article. Mainly, this work examine/review the machine learning (ML) approaches that can be used with 6G wireless networks. This article explains the incorporation of different Machine learning algorithms, network optimization using AI principles, AI opportunities for wireless networks, Q-Learning algorithm and its working in improving the performance, use of block-chain technology with wireless network and cognitive radio for dynamic spectral allocationetc.,Federated Learning for data sharing in mobile devices, Kernel Hilbert space for Data rate improvement in 6G. At the end of this article, the CIR prediction with analytical recursive least-squares (RLS) algorithm, machine learning (Linear regression)is depicted and compared. The delay versus number of MCD's is analyzed with Q-Learning, deep Q-learning and round robin algorithms.
- Publication:
-
American Institute of Physics Conference Series
- Pub Date:
- December 2023
- DOI:
- 10.1063/5.0178712
- Bibcode:
- 2023AIPC.2901f0029M