|Title||Performance Analysis of MIMO using Machine Learning in 5G Networks|
|Publication Type||Conference Paper|
|Year of Publication||2022|
|Authors||Bouras, C, Prokopiou, I, Gkamas, A, Kokkinos, V|
|Conference Name||The Eighteenth International Conference on Wireless and Mobile Communications (ICWMC 2022), May 22 – 26, 2022, Venice, Italy|
Massive Multiple-Input Multiple-output (MIMO) is an important radio antenna technology for mobile wireless networks, such as 5th Generation (5G) with high potential. The use of hybrid analog and digital precoding to minimize the energy consumption as well as the hardware complexity of mixed signal components is an essential strategy. Machine Learning (ML) could be able to boost 5G technologies due to the rising difficulty of configuring cellular networks. More than ever, a ML computational framework focused on successfully processing the expected huge data generated normally by 5G networks with high subscriber cell density, is required. In the Ultra-Dense Networks (UDNs) of 5G and beyond high demanding networks paired with beamforming and massive MIMO technologies, ML struggles to define network traffic aspects distinctively, especially when they are projected to be much more dynamic and complicated. This paper presents a state-of-the-art analysis of the combined and multiple uses of ML along with MIMO technology in 5G Networks.