With its touted ultra-high bandwidth and low latency, 5th generation (5G) wireless technology is envisaged to usher in a new smart and connected world. The goals of our research on 5G are two fold. First, we want to get empirical insights of network and application performance of commercial 5G under several realistic settings, and compare them with its predecessor (4G/LTE). Second, we want to identify novel challenges that are 5G-specific and propose mechanisms to overcome them.
My talk consists of three parts. In the first part, I will describe our measurement study of commercial 5G networks with special focus on millimeter wave (mmWave) 5G. It is to our knowledge a first comprehensive characterization of 5G network performance on smartphones by closely examining 5G service of three carriers (two mmWave carriers, one mid-band carrier) in three U.S. cities. This study finds that commercial mmWave 5G can achieve an impressive throughput of 2 Gbps. However, due to the known poor signal propagation characteristics of mmWave, 5G throughput perceived by the user equipment (UE) is highly sensitive to user mobility and obstructions resulting in a high number of 4G-5G handoffs. Such characteristics of mmWave 5G can make the throughput fluctuate frequently and wildly (between a range of 0 and 2 Gbps) which may confuse applications (e.g., the video bitrate adaptation) and bring highly inconsistent user experiences. Motivated by such insights, the second part of my talk will go beyond the basic measurement and describe Lumos5G - a novel and composable ML-based 5G throughput prediction framework that judiciously considers features and their combinations to make context-aware 5G throughput predictions. Through extensive on-field experiments and statistical analysis, we identify key UE-side factors affecting mmWave 5G performance. Besides geolocation, we quantitatively reveal several other UE-side contextual factors (such as geometric features between UE and 5G panel, mobility speed/mode, etc.) impact 5G throughput -- far more sophisticated than those impacting 4G/LTE. Instead of independently affecting the performance, we find these factors may cause complex interplay that is difficult to model analytically. We demonstrate that compared to existing approaches, Lumos5G is able to achieve 1.37x to 4.84x reduction in prediction error. This work can be viewed as a feasibility study for building what we envisage as a dynamic 5G performance map (akin to Google traffic map). In the third part, I will use our 18-months of experience conducting field experiments of commercial 5G to give my thoughts on the current 5G landscape and highlight both the research opportunities and challenges offered by the 5G ecosystem.
For more information, visit us @ https://5gophers.umn.edu
Arvind Narayanan is a Ph.D. Candidate in the Department of Computer Science & Engineering at the University of Minnesota, advised by Professor Zhi-Li Zhang and Professor Feng Qian. His research interests are broadly in the areas of emerging scalable network architectures (such as NFVs), 5G mobile networking, network data science and content distribution networks (CDNs). He has published papers in several top venues such as WWW, IMC, SIGCOMM, CoNEXT, APNET, Journal of Voice, GLOBECOM, ICDCS, etc. Arvind's recent work on 5G (including the publicly released datasets) has become the de facto baseline to understand and evaluate the evolution of commercial 5G's network performance. His work on DeepCache was the recipient of the Best Paper Award at SIGCOMM Workshop NetAI'18. Arvind completed his M.S. in Computer Science from the same department, and graduated with B.E. in Computer Engineering with highest distinction from the University of Mumbai where he was also awarded the Best Overall Student in his batch (1 out of 120).