Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

6G: New Generations of Wireless and the Impact on Measurement

wireless
(Source: Pixabay)

The end of 2020 will see only 2% of the world’s 8 billion mobile subscriptions as 5G.  But even though the vision for 5G is still far from being realized, the work on 6G has begun. The time required to develop a new generation of wireless means that work on 6G really started a few years ago.

The early work in 5G set the stage for developing a technology based on a user- and society-centric view; those working on 6G follow this example. Whitepapers from the ITU, Samsung, Docomo, and the University of Oulu describe futuristic use cases and network attributes: tactile holographic communications; precise digital twins; industrial IoT in the cloud; social and societal IoT; and a pervasive use of AI to merge communications and computing with society. Traditional key performance indicators (KPI) include data rates to 1Tbps, mobility of 1000 kmph, and latency of 0.1ms. New KPI’s include precision and accuracy of timing (“in time” and “on time” communications) and the ability to pinpoint location to centimeters. Since the vision and the KPI analyses are covered well and accessible at your fingertips with the links above, I will focus on the impact of 6G on design, test, and measurement.

I am often asked what design and testing will be like for 6G, and having watched the other five generations evolve from ideas to the mainstream, I believe I can anticipate a few things:

  1. Testing will happen in both traditional and new domains.

  2. Test technology and solutions will evolve over time.

  3. Complex system-level validation for the entire system will take an even bigger role than in previous generations.

With history as an indicator, it is safe to say that this will take some time. Automated mobile radio systems - those not requiring an operator or push-to-talk functionality - were conceived in the early 1970's building upon frequency reuse concepts patented by Bell Labs in the late 1940s. NTT launched the first commercial system in 1979, followed by the Saudi and Nordic launches of NMT in 1981, and then by AT&T's 1983 launch of AMPS in the USA.  Each subsequent generation has launched at one-decade intervals.

wireless history

Mobile Communications Measurement – Some History

Mobile wireless first enabled us to carry our phones anywhere, and now allows our office, education, and entertainment to be anywhere. The next step is for 6G to become an integral part of society. The industry puts constant pressure on the state of the art of affordable technology. Some examples we take for granted: 1G was not feasible without the microprocessor; 2G and 3G required revolutions in digital radio transceivers; and 4G would not exist without the lithium-ion battery.

That same pressure also drove the evolution of test and measurement requirements. We started with considerable focus measuring radio physics: power, sensitivity, and interference issues. Each subsequent generation drove change on two axes:  1) the way these measurements had to be made, and 2) new validation requirements, often at higher layers of system performance. Signal-to-noise based sensitivity measurements evolved to bit error rate (BER) and then to block error rate (BLER), and now must consider noise plus interference. Modulation accuracy went from modulation depth error to error vector magnitude (EVM). We added the testing of voice codecs, data throughput, battery drain, and handovers. Now we measure things like scheduler efficiency and even "quality of service" (QoS). 5G will bring system-level issues related to requirements for security, reliability, latency, and system power consumption. The increasing demands from industry and society required simulation, design, measurement, and validation to evolve from physics related to voice and data performance and then to system performance.

Societies and governments are paying close attention to 5G with a special interest in public safety, information security, and national interests. This implies design and validation requirements, not just for new physical attributes - like time-precision and jitter, but also for system-wide attributes including service level agreement (SLA) adherence and "quality of experience" (QoE).  In 6G, we can even foresee policy-driven requirements for system-level performance. An obvious example would be the government's use of a 6G network slice. A not-so-obvious example would be 6G as an integral part of automated driving or healthcare. Either of these drive strict safety, security, and reliability requirements, the enforcement of which citizens will expect of their governments.

Some of these changes are visible to us now as we help our customers with 5G technology. Of course, they all want to measure their radios or the speed of their fiber-optic systems and data centers.  However, we also get questions like: How can I validate what I am providing in my SLA with my customer?  What is causing the problems with voice quality?  How can we ensure mobile games run properly in the network and on specific mobile devices?  What level of security can be guaranteed?

6G will drive new technical demands in five major areas:

  • Next-generation radio in all bands plus the addition of bands above 100GHz.  Includes new technology to improve spectral and energy efficiency <8GHz, generational improvements in 20-70GHz mmWave, and adding Sub-THz (100-1000 GHz) for communications, sensing, and imaging.

  • Integrated heterogeneous multi-radio access technology (RAT) systems - Seamless and intelligent use of 6G radio systems with non-terrestrial networks as well as legacy wireless systems, personal area networks, and near field communication (NFC).

  • Time engineering in networks: Further reduce latency, add predictable and programmable latency for precise-time applications.

  • AI-based networking: The use of artificial intelligence (AI) to optimize real-time network operations and performance.  Also, the connectivity and sharing of pervasively distributed AI data, models, and knowledge.

  • Advanced security: Pervasive application of security technology for privacy, attack prevention, attack detection, attack resilience, and recovery in a zero-trust environment.

All but the first of these will have to be validated from the physical level to the system level. My predictions that some system-level testing will be dictated by policy requirements are sometimes dismissed by my colleagues. As mentioned above, governments around the world are engaged in intense dialog on 5G as it relates to security and national interests. Regional and community governments are developing local ordinance related to mobile device usage, cell siting, and electromagnetic exposure. Also, earlier in the 5G lifecycle than in previous generations, departments of defense are exploring the use of 5G for their needs.

If you still have your doubts about the impact of policy, consider early radio history: the universal call of distress: S-O-S (Morse code …---…) was not always the standard.  These three symbols, which were chosen because of their simplicity and ease of distinction, were standardized at the International Radio Telegraph Convention of Berlin in 1906.  The Titanic disaster in 1912 led to the standardization of not only a common distress radio channel (500KHz (λ=600m)) but also international maritime law stipulating that all shipboard radio telegraph offices had to be staffed at all times. So, we have early policy already dictating 1) message types, 2) radio channels, and 3) behavior. There are many additional examples since then, and with radio systems a fundamental part of society, we can expect to see more.

Roger Nichols is 5G and 6G Program Manager at Keysight Technologies.