The key to early 5G investigation – test tools for new network paradigms
The explosion of wireless data demand continues, with analysts predicting between 20 and 50-billion devices by the year 2020, ranging from M2M devices that transmit a few bytes per day to applications that stream multiple high definition video channels. Studies into future user demands give network operators the goal of creating an infrastructure that provides the impression of limitless capacity in any situation, including venues such as sports stadia and concerts where there are dense user populations. Figure 1 shows what this might mean for network development. (*See Figure 1).
To date, the biggest factors that determine a user’s experience (and therefore, perception) of the network are the coverage, the density and the number of available channels in any one location.
The requirements of the next-generation network differ depending on the device and service being considered. As examples, high-volume data users’ requirements center on connectivity and latency, with little consideration for efficiency and (in many instances) mobility. In contrast, IoT needs focus on reliability, cost and efficiency and public safety needs focus on immediacy and availability almost regardless of cost. Given that some of these requirements are diametrically opposed, it would appear that a ‘one size fits all’ network is unlikely.
From the telecoms operator’s perspective the goal must be to deliver services at price where the value to the customer exceeds the cost of providing them – in other words, making a return on investment in the network installation and running costs. To this end, many operators are moving away from their historical business of mobile voice, text and data provision, and moving (through organic growth, acquisition or industry alliances) to a ‘quadruple play’ model, providing fixed telephony, mobile voice and data, home broadband and streaming video, with the intent of being their customer’s integrated communications provider.
To support this model, the latest studies postulate the key network attributes that will be required: an integrated wireline/wireless network, where the wireless part comprises a dense network of small cells with capacity enhanced through high-order spatial multiplexing (MIMO), cell data rates of the order of 10 GB/s, and round-trip latency of 1ms. Most studies now assume multiple air interfaces, which will include current and future cellular standards evolution, extended WiFi integration, and operation at microwave or millimeter-wave frequencies.
While the network must support the successors to today’s smartphones and tablets and their data-hungry applications, some researchers are also concerned with the other end of the device spectrum – the battery powered sensors and actuators for the IoT; simple devices that transmit very little data, and are designed to have an unattended operating life of years. Designing and developing this type of device requires specific tools that can measure battery drain under three main conditions: ‘sleep mode’, when the device is completely inactive; ‘idle mode’, when the device is active but not transmitting; and ‘transmit mode’, when it is sending data. Current consumption for these devices is random in nature, and ranges from nano-amps in sleep mode to milli-amps in idle and amps in transmit, over a very short time with very steep rise and fall times (*See Figure 2).
In the mobile world, capacity gains come essentially from three variables: more spectrum, better spectrum efficiency and better frequency re-use through progressively smaller cell size (Figure 3). The fourth generation networks currently being built use more frequency bands than previous generations and can use both broader channel bandwidths and aggregated carriers – the combination of two or more channels at different frequencies to further increase capacity, but today’s mobile network operators are convinced that existing spectrum is inadequate to meet capacity forecasts. (*See Figure 3).
Therefore, in parallel with developments in cellular networks, new variations of the IEEE 802.11 wireless LAN standard bring new capability to shorter range communications. 802.11ac is an extension of 802.11n, providing a minimum of 500Mb/s single link and 1Gb/s overall throughput, running in the 5GHz band. 802.11ad provides up to 7Gbs throughput using approximately 2GHz of spectrum at 60GHz over a short range. The goal is for all variations of the 802.11 standard to be backward compatible, and for 802.11ac and ad to be compatible at the Medium Access Control (MAC) or Data Link layer and differ only in physical layer characteristics. Devices could then have three radios: 2.4GHz for general use, but which may suffer from interference, 5GHz for more robust and higher speed applications, and 60GHz for ultra-high-speed within a room – and support session switching amongst them. 802.11ac supports channel bandwidths up 160MHz, denser 256QAM modulation, and up to 8×8 MIMO to maximize capacity and both 802.11ac and 802.11ad support beamforming, to allow system resources to be focused on individual devices to improve transmission reliability. Mobile network operators worldwide are forming alliances with public WiFi providers to offer data session offload and WiFi calling; the former being a means of augmenting network capacity, and the latter being a means of extending voice network coverage to ‘not-spots’ where there is no cellular service.
Many of the investigations into next-generation cellular technology are focused on overlaying today’s mobile networks with a dense network of small cells in new spectrum at millimeter-wave frequencies, where multi-gigahertz modulation bandwidths are possible. Such a combined network would behave in a different way to the ‘traditional’ 1 to 6GHz RF frequencies and 10 to 20MHz bandwidths of today’s cellular and WiFi systems. A number of universities and R&D departments of network equipment manufacturers have been investigating possible network topologies and corresponding signal transmission properties. Various studies have looked at frequencies from 28GHz to the lower end of E band (which covers 60 to 90GHz). They are also investigating the effects of multiple spatial streams (MIMO) and beamforming, both of which require arrays of transmit and receive antennas. In addition, a number or government and quasi-government organizations around the world are sponsoring research, with the intent of moving forward with a global approach to next-generation network infrastructure. The results of all these investigations will feed into the standards-setting process for so called 5th generation networks – 5G – which will formally start around 2015/16.
These investigations break new ground for communications systems, and will require new test solutions. For example, an antenna array component at 60+ GHz is small, and may be bonded directly to the transmitter power amplifier output or receiver input it serves, so no metallic connection to it is possible. Creating a repeatable means of calibrating a test system’s connection to the component and determining its true performance is a new challenge for test equipment suppliers. This requires network analyzers that operate at millimeter-wave frequencies and innovative solutions to providing known standards that can be used in their calibration.
Investigating how wideband transmissions at these new frequencies behave in ‘real life’ requires new tools for generating and analyzing them. Some of the experience of developing components for 802.11ad Wireless LAN is feeding into this development. See Figure 4 for an example of a 60GHz test bed for analyzing 2GHz BW signals. (*See Figure 4).
Before a new air interface standard has been agreed, early investigation into transmission performance using various modulation types and densities can be done using a waveform creation software
application, a wideband arbitrary waveform synthesizer and a millimeter wave generator as an upconverter. The waveform creation application should have a clear and comprehensive user interface that allows a wide range of complex test signals to be created quickly and easily, and to be loaded into a compatible generator. Waveform creation software such as Keysight’s Signal Studio can be used, along with a wideband arbitrary waveform synthesizer, to create the required baseband waveform, then the resulting signal can be up-converted to the millimeter-wave frequency required. Later in the development, there will be a need to create and transmit fully-coded signals that can be demodulated by prototype receivers. Again, multiple signals will be needed to test multiple-stream scenarios. End-to-end testing using a system simulation tool such as Keysight SystemVue provides the option to create an ideal system that includes multiple transmission paths, then to replace block diagram elements with real components and corresponding real-world measurements as components become available. Multiple sources, emulating real-world overlapping networks and interference from other spectrum users, will be needed to fully characterize performance in a hostile spectrum environment.
Microwave signal simulation, generation and analysis is not new to Keysight – the company has been involved in making test equipment for millimeter wave components for radar and communications systems than range up to over 100GHz for many years. To ensure Keysight has the simulation and test tools for leading-edge product development, the company has representatives on the relevant standards bodies and industry forums, where it is involved in developing measurement science and test methodologies to give the greatest insight into device and system performance.