The first and foremost thing is conventional wide beam based cell sector covereage is not used. The cencept like cell center, cell middle and cell edge is gone to some extent. The previous generations cellular systems typically use three sector (alpham beta and gamma) 120 degree wide beam sector coverage.
The disadvantage of the wide beam is if it wants to transmit a signal to a mobile terminal in a particular direction it will transmit the signal to cell sector wide, which affects the link budget and introduce the interference.
In 5G NR, these beams are formed by analog beam-forming technique, but for the data transmissions 5G system dynamically uses analog or digital or the combination of analog and digital beam-forming called Hybrid beam-forming technique.
As the cell coverage is beams based, a mobile terminal in the 5G cell will sync, attach and report from a beam. The mobile terminal will only connect to a single beam, multi-beams connection is not supported in 3GPP Release 15.
The beam management is nothing but a procedure with set of phases like,
(a) Beam sweeping
(b) Beam measurements
© Beam determination
(d) Beam reporting
(e) Beam failure recovery
(a) Beam sweeping:
Beam Sweeping is a technique to transmit the beams in all predefined directions in a burst in a regular interval. For example, the first step in mobile terminal attach procedure is Initial Access, which is to synchronize with system and receive the minimum system information broadcast. So a “SS Block” carries the PSS, the SSS and the PBCH, and it will be repeated in predefined directions (beams) in time domain in 5ms window, this is called a SS burst, and this SS burst will be repeated in 20ms periodicity typically. Below diagram illustrates the concept.
It’s understandable that above illustration of 20 beams based cell sector coverage diagram (in the previous section) will not have fixed beams (always on) with reference signals and synchronization signals, it’s just for visualization. So it’s clear now a 32 beams Nokia gNB will transmit 32 SS blocks in different predefined directions (beams) in regular interval, the set of directions covered by the SS blocks may or may not cover the entire set of predefined directions available. The maximum number of predefined directions (beams / SS blocks) in the SS burst set is frequency dependent, like up to 3 GHz its “4”, from 3 GHz to 6 GHz its “8”, and from 6 GHz to 52.6 GHz its “64”.
(b) Beam measurements / © Beam determination:
In IDLE mode the measurement is based on SS (Synchronization Signal), and in the connected mode it’s based on CSI-RS in DL and SRS in UL. The CSI-RS measurement window configuration like periodicity and time/frequency offsets are relative to the associated SS burst. The best beam needs to be searched periodically, by using the SS and CSI-RS measurement results. Like SS blocks, CSI-RS will also be covered using beam sweeping technique, considering the overhead in covering all the predefined directions, CSI-RS will be transmitted only in the subsets of those predefined directions (beams), based on the locations of the active mobile terminals.
The SRS in UL is similar to LTE spec, the mobile terminal will transmit the SRS based on gNB directions and gNB will measure SRS to determine the best UL beam.
The DL beam is determined by the mobile terminal, the criterion is the beam should be received with maximum signal strength above a predefined threshold.
(d) Beam reporting:
In IDLE mode, after the mobile terminal selected a SS block (beam), for that SS block there is a predefined one or more RACH opportunities with certain time and frequency offset and direction (special to this SS block only), so that the mobile terminal knows in which transmit (UL) beam to transmit the RACH preamble. This is a way for mobile terminal to notify the gNB which one is the best beam. The gNB (transmit/ receive point, TRP) will be indicated to the mobile terminal in the system information, there is a one to one mapping between beam sweeping (SS block) blocks. The UE will send PRACH preamble in the UL SS Block corresponding to the DL SS Block in which the best Signal strength is detected.
Below diagram illustrates the Rx beam to Tx beam mapping during initial access in 5G NR.
In connected mode, the mobile terminal will provide feedback using control channel, in case of link failure and no directions can be recovered using CSI-RS, the mobile terminal will try to recover the link using the SS bursts.
(e) Beam failure recovery:
When the mobile terminal is suffering from poor channel condition, it will get it as a beam failure indication from lower layers. The mobile terminal will request for a recovery by indicating a new SS block or CSI-RS, this will be done by starting a RACH procedure. The gNB will transmit a DL assignment or UL grant on the PDCCH to end the beam failure recovery.
Beam Scheduling / gNB MAC Scheduler:
The beam based cell sector coverage (means the beam-forming only transmissions) enable the new way of scheduling, the same time/frequency resources can be reused and scheduled in more than one beams simultaneously within a TTI. And the massive MIMO system enables scheduling the data transmission as a 3D beam-formed and MU-MIMO transmission.
The mobile terminal distribution will be either consolidated in the single beam or in different beams. So the gNB MAC scheduler should schedule more than one beam in a TTI. There is a restriction in scheduling the number of beams in a TTI, this is due to the antenna design, basically due to the phase shifters in the AAS (Active Antenna System) and the number of TXRU (Transceiver Unit). And one more complexity is, gNB should schedule N set of beams per TTI among the 32 beams, this complexity is described in the above mentioned Nokia article, there will be 30,000+ different combinations from in which N set of beams will be selected.
We know for DL, mobile terminal will report CSI-RS based beam quality report, interesting point is there is a good chance that different mobile terminals in single beam will report different CSI report based on their channel quality. And one more point is data is also transmitted using hybrid beam-forming technique, means it is not necessary to send the data only in predefined analog beams, it can transmit in new direction using narrower beam than analog beam.
In LTE typically Proportional Fair (PF) scheduling will be used, PF scheduler will consider the channel quality, achieved rate till now and the maximum achievable rate by a mobile terminal to schedule. If we apply PF scheduler in 5G NR system with CSI-RS based beam quality report, it will not solve the problem, the scheduled mobile terminals will be for example in ten different frames, which is not possible to schedule in 5G NR system. So the point is gNB should consider the beams to schedule (based on beam quality) as well as the proportional fairness to the mobile terminals and in the selected mobile terminals at-least two or three mobile terminals should be a good combination for MU-MIMO transmissions to achieve optimal rate. So it’s a three dimensional problem for a gNB scheduler with number of beams (based on beam quality), proportional fairness to the mobile terminals and optimal rate from MU-MIMO transmission.
Wireless carriers around the world are pushing to bring 5G service to their customers as quickly as possible, but the new radio access networks—which will rely on emerging technologies, including millimeter waves and huge antenna arrays known as massive MIMO—will be a lot more complicated than what came before.
Nokia is applying machine learning to some of the problems that result from this complexity, hoping that artificial intelligence can boost network performance and cut costs, Rajeev Agrawal said recently during a 5G summit at the Computex trade show in Taipei, Taiwan.
Agrawal, who is in charge of Nokia’s radio access network offerings, presented three possibilities for machine learning and 5G that Nokia has studied internally but not yet published in academic research papers.
Scheduling Beamforming in Massive MIMO Networks
In a MIMO (multiple-input multiple-output) network, cellular base stations send and receive radio frequency signals in parallel through many more antennas than are normally used on a base station. This means the base station can transmit and receive more data, but these signals also interfere with one another.
Beamforming is a signal processing technology that lets base stations send targeted beams of data to users, reducing interference and making more efficient use of the radio-frequency spectrum.
One of the challenges in building these systems is figuring out how to schedule the beams. Nokia, for example, has a system with 128 antennas all working together to form 32 beams and wants to schedule up to four beams in a specified amount of time. The company also wants to schedule those beams in a sequence that will provide the highest spectral efficiency, which is a measure of how many bits per second a base station can send to a set of users.
The number of possible ways to schedule four of 32 beams mathematically adds up to more than 30,000 options. There’s simply not enough processing power on a base station to quickly find the best schedule for that many combinations.
Nokia says it was able to train neural networks how to find the best schedule offline, and then later quickly predict the best schedules on demand, although the company did not provide data to back up their performance or allow comparisons to other possible heuristics.
Another way to make more efficient use of spectrum in 5G networks is to install miniature base stations, or small cells, that can deliver wireless service closer to where customers are physically located. This can also help carriers solve another problem—finding the location of indoor objects, such as sensors or smart speakers in a home. GPS signals can typically identify an object’s indoor location no more accurately than within about 50 meters.
Agrawal said a small cell network’s radio-frequency data can be used to train a machine-learning algorithm to infer the positions of network users’ equipment. A slide from his presentation claimed mean positioning errors of 1 meter (m), 1.3 m, and 0.9 m using LTE eNB radio-frequency data from cells on different floors of a mall in China.
Nokia’s approach is to first, for multiple points in a room, measure the received signal strengths from each cell. Then, the company uses these maps to train neural networks to predict the location of a device based on the strength of the signals it receives from nearby cells.
Configuring Uplink and Downlink Channels
In order for a smartphone to work properly on a cellular network, engineers need to effectively configure the size of that devices’ uplink control channel, which transmits feedback on network quality. The more spectrum the uplink control channel uses, the better the quality of data transmission could be from a customer’s smartphone —but this also means that there is less spectrum available for data transmission. It’s a trade-off.
There are already techniques to automatically make this trade-off decision for 3G and 4G, but Agrawal said it is a “very important problem as we go to 5G” in part because the uplink control channel data will be more “rich.” For example, it could carry important information on the beams in a massive MIMO network.
Agrawal said a machine-learning system would first predict user equipment characteristics, such as mobility. Then, the system would make a prediction about what the uplink/downlink throughputs would be, against different settings, and pick the best setting.
Agarwal said he’s “not trying to say all of these [applications I presented] are right,” but to him, machine learning will be a key part of 5G networks.
Editor’s note: This story was updated on 13 July 2018 to correct the mean positioning errors for Nokia’s experiments.