Cellular IoT Power Saving Techniques: How Do The New Cat-M1 and Cat-NB1 Protocols Achieve Ultra-Low Energy Consumption?

As the competition over the protocols that will define the Internet of Things unfolds, 3GPP’s Cellular IoT (C-IoT) is definitely a strong contender. 3GPP Release 13 includes significant enhancements targeting C-IoT use cases. These are typically characterized by low-volume data transmissions that are not particularly sensitive to delays, for example smart grid sensors or asset trackers. Power consumption, on the other hand, is critical in these use cases. Some scenarios require deployment of battery-powered IoT devices for ten years without maintenance, meaning that one or two AA batteries must last at least a decade without recharge. To deal with these scenarios 3GPP Release 13 introduces two new User Equipment (UE) categories for IoT connectivity:

  1. Category-M1 (eMTC) – specified to provide a variable rate up to 1 Mbps in 1.4 MHz narrowband
  2. Category-NB1 (NB-IoT) – specified to provide <100 Kbps in 200 kHz narrowband

Standard-Defined Power Saving Techniques

To address these rigorous power-consumption requirements, the specifications for the new categories take advantage of some existing power saving techniques and introduce new ones.  All together, these  techniques increase UE battery life, up to ten years or more while supporting extended coverage by 15 to 20dB with respect to regular cellular services. We will review here the most important of these power saving techniques currently specified within Cat-M1 and Cat-NB1.

  1. Power Saving Mode (PSM)
    The UE PSM was introduced in Release 12. To maximize the downtime of the UE, it performs periodic tracking area update (TAU) after which it stays reachable for paging during a configurable window of idle time. Once the window of idle time passes, the device becomes dormant and is unreachable until the next periodic TAU. This power saving method is especially important for use cases that require sparse periodic reporting, for example once a day.
  2. Extended Discontinuous Reception (eDRX)
    Discontinuous reception (DRX) specified a sleep period of up to 10.24s between paging cycle, to reduce power consumption. The new extended version, eDRX, now enables the UE to sleep a predefined number of hyper frames (HF) of 10.24s before becoming available to receive traffic from the network. The maximum number of HFs that a device can request  adds up to about  40 minutes of extended sleep for Cat-M1 and almost three hours for Cat-NB1.
Depiction of extended sleep time using eDRX

Depiction of extended sleep time using eDRX


  1. A lower UE power class
    The new spec defines a power class of 20dBm, compared to 23dBm in legacy LTE. This feature reduces power dramatically since PA power is by far the main power driver in Cellular IoT system but it also reduces overall BOM by allowing the integration of a power amplifier in a single chip CMOS solution.
  2. Reduced reporting based on limited mobility
    This optimization takes advantage of limited mobility scenarios. These are cases in which the UE is assumed to be either stationary or moving at a very low speed. Under these assumptions, the specification allows relaxation of neighbor cells measurements and reporting period. This reduces processes that require RF resources and consume power.
  3. Upper layers optimizations
    1. Minimize the signaling overhead
      Simplified signaling reduces traffic between the UE and the base station  which saves a significant amount of power by reducing the “RF on” period.

      1. User Plane CIoT EPS optimization
        This feature allows Radio Resource Control (RRC) connections to be suspended and resumed. In the legacy LTE, a new RRC context must be established when the UE moves from IDLE mode to connected mode. Reestablishing the context only takes about a hundred milliseconds, but for IoT devices that only send a few bytes, it is a very significant overhead. Hence, a method has been specified to preserve the context of an RRC connection and suspend it on the mobile device and network side rather than releasing it.
      2. Control Plane CIoT EPS optimization
        In LTE the signaling plane is used for management tasks such as authentication and communication establishment, while data is transmitted over the user plane. For high-data-rate transmissions, the overhead of setting up a connection on the signaling plane may be negligible. But, when dealing with extremely small data packets, setting up the connection for a short transmission has a significant impact on the overall communication time, and hence on power consumption. To reduce this overhead, Control Plane CIoT EPS optimization specifies a way to include user data packets within signaling messages. By including the payload in the signaling message, the overhead is eliminated.
    2. Robust Header Compression (RoHC)
      This optimization also targets small, periodic data transmissions. Since internet protocols (such as UDP or TCP) require headers, they significantly increase the size of a typical IoT packet of a few hundred bytes. Again, as in the previous optimizations, the overhead also increases power consumption (“sending more bits require more power”). RoHC (which is typically used by VoIP applications) is very useful in such cases. By compressing the header, the overall transmission is significantly smaller leading to drastic reduction in power consumption.

Additional Power Saving Optimizations for UE Designers

The above optimizations are built into the 3GPP standard to enable significant power reduction for IoT use cases. In addition to these, the implementation and algorithms deployed by the UE designer also have considerable influence on the overall power consumption. Here are some examples of optimizations that can further reduce power consumption.

  1. Repetition “Early termination”
    One of the main methods for enhancing coverage (which was one of the targets of 3GPP) is repetition. A single transport block (TB) is repeated several times, and the UE can combine the received TBs to increase the signal-to-noise ratio (SNR) and decode it properly. Since the granularity of repetition and ability to predict the required number of repetitions is limited, 3GPP made sure that the standard supports the option of “early termination”. In this case, the UE can attempt to decode the TB before the repetition is over. Once decoded correctly, the UE may close the RF and stop decoding the remaining repetitions. By implementing superior algorithms on the receiver side, the UE will be able to decode faster, and close the RF earlier, thus saving power.
  2. Cell synchronization
    Syncing on the frame timing occurs each time after long sleep duration (based on PSS and SSS sequences). In this case, UE might require tens to hundreds of milliseconds to synchronize, depending on coverage conditions (Long sync means long “RF On” period just to prepare for wakeup). Sophisticated algorithms can reduce sync duration and make a significant difference between designs with a large impact on the overall power consumption.

Clearly these optimizations designed for small, periodic data traffic which are the essence of C-IoT UE requirements should be handled differently than high-data-rate use cases. Scenarios that stream heavy data payloads do not care  about small headers or short sleep time, because they  become relatively insignificant. Conversely, when dealing with tiny devices that are deployed in huge numbers to support smart city infrastructure, smart grids, and many other uses, the little things add up and every milliwatt counts.

An Efficient Protocol Must Run on an Efficient Architecture

Of course, all these protocol and algorithms optimizations must be implemented on an architecture that can take advantage of all of them to reduce power as much as possible. In much the same way that 3GPP analyzed the specific requirements of C-IoT use cases and adjusted the protocols accordingly, the device must also be designed with a deep understanding of the use cases and protocols to maximize efficiency. The architecture must be designed with ultra-low cost and ultra-low power in mind.  The CEVA-X1 IoT processor, based on the NEW CEVA-X Architecture Framework is the first of its kind to deliver both DSP and CPU functionality in a single core at uncompromising performance and ultra-low power consumption. Learn more about it by clicking here.

, , , , , , , , , , ,

No comments yet.

Leave a Reply