Across Europe, electricity meters are getting smarter. Starting in 2024, every home or business consuming more than 6 megawatt-hours annually needs a meter that can communicate remotely. Sounds simple enough. But these devices are exposing a fundamental flaw in the wireless networks designed to handle them.
The problem isn't the number of devices. It's how they behave.
Narrowband Internet of Things technology—NB-IoT for short—was standardized in 2014 specifically to handle millions of sensors and meters scattered across cities. Network operators embraced it. By 2016, installations were already underway. The technology met every requirement laid out for fifth-generation mobile networks: low power consumption, deep indoor coverage, massive device density.
Yet the requirements were written for a world that no longer exists.
Traditional IoT devices wake up, send a reading, and go back to sleep. A temperature sensor might transmit once every two hours. A parking sensor pings when a car arrives or leaves. This asynchronous behavior—sporadic, unpredictable, spread across time—was baked into the standard. NB-IoT was optimized for it.
Smart electricity meters don't work that way.
These meters use connection-oriented protocols. Think of them as requiring a formal handshake before every conversation. Breaking and reestablishing that connection creates massive overhead. A meter that reconnects every fifteen minutes generates four times more network traffic in twenty-four hours than one that stays permanently connected. So they stay connected. Always on. Always listening.
And they're queried in sync.
Electricity distributors don't poll meters randomly. They query them on schedules—sometimes as frequently as every five to fifteen minutes during critical grid events. When a power distributor needs to manage load across a neighborhood, hundreds or thousands of meters receive requests simultaneously. They all try to respond at once.
The collision is invisible but catastrophic.
Researchers modeled what happens when these two traffic types—conventional sporadic sensors and permanently connected smart meters—share the same wireless channel. The analysis reveals a system under strain.
NB-IoT cells use a random access procedure. Devices choose from a limited set of channels called preambles to announce they want to transmit. When traffic arrives randomly, collisions are manageable. But when hundreds of meters all attempt access in the same instant, the math changes. Preambles collide. Devices retry. Delays explode.
The model developed for this study tracks two populations: conventional devices generating messages according to standard patterns and smart meters queried at regular intervals. The researchers built a two-dimensional framework capturing both device types, then simplified it using state aggregation techniques to make calculations computationally feasible.
Their findings are stark.
A single NB-IoT cell remains stable for up to seven hundred twenty thousand conventional devices. Add smart meters demanding permanent connectivity and that ceiling drops to nine thousand. Not nine thousand more. Nine thousand total.
The presence of permanently connected devices has a linear effect on their own delays. Expected. But their impact on conventional devices is exponential. A handful of smart meters barely register. One hundred? Delays for conventional sensors start climbing. One thousand? The system buckles.
ITU specifications require delays under ten seconds for massive machine-type communications. The model shows conventional devices meet this threshold even with one thousand permanently connected meters in the cell. But the meters themselves? They violate the requirement at just one hundred units.
The querying interval doesn't matter much. Whether meters are polled every five minutes or every hour, the fundamental problem persists. The synchronous arrival creates a burst that overwhelms the access mechanism regardless of how long the quiet period lasts.
Geography compounds the issue. Deep indoor locations—basements, underground parking garages—require more transmission repetitions to punch through concrete and steel. These extended coverage levels consume more airtime per device. In field studies from the Czech Republic, researchers found coverage distributions ranging from sixty percent of meters in good conditions to just five percent in the worst category. The better the coverage, the higher the capacity. But even optimistic scenarios break under synchronized load.
One solution exists: add more channels. The analysis shows that allocating three resource blocks instead of one brings permanently connected device delays back under the ten-second threshold for all tested configurations. But spectrum is expensive. Network operators designed NB-IoT to squeeze into guard bands—unused slivers at the edges of existing LTE channels. Tripling that allocation means tripling the cost.
The alternative is redesigning the access mechanism itself. Current protocols treat all traffic equally. They don't distinguish between a smoke detector that transmits once a month and a smart meter pinged every five minutes. Prioritization schemes could help. So could dedicated channels for synchronized traffic. Both require changes to standards, to infrastructure, to millions of deployed devices.
The urgency isn't hypothetical. Electric vehicle charging stations need similar connectivity patterns. Vehicle-to-grid communication for load balancing depends on real-time bidirectional links. Microgrids coordinating distributed solar generation can't tolerate fifteen-minute response times. The energy transition depends on communications infrastructure that doesn't yet reliably support it.
Field measurements confirm the laboratory findings. In deployments around Hradec Králové in the Czech Republic, network operators calculated theoretical cell capacity for different coverage scenarios. Under asynchronous traffic, delivery times ranged from two hundred fifty milliseconds to ten seconds. Acceptable. Switch to synchronous queries and delays stretch into minutes. The exact duration depends on how many meters respond simultaneously.
After a blackout, the problem intensifies. Meters must reattach to the network, adding eighty bytes upstream and thirty bytes downstream for synchronization. Multiply that across every meter in a neighborhood recovering power and the network chokes on its own overhead.
The model itself offers a glimpse of the solution. By tracking the evolution of both traffic types through time, it reveals when and where the bottleneck occurs. Not in data transmission—messages are small. Not in synchronization—that's fixed overhead. The random access phase is the chokepoint. Too many devices, too few preambles, insufficient time to resolve collisions before the next batch arrives.
This isn't a problem fifth-generation networks were designed to solve. It's a problem they created by succeeding too well at connecting everything.
Credit & Disclaimer: This article is a popular science summary written to make peer-reviewed research accessible to a broad audience. All scientific facts, findings, and conclusions presented here are drawn directly and accurately from the original research paper. Readers are strongly encouraged to consult the full research article for complete data, methodologies, and scientific detail. The article can be accessed through https://doi.org/10.1109/JIOT.2024.3495698






