There’s a reason why the devices in our pockets are known as “smart” phones. These sophisticated gadgets have brought the computing and internet revolutions from our desktops to our palms, and every year they grow more powerful. But our handsets aren’t the only things in the mobile world evolving.
Just as our phones have changed from voice-centric to data-centric devices, the networks that connect them have undergone their own transformation. The most obvious change has been the rapid migration away from 2G and 3G networks to much faster 4G LTE technologies, making a 5 or 10 Mbps mobile connection the norm in the United States. But mobile operators aren’t just upgrading the radio technologies that connect your phone. They’re also changing the fundamental design of the network to meet these new data demands.
The basic building block of the network, the cell, is becoming much smaller and the end result of this will be mobile networks with much more capacity. That means not just faster speeds to your smartphone or tablet, but also mobile networks that can support many more fast connections, particularly in the dense urban areas where data demands are highest.
How can shrinking the size of the mobile network components make it better? Well, it helps if you understand the basics of how cellular systems works.
As usage increases on a mobile network much like the data explosion, you can shrink the size of the cell site to spread the usage over more cell sites.
Each cell only supports a certain amount of capacity, and that capacity must be shared by all of the customers connecting to that cell. Let’s say, for example, that a 4G cell has 100 Mbps of capacity. If that cell covers roughly the area of a square mile and there are 100 people surfing their phone browsers in that square mile, then you’re theoretically splitting that 100 Mbps one hundred different ways.
But instead of building a single big macro cell on a tower or building top, what if you built a dozen smaller cells mounted on utility poles or the sides of buildings in that same square mile area? Each small cell would have the full capacity of a macro cell, 100 Mbps, but each cell would only connect a handful of users. That means not only does the overall capacity of the network increase, but the individual speeds each user sees would be significantly faster because they’re splitting airtime with only a few other customers. By building denser networks, operators re-use their valuable airwaves much the same way we re-use unlicensed spectrum every time we set up a Wi-Fi access point.
So right about now you’re probably asking yourself if small cells are the answer, why haven’t operators slapped them on every light pole in the U.S.? If only it were that easy, but there are some technical obstacles standing in the way of mass-scale small cell networks.
First, there’s the issue of physics. Not all signals are created equal and the strongest signal doesn’t necessarily yield the best experience. Every time you put a new cell in a network transmitting at the same frequencies as its neighbors, you introduce interference. You can think of your connection to the macro-cellular tower like a couple having a conversation in an empty room. There’s nothing preventing the pair from understanding each other clearly. But if you suddenly put dozens of couples talking over each other in the same room, it becomes much harder to pick out each individual conversation out of the resulting cacophony.
The same goes for cellular networks. If you pack two many cells into the same limited space, you risk creating a murky soup of cross-interference. What’s more, as operators deploy small cells, they aren’t just tossing out their macro networks. These small cells will have to operate under the umbrellas of those big cells, which produces even more potential for interference.
The second issue is logistical. Cells don’t exist in a vacuum. They have to be connected to operator’s core networks with fiber-optic cables or high-speed radio links (your Wi-Fi router at home would be pretty useless, too, if your home broadband connection wasn’t backing it up). Furthermore, operators have to negotiate with city governments, utilities and property owners to mount these small cells where they’re needed.
But these aren’t insurmountable obstacles. Operators and wireless engineers are deploying the infrastructure, closing the deals and developing the interference-mitigation technologies necessary to make small cells happen. Consequently we’re starting to see dense clusters of indoor and outdoor cells installed in cities all over the world.
Notably Verizon is engaged in a very ambitious small cell project in the tech corridors of San Francisco. When the rollout is complete by the end of 2015, Verizon will have 400 pint-sized cells – each covering a 250 to 500 foot radius – lining the streets of SOMA, the Financial District and North Beach. Those neighborhoods are significant because they’re home to one of the largest concentrations of startups and tech companies in the world.
Verizon is also deploying small cells in New York, Chicago and other big U.S. cities – exactly where you’d expect to find a dense concentration of smartphone users demanding bandwidth from Verizon’s networks.
When we talk about mobile service, we tend to talk in terms of bigness: expansive coverage areas, fat data pipes and – the latest trend – large-screened phones. But as the mobile networks grow to meet our ever-growing hunger for data, they aren’t going to get bigger. They’re going to get smaller.