Mobile Broadband Revisited

Mobile Broadband Revisited

I am working on a piece on the “National Broadband Plan” but thought it might be useful to revisit a couple of articles I wrote for RCR Wireless. Some more thoughts in the next few days.

Defining Broadband

The Vision
Mobile broadband is the network connectivity environment, where networks of different shapes and sizes collaborate to provide users unfettered access to the information they seek, the content they want to engage in, connect people in new and exciting ways, where time and distance are all but collapsed to provide access to anyone and anything, faster than the speed of thought. At least, that’s the vision.

In 1991, the late Mark Weiser of XEROX PARC, considered the father of ubiquitous computing, dreamed of an always on, always connected world in which humans and computers are seamlessly united. In 2002, my friend and coauthor Dr. Yasuhisa Nakamura, then CTO of NTT DoCoMo wrote in our book that his vision of mobile broadband is when wireless infrastructure becomes indistinguishable from air – omnipresent. It is just there without us consciously searching for it. Here we are in 2009, where the FCC is engaged in the noble task of defining broadband and various players are quibbling over a few kbps speed requirements. But as the national debate on broadband reaches a fervent pitch, one has to come back to the task at hand and figure out what defines "mobile broadband."

Defining Broadband
FCC’s current definition of broadband is stated as "The term broadband commonly refers to high-speed Internet access that is always on and faster than the traditional dial-up access." Faster than dial-up doesn’t really conjure up an image of a 21st century ready infrastructure, so, how do we go about defining mobile broadband, what benchmarks are meaningful, and most importantly, what factors would yield sustainable competitive advantage to service providers.

First of all, we shouldn’t mix wireless and wireline for some time. The inherent cost structures, economics, and physics of the two mediums are quite different. By expecting wireless to deliver wireline performance and pricing, we are setting ourselves for disappointment.

Real speeds, coverage, and spectrum
The speed of the network has long been the main benchmark for mobile broadband, esp. the peak rates possible using a given technology. For e.g. in the GSM family of technologies, GPRS roughly equates to 114 Kbps, EDGE to 474 Kbps, UMTS to 2 Mbps (stationary), HSPA to 7.2 Mbps, HSPA+ to 28 Mbps, and LTE to 100 Mbps (of course, there are differences in upload, download, peak, off-peak, min, max, etc.). However, the real-life network speeds experienced by average consumers are typically 40-60% of the peak rates. During peak traffic times, the speed drops even further.

We should be looking at the bandwidth requirements from the eyes of the consumer. Someone living in Bellingham, Wash., only cares about the coverage and the average bandwidth available to them at any given moment. What ultrafast networks are deployed in Washington, D.C., is of little interest to them. So, we need to measure coverage and consistency in performance across the nation. Also, one needs to keep the spectrum scorecard for we can deliver 100 Mbps but the spectrum required under current set of technologies is just inadequate. Hence, the benchmarks for mobile broadband need to be closely correlated to the national spectrum dedicated to mobile.

As a first step, we need to take the discussion away from peak rates to average rates and measure the average throughput at any given time across various markets. Any issues with the backhaul network will also be reflected in these numbers and thus will help us understand the state of the mobile infrastructure at a more granular level. Japan, Korea, and Australia are investing heavily in upgrading their national mobile infrastructure to stay ahead of demand. Progress in these countries will clearly serve as a guiding principle for the U.S. and other economies.

As we move into the 3.5G and 4G mobile network arena, latency (along with jitter) will start to become an important benchmark as well. Reduction in the time to fetch content enables better user experiences. An all-IP network introduces a flatter network architecture which in turn reduces the latency in the network. Better user experience paves the way for more usage and higher content consumption which in theory yields informed citizens and higher productivity.

We should also keep track of the average bandwidth being consumed by users on a monthly basis. By keeping an accurate measurement, the ecosystem can plan better. Some other regulatory agencies like the Hong Kong Telecommunications Authority regularly publish mobile data usage. While the task is much bigger in the U.S., some measure of the pace of growth is necessary for the ecosystem to appreciate the risks and the opportunities.

Next, we need to keep track of the average price paid by consumers for mobile broadband and mobile data consumption over time and the choice of providers available to consumers on a national basis. The above also needs to be measured from a demographics point of view by looking at the numbers for a wide variety of user populations. Additionally, these measurements need to evolve over time as our understanding of what’s important to the consumer changes.

Intelligent Platforms
Finally, while the debate is focused on how to deal with the data growth, little attention is being paid to how to use the terabytes of data that is being generated. In other words, there is a lot of focus on data creation but little on intelligence extraction. Most service providers are consumed by network upgrades, move from WCDMA to HSPA+ to LTE and so on and so forth but little investment is going into understanding the consumer and their mobile data behavior – how are they consuming data? what are their preferences and unmet needs? how do you tailor content, value added services, and pricing plans at a subscriber level? how to leverage mobile as a media channel? etc.
Don’t get me wrong, carriers absolutely need to build a robust network that can stay ahead of the consumer demand but they also need to continue to innovate on several key fronts. By focusing too much on network build out and too little on building intelligent platforms that can harness the power of these networks, many service providers are leaving the door open for others to extract more value out of these network upgrades. Sustainable competitive advantage can only be built by understanding the consumer better, mobile affords that opportunity. Players who are focusing on measuring intelligence of their networks are the ones who will be able to withstand emerging business threats better than those who are investing little in building out the platform. And, intelligence is something the FCC can’t regulate but consumers will see the difference.

Solutions for the Broadband World

In the last column I talked about setting the goals and defining mobile broadband. While we are still a ways away in defining what constitutes broadband, another key debate has emerged in the past few weeks and that is how do we go about the solving the increased capacity problem. FCC Chairman Julius Genachowski has done a masterful job of outlining the principles, of holding public hearings in an open and transparent manner, of creating the urgency of dealing with the broadband issue, and of embarking on a practical national broadband plan, and of getting support of his fellow commissioners and industry leaders, the four key principles being:

1. Most importantly he described the spectrum shortage as a looming crisis and that additional spectrum capacity is needed to handle the demand of data traffic from data cards and smartphones (something we have illustrated in detail in the paper "Managing growth and profits in the Yottabyte era")
2. Removing red tape to allow wireless carriers to build their network faster, for example, the work with cell towers
3. Codify and enforce net-neutrality policies
4. Open Internet

To some in the industry, the broadband capacity problem equates to the lack of spectrum. In fact, the Chairman has spoken out about the "looming spectrum crisis" in great detail on several different occasions. It is apparent that to achieve 50-100 Mbps, new contiguous spectrum is needed. However, it will be a mistake if the dominant solution for the broadband capacity crisis is more spectrum, for the following reasons:

1. There isn’t enough spectrum, especially the right spectrum
2. It takes 7-10 years to procure the spectrum for wireless use
3. By focusing on spectrum only, we will be just postponing the current crisis
4. By giving out spectrum too soon, industry won’t have the opportunity to learn to thrive within its means and let new technology and business innovation show the way to handle the increased data consumption.

Like with all tough problems, to find an effective and a lasting solution, one has to break down the problem into smaller bits and find solutions that address not only those individual pieces but the problem as a whole. We know the following for a fact:

1. Broadband data cards (external or internal) account for over 73% of the data traffic (2009)
2. Smartphones esp. with full browser and media capability account for roughly 24% of the traffic (2009)
3. There are a small percentage (< 3%) of heavy users who regularly have very high data consumption
4. Majority of the data usage takes place in an indoor environment (60-80%)
5. Video and browsing are the two biggest application categories for data consumption (accounting for over 70% share)
6. Consumers launch full applications (or browsers) to get minor updates because that’s the only way to get access to those updates on the mobile devices. Alternate strategies like the one implemented by INQMobile series of devices and Motorola Cliq are good examples of rethinking applications
7. There is no incentive for the user to change behavior on content consumption
8. To cope with the data congestion issue, all three major elements of the network need to be upgraded – RF, core network, and the backhaul. Only RF portion of the network is predominantly dependent on the spectrum allocation (while some backhaul solutions require spectrum, the direction of the industry is towards laying fiber or adopting solutions that don’t require any additional spectrum)
9. Competition breeds innovation, legacy spectrum allocation regimes might have an opposite impact
10. Doing broadcast video over cellular is not economically feasible
11. Number of devices/user is increasing, however, not all connections need high-speed real-time availability
12. True 4G bandwidths (50-100 Mbps) are not possible without additional spectrum
13. Backhaul requirements for LTE will increase in the 200-500 Mbps range within the next 5 years
14. LTE is not going to have a major impact on the data consumption problem in the short-run (2010-2013)
15. LTE smartphones might not be in the market until 2012-13

To address the data consumption issue in light of the above facts, one has to figure out a set of solutions that work in concert with each other. Just focusing on one solution only gets you so far, however, a range of viable solutions that address each of the above problem elements are likely to prepare the industry much better for the long haul. Some of such solutions are discussed below:

1. Offloading traffic without impacting the user experience or requiring user intervention. Leverage existing WLAN footprint and invest in femtocells and WLAN expansion.
2. Congestion management through caching and intelligent buffering
3. Incentivizing users to shift consumption to fill the network troughs
4. Implementing network optimization across all media and application types, especially, video and browsing
5. Adopting broadcast mobile video solution
6. Tightly integrating highly used applications like Facebook and Twitter into the handset
7. Introducing tiered pricing plans so that light users pay for broadband connectivity relative to their consumption. This will also bring in a new set of users into the broadband fold who have been sitting on the sidelines due to pricing
8. Upgrading of the backhaul capacity irrespective of LTE
9. Investing in analytics to better understand user consumption behavior at a micro level to plan appropriate strategies, solutions, and pricing plans
10. Creative bundling of data plans to bring more users into the data ecosystem.

By considering such solutions in parallel, the industry will be better off in the long-run. It is the only way to tackle the problem in the short-term since neither the additional spectrum nor the announced deployments of LTE are going to make any meaningful dent to the data usage costs and margins. Wireless is one of the industries where policy can have a significant impact on the direction of the industry. By focusing too much on the spectrum, we will miss the opportunity to cultivate a better network and business ecosystem and to invent new technologies and revenue models that will have a far stronger impact on the evolution of the mobile industry.

Links to the original articles

Defining Broadband

Solutions for the Broadband World