Monica Paolini is the founder and president of Senza Fili Consulting, a company that provides advisory support on wireless data technologies and services since 2003. At Senza Fili Consulting, she assists vendors in gaining a better understanding of the service provider and end user markets. She works alongside service providers in developing wireless data strategies, and in assessing the demand for wireless services. Independent advice, a strong quantitative approach, and an international perspective are the hallmarks of her work.
She has a PhD in Cognitive Science from the University of California, San Diego, an MBA from the University of Oxford, and a BA/MA in Philosophy from the University of Bologna (Italy). She can be contacted at email@example.com
Wireless data technologies
Monica Paolini is the founder and president of Senza Fili Consulting, a company that provides advisory support on wireless data technologies and services since 2003. At Senza Fili Consulting, she assists vendors in gaining a better understanding of the service provider and end user markets. She works alongside service providers in developing wireless data strateg
In two recent presentations, I have looked at the role of C-RAN in facilitating both small cells deployments and the adoption of RAN virtualization. These are the preliminary results of my work on C-RAN and would love to hear your views on this. With C-RAN architectures -- ranging from DAS to V-RAN -- operators can manage interference and traffic in HetNets with small cells more effectively. Operators who have to use co-channel deployments in which the macro and small cell layers share the spectrum are wary of deploying small cells unless they can keep interference under control. As the assumption that fiber front haul is necessary to support a C-RAN architecture is being challenged, the success of C-RAN and small cells becomes increasingly tied. At the same time, C-RAN architectures -- and there is an interesting variety of them -- gradually enable operators to move towards RAN virtualization. In greenfield deployments, there is already a trend towards RAN virtualization. For existing networks, the virtualization of the RAN is challenging and operators will proceed gradually. Download the presentations: The evolution to C-RAN and V-RAN to support indoor and outdoor HetNets (this presentation includes the results of a survey on C-RAN done in preparation of the RAN World conference in Dusseldorf on 20-21 January, where I will chair the C-RAN track) Backhaul or fronthaul for small cells?
... View more
The Small Cell Summit last week in Dallas was a great occasion to catch up on the latest developments in small cells -- and small-cell backhaul. My first panel was on small cells and Wi-Fi coexistence -- luckily not about offload. I try to avoid the term "offloading" as it suggests that Wi-Fi is a second-class access technology. Some operators may still see it that way, but it is not the way users see Wi-Fi. Wi-Fi carries more traffic than cellular networks in mobile devices and this indicates that is it more than an offload technology. Instead of focusing on blindly diverting traffic to Wi-Fi, in the panel we talked about how the traffic load can be distributed among wireless interfaces, how deep the integration between mobile and Wi-Fi networks should be, and on the business models that will make all this possible. It was great to have on the panel representatives of the different parts of the value chain -- vendors, operators and those who actually go and install all the equipment for the operators: Mark Grayson, Cisco; Saptarshi Chaudhuri, Wipro, Kelley Carr, Goodman Networks, and Vijai Venkateswaran, Time Warner Cable. While Wi-Fi is seen as a necessary complement to cellular networks, the relationship between the two technologies is still evolving and multiple business models may have to be tried out before we find out which ones work best. Both in the panel and throughout the conference, the role of the enterprise has become more prominent - both as a partner and as a customer for mobile operators. Small cells and Wi-Fi bring corporate networks and mobile networks closer to each other, opening new possibilities for collaboration, but also potential areas of divergence on who controls which part of the network. On the second day, I gave an update on small-cell backhaul and moderated two panels, with Roger Kim, Time Warner Cable; Phil Bull, Amdocs; Eric Bozich, Century Link; Ed Gubbins, Current Analysis; and Anthony Tramontana, Deutsche Telekom. Again the main take home message is that there is still a lot to learn -- and vendors, operators, and system integrators are all working on this to ensure smooth and scalable deployments. While an essential access technology, Wi-Fi's role in providing backhaul is marginal. But the debate on what technologies will be more widely used, and whether there will be scope for some level of infrastructure sharing or site sharing in small-cell and Wi-Fi deployments is still open. Downloads: The evolving competitive landscape in small-cell backhaul. Small Cell Summit, Dallas, December 2013 Latest developments in HetNets and SON: operators’ requirements and vendors update. Small Cell Congress, Berlin, November 2013 Making HetNets a reality: challenges and solutions Small-cell backhaul: industry trends and market overview
... View more
Understandably, data traffic gets a lot of press these days – but its relative, data demand, gets much less attention, even though it is an equally interesting metric, although inherently more difficult to measure. The limited interest in the demand dimension may be due to the fact that we implicitly assume that usage reflects demand – at least for those who have the devices and can afford paying their mobile bill. (In fact the device is the only truly necessary requirement – with Wi-Fi there is a lot you can do with a smartphone without a cellular connection; most of us are sufficiently satisfied with our Wi-Fi only tablets). So in many case the issue of demand is linked to underserved market segments. This is clearly a big issue, but not the one I am thinking about: the demand for data for subscribers who have one or more connected devices, as opposed to their data usage. If we lived in an environment where free connections are always available and have infinite bandwidth, and the battery in the handset never runs out, we could assume that usage and demand are the same for all practical purposes. But this is not the environment in which we live. First there is cost (see graph below). It is true that most of us only use a small part of our data allowance, but we could reasonably expect that if prices were lower (or plans more generous), we would use more data. This is a reason why traffic from many mobile users increases as they step into a free Wi-Fi hotspot, or connect from the home or office Wi-Fi network. Second, capacity and coverage are limited, and this constraints our ability to use data. Probably we all have given up searching for some information because the network is slow, or arrived at a destination before the map application was able to give directions, or know about a few downtown places where trying to watch a video can only lead to frustration. As cost, limited network availability creates a gap between our data usage and our desired data usage. I have written an article for FierceWireless on how to correlate data demand and data usage, but, as you may expect, there is no simple way to estimate to data demand. So let me ask for your feedback (you can comment here or email me): how big of a gap do you think there is between usage and demand? Is it sufficiently big to be relevant when, for instance, we want to forecast future usage (and expect it to be driven by current usage, but also by unmet demand)? If we were to look at our experience and that of those living around us, how much disparity is there between your current usage, and the traffic you’d generate if cost, and network availability and capacity (within the limits of what today’s technology and spectrum can deliver) did not limit your activities? Follow Monica Paolini on Twitter @monicapaolini Read other blog posts by Monica by checking out her community profile.
... View more
I am not at all suggesting that ARPUs are not useful - it is data ARPUs that I find problematic. And in fact it is the combination of ARPU and per-bit metrics that together can give a more throurough assessement of performance. For instance, in many developing countries ARPUs are very low, but so is data traffic. The per-bit metrics help to reconcile the difference (in many emerging countries the per-bit revenues are still lower, but when you look at the per-bit revenues the divergence is less sharp) - or understand where the difference comes from. From the service perspective, I think that comparing per-bit revenues can be extremely useful (although it should not be the main factor) for pricing of services or even creating them. Operators use this information all the time. For services like SMS where the amount of bandwidth is very limited, per-bit costs are not too revealing. For video or audio streaming, or large downloads, it is quite important. For example, before you offer an unlimited Pandora service you may want to find out what is the expected cost and revenues compared to an unlimited Facebook service. To do so, you need to look at per-bit metrics that tell you what is the profitability of the services for the prices that the market will bear.
... View more
In my Excel-filled days, data ARPUs are my daily companions. On one hand, you have costs, on the other revenues – that is ARPUs. And since the area of both major growth and concern, is data, data ARPU is a core metric. Until you stop and ask yourself, what do data ARPUs mean? How do they help us understand and predict how data services generate revenues and can be profitable? I have had some doubts about the usefulness of data ARPUs for a while and prefer to use total ARPUs. But I always thought that at the very least data ARPUs give us a sense of the relevance of data for subscribers. It is big news when data ARPUs go over the 50% line – as they started a few years ago in Japan and, more recently, in other countries. When you look a bit more closely at how data ARPUs are calculated – i.e., the marginal revenues from adding data services to an existing plan – you realize that data ARPUs are not only scarcely useful, they are actually misleading as they lead to an underestimation of data revenues. If you relate data ARPUs to the cost of providing mobile data services, you come to the realization that operators are squandering their voice revenues to support loss-generating data services (I have to admit that I used to think that that was indeed the case; but that was many years ago when laptop plans could not be even close to profitable). This is not what is happening – on the contrary, this way to allocate revenues does not assign data the relevance it has for subscribers. As an unintended consequence, it may even make operators less aggressive in pursuing the mobile data services. I have presented my argument in a piece in FierceWireless, so let me move a step beyond. If you buy my argument that data ARPUs are no longer relevant in a world where all traffic in a mobile networks is becoming data, then how do we allocate revenues? Total ARPU is still a useful tool to understand differences among markets, operators, or subscriber segments. But another direction that I have found more and more useful in understanding the intricacies of profitability and monetization, is to cast revenues – and revenues for specific services or technologies – in per-bit values, such as per-GB revenues. Wouldn’t you find it useful if operators started to publish such metrics? And, wouldn’t that fit nicely with the VNI, as you could then relate traffic to revenues?
... View more
Above: Monica Paolini, President Senza Fili Consulting interviews Craig Conaway, Global Mobile Backhaul Sales Lead, at Small Cell World Summit in London In the last few months, I have spent a lot of time researching and pondering on what had changed in the small cell backhaul market over the last year. In my first report on the topic a year ago, I mostly looked at the business case challenges and technology options for backhaul in small cell deployments. It turns out that backhaul is a crucial element in the small-cell business model – substantially more important than it is in macro cell deployments – and that multiple technologies have to coexist side-by-side within the same deployment to meet the mobile operator requirements. While the importance of backhaul in the business case and the need for multiple solutions has stayed the same, the vendor landscape has significantly evolved over the last year. Initially, some vendors tried to reposition their existing products for macro cell backhaul to the small cell market, but this approach delivered solutions that do not meet the operators’ requirements. Other vendors had solutions that were not ready for large deployments, did not scale, or where too expensive and too difficult to deploy and maintain. In the last year, there has been a significant shift from the vendor community in trying to address the specific requirements of mobile operators. Small-cell backhaul is much more challenging than macro cell backhaul. Equipment has to be smaller, cheaper, and easier to install and manage, but at the same time it has to provide high capacity and high reliability. Most vendors – tier-1s, niche vendors, startups – have tried to address his challenges, and have come up with an interesting array of solutions and novel approaches. As a result, the new report focuses on the evolution of vendor market in small-cell backhaul. The list of confirmed trends and emerging ones that we have identified is shown in the table below and discussed in the report. Here, I wanted to add some comments and get your feedback on what these trends mean for the vendor community and for the vendor selection process for mobile operators. With the introduction of LTE and IP-based mobile networks, mobile operators have shown in new eagerness to have multiple vendors present side-by-side in their networks. Not only mobile operators want different vendors in different markets. They also want different vendors within the same market, for the same type of equipment. This has been for a long time the case in markets like Japan, where multiple base station vendors operated in the same footprint. Increasingly, mobile operators want to adopt this model and small-cell backhaul is an area where this approach is needed because no single vendor is likely to have a wide enough portfolio of solutions to meet the requirements of a large small-cell deployment. In this perspective, small-cell backhaul –alongside with small-cell equipment, for similar reasons – may provide a testing ground for a new approach in vendor selection. The coexistence of multiple vendors for small-cell backhaul poses interesting challenges. Different solutions come with different requirements – line-of-sight versus non-line-of-sight, for instance – and different performance characteristics – e.g., capacity, latency and reliability. Not only operators have to decide which solution to install at a given small-cell site, they also have to manage a complex backhaul system within a HetNet that requires a very tight latency to coordinate transmission and manage interference. How will operators decide to move forward? Will they opt for keeping complexity at a minimum, selecting only one or two vendors for backhaul even though this will somewhat limit the performance capabilities of the network? Or will they be ready to face the additional complexity that comes from the performance-optimized backhaul system with multiple vendors? As it is often the case, the answer is most likely to fall in the middle, with some operators more willing to accept the risks of complexity and others more protective of the stability of the networks. But, in addition to this, what may change the operators attitudes is the confidence that different solutions and different vendors can be managed within a single platform. This is an area in which we have seen interesting activity among vendors. Cisco’s announcement of a partnership with multiple wireless small-cell backhaul vendors to strengthen the emerging small-cell ecosystem is a move in this direction. We have also seen small vendors working more closely with each other to offer mobile operators to locate of solutions for the backhaul needs. As expected, most of this activity comes from specialized medical vendors and startups, which face a tougher challenge to be selected by mobile operators than tier-1 vendors. But tier-1 vendors have also shown a deeper interest in working with the startups to widen their solution portfolio. At the same time, the number of small-cell backhaul vendors – either established ones or startups – has quickly grown over last year, increasing the level of competition in the market that is still limited in size and remain so for the next couple of years. Will the combination of these two factors – trend towards multivendor networks, and too many vendors – result in more aggressive consolidation among vendors, or lead to a new approach in vendor selection with mobile operators more willing to buy equipment from smaller and more nimble vendors? More on small-cell backhaul: Download the full report by Senza Fili Consulting Conversation with Ed Chang, VP Product Management, and Eric Vallone, Senior Manager, Product Management, Cisco Video of conversation with Craig Conaway, Global Mobile Backhaul Sales Lead, Cisco Small-cell backhaul vendor solutions: the evolving competitive landscape. Presentation (slides and audio). Singapore, May 29, 2013. Small-cell backhaul market update - what has changed and what has stayed the same, FierceBroadband Wireless
... View more
A white paper I just wrote on the transition to the 4G IP core for mobile operators that is sponsored by Cisco is now available for download here (and attached to this post). The paper is about how mobile broadband requires operators to change the way they manage data services. The transition to a 4G IP network will give them the tools to manage traffic actively and achieve three goals: Increase network efficiency and capacity, lowering transport costs Offer service plans that are more flexible, fair, and personalized Maximize revenues from subscribers, applications and content providers, and vertical applications Key topics covered in the paper: More efficient use of network resources. This is required in order to expand network capacity to meet subscribers' demand in a cost effective way, which will enable operators to operate profitably. More flexible, fair, and personalized service plans. Mobile operators realize they have to move beyond flat-fee unlimited plans. To improve their subscribers' experience and differentiate their services from the competition, they can add features that allow them to move beyond capped plans. Maximize revenues. Flat-fee unlimited plans are not effective at segmenting the market and gaining revenues from added-value services, because for many potential subscribers the available plans are too expensive, or do not offer the features they want. A wider choice in service plans can address the demand from these subscribers, and raise data revenues. Furthermore, mobile broadband can create new revenue streams from advertisers and content and applications providers, and facilitate the development of new business models that make mobile data services more attractive, easier to use, and more effective. Mobile operators can also gain additional revenues from vertical applications, through partnerships with MVNOs, enterprises and public agencies. I hope you will find it interesting - and look forward to your comments and feedback Monica Senza Fili Consulting
... View more
I attended the mHealth Summit organized by the Foundation for NIH in DC on October 29-30 to get a sense of the progress in the use of mobile technologies to improve health care. It is a fascinating area where mobile technology can make a real difference in the lives of many people, especially in developing countries (see a paper I wrote recently on behalf of Cisco and Intel here). It was an interesting meeting for someone like me working on the m side of the mHealth, as most of the attendees were from the health care community. This allowed me to get a better understanding of the perspective and requirements of the health care providers. It turns out that for most of the applications presented the requirements are very easy to meet. Most of the applications are still based on SMS, the only mobile data service that is truly ubiquitous. For SMS-based applications the requirements in terms of network infrastructure, devices, or user training are very limited, but the functionality that these applications offer is also limited. Increasingly rural areas in developing countries have access to wireline and wireless broadband. This will open more exciting opportunities to deliver improved healthcare through richer applications built on more intensive data exchanges and real-time video. An increasing number of trials are focusing on such technologies, but few were represented at the mHealth Summit. Despite the promise, mobile health care applications appear to be moving slowly from the trial stage to full deployments. The Summit showcased a large number of promising applications, but in all cases they were still in early trial stage: available to few users, or still in development, and with very limited information on how effectively they improve care (compared to alternative, non-mobile solutions) and on what’s their ROI proposition to governments, health care providers and NGOs. Given that SMS have been around for decades now and that cellular coverage is good in most developing countries, we should be ready to move beyond trials to wide programs that use mobile technologies, available to wide areas and covering more than just a single condition or patient group. Applications alone are not enough; they need to rely on an ecosystem that includes patients, health care workers, health care agencies, governments, NGOs, and cellular other wireless providers. Patricia Mecheal, from the Earth Institute, insisted that we need to work in concert towards a common target, just like musicians do when they play in a symphony. This is a great challenge. As Vodafone’s Paul Davey remarked, “We need to find a common language”. Mobile and broadband operators, governments and health care agencies have only started to work together and still they have a very different view of the value chain, said Davey. The development of an ecosystem with a common language is a prerequisite to move beyond trials to full, nationwide deployments. Within such an ecosystem, it is crucial to ensure that applications work across networks, devices, languages. While solutions are available, proprietary solutions still seem to dominate efforts to ensure wide scalability and interoperability, although the trend is clearly for open source, standards-based solutions. Joel Selanikio, from DataDyne and developer of EpiSurveyor, gave the most compelling presentation arguing that most applications for emerging countries fail to take advantage of software approaches that can leverage already existing tools and can scale to large deployments. Attempts at quantifying the economic and health benefits of mobile healthcare applications seem to be very sparse and uneven. Suzanne Clough from WellDoc presented an application for diabetes treatment with a well-thought evaluation process, but most presenters did not seem to have given much thought to the topic. The prevailing approach seemed to get a grant from an NGO, develop an application and showcase it in a trail; when done move on to the next trial. Perhaps not surprisingly, it was a representative of the pharma industry, Scott C. Ratzan, Johnson & Johnson, to remind the audience that we should not even get involved in a trial without developing a solid business model. NGOs and their research partners often make the case that a competitive market approach does not work in health care, and especially not in emerging markets. But why should it be the case? Health care systems in emerging market are subject to even tighter budget constraints than in developed markets and have an even greater need to deploy effective solutions. How could we get this approach to change to speed up effective, scalable deployments? Monica Paolini firstname.lastname@example.org
... View more
A lot of progress has been made on facilitating Wi-Fi accesss across multiple networks in laptops, so while it is not as seamless as in cellular networks, it is not bad. When you get to mobile devices, the quality of the experiences changes a lot, I suspect because much more of the user interface is dependent on the device (and its manufacturer). As a result, much more customization is required and the results tend to be uneven. But progress has definitely been made--think that my first UMA phone had obviously Wi-Fi, but no data! And as a result, it would not connect to any network that required login from a splashpage (no browser).
... View more
Nagesh I have been a UMA user since the day the service was launched (I used to have no cellular coverage at home - and not for the fault of the operator, I live in a challenging RF and regulatory location) and since then the service and the devices have improved dramatically. I wrote a report a long time ago on UMA before it got deployed (it was one of the reports I am very proud of because many of the predictions actually were realized) and initially I thought it would mostly be for voice (this is an area where I was NOT right!). With the growing traffic levels, data becomes important as well as an offload strategy on the SP side. In a way this is a result of the improved quality of the service and of the devices. Five years ago, if you had a data plan, you would never think of using it a home as a substitute for your broadband connection--the connection was too slow and the devices had an intimidating interface. Now, many people go home and keep using their cellular connection on their mobile devices, even if they have Wi-Fi on their phones. They just forget to switch--which means that the connection is sufficiently good for the applications they are running. This good news for the SP (i.e. good customer experience) but also bad news for network traffic levels. I believe that's the sweet spot for both UMA and femtocells--the operators can actively manage how to get the traffic across while preserving or improving the user experience. I think there is a role for both UMA (lower cost, as long as the device supports Wi-Fi) and femtocells (in areas with high density of Wi-Fi usage or for subscribers without a Wi-Fi phone) and this role is going to increase as traffic levels, wireless data penetration and congestion grow. You raise an interesting issue though--if subscribers use UMA or femtocells, they typically use their broadband connection for the additional traffic. Granted that they pay for the connection, the broadband operator (cable or DSL operator) may not be too keen to provide what it may perceive as free backhaul for the cellular provider. I think that as soon as/if UMA/femtocell traffic grows this is an issue that it will have to be dealt with (possibly through usage caps or aggeeements with the cellular operators). What tools to manage the network do you think will be crucial to ensure fair allocation of resources both among SPs and among subscribers? Monica
... View more
Fights, competition, impending defeats all make good stories. That is probably why we used to see plenty of press first on the “Wi-Fi will kill cellular” story, then on the 3G-better-than-Wi-Fi, then on WiMAX killing both Wi-Fi and 3G at the same time, and now, finally on LTE destroying all of the above (possibly with the exception of Wi-Fi—that’s a variation on the theme I have not yet seen). I find all this very boring. Every time a new technology appears, the dueling match starts until the next technology appears. And then again. This all misses the point. We do need more than one technology and different technologies often compete with each other—which is healthy—but more often they meet different needs. Would you give up your mobile phone to have Wi-Fi? Or Wi-Fi to have a mobile phone? These are not questions that you hear people debating. Their questions more likely are: Why didn’t my phone switch to Wi-Fi? How do I tell if my phone is using cellular or Wi-Fi? The key issue is not which technology is best (it usually depends on what you are trying to do, where and on which device), but how to get them to work together. For an operator, the issue is how to integrate them within their network, what business model works best, how to allocate traffic to different networks. For subscribers, it means managing devices with multiple interfaces. This is where things get interesting and where there is room for innovation. This is where we are all learning because we are used to deal with single-technology networks. Broadband operators used to deal only with DSL or cable; cellular operators only with GSM or CDMA and their upgrades. When we are at home we used the desktop, when we were out we used our Blackberry. Now this is all changing. Comcast is selling WiMAX. Cablevision supports Wi-Fi in public areas. Sprint has a WiMAX and EV-DO combined service plan. And AT&T is relying more and more on Wi-Fi. The company has seen impressive growth in Wi-Fi access in its hotspots after the WayPort acquisition. Initially AT&T Wireless tried to get into the hotspot business, but never embraced it fully. It was a somewhat independent service that was expected to generate additional revenues. It did not work out that way. It was only with the iPhone that AT&T realized that it needed Wi-Fi not to generate direct revenues, but to offload traffic and improve subscriber experience. Wi-Fi is no longer an embellishment, it meets a need. It is a much more solid approach, one that shows a deeper understanding of the technology and related usage models, and that moves the debate squarely beyond the “Is Wi-Fi better than cellular?” level. It is great to see that this more realistic and demand driven approach has paid off at AT&T and other carriers. It also a positive development for subscribers who are more likely to have Wi-Fi on their phones, now that mobile operators are no longer trying to stop that—and instead encourage it. It is only the first step though. Network congestion is not going to disappear with Wi-Fi—or with WiMAX or LTE, for that matter. The iPhone and increasingly competing smartphones are showing subscribers that there is a lot they can do on a mobile device—and they are learning very quickly as usage statics show. The brute force approach of adding base stations or creating new networks certainly helps, but eventually it is not going to be sufficient because there is a finite amount of spectrum available and the laws of physics limit the amount of information that can be sent over it. We are not doomed, but there is more work ahead. As mentioned above, there is a need to optimize coexistence and sharing of available resources—multiple networks and multiple wireless interfaces. New network architectures that use picocells and femtocells will bring a more efficient use of spectrum and increased capacity. And operators will have to find ways to manage traffic more aggressively. With subscribers demanding unlimited access to all applications at all times and worrying about privacy, managing traffic has become a very sensitive topic. Increasingly, however, this is a topic that operators have to face to improve the experience of all their subscribers—instead of protecting a few extra-heavy users. Now, this is much more interesting that talking about the latest WiMAX versus LTE battle. Contributed by Monica Paolini, Senza Fili Consulting , email@example.com
... View more