re: Dealing with 2G Capacity Issues (Network Optimization) Before 3G
As the 78 year old Romanian woman who has to climb a tree to get a mobile signal would no doubt agree, there is always room for improvement. >> Making The Best Of Things
James Tulloch Mobile Communications International Issue 93 01 August 2002
Mobile networks are reaching the limits of their capacity, forcing operators to look much harder at network optimisation techniques.
Remember telephone booth cramming? Troops of grinning youths, bodies improbably contorted, wedged tightly into the local phone box. Laughable, fun, and slightly naughty, the fad peaked in the 1950s. Far less amusing is wireless network cramming, which took off in the late 1990s with the prepaid boom and is still going strong as more and more people, using ever greater amounts of airtime and bandwidth, try to squeeze onto bulging mobile networks. The strain is beginning to show.
"Travel around Europe for a bit and you can experience what the lack of sufficient quality means in terms of dropped calls and bad connections," explains Patrik Regardh, an executive at Ericsson. Network degradation has become a live issue. After more than 4,700 customer complaints, Californian authorities are investigating Cingular Wireless for failing to provide adequate coverage and QoS. Last year, calls lost along the 'last mile' link cost UK operators £100 million in traffic revenue, according to research from Actix, a developer of testing and optimisation tools (Orange UK customers recently suffered a network blackout for several hours).
Previously, the routine solution was to add more infrastructure but, according to Regardh, "last year we saw 23 per cent more users in Europe and very little investment in additional capacity." Why? Operators point to prevailing market conditions. "Against the slowdown in investment we have also seen a slowdown in market growth," reasons Mike Short, vice president, technology, at mmO2. Absolute growth in terms of subscriber additions may have flattened but that doesn't mean that network traffic is reducing. And what happens if and when bandwidth-hungry data services really take off, as operators so fervently hope?
There are scarce funds available for additional hardware, not least because so much of the operators' credibility with the financial markets rests on 3G, and consequently budget resources are heavily weighted towards 3G commitments. Operators were expecting UMTS services to relieve the capacity burden but, now that commercial roll-out schedules have been pushed back, they have to squeeze every ounce of value and performance from existing 2G and 2.5G infrastructure. The addition of GPRS to the mix places further bandwidth demands on those resources and now, to top it all, operators are pitching '3G-like services' over congested 2G networks to generate interest in wireless data.
But throwing more cells, base stations and switches at the problem may not be the best technical answer. Furthermore, getting planning permission for new sites has become more awkward everywhere. Yet, if operators want to deliver new services like MMS over existing infrastructure, they will have to find ways of coaxing more capacity and efficiency out of their networks. If only there was a way to expand the telephone booth so you could fit more people in. Well, there is a way. Many ways in fact. They come under the catch-all title of 'network optimisation techniques'.
Network optimisation parameters range from site management through to RF planning, coding upgrades and data compression techniques. The array of tweaks, nips and tucks that promise to deliver enhanced network performance at reduced costs is bewildering. Rarely, if ever, are they deployed in isolation, and there are as many combinations of techniques and optimisation strategies as there are operators. Priorities depend on specific circumstances.
"If you talk to someone like SFR, a large operator with a high market share, their concern is that they have a lot of legacy infrastructure and it may not be optimised," explains Andrew Rombach, product manager for software solutions provider Anite. On the other hand, "the smaller operators, who may be third or fourth in the market, are concerned about QoS in order to win new business." So one operator may be using spectral efficiency techniques like frequency hopping and automatic cell tiering to boost capacity, another to improve quality and reduce dropped calls.
A chain is only as strong as its weakest link and in the mobile telephony chain it's the radio network. Levels of radio interference, and available channel capacity are critical to service quality. Last mile performance is where operators and their vendor suppliers are concentrating their optimisation efforts. Of course, any operator will tell you that network optimisation is business as usual: "As a network operator we are always devising more ways of getting another quart out of the pint," says John Boggis, a technical executive in the core networks division of Vodafone UK. Traditional enhancement techniques like cell splitting, discontinuous transmission and frequency hopping have been used, to a greater or lesser extent, for some time.
And they remain valid solutions for many operators. According to Hayete Gallot, GSM/UMTS system advisor at Nortel, until recently not all European operators had implemented traditional spectral efficiency techniques.
"They didn't need the capacity so they were not concerned about implementing techniques like frequency hopping. But now, a lot of them are looking at frequency hopping coupled with features like dynamic cell tiering.
By doing that they will free spectrum for GPRS, for example." For CDMA operators in the US, the old standbys like cell splitting or converting spectrum to digital are "running out of gas," according to Dr. Marty Feuerstein, general manager of product development at smart antenna developer Metawave.
"So we see them going to non-traditional solutions."
Smart antennas are just one of these innovative solutions, albeit the one which has had the most exposure. Others include measures like sectorisation, microcells, fractional loading and dynamic channel allocation, higher order modulation, and Adaptive Multirate vocoding (AMR).
But, before embarking on any optimisation programme, operators have to get to know their networks better. This sounds elementary but, like the car owner who can no longer afford to get a new model every couple of years, operators have been forced to become more familiar with the inner workings and capabilities of their networks.
During the period of helter-skelter growth it was all that they could do to throw up infrastructure to try and meet demand. "We built a core network like a wall," explains Boggis, "and we were hanging pictures on it, hammering the nails in to hang another picture as fast as we could.
The design of the wall was pitched at about 5-6 million subs and there we were going past the 10-11 million mark."
Understandably, network efficiency took second place and operators were often basing frequency allocation plans on predicted data that was going out of date almost as soon as it reached network engineers' hands. "They don't have a tuned network," says Anite's Rombach. "They are not sure where the traffic is originating...They are not sure that they have capacity in the right places. All the customers we deal with are very reactive, all looking to be a bit more proactive but feel that they are lacking the data that can really tell them what they should be doing."
In order to plug the information gap operators are now deploying sophisticated measurement and planning tools. This explains why many of the latest optimisation techniques are software rather than hardware based. There is a lot of mileage in automation software that can combine network data, drive test data and switch statistics to produce near real-time recommendations as to how to optimise. "Perhaps the most important thing we have used to improve the capacity and the quality of our network is an optimised planning tool," says Franco Pattini, a network engineer for TIM. Key to this tool is the dynamic monitoring of measurement data collected from subscribers' handsets, which travels over the AVIS interface that stands between the BSC network and the actual base station. "So we gain knowledge of the current radio quality and performance. And the results are provided as a feedback to the planning tool." Pattini claims a 10-20 per cent improvement in quality in terms of bit error rate and dropped calls.
Third party suppliers like Anite and Actix supply measurement and planning tools in abundance. The major infrastructure vendors also provide their own solutions and many operators are pressing their main suppliers to help them with the optimisation process. TIM, however, developed its software in-house and Boggis is keen to point out that, Vodafone would expect to specify, build or modify optimisation tools to its own requirements. There is a latent conflict of interest here. "Vendors always have useful views," concedes Boggis, "but their financial model is different. They want to sell us kit and we want to optimally use it. That causes a bit of a tug of war." Pattini agrees: "For this reason we have chosen to implement and use our planning tool. Not a tool from our vendor. For the same reason we have complete control of the design of the network."
Assuming that they have up-to-date information about their network traffic load and performance, what can operators do next? According to George Tsoulos, a wireless technology consultant with PA Consulting, they are experimenting with a variety of techniques although he, "would be surprised if they did anything more than taking small steps to evolve current networks."
One of these is increased sectorisation, moving away from the current tri-sectoral approach towards six sector coverage where each BS antenna panel covers 60 degrees rather than 120 degrees. In theory that should double capacity and Tsoulos believes that Orange, in London, and other European operators have been playing with six sector techniques. It is not a minor change, however, requiring new antennas, cables and frequency plans. Operators with considerable legacy infrastructure already in place, such as Vodafone and TIM, are reluctant to add new hardware. "We think this kind of sectorisation can perhaps be used with UMTS; for GSM we normally use three sector antennas for each cell," says Pattini.
Related to sectorisation, and also the standard practice of cell splitting, is the deployment of microcells, particularly suited to traffic hot spots.
Tsoulos explains that, "you could see it in terms of increased sectorisation or you could see it as taking a big cell and splitting it down into three or six smaller cells and effectively increasing your capacity by three times." Unlike macro cells, planning permission is not really an issue, although the lower transmission power means that a) a lot of cells are required and b) frequency planning becomes more complex as operators have to split frequency channels between macro and microcells. Again, the capex implications have seen operators fighting shy of widespread deployment.
"They don't want to do cell splitting and microcells because they don't have the funds," asserts Gallot. But, Regardh disagrees, arguing that microcells and picocells, arranged in a hierarchical cell structure, are attractive to operators. More attractive, he believes, than sophisticated smart antennas.
American players, however, do seem to have been receptive to the attractions of smart antennas, manufactured by the likes of Metawave and Arraycomm.
Metawave supplies a product called SpotLight, which is attached to the existing CDMA BS infrastructure and can measure shifts in the traffic load and adapt itself to them. "With a three-sector configuration we can give them up to a 50 per cent capacity improvement," claims Feuerstein.
"Flexible sectorisation lets them go up to six sectors. And we see more and more interest in this option where they can get even more than 90 per cent capacity improvement. Service providers like that because in many cases it is more efficient than cell splitting." SpotLight costs around $100,000 per cell site, says Feuerstein, compared to the $400-600,000 it costs to add a new cell site.
The antennas work by steering radiation patterns towards the user and, most importantly, steering the signal away from interference from other sources. If that can be achieved, you improve your frequency reuse and so enhance your capacity. On the other hand, incorporating smart antennas into existing networks can be expensive, complex and involve a whole chain of practical alterations to mast sites. Again, the changing of legacy infrastructure is unattractive to some operators. "We have not used smart antennas in our network," notes Pattini, "mainly because such a technology became available when the deployment of our network was very large." Perhaps smart antennas will come into their own once they come embedded into network infrastructure. Leading vendors like Ericsson and Nokia are reportedly integrating smart antennas into their products and Metawave has inked a deal with Samsung to put its antennas into CDMA base stations.
Another technique implemented in America, this time by GSM operators, is AMR, the new vocoding system designed to dynamically transition between different coding schemes to favour either error protection or voice transmission, depending on the conditions. TDMA operators going to GSM have to work within narrow bandwidth constraints, about 12.5Mhz according to Gallot.
"The key objective for operators in the Americas is spectral efficiency," and she sees a combination of AMR and frequency hopping as an optimal solution to these capacity constraints (Nortel has supplied AMR to Cingular and Voicestream). "Another key thing is that AMR can operate in two modes, full rate and half rate. With half rate you can put two communications in one time slot (providing the interference conditions are good)...So on existing equipment you can double capacity." The downside is that for AMR to be efficient there needs to be a high penetration of AMR-compatible handsets. In the US, where the GSM networks are all new and all new handsets will be AMR-enabled, that is the plan, but in Europe, where operators have more spectrum and 3G to think about, AMR is highly unlikely to be a factor.
Well aware of their spectrum limitations US GSM operators are also talking about exploring dynamic channel allocation, which would assign frequency channels to cells based on actual traffic and interference conditions for that cell rather than a predetermined model. "The problem with the planning approach that all 2G operators have used," argues Tsoulos, "is that they deploy and plan the network based on the worst case scenario and that includes quite a considerable safety margin." That translates into inefficient use of spectrum, whereas dynamic channel allocation, and its sister, fractional loading, should improve efficiency by using less channels.
Channel allocation is particularly pertinent to GPRS where the trade-off between voice and data will become a real issue for operators. Co-channel interference levels are the key metric here, according to Actix CTO Jeff Atkins, and many operators are scrutinising, right down to the time-slot level, how voice and data will interact. "The irony is that as soon as they become successful they [data services] are immediately going to create a huge problem for voice services." But, beyond collecting more detailed information, GPRS operators have yet to get beyond the troubleshooting stage. New coding transmission schemes such as CS3 or CS4 provide higher data rates, but lower error protection, and data compression techniques will be deployed. Indeed, the applications side is a whole different sphere of optimisation, including measures like TCP optimisation and IP header compression.
For now, however, operators face a serious enough dilemma as to what extent to pursue optimisation in the radio network. They don't really know when 3G will roll out in earnest and how it will perform. The operators that can successfully guide their straining networks through the capacity chasm until 3G arrives will be at a distinct advantage when it comes to maintaining customer loyalty. In a sense they have become victims of their own success.
But unless they maintain QoS on their 2G networks customers could be put off wireless data, while nothing is more guaranteed to provoke churn than an unreliable network. As the 78 year old Romanian woman who has to climb a tree to get a mobile signal would no doubt agree, there is always room for improvement. <<
- Eric - |