SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Nokia Corp. (NOK) -- Ignore unavailable to you. Want to Upgrade?


To: Eric L who wrote (2173)4/3/2002 1:01:09 PM
From: elmatador  Respond to of 9255
 
German UMTS license terms to remain unchanged - regulator
By Total Telecom staff

03 April 2002

The German regulator stays firm on UMTS license conditions as others call for changes to smooth market consolidation.

The head of Germany's regulatory body has stressed that there are no plans to change the terms of the country's UMTS licenses, despite calls to amend the conditions to allow smaller players to keep their licenses if they merge.

In an interview with German financial daily Handelsblatt, regulator Matthias Kurth also rejected suggestions that the regulations prevent market consolidation, saying that the regulatory authority would not allow the mobile operators to "pass the buck."

"We cannot alter the conditions of a closed awards process after legally binding license awards have been made," Kurth is quoted as saying. Kurth argued that if companies bid more for a license than it is now currently worth, the goodwill must be written down - as is already happening with over-priced subsidiaries bought during the boom.

The head of Vodafone's German operations, Juergen von Kuczkowski, is said to have already threatened legal action should such a step be taken, Kurth told Handelsblatt.

This latest scuffle with the regulator was instigated by the CEO of France Telecom, Michel Bon, who argued that merged holders of German UMTS licenses should be able to keep both licenses, rather than having to return one as is currently the case.

Bon is said to have told Handelsblatt that the aim of France Telecom, which has a 28.5% stake in German mobile reseller Mobilcom, would be to merge with one of the smaller UMTS license holders, such as Quam, E-Plus or Viag Interkom. The new entity would then hold two licenses and sell some of the spectrum to a larger player - in exchange for much-needed cash.

That Kurth is not willing to smooth this path need not surprise the market. He has previously indicated that the office of the regulator does not wish to see spectrum become the object of market speculation.

Kurth indicated to Handelsblatt that if two companies were to merge the returned license would not automatically be awarded to another company. The first step would be to discuss the matter with all those in the market.

It is possible that a "spare" license could be auctioned among the other five license holders, thus ensuring that no player could enter the UMTS market at costs far lower than those paid by the other UMTS license holders. These all paid around E8.4 billion for their UMTS spectrum allocation.



To: Eric L who wrote (2173)4/3/2002 5:05:16 PM
From: Eric L  Read Replies (2) | Respond to of 9255
 
re: Andy Seybold on Performance of Wireless Networks

In reality, ALL of the data offerings from the voice carriers should be considered to be in beta mode at present.

Testing network speeds to notebook computers when the vast majority of customers will be using other types of wireless devices with totally different capabilities means that these tests may not be particularly useful

>> Tuning the Wireless Networks

Andrew M. Seybold
Wireless Outlook
1 April 2002

As wireless voice networks become data capable in the U.S., early adopters are finding out that data speeds are less than what has been promised.

It doesnt matter which technology is used, early results show slower speeds, longer time to connect and not-so-robust coverage in many cases.

As we continue to get reports from those who are trying out these services, it is obvious that there is still a lot of work to be done to optimize the networks.

Rolling out data services is a tough and tall order that requires balancing over-the-air data speeds with back-end capacity, balancing voice access (which pays the bills), making sure that data coverage is sufficient for customers, figuring out how to get data-capable devices into the field and providing different types of information access while trying to determine the right prices.

In reality, ALL of the data offerings from the voice carriers should be considered to be in beta mode at present. I base this statement on the fact that the data rates and performance of all of the available networks varies considerably from city to city, as well as from morning to night. Im not saying that there are major problems with the delivery of data over these networks, just that there is a lot of work still to be done. We dont have a lot of experience mixing voice and data services. No matter how well a network has been tested, it cant be properly tweaked until real customers are using it. Capacity will be added where it is most needed, and data footprints will eventually match voice footprints.

Wireless network deployment is part science, part art and part magic. I know from my years of working with two-way radio and cellular systems in a hands-on capacity that no modeling program in the world will provide network planners with all of the answers. Even today on the voice side, most network operators spend a great deal of time driving their territories testing and retesting their networks and their competitors networks.

When it comes to data services, the task is even more difficult since the number of data users is still small and the systems have only been up for a short period of time. Therefore, it is more important than ever for wireless operators to be real about what they can offer, where. The systems are nearly ready for prime time and are beta testingproviding service to real customerswhich is the final step in refining the systems. Of course, as networks expand and new users come online, there will be ongoing adjustments as with voice systems.

The Issues

The press and analysts are being given data devices so they can go out and poke around the network. This is both good news and bad news.

The good news is that these folks (ourselves included) will really work the networks to find out exactly what they will do, how good their coverage is and what their data speeds are.

The bad news is that these folks will write about their findings, talk about them, and in many cases, declare to the world that both 2.5 and 3G data speeds are not as advertised.

I have talked with many of these folks and receive daily emails from them.

Their methodology for testing revolves around a simple test that only determines data speed. After installing a wireless modem in a notebook, they connect and then visit one of the many Internet sites that provide speed tests.

2Wire.com seems to be one of the most popular sites, and different sites can yield different results. They run the test and record the results, repeating them in as many different locations and at as many different times of the day as possible. From what I can tell, the results have been underwhelming. Not a single network that is up and running today is getting anywhere near passing marks using this evaluation method.

At issue here is that these tests are exercising the entire end-to-end connection and not the wireless link itself. Some believe that this is a valid test since it is, after all, the end-to-end performance that we want to measure. In reality, there are too many variables in the chain for these tests to be truly valid. Even so, the industry is going to have to deal with hundreds of articles over the next few months citing their results. I have yet to see any results that show any network in a favorable light. Unfortunately, the industry has made speed the issue of 2.5 and 3G data services so speed is what the press and analysts are focusing on.

Perhaps its time for a neutral organization such as the CTIA to step up and provide standardized tests. Most of the testing web sites have been set up to prove or disprove speeds over wired links (dial-up, DSL and cable, for example) and they dont take latency in wireless networks into account. But even if we came up with a set of testing guidelines for these folks and corporate IT folks who want to run their own tests, differences will still be encounteredInternet speeds vary depending upon loading, time of day and routing. Even so, we might have more reliable test data if we established a web site designed to measure WIRELESS data throughput and set forth guidelines for testing the networks and services.

The industry should also publish a paper on a testing methodology that takes into account the type of data being measured, suggesting that the tests be run on both wired and wireless systems in order to obtain a true comparison. Further, the various compression technologies should be taken into account as well as raw data rates.

The computer press, especially, are novices when it comes to wireless data performance. The only way they know to measure networks is to use the tools that are provided for speed comparisons. Lets spend some time, as in industry, helping craft a new set of tools for them and make sure they understand the differences and similarities between wired and wireless communications.

Some may believe that wireless networks should be judged using the same tools as wired networks since the wireless industry is trying to deliver comparable data speeds.

But testing network speeds to notebook computers when the vast majority of customers will be using other types of wireless devices with totally different capabilities means that these tests may not be particularly useful.

However, many of the folks using these tests are preparing to write some pretty negative reviews of their experiences. The industry needs to work together to make certain that those who write reviews understand all of the issues and dont simply dismiss what the industry has accomplished thus far. <<

- Eric -



To: Eric L who wrote (2173)4/3/2002 5:08:23 PM
From: 49thMIMOMander  Respond to of 9255
 
Well, one could put it more dramatically

- consider the case that when your neighbor starts booting his/her PC, your PC will stop working,
freeze and crash, and additionally his/her will not boot.

That is, IBM/DEC/SUN actually tried to make computers which communicated, while DOS-WinCrash
was more optimized for personal, isolated wxnking. (nothing wrong with that, as long as neighbors
are not too much disturbed, paying their utility bills or whatever)

The telecom history is littered with "point-to-point" solutions which never worked, spread except
between those two points, this was actually the most important factor in why CCITT, now ITU, took
over from Bell-copy-me-if-you-can-standards as well as "defacto" standards (typical examples
modems, especially fax-modems, T1, 56-64kbps, etc,etc, roaming GPRS handsets just the latest)

That is, will be interesting to see how Microsoft will adapt to having to "connect", not just
with itself, locally, but also to others.

I am "forced to admit" that I had to switch from using ethernet along a common coaxial
cable to the less demanding point-to-point plus hub system, tired of searching for that
one ethernet card which clogged up the whole common cable, better to isolate every
user in his/her own little disconnectable cable.

Microsoft must have fun dreaming of a day they introduce a new words-feature, so
that one upgraded user will cause all non-upgraded to crash, within 20 miles, and
globally if interconnected.

Anyway, they missed out on both ethernet and internet, even "WinModems" and are still
struggling to make anything crashproof, or at least fail-safe, handling multi-tasking,
simple audio and video, and especially in cleaning up the dirt of long time ago closed
processes, without an unlimited supply of memory and regular reboots.

But who knows, maybe they can re-educate all of their managers and engineers??

Ilmarinen

Well, I have always enjoyed how they have handled the BIOS, VGA, SCSI,etc guys who
they trust to do the tough (embedded) stuff, to interface to something else...

Btw, win2000, XP users should bless VMS every fourth hour they get by without rebooting,
but it was tough to get the simple basic ideas of VMS even partially moved into WinCrash.
(the reason VMS was so popular was that it crashed less than IBM, some smart solutions)

And all along, Intel have tried to (almost) provide all the hardware, CPU functions to at least try
to build a functioning system, WinCrash-and-Doze just decided not to use most of them. (That SPOX thing
was even worse, really funny for a DSP guy)

Anyway, in terms of motherboards, the history is also littered with "strange" motherboards,
but luckily they just behave strangly in the provacy of their own little board (except if
the integrate stuff to make them connect to other motherboards)

Hmm, listnening to CNBC, funny how they get disturbed with funny ringtones these days,
even some funny rythmical RF-jazz salutes, the ones coming from active GSM handsets
and lousy audio-studio-equipment.