Dear Saturn:
I am stating that like many overhyped wireless rollouts, WiMax is following down the same tired and ultimately fruitless route. It is telling that actual attempts at making a money making business fail on a revenue versus cost basis. For example, at most WiMax will supply only 70Mb/s per covered area. Given its price ($40-50/mo), it must deliver 6 up and 0.75 up for each device in the area. That gives only 10 active users per area. Now a multiplier of how much is actually used is between 3 (2Mb average) and 6 (1Mb average) online users per active ones. So each area only covers 30 to 60 users. If a WiMax base station, associated network, tower and land lease costs $100K to $200K each, the system will not make enough money to cover the interest on the infrastructure costs much less any profit.
So even if it was technically feasible, it is a money loser. That is why such build outs are failing. Sure WiMax transmit, cell, area is measured in hundreds of square miles, but data speed density will restrict it greatly. The limiting factor is not how far the signal can go, but how many microcells can be placed in dense use areas. A free hot spot area like Cathedral Park (100K sqft) averages 200-500 users each day, but the low average link speed of 20-50Kb/s is tolerated because its free. If someone had to pay $40/mo for it, they would demand at least 1Mb/s which requires 4-5 WiMax cells to provide it. That means each cell should cover only 20K sqft.
Thus the 30 mile range of WiMax is actually a disadvantage. Now if WiMax was 70Gb/s rather than 70Mb/s, the larger cell would be less of a problem. What Cathedral Square required density would place towers just 200ft apart for WiMax, 70Gb/s would be more like 1.25 miles. 30K users paying $40/mo would easily pay for WiMax cell infrastructure. 54Mb/s WiFi only allowed 25-50 online users per cell, but since each cell only covered 70K sqft at most and could be reduced if needed, the required density is easy to achieve. Essentially one micro cell antenna per light pole was the average need. That only makes sense, if each cell including all network costs, costs $50K or less. Given that a WiFi base station is less than $100 and it has a 1GbE connection, total network costs should be below $1K each cell not including internet access. Trouble is that 155Mb/s (OC3) internet access costs $5.8K a month. That chews up $5.8K a month out of $12K/mo in total revenue (12 cells of 25 users each). Now if the city, county or state allows every pole to used and rights of way were gratis (no charge), the WiFi network might be feasible in high density areas. But given taxes, "franchise" fees, "free" web sites, web "channels" and required free services, the costs (taxes) can hack another 25-35% of the revenue off from profit resulting in remaining revenue not covering interest and depreciation costs on the equipment, installation and service.
So if cheap WiFi wasn't able to cover them, WiMax sure isn't given its not as far down the economy of scale curve. Thus without heavy subsidies, it is going to fail. That is what those early builders concluded. They had hoped that the multiplier effect was higher (30x or more) than what it turned out to be (3-6x). BTW POTS had a 25x multiplier in the old days. It was a cash cow (made a lot of money over the years).
Pete |