To: Maurice Winn who wrote (21467 ) 1/17/1999 6:40:00 PM From: Clarksterh Read Replies (2) | Respond to of 152472
Maurice - But if a slightly higher chip rate does in fact slightly improve efficiency, which QUALCOMM has said should be tested in practise and the most efficient system should be adopted, then the backward compatibility to existing cdmaOne should be ignored because the colossal amount of WWeb equipment to be sold will make the existing cdmaOne equipment a mere drop in the ocean. You're assuming that there is a black and white here. The fact is that differences this small are in the eye of the beholder. For instance, everything else being equal, the higher chip rate will accommodate higher data rates. Period? No. If you have a 3.6Mbps HDTV signal then both chip rates accommodate just 1 signal, and the higher chip rate will use somewhat more battery power. Is the difference significant or is it within the uncertainty for performance prediction. Given that the performance prediction is only good to maybe +/- 20%, I would say that this difference is inconsequential. Essentially this is related to the maxim that there comes a point in every project when you need to shoot the engineers and start production. (Note that in a system vs system trade (WCDMA vs CDMA-2000) Qualcomm would almost certainly win with a margin well outside the uncertainty, but when it comes to individual parameters there has to be some arbitrariness or you will never be done.) As for As Gilder's vision of freely available spectrum gains ground, the 5MHz bandwidths, guardbands and such like will become irrelevant. While that may be true, it is also true that that particular reality isn't likely for at least another 5 to 10 years. Should we always hold off on a particular technology because we know it will actually hinder us in 10 years. If so then Intel should never have made the x86 line and GSM and CDMAOne with their imperfect vocoders etc should never have started operation, ... . I am not saying that this trade should not be made, but it should be made from an economic standpoint. Knowing that your proposed technology will be obsolete in 10 years, is it still possible to make money before it becomes obsolete, and will the cost penalty to be paid in 10 years (to revamp the system with the newer technology and suffer with a legacy system) still be economically worthwhile? Clark PS As for the issue of Gilder's vision being a reason to go with the higher chip rate. Hardly, since 4.096 will be equally obsolescent at that point. At that point people will use whatever chip rate they need, and aren't going to be tied to 3.68 or 3.84 or 4.096.