Bux - The bursty nature of TDMA does not effect, to first order, the average power transmitted. The receiver essentially needs a certain amount of power per bit. Thus, if you squish the bits into 1/2 the time then the power rate will have to double.
As for spreading, that, like TDMA, does not to first order change the amount of power needed. However, there are a lot of second order effects from spreading:
1) The receiver can pick up more of the bounces off of buildings, so the transmitter needs to transmit less.
2) Due to the fact that you are averaging over a larger frequency range, and some bounce effects are very frequency specific, the received signal behaves in a more predictable fashion and thus needs less excess power to cover the occassional bumps in received signal strength.
3) In CDMAOne the handset can autonomously decide to lower the data rate when the user isn't speaking, and this lowers the necessary transmit power. This is something that GSM could duplicate, but has not to my knowledge.
4) In a cell system there are at least two interferers. The first is other users in the same or neighboring cells, and a second can be random noise spikes from hair dryers etc (actually, PC's are a more likely source). Since this particular set of secondary effects are mostly pretty narrow relative to the CDMA signal they get spread out when the signal is despread at the receiver. Thus, CDMA needs to transmit less excess power to talk over these occassional effects. (Note that since I don't know how many and big these spikes are, it is impossible to say how much power is saved here. It is possible it is none if the driver not the spikes, but completely random noise either in the air or in the receiver (called thermal noise).)
5) ??? I'm sure I am missing some things here, but ...
Clark |