kash, I believe 5 Volts DC current is acceptable to the microcircuits. It has been a standard of +5 Vdc since the days of the first logic ICs(integrated circuits) were supplied voltage. If more than 5 Vdc is supplied to IC, it will function erratically. If more than 6 Vdc is forced into IC, excess power disipation will destroy the IC. All of early stage of design of capacitors, transistors, and resistors for ICs are limit to 5 V. I believe the electronic engineers already designed for the lifetime of chips long enough for any users. Any chip below 5 V will improve the lifetime and reliable, not decrease the lifetime and reliable. For example, Intel's older CPUs operates with 5 V and 486 DX4 operates with 3.3 V. I don't think Yousef is going to say, "have led to serious reliability problem of INTC's Older chips."
When the voltage drift or change from one value to another by one-tenth of volt(3.2 V to 3.3 V - 0.10 different) will not produce any problem, because our digital computers may represent data with a range of signal voltages. The levels are typically either 0 or +5 V.
IMHO: Normally, we would like to see the lower voltage, for example, below 2.8 V. The less voltage results in lower power dissipation for each IC - power consumption is reduced. Also lower voltage results in less stress on each transistor, so ICs can continue to pack more functions and features into new device without fear of over heating the device. Especially the notbook can significantly improve the battery life and simplify heat dissipation issue. I don't see Yousef elaborating on this area. Like I said, "There is no big deal between 3.2 V and 3.3 V" - Knock it off.
Best wishes
James |