three year depreciation cycle for PCs?
Anyone hoping for the traditional depreciation cycle is going to be pretty disappointed. I have what was a top-of-the-line Dell workstation (about $5000 in '97); it still works just fine - for my work only somewhat slower than a modern machine. My work, software development, tends to be IO bound, but I suspect that relatively few people use a PC in ways that are going to see as much improvement as the various publicized benchmarks. My '97 vintage machine does just fine running a web browser, even watching full-screen video.
Contrast this to say 1995 when a 5 year old machine would have been unusable, not just a little slow, but unusable. Running VC++ 5 on NT 3.51 on a 486/66 with 16M of memory would blue screen 5 times a day. In 1995 we had to buy a new machine just to be able to continue to work. Today, it is going to be hard to quantify the cost of staying with older HW for another year and that is exactly what is causing IT departments to push back on spending.
I think that there's still a little room for the depreciation cycle to work in notebooks, but not much. I can't see anything on the horizon that is going to make me want to replace the A30p that I bought this year.
But that's not all that is wrong with the PC market:
1. No (mass) demand for more compute power.
2. No ability to supply more compute power: Because pipelines are so long and fragile, increased MHZ does very little for run of the mill applications which, because they are coded in the prevalent OO style consist of long chains of dereference, offset, jump (method dispatch). Bus bandwidth (and memory and disk speeds), as usual, lag far behind CPUs. And, as usual, that situation is getting worse, not better.
3. Commodity margin squeeze. You can't sell a $500 chip into a $1000 PC.
Until there's some relief on one or more of these constraints, I don't see much room for Intel to grow at rates that would justify the historical P/E of 30 - 35. |