To: Daniel Schuh who wrote (18674 ) 4/21/1998 4:17:00 PM From: Daniel Schuh Read Replies (2) | Respond to of 24154
Send in the clones news.com This one is on the subject of my favorite off-topic dead horse, Intel and sub-$1k PC's. This guy likes AMD, and has some interesting history. I can't comment too much, I wasn't following closely at the time.When AMD announced recently that it had solved its yield problems, the stars were finally in formation for its return. AMD currently is in the midst of the steepest ramp-up of microprocessors in history, from 1.5 million units this March quarter to an estimated 11.5 million units by the end of the year. Remember that AMD had a market share of about 30 percent in microprocessors in the not-too-distant past, while the combined current non-Intel share is below 10 percent. So the upside potential for AMD is huge. Indeed, in the words of the AMD management: "We are back!" I don't know, but I'd say that Intel and everybody else in hardware is going to continue to squeezed by the advent of the sub-$1k PC. About time, I say. K6-233's are sub-$90 on the street, and that's plenty fast enough for most things, till NT5 anyway. On the other hand, my old buddy Tom Pabst of Tom's hardware page actually has nice things to say about the Celeriac. He's got a point of sorts, the architecture/cache issue works out to be basicly the same as for socket7, and Celeriac's P6 core is a good part. Also says it's a champion overclocker, he cranked it right up to 400mhz/100mhz bus. See tomshardware.com . Makes sense, same core design as the PII, parts come off the same line. Might even be the exact same part with some strategic wires left disconnected. I always said Intel's got good engineering. But, Tom's into 3d gaming, which is fine, but not exactly what drives most purchases I imagine. And, going with the Celeriac buys you a new proprietary lock and apparently a non-upgradable one at that. Some other hardware site said that leaving out the cache on the Celeron saves a big $5 per part in manufacturing. Cheers, Dan.