SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : MSFT Internet Explorer vs. NSCP Navigator -- Ignore unavailable to you. Want to Upgrade?


To: Gerald R. Lampton who wrote (22700)2/17/1999 12:19:00 AM
From: damniseedemons  Read Replies (2) | Respond to of 24154
 
Hey Jerry,

The Rambus wave just keeps getting pushed farther and farther back... Why? Because it's a pretty worthless technology, for the most part.

RDRAM increases memory bandwidth, but the big tradeoff with bandwidth is latency. Net net, RDRAM systems hardly perform any faster than those with SDRAM memory. Not to mention that all but highly-3D applications use less than 7% of available memory bandwidth (be it SDRAM or EDO).

So basically, instead of addressing the real issue, Latency, which would noticeably increase overall system performance, Intel economically wants to keep the bottleneck on the CPU (though unfortunately for them the bottleneck has increasingly become the network, but that's another topic..). So basically, Rambus was in the right place at the right time, and Intel annointed them to be the vaunted "next generation memory interface." I won't go as far (publicly) to call INTC/RMBS a charade, but well...

-Sal

BTW: I appreciate all the legal commentary you've been making. Just with I had time to read all the posts, and especially to post here more often..



To: Gerald R. Lampton who wrote (22700)2/18/1999 2:51:00 AM
From: nommedeguerre  Respond to of 24154
 
Gerald,

>>When will Intel start shipping chipsets with Rambus technology? Does this Pentium III launch affect that in any way?

Cannot answer this effectively without a little research.

Unless Intel announces something better than a 500MHz Pentium III they will be playing catchup to what IBM can now offer. Doubt it will make a difference on the desktop but it could have a decent impact on network-related systems.

Cheers,

Norm



To: Gerald R. Lampton who wrote (22700)2/18/1999 10:16:00 AM
From: Daniel Schuh  Read Replies (1) | Respond to of 24154
 
Gerald, here's a couple articles from Tom's Hardware page a while back. Like Sal, they're not much impressed by Rambus. The main article is "Performance Impact of Rambus" www4.tomshardware.com. There's also brief takes of interest at www4.tomshardware.com and tomshardware.com . Both of the last 2 are sections of broader articles.

As a cheap hardware guy, I'm a little bemused by Intel in general. Celery is the way to go, at least on the desktop. For regular Pentium II's , it's not even clear that the 100mhz bus buys you anything in particular. High bandwidth solutions have a place in the multiprocessor market, but for mainstream desktop/ productivity apps, it's all massive overkill.

It's odd that businesses still seem to be budgeting pretty much the same money for PC's, going for $2-3k systems. I'd buy $500 systems as needed, and treat them as disposable, or use 'em for network terminals if they ran out of gas. But again, I'm cheap.

Cheers, Dan.



To: Gerald R. Lampton who wrote (22700)2/18/1999 11:11:00 AM
From: Daniel Schuh  Respond to of 24154
 
Intel Unveils Controversial Chip nytimes.com

One final OT on PIII/Katmai. And one little excerpt, from my cheap hardware guy perspective:

''This isn't really a push to promote a processor. It's more a push to arrest the slide down the slippery slope toward less expensive personal computers,'' said Van Baker, director of market research at Dataquest, a San Jose, Calif. high-tech research firm.

That makes sense for Intel business wise, but on the broader economic front, it's counter to the "appliance" mantra. The article goes on to state how PIII is cool for 3d graphics and other computationally intense stuff. Which is, indeed, cool, but personally uninteresting, my hand-eye coordination was never that good, the gaming scene is beyond me. From the broader business standpoint, I got to wonder too, as the bread-and-butter apps just don't need this stuff. Maybe voice recognition or video compression will become essential, I don't know. I think it would be pretty short sighted for business to invest big time in hardware for apps that aren't really ready yet, though.

Then there's NT2K, which will no doubt soak up plenty of compute power, but won't be helped by the signal-processing style horsepower in the PIII. And by the time NT2K becomes mainstream, today's hardware will be obsolete, anyway. Not to mention that a big, bloated OS is not the ideal platform for stuff that has a large real-time element. It's all pretty weird. Written word guy makes no predictions.

Cheers, Dan.