SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Rambus (RMBS) - Eagle or Penguin -- Ignore unavailable to you. Want to Upgrade?


To: Estephen who wrote (54938)9/24/2000 12:41:09 PM
From: Scumbria  Read Replies (1) | Respond to of 93625
 
Estephen,

"AMD has been the victim of its own product missteps and
manufacturing snafus."


He must be talking about 1.13GHz PIII, Merced at 600Mhz, the i820 disaster, Timna, server chipset recalls, missing earnings....

Scumbria



To: Estephen who wrote (54938)9/24/2000 1:31:29 PM
From: mishedlo  Read Replies (2) | Respond to of 93625
 
An interesting question posed on the FOOL followed by a reply.
=========================================================
AMD has some good products now and may have some great products coming, but one slip-up could cripple the company or bring them down. Like betting the farm on DDR vs RDRAM. That decision alone could do serious or even irrepairable harm to AMD. (Look at what the DDR/RDRAM decision is doing to Micron right now.)

I saw this article posted in the "Best Of" section of the Fool and that is how I ended up posting on your board. Frankly, I don't understand the above arguement and I am looking for clarification. First, I don't understand the technical side of this and I must admit, I don't understand how this decision is going to affect AMD's bottom line. Why for example is this so critical given that AMD has its own chipset and essentially its "own" kind of motherboard - doesn't this state of affairs mitigate the above problem? In other words, is it not possible for AMD to switch to RDRAM if they want?

I always thought of the CPU as the traffic controller in a computer that stores and retrieves things in a place called RAM - so why would the CPU care if it is doing this mundane task with DDR, RDRAM or any other RAM? I am aware that there are compatability issues that need to be adressed but are they insurmountable?

Any thoughts and comments are appreciated.
==========================================================
REPLY: thanks to JasonSmith
Why for example is this so critical given that AMD has its own chipset and essentially its "own" kind of motherboard - doesn't this state of affairs mitigate the above problem? In other words, is it not possible for AMD to switch to RDRAM if they want?

It is critical for AMD because the Athalon is clearly designed for DDR. It has a 200MHz bus, perfect for DDR200. However, no one has been able to actually field a production system using DDR with the Athalon, or any other processor for that matter.

The critical factor is the timing. If Intel gets to market with P4/RDRAM first, AMD cannot take the lead spot. Despite all the trumpeting of AMD's marketing department, the Athalon with PC133 performs neck and neck with Intel's P3 with PC133. They cannot take the performance lead with a 1.2GHz Athalon if there is a 1.5GHz P4 with RDRAM on the market (note that the P4 core is designed to be able to run at higher frequencies on the same production process than the P3, and since the Athalon has not yet beaten the frequency of the P3 in a fundamental way, it is safe to assume that the P4 is also designed to run at higher frequencies than the Athalon).

AMD cannot simply switch to RDRAM at a moment's notice. How RDRAM works is considerably different than how SDRAM or DDR works (for that matter, DDR is different than SDRAM, but that is another story). AMD would have to have a chipset, and it has taken them this long to get a working chipset for DDR, and we are still holding our breath.

Additionally, the Athalon core has been tweaked to be as efficient as possible using DDR, not RDRAM. You have to design the system for the memory it will use, and AMD backed DDR a long time ago. That means they are stuck with it for this design cycle. Of course, AMD could adapt a chipset to run RDRAM in maybe 2 quarters, and it would probably run better than DDR anyway, but Intel has been working on the P4/RDRAM design for years now, and they are way ahead.

I always thought of the CPU as the traffic controller in a computer that stores and retrieves things in a place called RAM - so why would the CPU care if it is doing this mundane task with DDR, RDRAM or any other RAM? I am aware that there are compatability issues that need to be adressed but are they insurmountable?

The CPU cares about how much data it can push through the pipe from the CPU to the memory and back. If the pipe is clogged, not much water flows. If the pipe is clear and wide, lots of water flows.

SDRAM is like a garden hose. You can get enough water through to water part of your lawn, or wash your car. If you need a high pressure wash, you are out of luck.

RDRAM is like a fire hose. It can deliver a lot of water at high pressure. It's overpowered for a lot of the software we use today, but just imagine if your yard suddenly expanded to 100 acres and you had to water it, or if you just bought a 50 yard long SUV that you want to wash in a hurry. That garden hose isn't looking so good now, is it?

DDR is like putting two garden hoses together. You get twice as much water. You still don't get good pressure (DDR bogs down latency-wise when you use it heavily), and you don't get nearly as much water as with a fire hose, but it is a whole lot better than that single hose. You can water twice as much and clean off your old car in half the time.

So you are right in the sense that DRAM is DRAM, and stuff goes back and forth and the processor does not know the difference. But if you have something that needs data quickly, like streaming media, animations, multiprocessing, voice recognition, etc., you want a full-bore fire hose, not a garden hose.

There are a number of old benchmarks that now run almost entirely in cache memory on current CPUs. These benchmarks will run the same no matter which memory you use. Disk-intensive benchmarks won't be any different. But software that needs to move data from one place to another quickly will be blazingly fast. Certain types of software that need to perform functions on large amounts of data will be blazingly fast. In short, RDRAM is necessary to allow the CPUs to support the next generation of processor-hungry, data-intensive software that is still on the drawing board today.

The main issue with DDR is the integrity of the electrical signals running around the board. The long and short of that is that DDR can't get much faster than DDR266, if that ever even works (unless you solder the chips directly to the board, which improves the signal integrity enough to get you to maybe DDR500). It does not matter how fast the silicon is - the way you transmit signals between chips has got to change.

RDRAM can currently get you to PC1066, which is equivalent to one channel of DDR266, but only uses about 1/4 the signals to do it, and the standard is well defined. Additionally, you can implement an entire memory subsystem with one PC1066 chip. It takes 4 DDR266 chips to do the same thing. The P4 takes advantage of this by allowing for two channels of PC800 RDRAM, or 3.2GB/s. This is easy to do when you can use less signals, and when those signals are impedance matched and terminated so they don't go spilling high frequency radiation into other circuits. I imagine that DDR will be terribly noisey, though I don't know for sure what affect that will have on other components in the system.

We haven't heard one manufacturer come out with a promise that they will provide a workstation with dual channel DDR. We have heard from Sony that they are developing a system that will have something like 12 or 16 channel RDRAM (this requires only 12 or 16 RDRAM chips, so it is not a big technical feat - there are 8 chips on one SDRAM DIMM ... the processor will require 12 or 16 integrated controllers).

This whole thing is about the future. AMD is thinking about maximizing returns right now. They are going with DDR, which was widely recognized as a stop-gap measure years ago. But DDR will not meet their needs for the long term, and they will be forced to go through another costly design cycle to get rid of it.

Intel is thinking several years into the future. They know the kinds of things they will need to be able to support, what kinds of markets they are going after, 5 and 10 years out. The P4 core is designed to take them 5 years out, perhaps to the end of the 32 bit processor. Timna is the low cost processor, and it also is designed to be used with RDRAM in value systems. AMD is taking advantage of the weakness in Intel right now, as they switch from the old product to the new, but a black eye is not the same thing as a death blow, and you can expect Intel to recover quickly.

Now Intel will recover, and Intel's focus has been on the future. AMDs has been on the near term and even the status quo. Who do you think will be better positioned 6 months from now, when the P4s are in full production, Timna is being introduced, and RDRAM is within 20% of the cost of SDRAM (while providing better granularity for small systems and better performance all around)?

The sad part is that all the information to make their processors compatible with RDRAM has been available to AMD all along. Intel even paved the way, using its muscle to help get the new standard ramped into full production. Instead, they chose to go with lesser suppliers and manufacturers who were backing a short-term non-standard standard, and those suppliers and manufacturers - for whatever reason - have not been able to deliver.

The sadder part is that with Athalon and DDR, AMD could have taken the performance crown from Intel for at least one full quarter, maybe two, maybe three, if DDR had been available on schedule. It would have been a major blow to Intel's market share. The P3, you may recall, cannot take advantage of the extra speed in DDR. However, AMD has settled for a virtual performance tie. You can't take market share by tieing, even if you discount the price of the product. You have to make your product better. AMD has failed to do that. So they have taken a little market share, but it is likely the P4 will take that back and then some.

Yes, it is a critical time for AMD. Memory performance is critical to system performance. Intel was not derailed by Athalon, and they are winding up for the counterattack with a Louisville Slugger.