CPU Clone-Makers Wrestle With X86 Compatibility (01/28/98; 4:30 p.m. EST) By Alexander Wolfe, EE Times <Picture>Verifying the compatibility of an X86 clone processor is harder than it appears, according to industry experts.
Every new clone CPU must be tested to ensure that it can actually execute the entire X86 instruction set without any bugs or unexpected side effects. But engineers working toward this goal face one big stumbling block, according to Michael Slater, principal analyst at MicroDesign Resources (Sebastopol, Calif.).
"The problem is the X86 architecture has never been a properly defined architecture," said Slater. "It's been an implementation, a de facto standard."
According to Glenn Henry, president of CPU designers Centaur Technology (Austin, Texas), "Over half the X86 architecture isn't written down. It's behavior."
As a result, there's no comprehensive written specification that a chip maker can test against to verify a chip's compatibility with the Intel architecture. And if a microprocessor can't be marketed with the pledge that it will run existing X86 software, it won't find many willing customers.
X86 compatibility is separate from, and must be performed in addition to, standard verification, which checks to see that a processor carries out its logical functions the way its designers intended.
Surprisingly, even Intel itself faces this problem; the company must verify that its own new chips are compatible with its own existing instruction set. In practice, however, it's more of a challenge for clone makers, which may not have Intel's historical database filled with arcane details of the architecture.
"Designing a chip with the amount of data flow and cache of a Pentium-class processor is not that difficult," said Centaur's Henry. "What's incredibly hard to do is to verify X86 compatibility."
In practice, Henry noted, the real test is whether a CPU will execute X86 machine code and, in turn, real-world PC applications. But passing this test involves a lot more than firing up a PC and seeing if a CPU will boot up Windows 95.
To pass muster, a chip must successfully navigate a host of test vectors and formal verification procedures. Yet, when Centaur began to verify its first CPU -- a low-cost Pentium-class processor called the WinChip C6 -- the company found that available off-the-shelf test software couldn't meet all its needs. Accordingly, Centaur internally developed what Henry called a "substantial complement of X86 simulation and verification tools."
To verify compatibility of the WinChip C6, Centaur ended up using a rich mix of classic tools like Speed Sim, as well as home-grown software such as an artificial-intelligence-based program generator.
"Verification to us is a serial test process," Henry said. "Every hour of the day, it's spitting out random test cases that no human could think of. For example, you're doing some operation when a snoop hits. Then an interrupt comes along, or there could be a 'stop clock.' No human thinks of such complicated test cases, but a random generator does, and when it does we turn it into a verification test case."
Automated testing is crucial, Henry said. "With X86 [CPUs], if all you do is check the results at the end, you'll get tons of bugs because every instruction has side effects."
Along with buzzing out the operation of a processor's instruction set, Henry said it's vital to see how a processor works in a systems setting where buses and operating systems can stress the CPU. "Testing with real-world I/O is very important," he said. "Lots of devices have accumulated over the years and ISA is a horrible bus. As a result, it's very hard to simulate a full system. The only way to go is to plug your processor into a real PC with real I/O."
This approach enables engineers to view the effects to bus interactions as well as the sometimes unexplained glitches in bus timing.
Even Intel itself is not immune from the question of X86 compatibility, and must work hard to assure both end-users and the engineering community that its chips are backward compatible.
"Compatibility is taken very seriously by Intel," said Gadi Singer, general manager of design technology for Intel (Santa Clara, Calif.). "We believe that the ability of the user to use all the software on [the] processor is very important."
Singer noted that there are many written specs for the Intel architecture, but "there's a lot of the Intel architecture that's not written down."
Ensuring compatibility consists of several components, he added. First is the idea that a CPU should run all the software handled by the previous processor generation. In addition, a new chip should conform to its own specification.
Interestingly, knowledgeable engineers note that Intel's own chips sometimes do things differently, especially when it comes to obscure functions relating to performance-monitoring or BIOS.
"There are differences between the 486 and the Pentium," Centaur's Henry said. "For example, the 486 has test registers that are not in the Pentium."
"Intel is not perfectly compatible among all their chips," said one CPU expert who requested anonymity. "For example, not all the CPU condition codes are the same. But most Pentium-class chips have pretty similar behavior."
In practice, features such as test registers are generally not used by operating systems or applications, so they don't affect end users. Rather, the features are used mostly by development and test engineers.
But for clone-makers, the inconsistencies between different Intel chips sometimes make it difficult to know how to define a "correct" operation.
"The rule [cloners] follow is, if two Intel processors do it differently, then it's a 'don't care' case," said MicroDesign Resources' Slater. "In that respect, Pentium helped them, because there were a bunch of weird things in the 486. Its weird corner cases -- if three exceptions happen at the same time, which one does the processor finish? It's mostly around protected-mode stuff, not the basic Intel instruction set."
As microprocessors from both Intel and its competitors get more complex, the job of verifying designs won't get easier anytime soon. "The amount of technology and investment required to do the last 5 percent of validation is probably as much or more than what's needed to do the first 95 percent," said Intel's Singer.
Added the CPU expert who requested anonymity: "You can never do enough verification to do it perfectly."
techweb.com |