SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Intel Corporation (INTC) -- Ignore unavailable to you. Want to Upgrade?


To: Paul Engel who wrote (40501)11/17/1997 3:14:00 AM
From: Ibexx  Read Replies (2) | Respond to of 186894
 
Paul and thread,

A "White Paper" from EE Times:
_____________

November 17, 1997, Issue: 981
Section: White Paper

Technology Grapples With A Time-To-Market Craze -- Re-Engineering The PC

By Rick Boyd-Merritt

Surprisingly, the story of the original IBM PC (which we relate beginning on page 95) is not so different from the tale of any machine you might see on the show floor of Comdex/Fall this year. A defining concern for the manufacturer was to hit a low price poin in a short design cycle, making off-the-shelf technologies a must. That, in turn, raised all sorts of knotty issues about reliability of the system in any given configuration of parts from any given mix of suppliers. And even in 1981, would-be giants like Microsoft Corp. cast a big shadow, significantly influencing the definition of the system.

PC design was-and is-a fast job of system integration calibrated to give the computer maker a significant, albeit temporary, position in a rapidly moving market. But the forces that made the original IBM PC so great may not be the best ones to move the platform forward today. It was one thing for IBM to throw off the yoke of its own bureaucracy and its prestigious patent portfolio to define and legitimize a new concept in open, personal computing. It is quite another for dozens of companies to routinely slap together systems made by a recipe-largely written by Intel Corp. and Microsoft-in a race measured not by the product's quality, but the quantity in which it ships.

For the PC to mature, the industry may need to step off the treadmill of time-to-volume and take a long, hard look at what innovations are needed to craft reliable systems for a rapidly fragmenting field of users. In short, it's time to get the engineer back into the PC.

Ironically, what is emerging as the No. 1 design imperative for the PC today is something any self-respecting designer should have made job one 16 years ago: quality. "Our key themes these days are simplicity, convenience and, related to all this, the overall concern with quality," said Carl Stork, general manager of PC hardware strategy at Microsoft. "By quality I'm simply talking about the thing working-how the system works, together with all its components. This isn't sexy, but it's one of the biggest challenges of the PC today."

The Need For Speed

It is at once fitting and ironic for a Microsoft executive to espouse this view: fitting because the Redmond, Wash., giant runs a hardware lab that oversees the testing of all manner of PCs and third-party add-ons. Thus, it has a first-hand understanding of just how real the quality problem is. The irony lies in the fact that it's Microsoft, as much as anyone, that keeps the industry tap dancing, defining a pace-and a "road map"-that system makers say leaves them scant resources to devote to other tasks, like figuring out how to differentiate their machines.

To pass muster in the Windows test labs next year and win a coveted "Designed for Windows" logo, manufacturers need to implement the Universal Serial Bus (USB) for low-speed peripherals and the 1394 bus for high-speed ones, push modem and audio functions off the aging ISA bus and meet a host of other baseline criteria detailed in a book co-written by Microsoft and Intel called the PC '98 Design Guide.

But that's just the bare minimum. PC makers know they must also incorporate Intel's Deschutes processors, which will run faster than 300 MHz and support 100-MHz system buses tied to 100-MH double-data-rate synchronous DRAMs and-perhaps later in th year-Direct Rambus DRAMs. Further, they must also implement a 2X version of the 66-MHz Accelerated Graphics Port (AGP). Most designers believe they must have these features or risk being out of the business in the course of a single product cycle.

"My chief concern is dealing with higher and higher processor and bus speeds," said Peter Ashkin, general manager of computer-systems engineering at Toshiba America Information Systems (Irvine, Calif.). "Next year Intel will break through the 300-MHz boundary. And we will have more product cycles each year, so the rate of change is accelerating." In its first year as a desktop-PC maker in the United States, he said, the Toshiba group has already rolled out three generations of its Infinia line.

In the face of such dizzying change, Toshiba has turned to Intel for motherboards and kept its own engineers focused on peripheral designs: a USB module that gives pushbutton access to features such as a TV or radio tuner, and soon-to-ship software add-ons to Intel's LANDesk system-management package, which ships with Toshiba's business PCs.

"Time-to-market is such a pressure in the PC business that some people pursue it over a system's robustness," said Wolfgang Baltes, R&D manager for performance desktops in Hewlett-Packard Co.'s Grenoble, France, operations. "And we fear there will be problems at that level."

A Vicious Cycle

Baltes, like Ashkin, sees his chief design challenge as dealing with Intel's processor speeds, Over the next few years, he said, they "are expected to span the 500-to-700-MHz range" and drive systems buses toward 150 to 200 MHz. "We will need controls over the thickness of layers of the motherboard, the layout of fine traces and other parameters that we have never controlled before," he said. "Some of our current tools don't even allow us to handle these issues. The new area where an engineer can contribute is in how to solve integration problems, heat dissipation, controlling design parameters and lowering costs."

Leonard Tsai, a principal engineer in NEC Corp.'s consumer PC division, indicates how heavy a burden these requirements can become. "The coming of a 100-MHz systems bus has meant we need to adopt new simulation tools, retrain engineers and face a lot of trial and error in motherboard layout," Tsai said. "We also face new mechanical and thermal issues as well as new support problems, and we have to qualify multiple component vendors with parts for these speeds. All these things put together are a headache."

Indeed, the must-haves add up to more than just a hard day at the office. "My biggest concern is that there are so many things as an engineering manager I have to push forward and pursue, but I don't see the return on our engineering investment or the improvement in systems reliability," said Tsai. "The business is based on how fast you can bring out new products-that's the driving engine, and it's a vicious cycle."

To be sure, quality issues have caused Microsoft to tone down its aggressive rhetoric about the PC as a digital, TV-like appliance for the living room, as articulated in its Simply Interactive PC initiative.

"Our vision about this remains intact," said Stork: "[But] we have to solve the quality, simplicity and convenience problems before we can see the PC in a consumer-electronics environment. The initial [consumer PC] products were well targeted to high-tech enthusiasts, but not to real end users."

But Satish Gupta, vice president of personal workstation products for the IBM PC Co., wonders whether end users are ready to switch from a cost-based to a quality-based buy. "To implement a design well, there are additional costs," he said, "but in this industry people aren't educated enough to want to pay for that."

If you stuff a PC with a full load of PCI-adapter cards, for example, the system is likely to fail because of signal-quality issues. "Luckily, most people don't do that, so the problems don't surface often," said Gupta. "It's easy to put out a crappy product that works most of the time, but it's much harder to put out quality products that will work all of the time."

Some designers fear that the double crunch of collapsing time-to-market windows in ever-speedier, ever-more-comple systems is leading to reliability problems that will soon become obvious to the general public. "We are designing by trial and error, misusing the public trust, and Intel is making money on that," said one engineer who asked not to be named.

The quality issue is endemic to the PC, though it is by no means insoluble, said Lewis Eggebrecht, chief architect of the original IBM PC, who now works for Philips Semiconductors (Sunnyvale, Calif.). "The PC, when used in the home or business, has not tended to be required to be a super-reliable product," he said. "Originally our goals on the IBM PC were to achieve 5,000 hours MTBF [mean time between failure] and have a product life of about five years. A mainframe terminal, by contrast, would have a 20,000-hour MTBF."

In Eggebrecht's view, "these are issues you can solve at the systems level, you don't have to do it at the chip level." For example, he said, Philips this year rolled out a living-room PC with a watchdog processor to automatically restart the system when a lockup seems imminent.

Over the past two years, Intel, like Microsoft, has begun stumping for quality in its own way, speaking out at design conferences in favor of what it calls "the balanced system." The Santa Clara, Calif., company is devoting its considerable energies to a panoply of issues, from defining system buses such as PCI and AGP (and, not incidentally, becoming a leading supplier of PC core logic and motherboards) to coordinating work on new specifications for parts as disparate as audio codecs and power supplies.

At the heart of the thrust is Intel's desire to expand the market an applications for its high-end microprocessors, an ambition that sometimes conflicts with balanced engineering. Such is the case, say several designers, with Intel's efforts to promote its high-end CPUs as decoding engines for a complex digital-videodisk data stream. "That's not the right answer when you can do that in $20 of silicon instead of an $800 CPU," said Dean Klein, chief technology officer at PC maker Micron Electronics Inc. (Nampa, Idaho). "You need to let the user balance his checkbook."

Some fear that the relentless drive for more features and more speed may be the PC's undoing. "The Mips and memory demands for commercial PC users are not going up," said another senior engineer who requested anonymity. "The question is: Does Intel's strategy [of selling at the high end] play out in the future?"

Whatever the fallout in the future, designers increasingly chafe at being tied to it today. "The big difficulty today is that the hardware architecture is primarily controlled by one company," said Eggebrecht of Philips. "Intel has done an excellent job of mastering control of the hardware architecture of the PC to make sure it fits their processor road map well. But as a systems and processor architect, I never like to be constrained by anyone else."

Winds In Washington

The circle of people seconding that view is no longer confined to system engineers faced with implementing the road maps and design guides. Consumer advocate Ralph Nader is trying to raise awareness of what he perceives as Microsoft's "aggressive and multidimensional" competitive efforts, which range from influencing the PC industry to moves into online banking and insurance. Nader was slated to host a conference in Washington on Nov. 13 geared to provide "a more coherent appraisal of where Microsoft is heading." His goal is to create a forum to discuss what he considers the taboo subject of the company's widening circle of influence.

"There's not much free speech in this area" Nader said in a interview. "You don't criticize the company that is part of your sales presentation. There's a severe imbalance of power between the likes of a Microsoft and a Compaq. If the Microsoft software seems lousy or inappropriately bundled, they can't speak out. That sends out a signal that people [had] better keep quiet, and that persuaded us there is a need for a conference to provide a more open appraisal of Microsoft's directions."

The event comes as the winds in Washington seem to be shifting against the Wintel duo. Both Microsoft and Intel are facing intense scrutiny from the U.S. Department of Justice over alleged antitrust activities. "There's no question that Intel and Microsoft are setting a vision and providing some leadership, but that doesn't mean anyone is required to follow our guidelines," counters Microsoft's Stork. "Our intent is to help everyone identify and produce better products and grow their markets, and I think our process is a very open one."

And so the race goes on. Together Intel and Microsoft set a quick pace for PC makers, who continue to see their job as one of integration on a tight schedule. At 16, the PC does not pretend to be a robust, mature system or one driven by breakthroughs from individual companies. The question is whether it will ever be.
techweb.com
________
A controversial article, no doubt.

Ibexx