Hi all, Time for my Monday EE-Times reading for PC fanatics...
The industry-wide march towards systems on a chip continues, with Intel giving out hints of future integration:
Intel eyes low-cost logic process The new peripheral process could be a sign Intel is ready to put its considerable manufacturing muscle behind integrated processors that would compete with the likes of National Semiconductor Corp.'s "PC-on-a-chip," which is expected to ship next June. Intel plans to ship a merged core-logic/graphics chip set, called Whitney, next year, and analysts have speculated that beyond the 0.18-micron generation, it would make sense for Intel to integrate its Celeron processors with north-bridge core logic and graphics.
Also an interesting note regarding one of the disadvantages of horizontally integrated producers: Too much concentration in what can suddenly become obsolete factories. Vertically integrated manufacturers only have one factory go obsolete at a time, typically:
"The problem with Intel is they have so many factories they would have to bring up in tandem that [moving to copper] would be too difficult to do, simply because of their size," Glaskowsky said. "It will take them longer to make the transition to copper than it would IBM, which has fewer fabs." techweb.com
More progress in the smaller, (and eventually cheaper) display area:
Mini displays look for larger showing Only about a dozen or so manufacturers are shipping at least engineering samples of miniature displays, but about 30 companies are known to be in active development and the underground development activity is said to be immense. And no wonder: The market opportunity for the mini readouts is also immense. techweb.com
Funny they didn't mention MVIS, where I used to work... Currently I'm thinking of applying over at Siemens Ultrasound, there are some people over there that I have a lot of respect for, it would be nice to work with them. Plus, the technical challenges of ultrasound are hard to beat: High speed computations, but done cheaply.
Backgammon, anyone? Neural learning theory tested Madison, Wis. - A computer took on a human grand master and won 99 of 100 games at the American Association of Artificial Intelligence meeting held here recently. techweb.com
My question: Did they let the computer roll the dice?
-- Carl |