SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Bookham Technology -- Ignore unavailable to you. Want to Upgrade?


To: tech101 who wrote (315)1/7/2004 11:38:52 PM
From: Tom Swift  Respond to of 376
 
I'm a little surprised that SiGe can be used in CMOS devices. I would have thought that the oxide quality would be too low due to the Ge. However, I did a Google on it and it appears that there is a lot of ongoing work in that area.

I agree that a CMOS line can handle SiGe without many changes - but are the devices HBTs?

BTW, years ago I saw a presentation by the guy that invented the CMOS imager (Eric Fossum?) at an OIDA meeting -- hot stuff, I'm glad the technology made it. The big deal back then was that CCDs were made on old 3" lines and the CMOS device could be made on 6 and 8" lines with the Orbit process.



To: tech101 who wrote (315)1/8/2004 1:22:49 PM
From: ownstock  Read Replies (1) | Respond to of 376
 
OK, so couple of things:

1) CMOS and CCD are different designs of imagers
2) Both CMOS and CCD imagers have been around since the 60's. Canon is the first to consumer commercialize it, but definitely not the pioneer. IBM, Fairchild and Nortel all had programs. The reason CMOS were not used for a long time is the old design rules made them lousy compared to CCD. Today they are reasonable, if digitally processed (corrected) by a separate chip.
3) There is no fundamental reason from a process viewpoint why CCD and CMOS cannot be on the same chip. I expect the best image chips to use aspects of both.
4) The CMOS-only imager will never achieve the same performance (mainly areal sensitivity) as the CCD imager...but this may be irrelevant for consumer applications (they don't give a hoot).

Today I carry a small Pentax CCD digital camera for quick pictures (last message). I also have had a Fujifilm 18Mpixel professional digital for almost three years, and a Nikon 35mm and Mamiya mid-format cameras for many years. Based on that I can say digital will have a hard, expensive time replacing anything above 35mm format for professional work. By that I mean final enlargements, not proofs.

I think high quality professional will be first-to-film for a long time. After processing it can be digitally scanned at very high resolution very quickly and cheaply. The camera is cheaper, lighter, much lower power and honestly much much better in terms of resolution and color than direct-to-digital.

But having said that I am not investing in film or film processing plays.

-Own