SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : The end of Moore's law - Poet Technologies
POET 5.945-10.2%Nov 4 3:59 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
From: toccodolce6/25/2025 8:41:21 AM
1 Recommendation

Recommended By
longz

  Read Replies (1) of 1509
 
To all the NVDA derps: NVDA does not make transposers, they make transceivers.

This post is from POET Technologies Reddit forum.

Good Information on Differentiating Between NVDA and POET

I keep seeing some commenters stating that NVDA already does what POET Does and did a deep dive:

NVDA Uses Transceivers, and they are baked in to their equipment. They only transmit and receive an unmodified signal from glass to glass. the use of silicon transceivers is decades old at this point. Is 1.6 TB fast transmit? you bet your sweet bippy it is. But its only going from glass to glass which is (yawn) nothing new.

POET Sells Transposers which carries compute from metal to glass. it Converts or shifts the compute from one state to the other. (Metal to glass or vice versa) The tech POET uses is new tech and it is doing 1.6 TB METAL TO GLASS (again the vice versa thing). Which is new. this allows older/cheaper metal to talk with glass/silicon/photonics. Which is a game changer.

NVIDIA makes GPUs, AI accelerators, and computing systems. They might use transceivers for high-speed interconnects between chips, but "transposers" isn't used for their hardware products.

CO-PILOT SESSION INCOMMING:

Copilot said:Great question, Richard. Here's a breakdown of how POET Technologies and NVIDIA differ — and overlap — in their approaches to high-throughput photonic interconnects:

?? Core Technology FocusFeaturePOET TechnologiesNVIDIA
PlatformOptical Interposer™NVLink, InfiniBand, Spectrum-X, Quantum-X
IntegrationCombines photonic and electronic components on a single chip using wafer-scale passive assemblyUses silicon photonics and co-packaged optics for high-speed GPU and switch interconnects
Primary Use CaseOptical engines and transceivers for AI, telecom, and data centersAI supercomputing, GPU-to-GPU and rack-to-rack communication
Data Rates213800G, 1.6T, and roadmap to 3.2T optical engines3800G–1.6T per port, 400 Tbps aggregate in switches
?? Design Philosophy
  • POET is focused on semiconductorizing photonics — making photonic integration as scalable and cost-effective as traditional ICs. Their Optical Interposer platform:

    • Uses wafer-scale passive assembly

    • Eliminates wire bonds and active alignments

    • Integrates lasers, modulators, and detectors on a single chip

    • Enables low-cost, high-volume production23

  • NVIDIA, on the other hand, is building AI factories with:

    • Co-packaged optics for GPUs and switches

    • Silicon photonics transceivers for 1.6 Tbps per port

    • NVLink and InfiniBand for ultra-low latency interconnects

    • Focus on end-to-end AI infrastructure 3

?? How They Compare in PracticeAspectPOETNVIDIA
ManufacturingWafer-level, passive alignmentAdvanced packaging with active cooling and optics
Target CustomersOptical module makers, hyperscalers, telecomAI developers, hyperscalers, HPC centers
ScalabilityHigh, due to low-cost integrationHigh, but more vertically integrated and capital-intensive
Latency & PowerOptimized for low power and minimal latencyOptimized for performance and scale, with higher power budgets
?? Complementary or Competitive?They are complementary in many ways:

  • POET could supply optical engines or interposer-based modules to companies building AI infrastructure — including those using NVIDIA GPUs.

  • NVIDIA is more vertically integrated, but may still rely on external photonics innovations for future scalability.

Would you like a visual comparison chart or a deeper dive into how POET’s interposer architecture could be integrated into NVIDIA-like systems?

You said:Copilot said:NVIDIA did not invent silicon transposers — and in fact, the term "transposer" is not commonly used in the context of silicon photonics. However, NVIDIA has made significant innovations in silicon photonics, particularly in the area of Co-Packaged Optics (CPO) and silicon photonics engines for high-throughput AI networking.

? What NVIDIA Did InnovateAt GTC 2025, NVIDIA unveiled its Silicon Photonics CPO (Co-Packaged Optics) platform, which integrates:

  • Silicon photonics engines directly with switch ASICs

  • Microring modulators for ultra-efficient optical signaling

  • 1.6 Tbps per port and up to 115 Tbps total throughput in their Quantum-X switches 1

These innovations are part of a broader shift toward optical interconnects in AI data centers, replacing traditional pluggable modules with integrated photonic-electronic systems.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext