SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : Technical analysis for shorts & longs -- Ignore unavailable to you. Want to Upgrade?


To: E_K_S who wrote (67185)10/25/2025 5:15:56 PM
From: Johnny Canuck  Respond to of 67657
 
I think you are missing the nuance of the paper that was posted.

The value proposition of quantum is that it can do algorithms faster and quicker than tradition AI implementation on GPUs.

The articles from IBM last week I think said that they could simulate a quantum centric algorithm on AMD GPUs. It did not say that it could do it as quick as a actual quantum computer implementation.

Think of it this way a quantum computer is theoretically projected to to solve an 128 bit cryptographic key by 2030 or 2040 using a brute force algorithm.

perplexity.ai

Running the similar algorithm on a GPU will essentially take longer that universe has left.

perplexity.ai

Quantum computing is about doing steps in parallel across an enormous set of processors at the same time while occupying a faction of the space, using a fraction of the energy involved and faster since the processes happens on a quantum level where delays are infinitely less.

GPUs, AMD or otherwise is not able to do any of that.

What the paper was suggesting I think without knowing the details is they are able to do a a massive number of parallel operations like but not exactly analogous to a small or modest quantum computer array.

I would highly doubt it can do anything on the scale or speed of a real quantum computer.

Right now it is all speculation till I see the paper or someone does a critique.

Essentially I suspect it is step in making something that looks like quantum computing but has a fraction of the speed and efficiency to try to do something sooner rather that later.

The parallel in the PC would would be I can run Windows or Linux in a virtual environment but it is not close to being as fast as running it in a native environment.

When I have time I will try to read the paper.

Science is trying to sexy right now, so a lot of discoveries are being overstated. I would hold your praise till you actually see it do something useful as opposed to an academic exercise.



To: E_K_S who wrote (67185)10/25/2025 5:23:27 PM
From: Johnny Canuck  Respond to of 67657
 
Why is the a debate about what quantum computing actual is and the timeframe to achieve actual quantum computing. Provide a chain of thought.

The debate over what quantum computing actually is and when it will become fully realized arises from both conceptual and technological uncertainties surrounding the field.?

At its core, quantum computing describes computation that leverages quantum mechanical principles—superposition, entanglement, and interference—to represent and manipulate data in ways that classical computers cannot. However, this definition is theoretical; translating it into a functioning technology has proven far more nuanced than with classical computation. Physicists, computer scientists, and engineers interpret “quantum computing” differently—some mean any device demonstrating quantum advantage on narrow tasks, while others reserve the term for a fault-tolerant, large-scale machine with error-corrected qubits capable of executing general algorithms like Shor’s at scale.?

Why the Definition Is Contested
  1. Foundational Ambiguity: Quantum mechanics itself remains probabilistic and non-intuitive. The mechanisms of measurement, decoherence, and entanglement—central to computation—are not universally agreed upon in philosophical or physical terms.?

  2. Practical Benchmarks: The community lacks a unified criterion for success. A lab demonstration showing a few-qubit system achieving “quantum supremacy” on a contrived task differs fundamentally from achieving stable logical qubits in a general-purpose quantum processor.?

  3. Algorithmic Limitations: Only a handful of known algorithms (e.g., Shor’s, Grover’s, quantum simulation) offer proven theoretical advantages, and even these require error rates far below today’s thresholds.?

Why the Timeline Is DisputedDifferent groups project vastly different milestones because the path from laboratory prototypes to a cryptographically relevant quantum computer (CRQC) depends on unresolved scaling challenges:

  • Hardware engineering: Qubits decohere in microseconds, requiring extreme error correction and millions of physical qubits to form even a few hundred logical qubits.?

  • Theoretical complexity: There is no mathematical proof that quantum systems are universally superior to classical ones—some simulations show classical algorithms partly “catching up”.?

  • Experimental pace: While companies such as IBM, Google, and Quantinuum demonstrate steady progress, predictions vary from the early 2030s to beyond 2040 for full-scale capability.?

In short, the debate persists because quantum computing straddles physics and computation theory. The field is both empirically uncertain—owing to material and control limitations—and philosophically unsettled, since its defining features blur traditional boundaries between physical measurement and logical information processing.?



To: E_K_S who wrote (67185)10/25/2025 5:24:06 PM
From: Johnny Canuck  Respond to of 67657
 
Explain major definitions of "quantum computing" and how they differ


The term “quantum computing” has several major definitions, reflecting distinct theoretical, practical, and engineering perspectives. These definitions vary because researchers emphasize different aspects of how quantum principles are applied to computation.?

1. Quantum-Mechanical DefinitionFrom a physics standpoint, a quantum computer is a system that uses superposition, entanglement, and interference of quantum states to perform operations that are impossible or exponentially slower on classical machines.?

  • Information is represented by qubits, which can exist in multiple states simultaneously (superposition).

  • Entanglement allows correlated qubits to act as a unified system.

  • Computation arises from manipulating these states and amplifying desired outcomes via interference.
    This definition centers on physical principles—quantum coherence and probabilistic measurement—as integral to the computational process.?

2. Information-Theoretic DefinitionIn computational theory, quantum computing is defined as a paradigm of computation over complex probability amplitudes rather than discrete bits. Operations are expressed mathematically through linear algebra (unitary transformations and matrix operations), not Boolean logic. This view abstracts away hardware and focuses on how information evolves through reversible, probabilistic transformations governed by the Schrödinger equation.?

3. Engineering and Industry DefinitionTechnology companies like IBM, Microsoft, and AWS define quantum computing as a multidisciplinary technology integrating computer science, physics, and engineering that seeks to solve problems beyond the reach of classical computation.?

  • Quantum processors (QPUs) operate qubits physically implemented in superconducting circuits, trapped ions, or photonic systems.?

  • The field includes hybrid approaches combining classical and quantum subsystems for optimization and simulation tasks.?
    This practical definition emphasizes technological readiness and scaling challenges—error correction, decoherence, and qubit fidelity—as the defining features of progress.

4. Philosophical and Foundational DefinitionPhilosophers and foundational theorists describe quantum computing as a conceptual framework redefining computation itself—treating the act of measurement and physical evolution as an inseparable part of information processing.?
This definition highlights the epistemic difference between quantum and classical computation: in quantum systems, information is not merely stored or processed, but embodied in probabilistic physical phenomena.

Distinctions Among Definitions

PerspectiveCore FocusKey TermsWhat Makes It Distinct<tbody]
quantum-mechanical<="" td=""]physical="" phenomena<="" td=""]superposition,="" entanglement<="" td=""]emphasizes="" exploiting="" quantum="" states="" directly="" for="" computation ?<="" td=""]
information-theoretic<="" td=""]mathematical="" formalism<="" td=""]unitary="" operations,="" amplitude="" interference<="" td=""]treats="" computation="" as="" manipulation="" of="" vectors="" in="" hilbert="" space ?<="" td=""]
engineering="" industry<="" td=""]hardware="" and="" scalability<="" td=""]qubits,="" coherence,="" error="" correction<="" td=""]focuses="" on="" building="" functional="" and="" scalable="" qpus ?<="" td=""]
philosophical<="" td=""]nature="" of="" information<="" td=""]measurement,="" non-determinism<="" td=""]explores="" how="" computation="" is="" redefined="" when="" measurement="" alters="" data ?<="" td=""]


In essence, these definitions differ primarily in what they consider the essential feature of quantum computation—whether it is the physics, the mathematics, the technology, or the concept of computation itself.



To: E_K_S who wrote (67185)10/25/2025 9:05:05 PM
From: Johnny Canuck  Read Replies (1) | Respond to of 67657
 
If I read this summary right it is even less significant than what I thought.

They have come up with a real time algorithm for stablizing qbit. They are running it as a hardware implementation on FPGA. FPGA are simply hardware cells where your essentially destroy connections to quickly prototype more complex chips using smaller fundamental blocks that are pre-packaged in a die. It helps you prototype something a lot less expensively than making an actual chip.

I think I mentioned before the next AI advances would be on FPGAs as they can implement faster versions of GPU algortihms once you have established the algorithm. Running hardware implemented versions algorithms in silicon is always going to be faster than on a CPU or GPU.

Message 35309046