Skip to content

Quick Start — What Is Changing

Q-Memory began as an advanced non-volatile resistive memory technology — storing data as resistance states in a switching material, with no charge leakage, no refresh cycles, and no standby power.

The key capabilities were:

  • Multi-level cell storage (many distinguishable states per cell)
  • Sub-100 ns write speed
  • Zero standby power
  • CMOS backend compatibility
  • Cryogenic-safe operation

These were validated at the theoretical and simulation level through an independent technical review in early 2026.

Q-Memory has evolved into a silicon photonic quantum computing platform — a chip that uses single particles of light (photons) to perform quantum computation and AI matrix acceleration.

The core ideas carried forward:

  • Non-volatile state → now implemented as optical memory materials that lock mirror positions with zero ongoing power
  • Multi-level encoding → now implemented as phase precision in optical elements
  • CMOS integration → now implemented as co-integrated electronics that drive and read the photonic layer
  • Zero standby power → now achieved through non-volatile optical memory, not resistive switching

The underlying compute substrate changed: from resistance in a material layer to phase in an optical network.

Think of the chip as a programmable grid of tiny beam splitters — microscopic optical mirrors on a chip. Single photons travel through this grid, and by adjusting each mirror’s angle, you can:

  1. Route photons down different paths — equivalent to quantum logic
  2. Entangle photons when two photons meet and are detected together — creating quantum correlations
  3. Multiply matrices by encoding a number vector as the brightness of light entering each port, then reading the outputs — performing AI computation at the speed of light

The same hardware does all three. That is the key architectural insight of the evolved platform.

The resistive memory architecture was designed to serve quantum computing as a peripheral — fast parameter storage for quantum computers built from other technologies (superconducting qubits, etc.).

The photonic architecture positions Q-Memory as the quantum processor itself — not a memory supporting someone else’s quantum computer, but the computing substrate.

Several factors drove this:

  1. Room-temperature operation: Photonic quantum computing does not require extreme cooling for the compute layer — only for the highest-performance detectors (a compact cryogenic unit, not a building-scale dilution refrigerator)

  2. CMOS foundry compatibility: Silicon photonics is manufactured in the same facilities as conventional chips, making it far more scalable than specialised quantum hardware

  3. Dual use: The same programmable optical mesh that runs quantum algorithms also performs neural network matrix multiplications — at potentially lower power than GPU approaches

  4. Non-volatile optical memory: The resistive memory insight — zero standby power, multi-level state — applies directly to optical phase elements, and is one of the key differentiators of the photonic platform

Phase 0 — Validate the Photonic Components (2026)

Section titled “Phase 0 — Validate the Photonic Components (2026)”

A small proof-of-concept photonic chip is being fabricated through a multi-project wafer run.

What it contains:

  • A 4-mode programmable optical mesh (the smallest useful configuration)
  • Thermal phase-shifting elements for programmability
  • Optical input/output via a fibre array
  • Passive test structures for loss and coupler characterisation

What it must prove:

  • Waveguide loss is within target
  • Phase elements shift the optical phase correctly and stably
  • Beam splitters split light accurately
  • Two photons interfere with high visibility (> 90% Hong-Ou-Mandel visibility)
  • The mesh can be programmed to arbitrary configurations

What it does not include:

  • On-chip photon source (external source used)
  • Cryogenic detectors (room-temperature detectors used)
  • Non-volatile optical memory (added in Phase 1)
  • CMOS electronics on-chip (off-chip FPGA used)

If Phase 0 passes all criteria, Phase 1 proceeds.

Phase 1 — Integrated Photonic System (2027)

Section titled “Phase 1 — Integrated Photonic System (2027)”

A 64-mode photonic chip with on-chip photon pair sources, integrated detection, and co-integrated CMOS electronics.

Key additions over Phase 0:

  • On-chip entangled photon pair generation
  • Non-volatile optical memory elements for zero-power weight storage
  • Electro-optic phase shifting for nanosecond-speed feed-forward
  • CMOS electronics co-integrated for real-time control

First demonstrations enabled:

  • Quantum key distribution
  • Quantum random number generation
  • Photonic AI matrix acceleration

Phase 2 — Multi-Chip Photonic Platform (2028)

Section titled “Phase 2 — Multi-Chip Photonic Platform (2028)”

A 256-mode system assembled from multiple co-packaged photonic chiplets.

Enables:

  • Fault-tolerant quantum operations (if loss budget is met)
  • Production-scale AI inference acceleration
  • Molecular simulation demonstrations
SectionWhat you’ll find
Architecture OverviewFull description of the platform layers, how computation works, and the development roadmap
Memory SystemsHow the optical network is organised, how non-volatile optical memory works, and how CMOS electronics integrate
ApplicationsQuantum key distribution, AI acceleration, molecular simulation, and other use cases
BenchmarksPhase 0 pass criteria, projected performance metrics, and technology comparisons

What Happened to the Original Q-Memory Specs?

Section titled “What Happened to the Original Q-Memory Specs?”

The original architecture’s key figures — 5.6 bits per cell, 5×10⁹ endurance cycles, <100 ns writes — applied to the resistive memory layer. That layer is no longer the primary technology direction.

The analogous figures for the photonic platform are different in nature: optical loss per photon path, phase element precision, HOM visibility, feed-forward latency, and matrix operation speed. These are documented in the Benchmarks section.

The core promise is the same — fast, dense, low-power information processing for quantum and AI — delivered by fundamentally different physics.