Skip to content

Quantum Computing Applications

The Q-Memory photonic platform is designed from the ground up to perform quantum computation using photons — individual particles of light. This page explains how the platform approaches quantum computing, what problems it can address, and what the current state of the technology enables.

Most quantum computers operating today use superconducting circuits or trapped ions as their quantum bits. These systems require extreme conditions: superconducting processors must be cooled to within thousandths of a degree of absolute zero — colder than outer space — using equipment costing millions of dollars.

Photons are naturally quantum mechanical at room temperature. A single photon in a superposition of paths, or entangled with another photon, behaves according to quantum mechanics without any cooling. This is the central appeal of photonic quantum computing.

The trade-off is that photons don’t interact with each other easily. Two-qubit logic operations — needed for universal quantum computation — require either nonlinear optical elements (difficult to scale) or a measurement-based approach where entanglement is generated and used as a resource. The Q-Memory platform uses the latter approach.

A photon travelling through the programmable optical mesh undergoes precisely controlled interference. By setting the phase elements appropriately, the mesh can apply any single-qubit rotation to a photon-encoded qubit. This is equivalent to rotating the qubit state on the Bloch sphere.

Single-qubit operations are:

  • Deterministic — they always work when a photon is present
  • High fidelity — limited primarily by phase precision and waveguide loss
  • Reconfigurable — any gate can be programmed by changing the phase settings

Two-Qubit Operations (Entanglement Generation)

Section titled “Two-Qubit Operations (Entanglement Generation)”

Two-qubit logic between photons is achieved through a process called fusion — two photons meet at a beam splitter and are jointly detected. When the detection pattern matches a specific signature, the two photons have become entangled. This entanglement is then used as a resource for further computation.

The key characteristic of this approach is that fusion gates are probabilistic — they succeed with a certain probability per attempt, typically 25–75% depending on the configuration. The platform compensates by:

  1. Running many parallel copies of each operation
  2. Routing successful outcomes forward and discarding failures
  3. Feed-forward corrections that adapt the subsequent computation based on which attempts succeeded

This probabilistic nature means the chip needs more optical modes than the minimum required by a deterministic approach — but it enables room-temperature operation.

Real-time feed-forward is essential for adaptive quantum logic. The CMOS electronics read each detector output and, within nanoseconds, update the phase elements further along the chip to apply the appropriate correction or continuation.

This closed-loop control is what makes measurement-based quantum computation possible on a single chip, and it is one of the most technically demanding aspects of the system.

Protocol: Two parties exchange a shared secret key using entangled photon pairs. Any eavesdropper disturbs the quantum state in a detectable way — making interception physically impossible without being caught.

Platform role: The on-chip entangled photon pair source generates pairs at telecom wavelengths. One photon of each pair goes to each party. The measurement outcomes are correlated in a way that cannot be replicated without access to the physical channel.

Compatibility: Telecom wavelength operation makes the platform directly compatible with existing fibre infrastructure and satellite-based quantum communication networks.

How it works: A photon sent into a 50:50 beam splitter has a 50% chance of exiting each output port — this is true quantum randomness, not pseudo-randomness from a deterministic algorithm. The measurement outcome cannot be predicted, even in principle.

Applications: Cryptographic key generation, secure protocol seeding, scientific simulation, gaming, and any application requiring certified randomness.

Platform advantage: On-chip integration means quantum random number generation can be embedded directly into computing systems at chip scale.

Boson Sampling is a computational task — sampling from the output distribution of a linear optical network with single photons — that is believed to be classically intractable for large enough systems. It was the first class of photonic computation proposed to demonstrate quantum advantage.

The Q-Memory programmable optical mesh is a direct implementation of the hardware required for Boson Sampling and related protocols. Phase 1 targets a demonstration of quantum sampling at a scale where classical simulation becomes challenging.

The problem: Classical computers struggle to simulate the quantum behaviour of molecules accurately. The number of classical states required to represent a quantum system grows exponentially with system size.

The photonic approach: Variational Quantum Eigensolver (VQE) is a hybrid algorithm where a quantum processor prepares and measures a parameterised quantum state, and a classical optimiser adjusts the parameters to minimise the energy. This finds the ground state of a molecule.

Platform role: The programmable optical mesh implements the quantum circuit component of VQE. The CMOS electronics handle the classical optimisation loop. Phase 1–2 targets small molecular simulations relevant to drug discovery and materials design.

  • Proof-of-concept photonic chip validates optical components
  • Optical loss, beam splitter accuracy, phase drift, and Hong-Ou-Mandel visibility are characterised
  • Two-photon interference is demonstrated with external photon sources
  • On-chip photon pair generation
  • Quantum key distribution demonstrations
  • Quantum random number generation at production rates
  • Small quantum circuit demonstrations with real-time feed-forward
LimitationCurrent statePath to resolution
Gate success probability25–75% per attemptParallel resource states; larger mode count
Photon source efficiencyFraction of pulses produce usable photonsSource multiplexing with fast optical switching
Loss per photonJust above fault-tolerance thresholdOptimised waveguide design; improved couplers
Detector coolingRequires compact cryogenic unitCompact closed-cycle cooler (~mini-fridge)
ScalePhase 0: 4 modesPhase 1: ~64 modes; Phase 2: ~256 modes

The loss budget constraint is the most fundamental: fault-tolerant photonic quantum computing requires keeping total photon loss below a tight threshold per logical operation. Current demonstrated systems are close to this threshold; the gap is small but real and is the primary engineering target for Phase 1–2.