Quantum‑Enhanced Image Classification on Fashion‑MNIST with TensorFlow‑Quantum and Cirq
- Chaitanya Singh
- Jul 25
- 3 min read

In this project we build a hybrid quantum‑classical model that learns to tell sandals apart from ankle boots. We rely on TensorFlow‑Quantum to handle the quantum layers and Cirq to define the underlying circuits. The overall flow mirrors what you’d do in classical deep learning, but with trainable quantum gates whose measured expectation values drive the final decision.
We begin by choosing Fashion‑MNIST as our playground. This dataset mirrors classic MNIST but replaces handwritten digits with greyscale clothing items, keeping the same 28×28 format and train/test splits. To keep the task challenging yet manageable on today’s quantum hardware simulators, we focus on just two categories—sandals and ankle boots—and downscale each image from 28×28 pixels to 2×2 pixels. That gives us exactly four numerical features per example.
Next we load the full Fashion‑MNIST dataset via Keras and filter it so only labels 5 (sandal) and 9 (ankle boot) remain. We normalize all pixel intensities from the original 0–255 range down to 0–1, resize each image to 2×2, and split everything into training, validation, and test sets.
To encode our four‑pixel images into qubit states, we apply a binary threshold at 0.5: pixels above that threshold become ones, and those at or below become zeros. We map those four bits onto four data qubits arranged in a 2×2 grid. For each bit set to one, we append an X gate to the corresponding qubit, flipping it from the |0⟩ state into |1⟩. Once all of our images are represented as Cirq circuits, we convert the batch into TensorFlow‑Quantum tensors so that Keras can train on them in bulk.
Our parametrized quantum circuit adds a fifth readout qubit. We prepare that readout qubit with an X gate followed by Hadamard, placing it in equal superposition. Then we apply two layers of trainable Ising‑type couplings: first XX gates between each data qubit and the readout qubit, each raised to its own trainable angle, then ZZ gates in the same fashion. A final Hadamard on the readout qubit finishes the circuit. Measuring ⟨Z⟩ on that qubit yields a continuous score between –1 and +1, which we interpret as our model’s output.
In Keras we wrap this parametrized quantum circuit inside a tfq.layers.PQC layer in a simple Sequential model. We convert our binary labels into +1 for sandals and –1 for ankle boots so that hinge loss applies naturally. We compile with the Adam optimizer at a modest learning rate, train for ten epochs using batch size 64, and track a custom hinge‑accuracy metric that checks whether the sign of the readout score matches the true label.
After training, we evaluate on the test set to obtain our final accuracy. Plotting accuracy and loss curves across epochs shows how quickly our hybrid model converges and how stably it learns.
By building this end‑to‑end pipeline on just five qubits, we demonstrate how TensorFlow‑Quantum integrates quantum circuits seamlessly into familiar Keras workflows. You can extend this template in many directions: increase image resolution to 3×3 or 4×4 (requiring 9 or 16 qubits), swap the binary threshold for continuous rotation encodings, compare against small classical CNNs or SVMs, or deepen the circuit with controlled‑Z layers. We encourage you to adapt and experiment with your own data and circuit architectures to see how small quantum circuits can enhance real‑world machine learning tasks. Have fun exploring!
Коментарі