top of page

Quantum Fashion-MNIST

  • Writer: Chaitanya Singh
    Chaitanya Singh
  • Jul 22
  • 3 min read

Updated: Jul 25

ree

In this project, we build a simple hybrid quantum-classical classifier to tell apart two similar items: sandals vs ankle boots from the Fashion-MNIST dataset. We use TensorFlow-Quantum (TFQ) and Cirq for the quantum side, while keeping the overall flow approachable and easy to adapt for your own experiments.

  1. Why Fashion-MNIST and why two classes?

    • The full Fashion-MNIST set has ten greyscale clothing categories (t-shirts, trousers, coats, etc.) at 28x28 pixels.

    • Today’s noisy quantum processors can only handle very small inputs, so we focus on just two classes: sandals and ankle boots.

    • These two are hard to distinguish even for classical methods, so they make for a meaningful binary classification task.

  2. Downscaling and filtering

    • Select classes: keep only images labeled “sandal” or “ankle boot,” giving about 10 200 training samples, 1 800 validation samples and 2 000 test samples.

    • Normalize pixels: divide original pixel values (0 to 255) by 255 so they range from 0 to 1.

    • Resize to 2×2 pixels: downsample each image to exactly four pixels, then threshold at 0.5 so each pixel becomes either 0 or 1.

  3. Encoding images into quantum circuits

    • Map each 2×2 image onto four qubits arranged in a 2×2 grid.

    • For each pixel above 0.5, apply an X gate to the corresponding qubit (flipping it from |0> to |1>). Pixels below or equal to 0.5 leave their qubits in |0>.

    • Each image becomes a tiny Cirq circuit that encodes the data in the qubit initial states.

  4. Parametrized quantum circuit (PQC) model

    • Add one readout qubit, prepared with a Hadamard gate to start in a superposition.

    • Apply two layers of trainable Ising-type interactions (XX and ZZ couplings) across the four data qubits.

    • Finish with another Hadamard on the readout qubit. Measuring this qubit gives an expectation value between –1 and +1 that the model uses for classification.

  5. Hybrid training with TFQ plus Keras

    • Build a Keras Sequential model whose trainable layer is TFQ’s PQC.

    • Convert our binary labels into +1 (for sandals) and –1 (for ankle boots) so hinge loss works naturally.

    • Use Adam optimizer at a modest learning rate and monitor a custom hinge accuracy metric that checks whether the sign of the readout matches the true label.

  6. Training outcomes

    • Over ten epochs with batch size 64, you’ll see the hinge loss decrease and both train and validation accuracy rise—often into the 70–80 percent range on a simulator.

    • Final evaluation on the test set yields the model’s true generalization performance.

  7. Why this matters

    • It shows how even a tiny quantum circuit—just five qubits in total—can take part in a real machine learning task.

    • The approach is modular: you can swap in different entangling layers, add more layers, or expand to more qubits and classes as hardware improves.

    • TensorFlow-Quantum lets you embed quantum circuits in familiar Keras workflows so you can prototype without rewriting your training loop.

  8. Next steps

    • Increase image resolution to 3×3 or 4×4 (requiring 9 or 16 qubits) as your simulator or hardware allows.

    • Use continuous rotations rather than binary thresholding to encode pixel intensity more richly.

    • Benchmark against classical models like small CNNs or SVMs on the same two-class problem.

    • Try other datasets: any two-class grayscale or sensor-readout problem you find interesting.

This project gives you a clear recipe for end-to-end quantum-classical machine learning, from raw images to circuits to trained hybrid models. Have fun experimenting!

Comentarios


LinkedIn

  • LinkedIn

© 2025 by Chaitanya Singh

bottom of page