Awesome
Quantum Neural Network Classifiers
An implementation of quantum neural network (QNN) classifiers
Setup
$ git clone https://github.com/LWKJJONAK/Quantum_Neural_Network_Classifiers
$ cd Quantum_Neural_Network_Classifiers
$ julia --project=amplitude_encode -e "using Pkg; Pkg.instantiate()"
$ julia --project=block_encode -e "using Pkg; Pkg.instantiate()"
Then the environments for the codes provided in jupyter notebook formats will be built.
Note: In order to use PyPlot, you will need to have the Python Matplotlib library installed on your machine first.
In case you have a dark background, change the line and text color of YaoPlots.plot:
CircuitStyles.textcolor[]="yellow"
CircuitStyles.linecolor[]="yellow"
(Here, we acknowledge the valuable input from Dr. Erlebacher)
In addition, for better compatibility, use version 1.7 of Julia.
Contents
- Amplitude-Encoding Based QNNs: Basic Building Blocks
- Amplitude-Encoding Based QNNs: An Example Code For The Whole Training Procedure
- Block-Encoding Based QNNs: An Example Code For The Whole Training Procedure
Data
In the paper Quantum Neural Network Classifiers: A Tutorial, we provide 5 Tables to exhibit the benchmarks. In addition to the average accuracy provided in this paper, in the complete data files, we also record the learning rate, the batch size, the number of iterations, the size of the training and test sets, and the accuracy/loss curves during the training process. The quick link is shown here: (the code for this part will not be updated, since in these links we have provided all the 55000 data files recording the numerical results)
- Table 1: Different depths & Three digital entangling layers
- Table 2: Different analog layers’ Hamiltonian evolution time & Depths 1, 3, 5
- Table 3: Different depths & Three analog entangling layers
- Table 4: Different scaling factors for data encoding & Three digital entangling layers
- Table 5: Different analog layers’ Hamiltonian evolution time & Three scaling factors
Motivation
Over the recent years, quantum neural network models have attracted a lot of attention and explorations. One major direction of QNNs is to handle classification tasks. Here, we divide QNN classifiers into two categories according to the ways of data encoding as exhibited in the figure above:
- The amplitude encoding strategy is suitable for the situations where we have a quantum random access memory (QRAM) to fetch the data, or the data directly comes from a quantum process (natural quantum data).
- The block encoding strategy is suitable for the situations where we have to encode the classical data into the QNN models.
Built With
- Yao - A framework for Quantum Algorithm Design
Detailed installation instructions and tutorials of Julia and Yao.jl can be found at julialang.org and yaoquantum.org, respectively.
Examples of using Yao in other projects
To Cite
@article{Li2022Quantum,
title={{Quantum Neural Network Classifiers: A Tutorial}},
author={Weikang Li and Zhide Lu and Dong-Ling Deng},
journal={SciPost Phys. Lect. Notes},
volume={61},
year={2022},
publisher={SciPost},
doi={10.21468/SciPostPhysLectNotes.61},
url={https://scipost.org/10.21468/SciPostPhysLectNotes.61},
}
@article{Luo2020Yao,
title = {Yao.Jl: {{Extensible}}, {{Efficient Framework}} for {{Quantum Algorithm Design}}},
author = {Luo, Xiu-Zhe and Liu, Jin-Guo and Zhang, Pan and Wang, Lei},
year = {2020},
journal = {Quantum},
volume = {4},
pages = {341},
doi = {10.22331/q-2020-10-11-341},
url = {https://quantum-journal.org/papers/q-2020-10-11-341/}
}
We experimentally implement the block-encoding based QNNs for large-scale (256-dimensional) real-life images' classifications, see also the paper "Experimental Quantum Adversarial Learning with Programmable Superconducting Qubits"
@article{Ren2022Experimental,
title = {Experimental Quantum Adversarial Learning with Programmable Superconducting Qubits},
author = {Ren, Wenhui and Li, Weikang and Xu, Shibo and Wang, Ke and Jiang, Wenjie and Jin, Feitong and Zhu, Xuhao and Chen, Jiachen and Song, Zixuan and Zhang, Pengfei and Dong, Hang and Zhang, Xu and Deng, Jinfeng and Gao, Yu and Zhang, Chuanyu and Wu, Yaozu and Zhang, Bing and Guo, Qiujiang and Li, Hekang and Wang, Zhen and Biamonte, Jacob and Song, Chao and Deng, Dong-Ling and Wang, H.},
year = {2022},
month = nov,
volume = {2},
number = {11},
pages = {711--717},
publisher = {{Nature Publishing Group}},
issn = {2662-8457},
doi = {10.1038/s43588-022-00351-9},
journal = {Nat. Comput. Sci.}
}
License
Released under MIT License