A Novel Quantum Convolution Neural Network for Image Classification Applications
Received: 30 May 2025 | Revised: 17 July 2025 | Accepted: 23 July 2025 | Online: 6 October 2025
Corresponding author: Venkatachalapathy Madhavanna Venkatappa
Abstract
Image classification plays a vital role in large-scale data analysis, especially in object recognition tasks using advanced Deep Learning (DL) frameworks. However, the growing complexity and computational demands of modern DL models have introduced challenges related to scalability and efficiency. Quantum Computing (QC) has emerged as a promising alternative, capable of addressing these limitations by leveraging the principles of Quantum Machine Learning (QML). However, many existing QML models require a large number of qubits, which poses limitations within the current Noisy Intermediate-Scale Quantum (NISQ) era. This work introduces a novel Adaptive Quantum Convolutional Neural Network (AQCNN) designed for efficient and scalable image classification within the constraints of NISQ devices. Addressing the limitations of existing QML approaches, particularly the high qubit requirements, AQCNN incorporates a resource-efficient quantum convolutional layer that performs localized quantum filtering using parameterized quantum circuits. A classical preprocessing layer encodes input features to reduce qubit load, followed by quantum embedding and hybrid quantum-classical layers that optimize feature extraction and classification performance. The model leverages an adaptive quantum convolution strategy, minimizing quantum gate depth and circuit complexity while preserving critical spatial hierarchies in image data. Evaluated on benchmark datasets, AQCNN achieved 95.88% accuracy on MNIST and 95.68% on FMNIST, outperforming comparable QML architectures. Additionally, the model supports scalable execution through parallel quantum circuit arrays, enabling practical deployment on current quantum hardware. This architecture demonstrates a significant advance in quantum-assisted image classification, balancing performance with qubit and gate efficiency. The integration of adaptive quantum convolution and hybrid processing not only enhances classification accuracy but also provides a viable path forward for deploying QML solutions under realistic hardware constraints.
Keywords:
image classification, quantum computing, quantum machine learning, adaptive quantum convolutional neural network, FMNIST, MNIST, noisy intermediate-scale quantumDownloads
References
F. Fan, Y. Shi, T. Guggemos, and X. X. Zhu, "Hybrid Quantum-Classical Convolutional Neural Network Model for Image Classification," IEEE Transactions on Neural Networks and Learning Systems, vol. 35, no. 12, pp. 18145–18159, Dec. 2024.
P. Easom-McCaldin, A. Bouridane, A. Belatreche, R. Jiang, and S. Al-Maadeed, "Efficient Quantum Image Classification Using Single Qubit Encoding," IEEE Transactions on Neural Networks and Learning Systems, vol. 35, no. 2, pp. 1472–1486, Feb. 2024.
S. Moradi, C. Brandner, M. Coggins, R. Wille, W. Drexler, and L. Papp, "Error mitigation for quantum kernel based machine learning methods on IonQ and IBM quantum computers." arXiv, 2022.
N. Ji, R. Bao, Z. Chen, Y. Yu, and H. Ma, "Hybrid Quantum Neural Network Image Anti-Noise Classification Model Combined with Error Mitigation," Applied Sciences, vol. 14, no. 4, Feb. 2024, Art. no. 1392.
Y. Song, J. Li, Y. Wu, S. Qin, Q. Wen, and F. Gao, "A resource-efficient quantum convolutional neural network," Frontiers in Physics, vol. 12, Apr. 2024, Art. no. 1362690.
Y. Li, R. G. Zhou, R. Xu, J. Luo, and W. Hu, "A quantum deep convolutional neural network for image recognition," Quantum Science and Technology, vol. 5, no. 4, Jul. 2020, Art. no. 044003.
L. Hai, C. Liang, H. Yaming, Y. Wenli, and S. Fengquan, "An Improved Convolutional Neural Networks: Quantum Pseudo-Transposed Convolutional Neural Networks," IEEE Access, vol. 13, pp. 37108–37117, 2025.
S. Lloyd, M. Mohseni, and P. Rebentrost, "Quantum principal component analysis," Nature Physics, vol. 10, no. 9, pp. 631–633, Sep. 2014.
T. Hur, L. Kim, and D. K. Park, "Quantum convolutional neural network for classical data classification," Quantum Machine Intelligence, vol. 4, no. 1, Jun. 2022, Art. no. 3.
K. Mitarai, M. Negoro, M. Kitagawa, and K. Fujii, "Quantum circuit learning," Physical Review A, vol. 98, no. 3, Sep. 2018, Art. no. 032309.
F. Vatan and C. Williams, "Optimal quantum circuits for general two-qubit gates," Physical Review A, vol. 69, no. 3, Mar. 2004, Art. no. 032315.
Y. Liang, W. Peng, Z. J. Zheng, O. Silvén, and G. Zhao, "A hybrid quantum–classical neural network with deep residual learning," Neural Networks, vol. 143, pp. 133–147, Nov. 2021.
T. Adiono, G. Meliolla, E. Setiawan, and S. Harimurti, "Design of Neural Network Architecture using Systolic Array Implemented in Verilog Code," in 2018 International Symposium on Electronics and Smart Devices (ISESD), Bandung, Indonesia, Oct. 2018, pp. 1–4.
Li Deng, "The MNIST Database of Handwritten Digit Images for Machine Learning Research [Best of the Web]," IEEE Signal Processing Magazine, vol. 29, no. 6, pp. 141–142, Nov. 2012.
H. Xiao, K. Rasul, and R. Vollgraf, "Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms." arXiv, 2017.
Downloads
How to Cite
License
Copyright (c) 2025 Venkatachalapathy Madhavanna Venkatappa, Venkateshappa

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain the copyright and grant the journal the right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) after its publication in ETASR with an acknowledgement of its initial publication in this journal.