Abstract
Quantum computing is rapidly reshaping the landscape of image recognition by offering enhanced computational capabilities. Classical deep learning architectures such as Convolutional Neural Networks (CNNs) struggle with scalability and efficiency when handling high-dimensional quantum image data. Quantum-enhanced Dilated Convolutional Networks (Q-DCNs) have emerged as a novel solution to this problem. Traditional CNNs, even with dilation, are computationally intensive on large-scale or entangled quantum image datasets. These models fail to exploit quantum parallelism and often suffer from vanishing gradients and redundant parameterization. There is a need for an optimized hybrid quantum-classical model that combines the generalization capacity of dilation with the processing power of quantum circuits. We propose a Quantum-Enhanced Dilated Convolutional Network (Q-DCN), wherein dilated convolutional layers are hybridized with quantum variational circuits (QVCs). The model includes a dilated feature extractor that feeds into a parameterized quantum layer for entanglement-preserving transformation. The quantum circuit acts as a regularizer and nonlinear encoder, effectively reducing model complexity and enhancing feature discrimination. The Q-DCN was evaluated on a simulated quantum image dataset and compared against five existing methods: Classical CNN, Dilated CNN, Quantum CNN (QCNN), Variational Quantum Classifier (VQC), and Quantum Kernel Estimator (QKE). Q-DCN achieved superior accuracy (94.3%), reduced inference time by 23%, and utilized 35% fewer parameters. These results indicate that Q-DCN offers a scalable, efficient, and accurate solution for quantum image recognition.
Authors
B. Guruprakash1, Vaddi Karthik Reddy2
Sethu Institute of Technology, India1, Jawaharlal Nehru Technological University, India2
Keywords
Quantum Image Recognition, Dilated CNN, Variational Quantum Circuits, Hybrid Neural Networks, Quantum Computing