Capsule networks have emerged as a robust alternative to traditional
convolutional neural networks, providing superior performance in
recognizing spatial hierarchies and capturing intricate relationships in
image data. However, their computational intensity and memory
demands present significant challenges, particularly for resource-
constrained environments. Addressing this limitation, the proposed
study explores the integration of knowledge distillation and transfer
learning techniques to enhance the computational efficiency of
Capsule Networks without compromising their accuracy. Knowledge
distillation compresses the model by transferring learned knowledge
from a high-capacity teacher network to a lightweight student network,
effectively reducing computational overhead. Transfer learning
further minimizes resource demands by leveraging pre-trained models,
thus expediting the training process and optimizing performance.
Experiments were conducted on the MNIST and CIFAR-10 datasets,
with the optimized Capsule Network achieving classification
accuracies of 99.1% and 93.7%, respectively, while reducing
computational requirements by 45%. The proposed approach
demonstrated a significant improvement in training time and memory
efficiency, achieving a 40% reduction in model parameters compared
to baseline Capsule Network implementations. These results underline
the potential of combining knowledge distillation and transfer learning
to make advanced architectures like Capsule Networks accessible for
real-time and edge applications. Future directions include extending
this framework to more complex datasets and applications such as
object detection and medical imaging.
Vince Paul1, A. Anbu Megelin Star2, A. Anto Spiritus Kingsly3, S.J. Jereesha Mary4 Christ College of Engineering, India1, DMI Engineering College, India2, Oasys Institute of Technology, India3, Annai Velankanni College of Engineering, India4
Capsule Networks, Knowledge Distillation, Transfer Learning, Computational Efficiency, Model Compression
January | February | March | April | May | June | July | August | September | October | November | December |
5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Published By : ICTACT
Published In :
ICTACT Journal on Soft Computing ( Volume: 15 , Issue: 3 , Pages: 3578 - 3588 )
Date of Publication :
January 2025
Page Views :
117
Full Text Views :
5
|