Improving the Efficiency of Transfer Learning Through Dynamic Determination of the Number of Layers to Freeze Based on Class Similarity

Authors

DOI:

https://doi.org/10.31649/1997-9266-2026-185-2-48-54

Keywords:

transfer learning, dynamic layer freezing, cosine distance, ResNet-50, CIFAR-10, resource reduction

Abstract

In this research work, a method is presented that automatically determines the number of neural network layers to be freezed during transfer learning. The proposed approach is based on the use of cosine distance between the vector representations of classes from the source and target datasets, which makes it possible to evaluate their semantic similarity and accordingly regulate the depth of layer freezing for further fine-tuning. Unlike fixed strategies, the dynamic approach provides more flexible utilization of prior knowledge and reduces the need for excessive computations. The study employs the ResNet-50 architecture and class subsets from the CIFAR-10 dataset, for which mean feature vectors were generated and cosine distance was calculated. Based on these values, the method dynamically determined how many layers of the model should be frozen and kept unchanged, and which layers require retraining. The effectiveness of the method was evaluated by comparing classical training from scratch with training using this transfer learning approach. The obtained results demonstrate that the proposed method improves generalization quality and reduces training time, highlighting the advantages of dynamically determining the number of layers to freeze. The method can be applied in tasks where rapid adaptation, limited data, and efficient resource usage are essential. The proposed technique is promising, as it effectively combines model accuracy with reduced computational costs, enabling scalability, reusability of pretrained layers, and quick integration into various application domains, making it a valuable tool for future research and practical implementations.

Author Biography

D. A. Ivanov, Zhytomyr Polytechnic State University

Post-Graduate Student of the Chair of Software Engineering

References

S. J. Pan, and Q. Yang, “A Survey on Transfer Learning,” IEEE Transactions on Knowledge and Data Engineering, vol. 22, no. 10, pp. 1345-1359, 2010. https://doi.org/10.1109/TKDE.2009.191 .

F. Zhuang, et al., “A Comprehensive Survey on Transfer Learning,” Proceedings of the IEEE, vol. 109, no. 1, pp. 43-76, 2020. https://doi.org/10.1109/JPROC.2020.3004555 .

A. Huang, “Similarity measures for text document clustering,” Proceedings of the Sixth New Zealand Computer Science Research Student Conference, 2008, pp. 49-56.

J. Yosinski, et al., “How transferable are features in deep neural networks?” Advances in Neural Information Processing Systems, vol. 27, pp. 3320-3328, 2014. https://doi.org/10.48550/arXiv.1411.1792 .

D. A. Ivanov, “Reducing the training time of models using transfer learning,” Scientific Works of VNTU, no. 3, 2024. https://doi.org/10.31649/2307-5376-2024-3-25-30 .

S. Kornblith, J. Shlens, and Q. V. Le, “Do Better ImageNet Models Transfer Better?” Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 2661-2671. https://doi.org/10.1109/CVPR.2019.00277 .

M. Raghu, C. Zhang, J. Kleinberg, and S. Bengio, “Transfusion: Understanding Transfer Learning for Medical Imaging,” Advances in Neural Information Processing Systems, vol. 32, 2019. https://arxiv.org/abs/1902.07208 .

S. Ben-David, J. Blitzer, K. Crammer, and F. Pereira, “A Theory of Learning from Different Domains,” Machine Learning, vol. 79, pp. 151-175, 2010. https://doi.org/10.1007/s10994-009-5152-4 .

K. He, X. Zhang, S. Ren, and J. Sun, “Deep Residual Learning for Image Recognition,” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 770-778. https://doi.org/10.1109/CVPR.2016.90 .

A. Krizhevsky, Learning Multiple Layers of Features from Tiny Images, University of Toronto, 2009.

Downloads

Abstract views: 0

Published

2026-04-08

How to Cite

[1]
D. A. Ivanov, “Improving the Efficiency of Transfer Learning Through Dynamic Determination of the Number of Layers to Freeze Based on Class Similarity”, Вісник ВПІ, no. 2, pp. 48–54, Apr. 2026.

Issue

Section

Information technologies and computer sciences

Metrics

Downloads

Download data is not yet available.