SwAV transfer learning and knowledge distillation on chest X-ray classification
Abstract
COVID-19 has severely impacted human life. In response, researchers worldwide have focused on developing advanced computer systems capable of analyzing chest X-rays. Models with the best performance have been identified through numerous studies. However, these models are large and require significant computational resources. In this study, we propose combining Swapping Assignments between Views (SwAV) Transfer Learning (TL) and Knowledge Distillation (KD). By training the teacher model ResNet-50 with SwAV TL and transferring its knowledge to smaller models like ResNet-18 and ResNet-34, the performance of the smaller models improved. The ResNet-34 model's performance increased, with accuracy increasing by 5.31%, recall by 4.05%, precision by 5.62%, F1-score by 4.93%, and AUROC by 0.85%. Similarly, the ResNet-18 model's performance improved, with accuracy increasing by 1.52%, recall by 1.03%, precision by 1.87%, F1-score by 1.46%, and AUROC by 0.29%. Therefore, it has been demonstrated that the combination of SwAV TL and KD can effectively transfer knowledge from larger model to smaller ones, resulting in improved performance in the smaller models.
Commun. Math. Biol. Neurosci.
ISSN 2052-2541
Editorial Office: [email protected]
Copyright ©2024 CMBN