4th African International Conference on Industrial Engineering and Operations Management

Improving Knowledge Distillation Using Data Augmentation

DONG HO SHIN
Publisher: IEOM Society International
0 Paper Citations
1 Views
1 Downloads
Track: High School STEM Poster Competition
Abstract

Knowledge Distillation (KD) is a type of transfer learning where a large, pre-trained model (Teacher Model) transfers knowledge to a smaller, simpler model (Student Model) to make the latter more efficiently trainable. Knowledge distillation (KD) can effectively transfer the knowledge of a teacher model to a student model in image classification tasks. However, the results of KD can vary depending on the specific method and approach used, thus a standardized improvement method is necessary. Data Augmentation techniques are powerful performance improvement methods for image classification. In this paper, we propose a standardized approach to improving knowledge distillation using various Data Augmentation techniques based on the α value. The results showed that Data Augmentation techniques provided a significant performance improvement in the Knowledge Distillation model. We demonstrated that as the strength of data augmentation increases, the performance of knowledge distillation (KD) also increases proportionally. And It seems that the α value does not have a significant impact on the performance improvement of knowledge distillation (KD).

Keywords

deep learning, computer vision, Image Classification, Knowledge Distillation and Data Augmentation.

Published in: 4th African International Conference on Industrial Engineering and Operations Management, Lusaka, Zambia

Publisher: IEOM Society International
Date of Conference: April 4-6, 2023

ISSN/E-ISSN: 2169-8767