Vanilla KD | KD w/ our logit standardization |
Knowledge distillation involves transferring soft labels from a teacher to a student using a shared temperature-based softmax function. However, the assumption of a shared temperature between teacher and student implies a mandatory exact match between their logits in terms of logit range and variance. This side-effect limits the performance of student, considering the capacity discrepancy between them and the finding that the innate logit relations of teacher are sufficient for student to learn. To address this issue, we propose setting the temperature as the weighted standard deviation of logit and performing a plug-and-play Z-score pre-process of logit standardization before applying softmax and Kullback-Leibler divergence. Our pre-process enables student to focus on essential logit relations from teacher rather than requiring a magnitude match, and can improve the performance of existing logit-based distillation methods. We also show a typical case where the conventional setting of sharing temperature between teacher and student cannot reliably yield the authentic distillation evaluation; nonetheless, this challenge is successfully alleviated by our Z-score. We extensively evaluate our method for various student and teacher models on CIFAR-100 and ImageNet, showing its significant superiority. The vanilla knowledge distillation powered by our pre-process can achieve favorable performance against state-of-the-art methods, and other distillation variants can obtain considerable gain with the assistance of our pre-process.
The code is built on mdistiller, Multi-Level-Logit-Distillation, CTKD and tiny-transformers.
The repository is still in progress.
Environments:
- Python 3.8
- PyTorch 1.7.0
Install the package:
sudo pip3 install -r requirements.txt
sudo python setup.py develop
- Download the
cifar_teachers.tar
at https://github.com/megvii-research/mdistiller/releases/tag/checkpoints and untar it to./download_ckpts
viatar xvf cifar_teachers.tar
.
- For KD
# KD
python tools/train.py --cfg configs/cifar100/KD/res32x4_res8x4.yaml
# KD+Ours
python tools/train.py --cfg configs/cifar100/KD/res32x4_res8x4.yaml --logit-stand --base-temp 2 --kd-weight 9
- For DKD
# DKD
python tools/train.py --cfg configs/cifar100/DKD/res32x4_res8x4.yaml
# DKD+Ours
python tools/train.py --cfg configs/cifar100/DKD/res32x4_res8x4.yaml --logit-stand --base-temp 2 --kd-weight 9
- For MLKD
# MLKD
python tools/train.py --cfg configs/cifar100/MLKD/res32x4_res8x4.yaml
# MLKD+Ours
python tools/train.py --cfg configs/cifar100/MLKD/res32x4_res8x4.yaml --logit-stand --base-temp 2 --kd-weight 9
- For CTKD
Please refer to CTKD/README.md
- Teacher and student have identical structures
Teacher Student |
ResNet32x4 ResNet8x4 |
VGG13 VGG8 |
Wrn40-2 Wrn40-1 |
Wrn40-2 Wrn16-2 |
ResNet56 ResNet20 |
ResNet110 ResNet32 |
ResNet110 ResNet20 |
---|---|---|---|---|---|---|---|
KD | 73.33 | 72.98 | 73.54 | 74.92 | 70.66 | 73.08 | 70.67 |
KD+Ours | 76.62 | 74.36 | 74.37 | 76.11 | 71.43 | 74.17 | 71.48 |
CTKD | 73.39 | 73.52 | 73.93 | 75.45 | 71.19 | 73.52 | 70.99 |
CTKD+Ours | 76.67 | 74.47 | 74.58 | 76.08 | 71.34 | 74.01 | 71.39 |
DKD | 76.32 | 74.68 | 74.81 | 76.24 | 71.97 | 74.11 | 71.06 |
DKD+Ours | 77.01 | 74.81 | 74.89 | 76.39 | 72.32 | 74.29 | 71.85 |
MLKD | 77.08 | 75.18 | 75.35 | 76.63 | 72.19 | 74.11 | 71.89 |
MLKD+Ours | 78.28 | 75.22 | 75.56 | 76.95 | 72.33 | 74.32 | 72.27 |
- Teacher and student have distinct structures
Teacher Student |
ResNet32x4 SHN-V2 |
ResNet32x4 Wrn16-2 |
ResNet32x4 Wrn40-2 |
Wrn40-2 ResNet8x4 |
Wrn40-2 MN-V2 |
VGG13 MN-V2 |
ResNet50 MN-V2 |
---|---|---|---|---|---|---|---|
KD | 74.45 | 74.90 | 77.70 | 73.97 | 68.36 | 67.37 | 67.35 |
KD+Ours | 75.56 | 75.26 | 77.92 | 77.11 | 69.23 | 68.61 | 69.02 |
CTKD | 75.37 | 74.57 | 77.66 | 74.61 | 68.34 | 68.50 | 68.67 |
CTKD+Ours | 76.18 | 75.16 | 77.99 | 77.03 | 69.53 | 68.98 | 69.36 |
DKD | 77.07 | 75.70 | 78.46 | 75.56 | 69.28 | 69.71 | 70.35 |
DKD+Ours | 77.37 | 76.19 | 78.95 | 76.75 | 70.01 | 69.98 | 70.45 |
MLKD | 78.44 | 76.52 | 79.26 | 77.33 | 70.78 | 70.57 | 71.04 |
MLKD+Ours | 78.76 | 77.53 | 79.66 | 77.68 | 71.61 | 70.94 | 71.19 |
-
Download the dataset at https://image-net.org/ and put them to
./data/imagenet
python tools/train.py --cfg configs/imagenet/r34_r18/kd_ours.yaml
Please refer to tiny-transformers/README.md
Sincere gratitude to the contributors of mdistiller, CTKD, Multi-Level-Logit-Distillation and tiny-transformers for their distinguished efforts.