Skip to content

jingyang2017/KD_SRRL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SRRL

Paper

Knowledge distillation via softmax regression representation learning

Jing Yang, Brais Martinez, Adrian Bulat, Georgios Tzimiropoulos

Method

Requirements

  • Python >= 3.6
  • PyTorch >= 1.0.1

ImageNet Training and Testing

python train_imagenet_distillation.py --net_s resnet18S --net_t resnet34T

python train_imagenet_distillation.py --net_s MobileNet --net_t resnet50T

log

https://drive.google.com/drive/folders/19OnwUad63-ITXL2TxguRdyP0KtKJIfgI

Citation

@inproceedings{yang2021knowledge, 
  title={Knowledge distillation via softmax regression representation learning},
  author={Jing Yang, Brais Martinez, Adrian Bulat, Georgios Tzimiropoulos},
  booktitle={ICLR2021},
  year={2021}  
}
@article{yang2020knowledge,
  title={Knowledge distillation via adaptive instance normalization},
  author={Yang, Jing and Martinez, Brais and Bulat, Adrian and Tzimiropoulos, Georgios},
  journal={arXiv preprint arXiv:2003.04289},
  year={2020}
}

License

This project is licensed under the MIT License

About

No description or website provided.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages