Reading Discussion 8
Key Word(s): Distillation, Compression
Selected Readings
-
Expository
- Knowledge Distillation
- An Overview of Model Compression Techniques for Deep Learning in Space
- Deep Learning - Model Optimization and Compression: Simplified
-
Research
- Model compression
- Distilling the Knowledge in a Neural Network
- Knowledge Distillation: A Survey
- An End-to-End Compression Framework Based on Convolutional Neural Networks
- DistilBERT
- Structured Knowledge Distillation for Semantic Segmentation
- Improved knowledge distillation via teacher assistant
- On the Efficacy of Knowledge Distillation
* Next presentations, select from Research or Use Case