Reading Discussion 6: Distillation and Compression
Key Word(s): Compression, Distillation
Selected Readings
Expository
Research
- An embarrassingly simple approach to zero-shot learning
- Distilling the Knowledge in a Neural Network
- Distilling a neural network into a soft decision tree
- Model compression