default search action
"MLKD-BERT: Multi-level Knowledge Distillation for Pre-trained Language Models."
Ying Zhang, Ziheng Yang, Shufan Ji (2024)
- Ying Zhang, Ziheng Yang, Shufan Ji:
MLKD-BERT: Multi-level Knowledge Distillation for Pre-trained Language Models. CoRR abs/2407.02775 (2024)
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.