[go: up one dir, main page]

"MLKD-BERT: Multi-level Knowledge Distillation for Pre-trained Language Models."

Ying Zhang, Ziheng Yang, Shufan Ji (2024)

Details and statistics

DOI: 10.48550/ARXIV.2407.02775

access: open

type: Informal or Other Publication

metadata version: 2024-08-07