[go: up one dir, main page]

"Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large ..."

Taiqiang Wu et al. (2024)

Details and statistics

DOI: 10.48550/ARXIV.2404.02657

access: open

type: Informal or Other Publication

metadata version: 2024-05-13