[go: up one dir, main page]

"bert2BERT: Towards Reusable Pretrained Language Models."

Cheng Chen et al. (2022)

Details and statistics

DOI: 10.18653/V1/2022.ACL-LONG.151

access: open

type: Conference or Workshop Paper

metadata version: 2023-05-04