[go: up one dir, main page]

"Compressing Pre-trained Models of Code into 3 MB."

Jieke Shi et al. (2022)

Details and statistics

DOI: 10.48550/ARXIV.2208.07120

access: open

type: Informal or Other Publication

metadata version: 2022-08-17