A curated list for Efficient Large Language Models
-
Updated
Nov 17, 2024 - Python
A curated list for Efficient Large Language Models
模型压缩的小白入门教程
[NeurIPS 2024] SlimSAM: 0.1% Data Makes Segment Anything Slim
Explore concepts like Self-Correct, Self-Refine, Self-Improve, Self-Contradict, Self-Play, and Self-Knowledge, alongside o1-like reasoning elevation🍓 and hallucination alleviation🍄.
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
[arXiv 2024] PyTorch implementation of DCD: https://arxiv.org/abs/2407.11802
PyTorch implementation of over 30 realtime semantic segmentations models, e.g. BiSeNetv1, BiSeNetv2, CGNet, ContextNet, DABNet, DDRNet, EDANet, ENet, ERFNet, ESPNet, ESPNetv2, FastSCNN, ICNet, LEDNet, LinkNet, PP-LiteSeg, SegNet, ShelfNet, STDC, SwiftNet, and support knowledge distillation, distributed training, Optuna etc.
Challenge, Rethink, Ascend
A treasure chest for visual classification and recognition powered by PaddlePaddle
Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.
TeaPartyDev Blog is a personal knowledge base and blog designed to unify and refine insights, providing a central hub for organized and expanded learning.
[ICCV 2023] MI-GAN: A Simple Baseline for Image Inpainting on Mobile Devices
[IEEE TIV] Official PyTorch Implementation of ''A Cognitive-Based Trajectory Prediction Approach for Autonomous Driving.''
大模型/LLM推理和部署理论与实践
TensorFlow code for our ECCV'24 Workshop paper "LightAvatar: Efficient Head Avatar as Dynamic NeLF"
Efficient computing methods developed by Huawei Noah's Ark Lab
[ECCV 2024] - Improving Zero-shot Generalization of Learned Prompts via Unsupervised Knowledge Distillation
AI book for everyone
Official code for TPAMI2024 paper: Pixel Distillation: Cost-flexible Distillation across Image Sizes and Heterogeneous Networks
Collection of AWESOME vision-language models for vision tasks
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."