[go: up one dir, main page]

Skip to content
#

knowledge-distillation

Here are 518 public repositories matching this topic...

Explore concepts like Self-Correct, Self-Refine, Self-Improve, Self-Contradict, Self-Play, and Self-Knowledge, alongside o1-like reasoning elevation🍓 and hallucination alleviation🍄.

  • Updated Nov 15, 2024
  • Jupyter Notebook

PyTorch implementation of over 30 realtime semantic segmentations models, e.g. BiSeNetv1, BiSeNetv2, CGNet, ContextNet, DABNet, DDRNet, EDANet, ENet, ERFNet, ESPNet, ESPNetv2, FastSCNN, ICNet, LEDNet, LinkNet, PP-LiteSeg, SegNet, ShelfNet, STDC, SwiftNet, and support knowledge distillation, distributed training, Optuna etc.

  • Updated Nov 14, 2024
  • Python

Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.

  • Updated Nov 14, 2024

Improve this page

Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."

Learn more