Stars
Stability-AI's SV3D (ECCV 2024 oral, Voleti et al.) in the diffusers convention.
Official repository for paper "MagicMan: Generative Novel View Synthesis of Humans with 3D-Aware Diffusion and Iterative Refinement"
The AI Scientist: Towards Fully Automated Open-Ended Scientific Discovery 🧑🔬
A nanoGPT pipeline packed in a spreadsheet
Smplify-X implementation. (2024. 08. 30 No Error & Recent version)
[3DV 2024] Official Repository for "TADA! Text to Animatable Digital Avatars".
Open-source keyboard firmware for Atmel AVR and Arm USB families
An open-source framework for training large multimodal models.
🦦 Otter, a multi-modal model based on OpenFlamingo (open-sourced version of DeepMind's Flamingo), trained on MIMIC-IT and showcasing improved instruction-following and in-context learning ability.
OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
🦘 Explore multimedia datasets at scale
High-Resolution Image Synthesis with Latent Diffusion Models
yoshida-m-3 / vim-im-select
Forked from brglng/vim-im-selectImprove Vim/Neovim experience with input methods.
Python packaging and dependency management made easy
A game theoretic approach to explain the output of any machine learning model.
Source code of the KDD19 paper "Deep anomaly detection with deviation networks", weakly/partially supervised anomaly detection, few-shot anomaly detection, semi-supervised anomaly detection
Repository for Machine Learning resources, frameworks, and projects. Managed by the DLSU Machine Learning Group.
End-to-End Object Detection with Transformers
[TPAMI 2022 & CVPR2021 Oral] UP-DETR: Unsupervised Pre-training for Object Detection with Transformers
🐸💬 - a deep learning toolkit for Text-to-Speech, battle-tested in research and production
Flops counter for convolutional networks in pytorch framework
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.