- Vancouver, BC
Lists (6)
Sort Name ascending (A-Z)
Stars
A very simple static homepage for your server.
A highly customizable homepage (or startpage / application dashboard) with Docker and service API integrations.
🚀 A self-hostable personal dashboard built for you. Includes status-checking, widgets, themes, icon packs, a UI editor and tons more!
Customizable browser's home page to interact with your homeserver's Docker containers (e.g. Sonarr/Radarr)
A plugin for the Obsidian.md note-taking software
Fix .app programs installed by Nix on Mac
Add/change/delete surrounding delimiter pairs with ease. Written with ❤️ in Lua.
Azure OpenAI code resources for using gpt-4o-realtime capabilities.
Observe FastAPI app with three pillars of observability: Traces (Tempo), Metrics (Prometheus), Logs (Loki) on Grafana through OpenTelemetry and OpenMetrics.
MARS5 speech model (TTS) from CAMB.AI
Inference code for the paper "Spirit-LM Interleaved Spoken and Written Language Model".
Faster Whisper transcription with CTranslate2
Fill-in-the-middle fine-tuning for the Code Llama model 🦙
Selenium-automated Jupyter Notebook that is synchronised with NeoVim in real-time.
👻 GhostText plugin for Neovim with zero dependencies 🎉 Supports neovim running inside WSL too! 🥳 Windows/Linux/macOS supported out-of-the-box! 😄 (Other OSes need python3.6+ installed)
A fully-featured batteries-included Neovim distribution for the world of Data Science. Prepared to run code and interact with Jupyter Notebooks without ever leaving your terminal.
A neovim plugin for interactively running code with the jupyter kernel. Fork of magma-nvim with improvements in image rendering, performance, and more
Multiple NVIDIA GPUs or Apple Silicon for Large Language Model Inference?
✨✨Latest Advances on Multimodal Large Language Models
A high-throughput and memory-efficient inference and serving engine for LLMs
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
Effortlessly run LLM backends, APIs, frontends, and services with one command.