🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
-
Updated
Sep 29, 2024 - Python
🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
Open-source RAG Framework for building GenAI Second Brains 🧠 Build productivity assistant (RAG) ⚡️🤖 Chat with your docs (PDF, CSV, ...) & apps using Langchain, GPT 3.5 / 4 turbo, Private, Anthropic, VertexAI, Ollama, LLMs, Groq that you can share with users ! Efficient retrieval augmented generation framework
LlamaIndex is a data framework for your LLM applications
🙌 OpenHands: Code Less, Make More
Efficiently Fine-Tune 100+ LLMs in WebUI (ACL 2024)
A generative speech model for daily dialogue.
基于大模型搭建的聊天机器人,同时支持 微信公众号、企业微信应用、飞书、钉钉 等接入,可选择GPT3.5/GPT-4o/GPT-o1/ Claude/文心一言/讯飞星火/通义千问/ Gemini/GLM-4/Claude/Kimi/LinkAI,能处理文本、语音和图片,访问操作系统和互联网,支持基于自有知识库进行定制企业智能客服。
A high-throughput and memory-efficient inference and serving engine for LLMs
The platform for building AI from enterprise data
The Memory layer for your AI apps
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Universal LLM Deployment Engine with ML Compilation
RAGFlow is an open-source RAG (Retrieval-Augmented Generation) engine based on deep document understanding.
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
🔍 AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
<⚡️> SuperAGI - A dev-first open source autonomous AI agent framework. Enabling developers to build, manage & run useful autonomous agents quickly and reliably.
GPT-powered chat for documentation, chat with your documents
Add a description, image, and links to the llm topic page so that developers can more easily learn about it.
To associate your repository with the llm topic, visit your repo's landing page and select "manage topics."