Join our Discord Server to get the latest updates and to interact with the community.
This is the repository for the Freedom GPT application. This application is built using Electron and React. It is a desktop application that allows users to run alpaca models on their local machine.
git clone --recursive https://github.com/ohmplatform/FreedomGPT.git freedom-gpt
cd freedom-gpt
yarn install
yarn start:prod
git clone --recursive https://github.com/ohmplatform/FreedomGPT.git freedom-gpt
cd freedom-gpt
yarn install
cd llama.cpp
make
- Download and install CMake: https://cmake.org/download/
- Run the following commands one by one:
cd llama.cpp
cmake .
cmake --build . --config Release
- You should now have a
Release
folder with amain.exe
file inside it. You can run this file to test the chat client.
We are using http://localhost:8889
as the API URL, you can change it in the file
src/index.ts
To run the application, run the following command in your terminal:
yarn start
⦻ Make sure you are in the root directory of the project.
To run the docker image, run the following command in your terminal:
docker pull freedomgpt/freedomgpt
docker run -d -p 8889:8889 freedomgpt/freedomgpt
If you want to build the docker image yourself, run the following command in your terminal:
docker build -t freedomgpt/freedomgpt .
OR
yarn docker
Screen.Recording.2023-04-22.at.8.41.01.AM.mov
This project utilizes several open-source packages and libraries, without which this project would not have been possible:
"llama.cpp" - C++ library. https://github.com/ggerganov/llama.cpp
"LLAMA" by Facebook Research - a low-latency, large-scale approximate nearest neighbor search algorithm. https://github.com/facebookresearch/llama
"Alpaca" by Stanford CRFM - a framework for understanding and improving the efficiency and robustness of algorithms. https://crfm.stanford.edu/2023/03/13/alpaca.html
"alpaca-lora" by tloen - a Python library for working with LoRa radios and the Alpaca protocol. https://github.com/tloen/alpaca-lora
We would like to express our gratitude to the developers of these packages and their contributors for making their work available to the public under open source licenses. Their contributions have enabled us to build a more robust and efficient project.
See the LICENSE file.