Replies: 2 comments 4 replies
-
It should be, it's just not configured. |
Beta Was this translation helpful? Give feedback.
4 replies
-
Not sure if you solved this, but I just added a guide to the wiki: https://github.com/neph1/LlamaTale/wiki/Configuring-llama.cpp-server |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
so i would like to know if is possible to use it with llama.cpp using ./server and listening port 5001, i tried and didnt work
Beta Was this translation helpful? Give feedback.
All reactions