[go: up one dir, main page]

Search results

16 packages found

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

published version 3.3.0, 18 hours ago14 dependents licensed under $MIT
30,694

React Native binding of llama.cpp

published version 0.4.3, 12 days ago1 dependents licensed under $MIT
5,086

llama.cpp gguf file parser for javascript

published version 0.2.2, 7 months ago1 dependents licensed under $MIT
1,393

Fork of llama.rn for ChatterUI

published version 1.3.0, 16 days ago0 dependents licensed under $MIT
735

Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.

published version 1.0.0-beta.26, 3 days ago0 dependents licensed under $MIT
442

React Native binding of llama.cpp

published version 0.3.12, a month ago0 dependents licensed under $MIT
149

llama.cpp LLM Provider

published version 0.3.14, 2 months ago1 dependents
112

llama.cpp LLM Provider

published version 0.1.15, 2 months ago0 dependents
108

React Native binding of llama.cpp

published version 0.3.12, a month ago0 dependents licensed under $MIT
80

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level

published version 1.3.1, 8 months ago0 dependents licensed under $MIT
42

llama.cpp LLM Provider - OpenAI Compatible

published version 0.1.3, 2 months ago2 dependents
50

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level

published version 0.1.0, 4 months ago0 dependents licensed under $MIT
34

Node.js bindings for LlamaCPP, a C++ library for running language models.

published version 1.2.0, 2 months ago0 dependents
17

use `npm i --save llama.native.js` to run lama.cpp models on your local machine. features a socket.io server and client that can do inference with the host of the model.

published version 1.1.0, a year ago0 dependents licensed under $IDGASHIT
17

A simple grammar builder compatible with GBNF (llama.cpp)

published version 0.0.5, 7 months ago0 dependents licensed under $MIT
6

serve websocket GGML 4/5bit Quantized LLM's based on Meta's LLaMa model with llama.ccp

published version 0.1.0, 2 years ago0 dependents licensed under $ISC
1