flowchart LR
o4j[Ollama4j]
o[Ollama Server]
o4j -->|Communicates with| o;
m[Models]
subgraph Ollama Deployment
direction TB
o -->|Manages| m
end
Install on Linux curl -fsSL https://ollama.com/install.sh | sh |
CPU only docker run -d -p 11434:11434 \
-v ollama:/root/.ollama \
--name ollama \
ollama/ollama NVIDIA GPU docker run -d -p 11434:11434 \
--gpus=all \
-v ollama:/root/.ollama \
--name ollama \
ollama/ollama |
Note
We are now publishing the artifacts to both Maven Central and GitHub package repositories.
Track the releases here and update the dependency version according to your requirements.
Using Maven Central
In your Maven project, add this dependency:
<dependency>
<groupId>io.github.ollama4j</groupId>
<artifactId>ollama4j</artifactId>
<version>1.0.79</version>
</dependency>
- Add
GitHub Maven Packages
repository to your project'spom.xml
or yoursettings.xml
:
<repositories>
<repository>
<id>github</id>
<name>GitHub Apache Maven Packages</name>
<url>https://maven.pkg.github.com/ollama4j/ollama4j</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
- Add
GitHub
server to settings.xml. (Usually available at ~/.m2/settings.xml)
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
http://maven.apache.org/xsd/settings-1.0.0.xsd">
<servers>
<server>
<id>github</id>
<username>YOUR-USERNAME</username>
<password>YOUR-TOKEN</password>
</server>
</servers>
</settings>
- In your Maven project, add this dependency:
<dependency>
<groupId>io.github.ollama4j</groupId>
<artifactId>ollama4j</artifactId>
<version>1.0.79</version>
</dependency>
- Add the dependency
dependencies {
implementation 'io.github.ollama4j:ollama4j:1.0.79'
}
Tip
Find the full API specifications on the website.
Build:
make build
Run unit tests:
make unit-tests
Run integration tests:
make integration-tests
Newer artifacts are published via GitHub Actions CI workflow when a new release is created from main
branch.
If you like or are using this project to build your own, please give us a star. It's a free way to show your support.
Datafaker
: a library to generate fake dataVaadin Web UI
: UI-Tester for Interactions with Ollama via ollama4jollama-translator
: Minecraft 1.20.6 spigot plugin allows to easily break language barriers by using ollama on the server to translate all messages into a specfic target language.AI Player
: A minecraft mod which aims to add a "second player" into the game which will actually be intelligent.Ollama4j Web UI
: A web UI for Ollama written in Java using Spring Boot and Vaadin framework and Ollama4j.JnsCLI
: A command-line tool for Jenkins that manages jobs, builds, and configurations directly from the terminal while offering AI-powered error analysis for quick troubleshooting.Katie Backend
: An Open Source AI-based question-answering platform that helps companies and organizations make their private domain knowledge accessible and useful to their employees and customers.TeleLlama3 Bot
: A Question-Answering Telegram Bot.moqui-wechat
: A wechat plugin
Contributions are most welcome! Whether it's reporting a bug, proposing an enhancement, or helping with code - any sort of contribution is much appreciated.
The code is available under MIT License.
If you find this project helpful in your research, please cite this work at
@misc{ollama4j2024,
author = {Amith Koujalgi},
title = {Ollama4j: A Java Library (Wrapper/Binding) for Ollama Server},
year = {2024},
month = {January},
url = {https://github.com/ollama4j/ollama4j}
}
The nomenclature and the icon have been adopted from the incredible Ollama project.
Thanks to the amazing contributors