Awesome
Ollama4j
<p align="center"> <img src='https://raw.githubusercontent.com/ollama4j/ollama4j/65a9d526150da8fcd98e2af6a164f055572bf722/ollama4j.jpeg' width='100' alt="ollama4j-icon"> </p> <div align="center"> A Java library (wrapper/binding) for Ollama server.Find more details on the website.
</div>
Table of Contents
How does it work?
flowchart LR
o4j[Ollama4j]
o[Ollama Server]
o4j -->|Communicates with| o;
m[Models]
subgraph Ollama Deployment
direction TB
o -->|Manages| m
end
Requirements
<a href="https://ollama.com/" target="_blank"> <img src="https://img.shields.io/badge/v0.3.0-green.svg?style=for-the-badge&labelColor=gray&label=Ollama&color=blue" alt=""/> </a> <table> <tr> <td><a href="https://ollama.ai/" target="_blank">Local Installation</a>
</td> <td><a href="https://hub.docker.com/r/ollama/ollama" target="_blank">Docker Installation</a>
</td> </tr> <tr> <td><a href="https://ollama.com/download/Ollama-darwin.zip" target="_blank">Download for macOS</a>
<a href="https://ollama.com/download/OllamaSetup.exe" target="_blank">Download for Windows</a>
Install on Linux
curl -fsSL https://ollama.com/install.sh | sh
</td>
<td>
CPU only
docker run -d -p 11434:11434 \
-v ollama:/root/.ollama \
--name ollama \
ollama/ollama
NVIDIA GPU
docker run -d -p 11434:11434 \
--gpus=all \
-v ollama:/root/.ollama \
--name ollama \
ollama/ollama
</td>
</tr>
</table>
Installation
[!NOTE] We are now publishing the artifacts to both Maven Central and GitHub package repositories.
Track the releases here and update the dependency version according to your requirements.
For Maven
Using Maven Central
In your Maven project, add this dependency:
<dependency>
<groupId>io.github.ollama4j</groupId>
<artifactId>ollama4j</artifactId>
<version>1.0.89</version>
</dependency>
Using GitHub's Maven Package Repository
- Add
GitHub Maven Packages
repository to your project'spom.xml
or yoursettings.xml
:
<repositories>
<repository>
<id>github</id>
<name>GitHub Apache Maven Packages</name>
<url>https://maven.pkg.github.com/ollama4j/ollama4j</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
- Add
GitHub
server to settings.xml. (Usually available at ~/.m2/settings.xml)
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
http://maven.apache.org/xsd/settings-1.0.0.xsd">
<servers>
<server>
<id>github</id>
<username>YOUR-USERNAME</username>
<password>YOUR-TOKEN</password>
</server>
</servers>
</settings>
- In your Maven project, add this dependency:
<dependency>
<groupId>io.github.ollama4j</groupId>
<artifactId>ollama4j</artifactId>
<version>1.0.89</version>
</dependency>
For Gradle
- Add the dependency
dependencies {
implementation 'io.github.ollama4j:ollama4j:1.0.79'
}
API Spec
[!TIP] Find the full API specifications on the website.
Development
Build:
make build
Run unit tests:
make unit-tests
Run integration tests:
make integration-tests
Releases
Newer artifacts are published via GitHub Actions CI workflow when a new release is created from main
branch.
⭐ Give us a Star!
If you like or are using this project to build your own, please give us a star. It's a free way to show your support.
Who's using Ollama4j?
# | Project Name | Description | Link |
---|---|---|---|
1 | Datafaker | A library to generate fake data | GitHub |
2 | Vaadin Web UI | UI-Tester for interactions with Ollama via ollama4j | GitHub |
3 | ollama-translator | A Minecraft 1.20.6 Spigot plugin that translates all messages into a specific target language via Ollama | GitHub |
4 | AI Player | A Minecraft mod that adds an intelligent "second player" to the game | GitHub, <br/> Reddit Thread |
5 | Ollama4j Web UI | A web UI for Ollama written in Java using Spring Boot, Vaadin, and Ollama4j | GitHub |
6 | JnsCLI | A command-line tool for Jenkins that manages jobs, builds, and configurations, with AI-powered error analysis | GitHub |
7 | Katie Backend | An open-source AI-based question-answering platform for accessing private domain knowledge | GitHub |
8 | TeleLlama3 Bot | A question-answering Telegram bot | Repo |
9 | moqui-wechat | A moqui-wechat component | GitHub |
Traction
Get Involved
<div align="center"><a href=""></a> <a href=""></a> <a href=""></a> <a href=""></a> <a href=""></a>
</div>Contributions are most welcome! Whether it's reporting a bug, proposing an enhancement, or helping with code - any sort of contribution is much appreciated.
🏷️ License and Citation
The code is available under MIT License.
If you find this project helpful in your research, please cite this work at
@misc{ollama4j2024,
author = {Amith Koujalgi},
title = {Ollama4j: A Java Library (Wrapper/Binding) for Ollama Server},
year = {2024},
month = {January},
url = {https://github.com/ollama4j/ollama4j}
}
References
Credits
The nomenclature and the icon have been adopted from the incredible Ollama project.
Thanks to the amazing contributors
<p align="center"> <a href="https://github.com/ollama4j/ollama4j/graphs/contributors"> <img src="https://contrib.rocks/image?repo=ollama4j/ollama4j" alt=""/> </a> </p>