Home

Awesome

AI Runner Nexus

Run Mistral LLM offline on your computer using a socket server.


Features


Limitations

Data between server and client is not encrypted

This only matters if someone wants to create a production ready version of this server which would be hosted on the internet. This server is not designed for that purpose. It was designed with a single use-case in mind: the ability to run Stable Diffusion (and other AI models) locally. It was designed for use with the Krita Stable Diffusion plugin, but can work with any interface provided someone writes a client for it.

Only works with Mistral

This library was designed to work with the Mistral model, but it can be expanded to work with any LLM.


Installation

pip install airunner-nexus
cp src/airunner_nexus/default.settings.py src/airunner_nexus/settings.py

Modify settings.py as you see fit.


Run server and client

See src/airunner_nexus/server.py for an example of how to run the server and src/airunner_nexus/client.py for an example of how to run the client. Both of these files can be run directly from the command line.

The socket client will continuously attempt to connect to the server until it is successful. The server will accept connections from any client on the given port.