Awesome
kg-chat
LLM-based chatbot that queries and visualizes KGX
nodes and edges TSV files loaded into either DuckDB
(default) or neo4j
database backend.
LLMs Supported
LLM Provider | Models |
---|---|
OpenAI | - gpt-4o-2024-08-06 <br>- gpt-4o-mini <br>- gpt-4o-mini-2024-07-18 <br>- gpt-4o-2024-05-13 <br>- gpt-4o <br>- gpt-4-turbo-2024-04-09 <br>- gpt-4-turbo <br>- gpt-4-turbo-preview |
Anthropic | - claude-3-5-sonnet-20240620 <br>- claude-3-opus-20240229 <br>- claude-3-sonnet-20240229 <br>- claude-3-haiku-20240307 |
Ollama | - llama3.1 |
LBNL-hosted models via CBORG | - lbl/cborg-chat:latest <br>- lbl/cborg-chat-nano:latest <br>- lbl/cborg-coder:latest <br>- openai/chatgpt:latest <br>- anthropic/claude:latest <br>- google/gemini:latest |
:warning:
-
OpenAI: Ensure
OPENAI_API_KEY
is set as an environment variable. -
Anthropic: Ensure
ANTHROPIC_API_KEY
is set as an environment variable. -
Ollama: Better results if the
llama 3.1 405b
model is used. Needs GPU.- No API key required.
- Download the application from here and install it locally.
- Get any model of your choice but make sure the model has the
Tools
badge for it to work. Here's an example:ollama run llama3.1:405b
-
Models hosted by Lawrence Berkeley National Laboratory via CBORG: Ensure
CBORG_API_KEY
is set as an environment variable.- The list of modes can be found here listed under "LBNL_Hosted Models".
How to set the API key as an environment variable?
One quick way is
export OPENAI_API_KEY=XXXXXX
export ANTHROPIC_API_KEY=XXXXX
export CBORG_API_KEY=XXXX
But if you want these to persist permanently
vi ~/.bash_profile
OR
vi ~/.bashrc
Add the 2 lines exporting the variables above and then
source ~/.bash_profile
OR
source ~/.bashrc
Setup
For Neo4j Backend (Optional)
- Install Neo4j desktop from here.
- Create a new project and database, then start it.
- Install the APOC plugin in Neo4j Desktop.
- Update settings to match
neo4j_db_settings.conf
.
General Setup
For Developers
- Clone this repository.
- Create a virtual environment and install dependencies:
cd kg-chat pip install poetry poetry install
- Replace
data/nodes.tsv
anddata/edges.tsv
with desired KGX files if needed.
For using kg-chat as a dependency
pip install kg-chat
OR
poetry add kg-chat@latest
Supported Backends
- DuckDB [default]
- Neo4j
Commands
-
Import KG: Load nodes and edges into a database (default: duckdb).
poetry run kg import --data-dir data
-
List LLM models: List the LLM models supported.
poetry run kg list-models
-
Test Query: Run a test query.
:warning:
--data-dir
is a required parameter for all commands. This is the path for the directory which contains the nodes.tsv and edges.tsv file. The filenames are expected to be exactly that.poetry run kg test-query --data-dir data
-
QnA: Ask questions about the data.
poetry run kg qna "how many nodes do we have here?" --data-dir data
-
Chat: Start an interactive chat session.
poetry run kg chat --data-dir data
-
App: Deploy a local web application.
poetry run kg app --data-dir data
Visualization
Use show me
in prompts for KG visualization.
Acknowledgements
This cookiecutter project was developed from the monarch-project-template template and will be kept up-to-date using cruft.