Awesome
Auto-GPT Package
"It's like AutoGPT got a brew install
", made possible by Kurtosis.
NOTE: This now runs with 0.4.0 that drops support for Milvus, Weaviate and PineCone. You can run Kurtosis against 0.3.1 by doing kurtosis run github.com/kurtosis-tech/autogpt-package@0.3.1
with the desired arguments
Run AutoGPT in the browser (no installation needed)
- If you don't have an OpenAI API key, get one here
- Click this link to open a Gitpod, selecting "Continue" to use the default resources
- Wait for the Gitpod to boot up and the terminal to finish installing Kurtosis (should take ~30 seconds)
- Run the following in the terminal (replacing
YOUR_API_KEY_HERE
with your OpenAI API key)kurtosis run github.com/kurtosis-tech/autogpt-package --enclave autogpt '{"OPENAI_API_KEY": "YOUR_API_KEY_HERE"}'
- When installing & starting AutoGPT has finished, run the following in the terminal to open the AutoGPT prompt:
kurtosis service shell autogpt autogpt --exec "python -m autogpt"
- Use AutoGPT as you please!
Run AutoGPT on your machine
- If you don't have an OpenAI API key, get one here
- Install Kurtosis using these instructions
- Run the following in your terminal (replacing
YOUR_API_KEY_HERE
with your OpenAI API key)kurtosis run github.com/kurtosis-tech/autogpt-package --enclave autogpt '{"OPENAI_API_KEY": "YOUR_API_KEY_HERE"}'
- When installing & starting AutoGPT has finished, run the following in your terminal to open the AutoGPT prompt:
kurtosis service shell autogpt autogpt
and then within the prompt:
> python -m autogpt
- Use AutoGPT as you please! To destroy the AutoGPT instance, run:
kurtosis enclave rm -f autogpt
Configuring AutoGPT (including memory backend)
To pass any of the AutoGPT configuration values listed here, pass the argument as a property of the JSON object you're passing to Kurtosis just like you passed in OPENAI_API_KEY
.
For example, this is how you'd pass the RESTRICT_TO_WORKSPACE
flag:
kurtosis run github.com/kurtosis-tech/autogpt-package --enclave autogpt '{"OPENAI_API_KEY": "YOUR_API_KEY_HERE", "RESTRICT_TO_WORKSPACE": "False"}'
NOTE: this package spins up AutoGPT using the local
backend by default. Other backends are available by setting the MEMORY_BACKEND
parameter in the JSON object you pass in when you run the kurtosis run
command above.
For example, to set the redis
memory backend:
kurtosis run github.com/kurtosis-tech/autogpt-package --enclave autogpt '{"OPENAI_API_KEY": "YOUR_API_KEY_HERE", "MEMORY_BACKEND": "redis"}'
NOTE: Redis isn't working with 0.4.0 for now
To run with a different image other than the one hardcoded in main.star
use
kurtosis run github.com/kurtosis-tech/autogpt-package --enclave autogpt '{"OPENAI_API_KEY": "YOUR_API_KEY_HERE", "AUTOGPT_IMAGE": "significantgravitas/auto-gpt:v0.4.0"}'
Using AutoGPT plugins
Kurtosis supports the ALLOWLISTED_PLUGINS
configuration flag that AutoGPT ships with. For example, to run the AutoGPTTwitter
plugin do the following:
kurtosis run github.com/kurtosis-tech/autogpt-package --enclave autogpt '{"OPENAI_API_KEY": "YOUR_API_KEY_HERE", "ALLOWLISTED_PLUGINS": "AutoGPTTwitter"}'
To get multiple plugins running at the same time; separate them with comma without spaces like so:
kurtosis run github.com/kurtosis-tech/autogpt-package --enclave autogpt '{"OPENAI_API_KEY": "YOUR_API_KEY_HERE", "ALLOWLISTED_PLUGINS": "AutoGPTTwitter,AutoGPTEmailPlugin"}'
Under the hood, Kurtosis will download and install the package for you.
As of now the following plugins are supported:
First Party
- AutoGPTTwitter
- AutoGPTEmailPlugin
- AutoGPTSceneXPlugin
- AutoGPTBingSearch
- AutoGPTNewsSearch
- AutoGPTWikipediaSearch
- AutoGPTApiTools
- AutoGPTRandomValues
- AutoGPTSpacePlugin
- AutoGPTBaiduSearch
- AutoGPTBluesky
- AutoGPTAlpacaTraderPlugin
- AutoGPTUserInput
- BingAI
- AutoGPTCryptoPlugin
- AutoGPTDiscord
- AutoGPTDollyPlugin
Third Party
- AutoGPTGoogleAnalyticsPlugin
- AutoGPT_IFTTT
- AutoGPT_Zapier
- AutoGPT_YouTube
- AutoGPTPMPlugin
- AutoGPTWolframAlpha
- AutoGPTTodoistPlugin
- AutoGPTMessagesPlugin
- AutoGPTWebInteraction
- AutoGPTNotion
- SystemInformationPlugin
To add support for more plugins, simply create an issue or create a PR adding an entry to plugins.star
.
Run without OpenAI
We understand OpenAI can be expensive for some people; more-ever some people might be trying to use this with their own models. AutoGPT-Package supports running AutoGPT against a GPT4All
model that runs via LocalAI
. To use a local model -
kurtosis run github.com/kurtosis-tech/autogpt-package '{"GPT_4ALL": true}'
This uses the https://gpt4all.io/models/ggml-gpt4all-j.bin
model default
To use a different model try the MODEL_URL
parameter like -
kurtosis run github.com/kurtosis-tech/autogpt-package '{"GPT_4ALL": true, "MODEL_URL": "https://gpt4all.io/models/ggml-gpt4all-l13b-snoozy.bin"}'
Development
To develop on this package, clone this repo and run the following:
kurtosis run . --enclave autogpt '{"OPENAI_API_KEY": "YOUR_API_KEY_HERE"}'
Note the .
- this tells Kurtosis to use the version of the package on your local machine (rather than the version on Github).
Kurtosis also has an extension available on the VSCode marketplace that provides syntax highlighting and autocompletion for the Starlark that this package is composed of.
Feedback or Questions?
Let us know in our Discord or on Twitter @KurtosisTech!
Feel free to create an issue on GitHub if you have any bugs or feature requests.