Awesome
<div align="center"> <h1 align="center">🎵 Serving Bark with BentoML</h1> </div>Bark is a transformer-based text-to-audio model created by Suno. Bark can generate highly realistic, multilingual speech as well as other audio - including music, background noise and simple sound effects.
This is a BentoML example project, demonstrating how to build an audio generation API server using Bark. See here for a full list of BentoML example projects.
Install dependencies
git clone https://github.com/bentoml/BentoBark.git
cd BentoBark
# Recommend Python 3.11
pip install -r requirements.txt
Run the BentoML Service
We have defined a BentoML Service in service.py
. Run bentoml serve
 in your project directory to start the Service.
$ bentoml serve service:SunoBark
2024-04-03T04:21:20+0000 [WARNING] [cli] Converting 'SunoBark' to lowercase: 'sunobark'.
2024-04-03T04:21:20+0000 [INFO] [cli] Starting production HTTP BentoServer from "service:SunoBark" listening on http://localhost:3000 (Press CTRL+C to quit)
The server is now active at http://localhost:3000. You can interact with it using the Swagger UI or in other different ways. Note that you can set voice_preset
to simulate your desired voice. See the Bark Speaker Library for details (use the value in the Prompt Name column).
CURL
curl -X 'POST' \
'http://localhost:3000/generate' \
-H 'accept: audio/*' \
-H 'Content-Type: application/json' \
-d '{
"text": "♪ In the jungle, the mighty jungle, the lion barks tonight ♪",
"voice_preset": null
}'
Python client
import bentoml
with bentoml.SyncHTTPClient("http://localhost:3000") as client:
result = client.generate(
text="♪ In the jungle, the mighty jungle, the lion barks tonight ♪",
voice_preset="",
)
Expected output:
Deploy to BentoCloud
After the Service is ready, you can deploy the application to BentoCloud for better management and scalability. Sign up if you haven't got a BentoCloud account.
Make sure you have logged in to BentoCloud, then run the following command to deploy it.
bentoml deploy .
Once the application is up and running on BentoCloud, you can access it via the exposed URL.
Note: For custom deployment in your own infrastructure, use BentoML to generate an OCI-compliant image.