Home

Awesome

batchai - A supplement to Copilot and Cursor - utilizes AI for batch processing of project codes

中文

Demostration: https://example.batchai.kailash.cloud:8443

<div align="center" width="100%"> <img src="doc/example-site-en.png" width="100%"> </div>

batchai has a simple goal: run a command to scan and process an entire codebase, letting AI perform bulk tasks like automatically finding and fixing common bugs, or generating unit tests. It’s similar to AI-driven SonarQube for error checking. Essentially, batchai complements tools like Copilot and Cursor by eliminating the need to copy-paste between chat windows and open files, or manually adding files to the AI’s context, making the process more efficient.

To demonstrate with the spring-petclinic project (cloned from https://github.com/spring-projects/spring-petclinic), I ran the following batchai command in the cloned directory:

batchai check --fix

And here are the results I got:

<p align="center"> <img src="doc/batchai-demo-1.png" width="800"> </p> <p align="center"> <img src="doc/batchai-demo-2.png" width="800"> </p>

The full results are as follows:

The results above were generated by running batchai with the OpenAI gpt-4o-mini model.

Additionally, I’ve just launched a demo website (https://example.batchai.kailash.cloud:8443), where you can submit your own GitHub repositories for batchai to perform bulk code checks and generate unit test codes. Given the high cost of using OpenAI, this demo site uses the open-source model qwen2.5-coder:7b-instruct-fp16(performs not as good as gpt-4o-mini), running on my own Ollama instance. Note that due to resource limitations, tasks will be executed in a queue.

Here are some interesting findings from eat-my-dog on my own projects over the past few weeks:

Also, here’s an example of where the AI didn’t quite get it right:

<p align="center"> <img src="doc/batchai-demo-3.png" width="800"> </p>

The issue of LLM hallucinations is unavoidable, so to prevent AI errors from overwriting our existing changes, I designed batchai to only work in a clean Git repository. If there are any unstaged files, batchai will refuse to execute. This way, we can use git diff to confirm the changes made by batchai. If there are mistakes, we can simply revert them. This step is still essential.

Features

Planned features

Currently, batchai only supports bulk code checks and generating unit test code, but planned features include code explanation and comment generation, refactoring, and more — all to be handled in bulk. Another goal is to give batchai a broader understanding of the project by building a cross-file code symbol index, which should help the AI work more effectively.

Of course, suggestions and feature requests are always welcome at the Issues page. Feel free to contribute, and we can discuss them together.

Getting Started

  1. Download the latest executable binary from here and add it to your $PATH. For Linux and Mac OSX, remember to run chmod +x ... to make the binary executable.

  2. Clone the demo project. The following steps assume the cloned project directory is /data/spring-petclinic

    cd /data
    git clone https://github.com/spring-projects/spring-petclinic
    cd spring-petclinic
    

    In this directory, create a .env file. In the .env file, set the OPENAI_API_KEY. Below is an example:

    OPENAI_API_KEY=change-it
    

    For Ollama, you can refer to my example docker-compose.yml

  3. CLI Examples:

    • Report issues to the console (also saved to build/batchai):
    cd /data/spring-petclinic
    batchai check . src/main/java/org/springframework/samples/petclinic/vet/Vets.java
    
    • Directly fix the target files via option --fix:
    cd /data/spring-petclinic
    batchai check --fix . src/main/java/org/springframework/samples/petclinic/vet/Vets.java
    
    • Run batchai in main Java code only:
    cd /data/spring-petclinic
    batchai check . src/main/java/
    
    • Run batchai on the entire project:
    cd /data/spring-petclinic
    batchai check .
    
    • Generates unit test code for the entire project:
    cd /data/spring-petclinic
    batchai test .
    

Supported LLMs

Tested and supported models:

To add more LLMs, simply follow the configuration in res/static/batchai.yaml, as long as the LLM exposes an OpenAI-compatible API.

Configuration

License

MIT

NA

GitHub Releases Download