Home

Awesome

<h1>AutoWebGLM: Bootstrap And Reinforce A Large Language Model-based Web Navigating Agent</h1>

This is the official implementation of AutoWebGLM. If you find our open-sourced efforts useful, please 🌟 the repo to encourage our following development!

Overview

paper

AutoWebGLM is a project aimed at building a more efficient language model-driven automated web navigation agent. This project is built on top of the ChatGLM3-6B model, extending its capabilities to navigate the web more effectively and tackle real-world browsing challenges better.

Features

Evaluation

We have publicly disclosed our evaluation code, data, and environment. You may conduct the experiment using the following code.

AutoWebBench & Mind2Web

You can find our evaluation datasets at <a href="./autowebbench/" alt="autowebbench">AutoWebBench</a> and <a href="./mind2web/" alt="mind2web">Mind2Web</a>. For the code to perform model inference, please refer to <a href="https://huggingface.co/THUDM/chatglm3-6b" alt="chatglm3-6b">ChatGLM3-6B</a>. After obtaining the output file, the score can be obtained through python eval.py [result_path].

WebArena

We have made modifications to the WebArena environment to fit the interaction of our system; see <a href="./webarena/" alt="webarena">WebArena</a>. The modifications and execution instructions can be found in <a href="./webarena/README.md" alt="readme">README</a>.

MiniWob++

We have also made modifications to the MiniWob++ environment, see <a href="./miniwob++/" alt="miniwob++">MiniWob++</a>. The modifications and execution instructions can be found in <a href="./miniwob++/README.md" alt="readme">README</a>.

License

This repository is licensed under the Apache-2.0 License. All open-sourced data is for resarch purpose only.

Citation

If you use this code for your research, please cite our paper.

@misc{lai2024autowebglm,
    title={AutoWebGLM: Bootstrap And Reinforce A Large Language Model-based Web Navigating Agent},
    author={Lai, Hanyu and Liu, Xiao and Iong, Iat Long and Yao, Shuntian and Chen, Yuxuan and Shen, Pengbo and Yu, Hao and Zhang, Hanchen and Zhang, Xiaohan and Dong, Yuxiao and Tang, Jie},
    year={2024},
    eprint={2404.03648},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}