Awesome
Binder🔗: Binding Language Models in Symbolic Languages
<p align="left"> <a href="https://img.shields.io/badge/PRs-Welcome-red"> <img src="https://img.shields.io/badge/PRs-Welcome-red"> </a> <a href="https://img.shields.io/github/last-commit/HKUNLP/Binder?color=green"> <img src="https://img.shields.io/github/last-commit/HKUNLP/Binder?color=green"> </a> <br/> </p>Code for paper Binding Language Models in Symbolic Languages. Please refer to our project page for more demonstrations and up-to-date related resources. Check out our demo page to have an instant experience of Binder, which achieves sota or comparable performance with only dozens of(~10) program annotations.
<img src="pics/binder.png" align="middle" width="100%">Updates
- 2023-08-25: 🔥 Update to support OpenAI chat series models like
gpt-3.5-xxx
andgpt-4-xxx
, code will be further refactor later to support more! - 2023-03-23: Since OpenAI no longer supports Codex series models, we will sooner test and update the engine from "code-davinci-002" to "gpt-3.5-turbo".
- 2023-01-22: Accepted by ICLR 2023 (Spotlight)
- 2022-12-04: Due to the fact OpenAI's new policy on request limitation, the n sampling couldn't be done as previously, we will add features to call multiple times to be the same usage soon!
- 2022-10-06: We released our code, huggingface spaces demo and project page. Check it out!
Dependencies
To establish the environment run this code in the shell:
conda env create -f py3.7binder.yaml
pip install records==0.5.3
That will create the environment binder
we used.
Usage
Environment setup
Activate the environment by running
conda activate binder
Add key
Apply and get API keys
(sk-xxxx like) from OpenAI API, save the key in key.txt
file, make sure you have the rights to access the model(in the implementation of this repo, code-davinci-002
) you need.
Run
Check out commands in run.py
Citation
If you find our work helpful, please cite as
@article{Binder,
title={Binding Language Models in Symbolic Languages},
author={Zhoujun Cheng and Tianbao Xie and Peng Shi and Chengzu Li and Rahul Nadkarni and Yushi Hu and Caiming Xiong and Dragomir Radev and Mari Ostendorf and Luke Zettlemoyer and Noah A. Smith and Tao Yu},
journal={ICLR},
year={2023},
volume={abs/2210.02875}
}
Contributors
<a href="https://github.com/BlankCheng"> <img src="https://avatars.githubusercontent.com/u/34505296?v=4" width="50" /></a> <a href="https://github.com/Timothyxxx"> <img src="https://avatars.githubusercontent.com/u/47296835?v=4" width="50" /></a> <a href="https://github.com/chengzu-li"><img src="https://avatars.githubusercontent.com/u/69832207?v=4" width="50" /></a> <a href="https://github.com/Impavidity"> <img src="https://avatars.githubusercontent.com/u/9245607?v=4" width="50" /></a> <a href="https://github.com/Yushi-Hu"><img src="https://avatars.githubusercontent.com/u/65428713?v=4" width="50" /></a> <a href="https://github.com/taoyds"><img src="https://avatars.githubusercontent.com/u/14208639?v=4" width="50" /></a>