Awesome
Auto-CoT: Automatic Chain of Thought Prompting in Large Language Models (ICLR 2023)
Cheer AI up with the "let's think step by step" prompt? More plz. Let’s think not just step by step, but also one by one.
Auto-CoT uses more cheers & diversity to SAVE huge manual efforts in chain of thought prompt design, matching or even exceeding performance of manual design on GPT-3.
Check out our 25-page paper for more information.
Requirements
Python>=3.8
pip install torch==1.8.2+cu111 torchtext==0.9.2 -f https://download.pytorch.org/whl/lts/1.8/torch_lts.html
pip install -r requirements.txt
Datasets
Download the datasets from the following:
https://github.com/kojima-takeshi188/zero_shot_cot/tree/main/dataset
https://github.com/kojima-takeshi188/zero_shot_cot/tree/main/log
Quick Start
See try_cot.ipynb
Instructions
Construct Demos:
python run_demo.py \
--task multiarith \
--pred_file log/multiarith_zero_shot_cot.log \
--demo_save_dir demos/multiarith
Run inference:
python run_inference.py \
--dataset multiarith \
--demo_path demos/multiarith \
--output_dir experiment/multiarith
Citing Auto-CoT
@inproceedings{zhang2023automatic,
title={Automatic Chain of Thought Prompting in Large Language Models},
author={Zhang, Zhuosheng and Zhang, Aston and Li, Mu and Smola, Alex},
booktitle={The Eleventh International Conference on Learning Representations (ICLR 2023)},
year={2023}
}
Security
See CONTRIBUTING for more information.
License
This project is licensed under the Apache-2.0 License.