Home

Awesome

Investigating the Effectiveness of Task-Agnostic Prefix Prompt for Instruction Following (TAPP)

This is the official github repository for Task-Agnostic Prefix Prompt.

Overview of Task-Agnostic Prefix Prompt (TAPP)

<p align="center"> <img src="./TAPP.png" width="80%" height="80%"> </p>

Dataset

For evaluation dataset, we used SuperNatural-Instructions, which can be assessed in the official repo. Simply clone the repo under this directory and change the directory name to data. We run our main experiments on the 119 tests tasks designated by this official repository. We also randomly selected 100 instances (at maximum) per each of these tasks.

Setting

The following command will clone the project:

git clone https://github.com/seonghyeonye/TAPP.git

Before experimenting, you can make a virtual environment for the project.

conda create -n zeroshotlm python=3.8
conda activate zeroshotlm
pip install -r requirements.txt

Run

You can run inference with various prompting schemes by running files under scripts/gpt3 or scripts/decoder. For instance, if you want to test our TAPP on GPT-3 (davinci) for 119 Test tasks from SuperNatural-Instructions, you can run scripts/gpt3/run_ICIL.sh

Result of Task-Agnostic Prefix Prompt (TAPP)

<p align="center"> <img src="./result.png" width="80%" height="80%"> </p>

Irrelevant TAPP

For Irrelevant TAPP, we randomly corrupt input sentences from the prompts with sentences from cc_news. To access this file, please visit our google drive. we have downloaded the huggingface cc_news dataset from here and parsed using en_core_sm from spacy and made it into a .txt file comprised of its sentences.

But under demo directory, we have all the irr_ICIL demos already extracted, so instead, you can run run_ICIL.sh(instaed of run_irr_ICIL.sh) after you change the --demo_path option.

Experiments

To replicate our experiments in the paper, please refer to the following resources:

OpenAI API

For experiments with GPT-3 models (curie, davinci, text-curie, text-davinci), you should have a OpenAI API key, which can be applied here. After acquiring the key, insert your API key on export openai_key="" for each of the script file in scripts directory.

Acknowledgements

Our code repository is mainly based on Tk-Instruct. Special thanks to the contributors of the repository!

Citations

@article{ye2023context,
  title={In-Context Instruction Learning},
  author={Ye, Seonghyeon and Hwang, Hyeonbin and Yang, Sohee and Yun, Hyeongu and Kim, Yireun and Seo, Minjoon},
  journal={arXiv preprint arXiv:2302.14691},
  year={2023}
}