Home

Awesome

Visual Prompting

This is the official implementation of the paper Exploring Visual Prompts for Adapting Large-Scale Models.

Installation

Clone this repo:

git clone https://github.com/hjbahng/visual_prompting.git
cd visual_prompting

This code requires python 3+. Install dependencies by:

pip install -r requirements.txt

Prepare the pre-trained models:

bash models/download_models.sh

Train/Test for CLIP

python main_clip.py --dataset cifar100 --root [path_to_cifar100] 
python main_clip.py --evaluate --resume /path/to/checkpoints/model_best.pth.tar --dataset cifar100 --root [path_to_cifar100]

Train/Test for Vision Models

python main_vision.py --model bit_m_rn50 --dataset cifar100 --root [path_to_cifar100]
python main_vision.py --evaluate --resume /path/to/checkpoints/model_best.pth.tar --model bit_m_rn50 --dataset cifar100 --root [path_to_cifar100]

Citation

If you use this code for your research, please cite our paper.

@article{bahng2022visual,
         title={Exploring Visual Prompts for Adapting Large-Scale Models}, 
         author={Hyojin Bahng and Ali Jahanian and Swami Sankaranarayanan and Phillip Isola},
         journal={arXiv preprint arXiv:2203.17274},
         year={2022}
}