Home

Awesome

Official Source code for PD-Net

Polysemy Deciphering Network for Human-Object Interaction Detection ([ECCV2020 paper] [Code]

Polysemy Deciphering Network for Robust Human-Object Interaction Detection ([IJCV2021 paper])

<img src="https://github.com/MuchHair/PD-Net-Extended-Version/blob/master/Paper_Images/overview.png" width="999" >

Train, Test and Eval Model on HICO-DET

Preprocess data

  1. Please prepare these files (pwd:1111) Put them in data/hico/hico_processed dir.
  2. Prepare faster_rcnn_fc7.hdf5 (Step 1 in No-frills) and put it in data/hico/hico_processed dir.
  3. Please follow No-frills to obtain the "hoi_candidates_subset.hdf5" "hoi_candidates_box_feats_subset.hdf5", "hoi_candidate_labels_subset.hdf5" files. Put them in data/hico/hoi_candidates dir.
  4. Prepare pose
# prepare input file for AlphaPose
python -m lib.data_process.prepare_for_pose

use AlphaPose to obtain the pose results

# convert and generate features
python -m lib.data_process_hico.convert_pose_result
python -m lib.data_process_hico.cache_alphapose_features

Please put the final .hdf5 pose file in data/hico/hoi_candidates dir.

5 .If evaling PD-Net with INet, download a pre-trained INet preditions (pwd:1111) and put this .hdf5 file in output/hico-det/INet/ dir

 # train
CUDA_VISIBLE_DEVICES=0 python tools/vcoco/train_net_pd.py

# test(choose a model  MODEL_NUM to test and the precoss will generate a .hdf5 file used for eval)
CUDA_VISIBLE_DEVICES=0 python tools/vcoco/test_net_pd.py --model_num MODEL_NUM --eval_with_INet True

# eval (use the .hdf5 generated above to eval)
bash eval/compute_mAP.sh

Pretrained model (22.37 mAP on HICO-DET)

HOI-VP Dataset

The Images are provided by VG and the annotations (based on HCVRD) can be obtained from this link (pwd:1111).

Citation

Please consider citing this paper in your publications if it helps your research. The following is a BibTeX reference.

@article{zhong2021polysemy,
  title={Polysemy Deciphering Network for Robust Human-Object Interaction Detection},
  author={Zhong, Xubin and Ding, Changxing and Qu, Xian and Tao, Dacheng},
  journal={IJCV},
  year={2021}
}