Awesome
<img src="assets/logo.png" width="350">Research @ TUGraz & BlackShark.ai (CVPR 2022)
PolyWorld Inference and Evaluation Code
PolyWorld is a research project conducted by the Institute of Computer Graphics and Vision of TUGraz, in collaboration with BlackShark.ai. PolyWorld is a neural network that extracts polygonal objects from an image in an end-to-end fashion. The model detects vertex candidates and predicts the connection strenght between each pair of vertices using a Graph Neural Network. This repo includes inference code and pretrained weights for evaluating PolyWorld on the CrowdAI Mapping Challenge dataset.
<p align="center"> <img src="assets/teaser.png" width="450"> </p>-
Paper PDF: PolyWorld: Polygonal Building Extraction with Graph Neural Networks in Satellite Images
-
Authors: Stefano Zorzi, Shabab Bazrafkan, Stefan Habenschuss, Friedrich Fraundorfer
-
Video: YouTube link
-
Poster: Seafile link
Dependencies
- pycocotools
- pyshp
- torch
Getting started
After cloning the repo, download the polyworld_backbone pre-trained weights from here, and place the file in the trained_weights folder.
The CrowdAI Mapping Challenge dataset can be downloaded here.
Run the evaluation on the CrowdAI Mapping Challenge dataset
To run the evaluation, specify batch size, image folder, and annotation file of the CrowdAI dataset in the main function of the prediction.py script. Then simply run:
python prediction.py
The code is tested on an Nvidia RTX 3090 using batch_size = 6.
During inference, the script converts the predicted polygons to coco format and saves a json file (predictions.json).
If you wish to visualize the results in QGIS, we suggest to convert the predictions from coco json format to shapefile using the coco_to_shp.py script. To run the conversion, specify the json file and the output folder in the main function, and then type:
python coco_to_shp.py
In order to compute AP and AR metrics with the COCO API, or the MTA metric, please use the script provided by the Frame Field Learning repo.
If you want to compute IoU and C-IoU metrics, use the coco_IoU_cIoU.py script. To run the evaluation, specify json file and ground truth annotations in the main function and then run:
python coco_IoU_cIoU.py
Download results
A download link for the PolyWorld predictions corresponding to the val-set of the CrowdAI dataset is also provided:
-
json results: here you can find the output annotations in json format with and without using refinement vertex offsets.
-
shp results: here you can find archives containing the shapefile annotations ready to be visualized in QGIS.
BibTeX citation
If you use any ideas from the paper or code from this repo, please consider citing:
@inproceedings{zorzi2022polyworld,
title={PolyWorld: Polygonal Building Extraction with Graph Neural Networks in Satellite Images},
author={Zorzi, Stefano and Bazrafkan, Shabab and Habenschuss, Stefan and Fraundorfer, Friedrich},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={1848--1857},
year={2022}
}