Home

Awesome

Introduction

CheetahInfer is a pure C++ inference SDK based on TensorRT, which supports fast inference of CNNs based computer vision model.

Features

Prerequisites

CheetahInfer has several dependencies:

After the installation of above dependencies, we need modify the TENSORRT_INSTALL_DIR and OPENCV_INSTALL_DIR in file Makefile.config and the environment variable LD_LIBRARY_PATH and PATH in .bashrc file accordingly like the following.

export LD_LIBRARY_PATH="${LD_LIBRARY_PATH}:/home/yichaoxiong/opt/lib/tensorrt:/home/yichaoxiong/opt/lib/opencv"
export PATH="${PATH}:/usr/local/cuda-10.2/bin"

Preparation for model and data

Compilation and running

cd classifier
make -j4
./build/main --imgfp /path/to/image

If you want speficy which GPU to use, you can make it by setting the environment variable CUDA_VISIBLE_DEVICES.

Contact

This repository is currently maintained by Hongxiang Cai (@hxcai), Yichao Xiong (@mileistone).

Credits

We got some code from TensorRT and retinanet-examples.