Home

Awesome

pytorch-0.4-yolov3 : Yet Another Implimentation of Pytroch 0.41 or over and YoloV3

This repository is created for implmentation of yolov3 with pytorch 0.4 from marvis/pytorch-yolo2.

This repository is forked from great work pytorch-yolo2 of @github/marvis, but I couldn't upload or modify directly to marvis source files because many files were changed even filenames.

Difference between this repository and marvis original version.

If you want to know the training and detect procedures, please refer to https://github.com/marvis/pytorch-yolo2 for the detail information.

Train your own data or coco, voc data as follows:

python train.py -d cfg/coco.data -c cfg/yolo_v3.cfg -w yolov3.weights
python train.py -d cfg/my.data -c cfg/my.cfg -w yolov3.weights -r

Recorded yolov2 and yolov3 training for my own data

Detect the objects in dog image using pretrained weights

yolov2 models

wget http://pjreddie.com/media/files/yolo.weights
python detect.py cfg/yolo.cfg yolo.weights data/dog.jpg data/coco.names 

predictions

Loading weights from yolo.weights... Done!
data\dog.jpg: Predicted in 0.832918 seconds.
3 box(es) is(are) found
truck: 0.934710
bicycle: 0.998012
dog: 0.990524
save plot results to predictions.jpg

yolov3 models

wget https://pjreddie.com/media/files/yolov3.weights
python detect.py cfg/yolo_v3.cfg yolov3.weights data/dog.jpg data/coco.names  

predictions

Loading weights from yolov3.weights... Done!

data\dog.jpg: Predicted in 0.837523 seconds.
3 box(es) is(are) found
dog: 0.999996
truck: 0.995232
bicycle: 0.999973
save plot results to predictions.jpg

validation and get evaluation results

valid.py data/yourown.data cfg/yourown.cfg yourown_weights

Performances for voc datasets using yolov2 (with 100 epochs training)

coord_scale=1, object_scale=5, class_scale=1 mAP = 73.1  
coord_scale=1, object_scale=5, class_scale=2 mAP = 72.7  
coord_scale=1, object_scale=3, class_scale=1 mAP = 73.4  
coord_scale=1, object_scale=3, class_scale=2 mAP = 72.8  
coord_scale=1, object_scale=1, class_scale=1 mAP = 50.4  
anchors = 1.1468, 1.5021, 2.7780, 3.4751, 4.3845, 7.0162, 8.2523, 4.2100, 9.7340, 8.682
coord_scale=1, object_scale=3, class_scale=1 mAP = 74.4  

Therefore, you may do many experiments to get the best performances.

License

MIT License (see LICENSE file).