Coder Social home page Coder Social logo

chuanenlin / drone-net Goto Github PK

View Code? Open in Web Editor NEW
146.0 6.0 66.0 92.08 MB

https://towardsdatascience.com/tutorial-build-an-object-detection-system-using-yolo-9a930513643a

Python 100.00%
object-detection yolov3 darknet pretrained-weights drones dataset

drone-net's People

Contributors

andrewkinsman avatar chuanenlin avatar iibrahimli avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

drone-net's Issues

Label file format

Quick question for the label files.

At yolo doc it says:

Where x, y, width, and height are relative to the image's width and height.

In your label files you have the real coordinates (x,y for the center of the image) and width/height, is that correct?

Weight file and cfg file type

Hi, I'm wondering what pre-trained weight you used for training to generate the yolo-drone-weights, was it yolov3-tiny/yolov2/voc?

Also, did you do any pre-processing for the images before training?

Weight Files

Sorry to bother you, but I have a question how did you convert the cpkg files into a standalone weight?

Wrong detection results

Hi, I'm using default cfg and weight for drone detection. However, I'm not getting accurate results. In fact, there are multiple drones are getting detected in a single yolo bounding box.

Screenshot from 2021-11-01 15-01-50
Screenshot from 2021-11-01 15-01-47
Screenshot from 2021-10-31 11-34-23

Here are some of the result on the training images:
predictions
predictions
predictions
predictions

I even trained the network from scratch(MSE Loss ~ 0.18) but still no improvement. I'm using the Yolov4 branch from AlexeyAB/darknet, which expects a normalised coordinate and I'm also using normalised coordinate for training. What would be the reason for my error? Thanks!

batch size

how do you think, what is the minimum train batch size?

Image labels are wrong

You should specify the labeling format. If it is <class_id> , the bounding boxes are completely broken.

Train Darknet

When i run darknet to train my custom set of images ,

I get the following error

darknet: ./src/parser.c:315: parse_yolo: Assertion `l.outputs == params.inputs' failed.
Aborted (core dumped)

Drones not detected after training of Tiny YOLOv3

Hi, I've followed your tutorial in order to train Tiny YOLOv3 with drone images. After ~1900 iterations I'm still not getting any detection even on training images, which is strange. I've tested with your weights and everything works, so there must be some problem during my training procedure.

Darknet has been compiled with GPU and the training has been done on Google Colab. Here is the output during the training run with the command ./darknet detector train drone.data cfg/yolov3-tiny-drone.cfg darknet53.conv.74:

layer     filters    size              input                output
    0 conv     16  3 x 3 / 1   416 x 416 x   3   ->   416 x 416 x  16  0.150 BFLOPs
    1 max          2 x 2 / 2   416 x 416 x  16   ->   208 x 208 x  16
    2 conv     32  3 x 3 / 1   208 x 208 x  16   ->   208 x 208 x  32  0.399 BFLOPs
    3 max          2 x 2 / 2   208 x 208 x  32   ->   104 x 104 x  32
    4 conv     64  3 x 3 / 1   104 x 104 x  32   ->   104 x 104 x  64  0.399 BFLOPs
    5 max          2 x 2 / 2   104 x 104 x  64   ->    52 x  52 x  64
    6 conv    128  3 x 3 / 1    52 x  52 x  64   ->    52 x  52 x 128  0.399 BFLOPs
    7 max          2 x 2 / 2    52 x  52 x 128   ->    26 x  26 x 128
    8 conv    256  3 x 3 / 1    26 x  26 x 128   ->    26 x  26 x 256  0.399 BFLOPs
    9 max          2 x 2 / 2    26 x  26 x 256   ->    13 x  13 x 256
   10 conv    512  3 x 3 / 1    13 x  13 x 256   ->    13 x  13 x 512  0.399 BFLOPs
   11 max          2 x 2 / 1    13 x  13 x 512   ->    13 x  13 x 512
   12 conv   1024  3 x 3 / 1    13 x  13 x 512   ->    13 x  13 x1024  1.595 BFLOPs
   13 conv    256  1 x 1 / 1    13 x  13 x1024   ->    13 x  13 x 256  0.089 BFLOPs
   14 conv    512  3 x 3 / 1    13 x  13 x 256   ->    13 x  13 x 512  0.399 BFLOPs
   15 conv     18  1 x 1 / 1    13 x  13 x 512   ->    13 x  13 x  18  0.003 BFLOPs
   16 yolo
   17 route  13
   18 conv    128  1 x 1 / 1    13 x  13 x 256   ->    13 x  13 x 128  0.011 BFLOPs
   19 upsample            2x    13 x  13 x 128   ->    26 x  26 x 128
   20 route  19 8
   21 conv    256  3 x 3 / 1    26 x  26 x 384   ->    26 x  26 x 256  1.196 BFLOPs
   22 conv     18  1 x 1 / 1    26 x  26 x 256   ->    26 x  26 x  18  0.006 BFLOPs
   23 yolo
Loading weights from darknet53.conv.74...Done!

yolov3-tiny-drone
Learning Rate: 0.001, Momentum: 0.9, Decay: 0.0005
Resizing
416
Loaded: 0.000073 seconds
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499833, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498678, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499832, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498684, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499832, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498682, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499833, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498673, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499832, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498681, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499833, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498680, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499832, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498676, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499833, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498681, .5R: -nan, .75R: -nan,  count: 0
1: 315.492737, 315.492737 avg, 0.000000 rate, 1.219311 seconds, 24 images
Loaded: 0.000067 seconds
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499833, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498674, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499833, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498679, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499832, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498677, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499833, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498676, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499833, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498684, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499832, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498684, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499833, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498679, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.499832, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.498675, .5R: -nan, .75R: -nan,  count: 0
2: 315.492767, 315.492737 avg, 0.000000 rate, 0.481109 seconds, 48 images

................

Loaded: 0.000072 seconds
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.235015, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.036456, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.235015, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.036457, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.235015, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.036457, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.235015, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.036458, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.235015, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.036456, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.235015, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.036454, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.235015, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.036455, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.235015, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.036453, .5R: -nan, .75R: -nan,  count: 0
379: 26.248299, 33.163815 avg, 0.000021 rate, 0.740130 seconds, 9096 images
Loaded: 0.000068 seconds
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.232247, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.034958, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.232247, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.034960, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.232247, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.034961, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.232246, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.034962, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.232248, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.034957, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.232247, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.034961, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.232247, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.034960, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.232248, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.034960, .5R: -nan, .75R: -nan,  count: 0
380: 25.502501, 32.397682 avg, 0.000021 rate, 0.755673 seconds, 9120 images
Resizing
544

..................

Loaded: 0.000052 seconds
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000022, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000003, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000022, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000003, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000022, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000003, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000022, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000003, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000022, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000003, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000022, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000003, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000022, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000003, .5R: -nan, .75R: -nan,  count: 0
Region 16 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000022, .5R: -nan, .75R: -nan,  count: 0
Region 23 Avg IOU: -nan, Class: -nan, Obj: -nan, No Obj: 0.000003, .5R: -nan, .75R: -nan,  count: 0
1930: 0.000000, 0.002069 avg, 0.001000 rate, 0.583884 seconds, 46320 images

Do you notice anything strange that might be related to the problem I'm facing?
Thank you so much for your help and for sharing the tutorial

Edit
Here is a link to the avg loss plot of another run of training which gives the same problem: https://imgur.com/8IG3yO4

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.