Coder Social home page Coder Social logo

lkevinzc / dance Goto Github PK

View Code? Open in Web Editor NEW
64.0 64.0 14.0 14.9 MB

Codes for "DANCE: A Deep Attentive Contour Model for Efficient Instance Segmentation", WACV2021

Python 95.86% C 0.28% Cuda 3.59% C++ 0.26%
coco computer-vision deep-learning instance-segmentation pytorch wacv2021

dance's People

Contributors

lkevinzc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

dance's Issues

RuntimeError: stack expects a non-empty TensorList

Hi, I preprocessed COCO2017 dataset with python datasets/register_coco_edge.py. But when I trained this network with python train_net.py --num-gpus 1 --config-file configs/Dance_R_50_3x.yaml , I still faced a problem which said:
'ERROR [05/08 10:49:58 d2.engine.train_loop]: Exception during training:
Traceback (most recent call last):
File "/home/caoyang/detectron2/detectron2/engine/train_loop.py", line 132, in train
self.run_step()
File "/home/caoyang/detectron2/detectron2/engine/train_loop.py", line 214, in run_step
loss_dict = self.model(data)
File "/home/caoyang/anaconda3/envs/dance/lib/python3.6/site-packages/torch/nn/modules/module.py", line 532, in call
result = self.forward(*input, **kwargs)
File "/home/caoyang/dance/core/modeling/edge_snake/dance.py", line 140, in forward
features, proposals, (gt_sem_seg, [gt_instances, images.image_sizes])
File "/home/caoyang/anaconda3/envs/dance/lib/python3.6/site-packages/torch/nn/modules/module.py", line 532, in call
result = self.forward(*input, **kwargs)
File "/home/caoyang/dance/core/modeling/edge_snake/edge_det.py", line 270, in forward
_, poly_loss = self.refine_head(snake_input, None, targets[1])
File "/home/caoyang/anaconda3/envs/dance/lib/python3.6/site-packages/torch/nn/modules/module.py", line 532, in call
result = self.forward(*input, **kwargs)
File "/home/caoyang/dance/core/modeling/edge_snake/snake_head.py", line 1881, in forward
training_targets = self.compute_targets_for_polys(gt_instances, image_sizes)
File "/home/caoyang/dance/core/modeling/edge_snake/snake_head.py", line 1232, in compute_targets_for_polys
init_ex_targets = torch.stack(init_ex_targets, dim=0)
RuntimeError: stack expects a non-empty TensorList'
I guess the reason is that there is no target in the picture, or the target is not marked. And would you like to tell me how to solve this problem.

mmdet version?

Could you please offer an source mmdet version of DANCE?

train or apply to a different dataset

Hi,

I really like your work!
I would like to apply dance to another dataset , so I need to train from scratch , it this possible ?

Also, do I need to apply any filters on the images or normalization techniques before training ?

Alternatively, I would like to use your algorithm for refining another segmentation model, hence using the masks generated from the 1st algorithm as seeds for DANCE ,so it cant make refinements by finding the "real contours".

Could option 1 or 2 (or both) is possible ? and how to do so ?

Thanks a lot,

Config of model

Hi,

Thank you for your great work on Dance! I am trying to better understand this work so I want to train this model. I noticed that you have answered a issue about this, and I want to know if the config in https://github.com/lkevinzc/dance/blob/master/core/config/defaults.py is the same as the config you used in pre-trained model you provided? I noticed you set _C.MODEL.SNAKE_HEAD.ATTENTION = False and _C.MODEL.SNAKE_HEAD.NEW_MATCHING = False.

Sorry to bother you if I have misunderstood the model.

How to get edge_map of SBD?

Hi, so sorry to bother you. Is there any code for the edge map generation from data of SBD? I only find coco and cityscapes in the datasets folder. Thanks.

Face an error when running author's code

Traceback (most recent call last):
File "train_net.py", line 14, in
import core.data # noqa
File "/public/home/archer/dance/core/init.py", line 1, in
from core import modeling
File "/public/home/archer/dance/core/modeling/init.py", line 1, in
from .fcos import FCOS
File "/public/home/archer/dance/core/modeling/fcos/init.py", line 1, in
from .fcos import FCOS
File "/public/home/archer/dance/core/modeling/fcos/fcos.py", line 10, in
from core.layers import DFConv2d, IOULoss
File "/public/home/archer/dance/core/layers/init.py", line 4, in
from .extreme_utils import _ext as extreme_utils
ImportError: cannot import name '_ext' from 'core.layers.extreme_utils' (/public/home/archer/dance/core/layers/extreme_utils/init.py)
And I find there's nothing in init.py

the results of Dance

Hi, so sorry to bother you. I run the code of dance, but it have the error results of the method. I think the ml_nms module which causes the case. However, I don't change the code. Have you ever had this kind of situation?

image

Pre-train model with CenterNet Detector

Hi,

Sorry for bothering you, and thank you for the excellent work on DANCE! I am trying to better understand DANCE's improvements over DeepSnake, and I would like to reproduce more of DANCE's results. Do you have the COCO training script and pre-trained model for DANCE with CenterNet as the detector?

Thank you so much.

About the training usage and some missing files

Hi, so sorry to bother you, but where is the training usage? And I noticed that some necessary file were missing, such as: core/modeling/backbone/mobilenet.py & vovnet.py, the panopticapi package, where can I find them?

please help!AssertionError: Attribute 'thing_classes' in the metadata of 'coco_2017_train_edge' cannot be set to a different value!

when i training my own dataset ,have the problem about:
AssertionError: Attribute 'thing_classes' in the metadata of 'coco_2017_train_edge' cannot be set to a different value!

['person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', 'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', 'baseball bat', 'baseball glove', 'skateboard', 'surfboard', 'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', 'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', 'hair drier', 'toothbrush'] != ['building']

please give a suggestion~

About code on coco

Thanks for your great work. I have a question that why do you reproduce your code on detectron2 when training on coco, instead of directly using 'snake' code? Since files of coco dataset (dataset and evaluation for coco) have already existed in 'snake' (though snake does not use them).

Error when running baseline snake

Thanks for your work and code! An error occurs when I am running the baseline snake model using the command:
CUDA_VISIBLE_DEVICES=0,1 python train_net.py --num-gpus 2 --config-file configs/Dsnake_R_50_1x.yaml

File "/data/yinyf/dance/core/modeling/dsnake_baseline/dsnake_head.py", line 190, in forward
_, losses = self.refine_head(features["p2"], None, targets)
File "/home/fengh/miniconda3/envs/dance/lib/python3.6/site-packages/torch/nn/modules/module.py", line 532, in call
result = self.forward(*input, **kwargs)
File "/data/yinyf/dance/core/modeling/dsnake_baseline/dsnake_head.py", line 771, in forward
training_targets = self.compute_targets_for_polys(targets)
File "/data/yinyf/dance/core/modeling/dsnake_baseline/dsnake_head.py", line 638, in compute_targets_for_polys
init_sample_locations = torch.stack(init_sample_locations, dim=0)
RuntimeError: stack expects a non-empty TensorList

How can I deal with it?

How to visualize the prediction

Hi

When I run the following command,

python train_net.py --config-file configs/Dance_R_101_3x.yaml --eval-only MODEL.WEIGHTS ./output/r101_3x_model_final.pth 

DANCE model predicts box coordinates and classes but does not produce a segmentation mask, as shown below.
output_image_2

How do I get the segmentation?
Also, which code did you use to visualize the predictions, like the figures in your DANCE paper?

Thank you for your help in advance.

Can I have a simple training on a graphics card?

First of all, thank you for your excellent work.I have successfully tested your model so far, but when I tried to train the model so that I could observe the flow of data, I ran into a problem: I only have one piece of 1080Ti(11G), and when I ran it, the video memory was exceeded. What parameters should I adjust if I want to train on a video card?Could you give me some advice?Thank you very much!

Some questions about training strategies

Hi, I have noticed that DeepSnake adopt the two-stage training strategies on cityscapes dataset , which train the detector alone firstly and then train the detector and snake branches simultaneously. And I also noticed that Dance are trained using SGD. So I want to know if Dance has adopted a two-stage training strategy, or directly trained end-to-end.
Thanks for your reply!

DANCE network in the snake baranch uses COCO dataset to train

Hello! Thanks for you great work. Can you give some help about using COCO dataset in the snake branch? Because I want to use my own dataset which is the form of the COCO dataset to train the DANCE network in the snake branch. Thankyou!
您好,我主要想请教一下在snake分支下该如何支持DANCE模型训练COCO数据集形式的数据呢?主要是在lib/datsets/coco路径下和lib/evaluators/coco路径下相关的dance.py程序如何编写呢?

about activation function

Thank you for your great work,I have a question:
In the paper,you add a "tanh" activation function to the snake prediction before multiplying it by the object scale.Why do you choose the activation function?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.