Comments (13)
Hi! Thank you for your interest! This project is done by Caffe framework (https://github.com/BVLC/caffe) with little changes to code. We only changes some of the network structure (e.g., adding attention modules), which can be done by simply modifying prototxt. The details of the framework can be found in the paper, and if you have any questions about the structure please feel free to ask me.
from hydraplus-net.
Hi!
In the paper, it said that the HP-net was trained in a stage-wise fashion. And which loss is used when training the M-net and fine-turning the AF-net? Could you share the Caffe prototxt files ?
Thank you!
from hydraplus-net.
Hi! In each stage of training, we always use weighted sigmoid cross entropy loss, as illustrated in the paper. The weights for positive and negative examples aim to balance the loss of positive and negative samples. We will detail code and prototxt later. Thank you!
from hydraplus-net.
@xh-liu Thanks!
from hydraplus-net.
Hi,I'm very interested in your paper. Can you share your code for me?
@xh-liu
from hydraplus-net.
Can you detail code and prototxt? Because what I have done can not reconstruct your result. Thank you very much! @xh-liu
from hydraplus-net.
@Li1991 How do you combine the result?
from hydraplus-net.
We will give the detailed code later. Thank you for your interest!
from hydraplus-net.
Hi @xh-liu, will the model or the prototxt file be released resently?
from hydraplus-net.
I have added some example prototxts in the folder prototxt_example. a0 and a3 are two branches out of total nine branches. You can re-implement the other branches based on it. fusion denotes the whole net for fusing the features from nine branches and the main branch. For computation simplicity, we extract features of the nine branches offline and use the extracted features to train the final fusion layer and classifiers.
from hydraplus-net.
Hi, after seeing your example prototxts, I found you use a layer called NNInterp, but I can not find it. Could you please offer the original code of this layer? Thank you! @xh-liu
from hydraplus-net.
@Li1991 I have added the code for nninterp layer in folder 'layers'.
from hydraplus-net.
Thank you for your kindness. @xh-liu And where is your python layer --FeatureConcatDataLayer?
from hydraplus-net.
Related Issues (18)
- PA-100k Question! HOT 3
- Can you write the readme.txt in detail?
- evaluation code for the datasets (mostly VIPeR) HOT 1
- Ask about your threshold used while testing in the attribute recognition. HOT 2
- how to create PA100K dataset with my own data? HOT 3
- how run the HydraPlus-Net,thank you
- How long does this model need to train?
- License HOT 2
- cannot unzip dataset
- Fine Tuning With Custom Data
- Questions about the attribute recognition HOT 14
- Where to download the dataset pa-100k? HOT 2
- Question About Attentive Feature Network HOT 12
- AFnet HOT 2
- Question about MDA HOT 1
- Can you share your caffe version?
- Is there any data augmentation for your "Main Net"? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from hydraplus-net.