Comments (4)
Hi @zhunzhong07,
Yes you're correct. We sample uniformly from the K nearest neighbors during training. Therefore, it is highly likely that the anchor sees a different neighbor in the next epoch. So, if you train long enough it should have the exact same effect as Eq. 2. After all, it is not practical to include a lot of neighbors for every sample during a forward pass, since this does not scale well with the amount K.
Hope this helps.
from unsupervised-classification.
Hi @wvangansbeke,
Thanks for your quick reply. I have another question.
In your code, I find that the indices of neighbors only computed once after the self-supervised learning. Why not re-computed the neighbors after each epoch of SCAN. Will this improve the results?
from unsupervised-classification.
Yes a good point. I never tried it exactly like that (although something similar). It makes sense actually. However, I'm not sure that the representations are going to be much better at that point. I just think that it will be difficult to exploit the selflabling as we currently do. This step basicaly readjusts the decision boundary between classes and updates the representations based on the prototypes of each class.
from unsupervised-classification.
OK. Thanks for your reply!
from unsupervised-classification.
Related Issues (20)
- scan.py Error
- eval can't run the result HOT 1
- How to find labels as clustered results (ex: clustered images....)? HOT 3
- Why do you need target (label) on fill_memory_bank?? HOT 1
- Is knn for validation?? (memory bank, target(label)) HOT 2
- If an unlabeled dataset is used how should the target variable be handled? HOT 2
- How to interpret confusion matrix in sacn?
- Is it Unsupervised Learning or Semi-supervised Learning?
- The scan result is not ideal
- Multi-GPU support HOT 1
- About "Pretext + Kmeans' HOT 7
- Accuracy becomes lower after self-labeling
- Overcoming uncertainity after scan phase by dynamically lowering accuracy threshold for self-labeling
- Normalization on custom dataset HOT 1
- Learned representations after pretext task HOT 2
- Solve the bad results caused by data imbalance? HOT 1
- Help with installation
- SimCLR loss and nearest neighbors computed on hidden features?
- negative loss from SCANLoss HOT 1
- Is it normal that i got a loss which is negative with the entropy_weight=5.0? HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from unsupervised-classification.