Coder Social home page Coder Social logo

eelxpeng / unsuperviseddeeplearning-pytorch Goto Github PK

View Code? Open in Web Editor NEW
89.0 4.0 27.0 23.1 MB

This repository tries to provide unsupervised deep learning models with Pytorch

License: MIT License

Python 99.83% Shell 0.17%
unsupervised deep-learning pytorch autoencoder denoising-autoencoders variational-autoencoders generative-adversarial-network

unsuperviseddeeplearning-pytorch's Introduction

Unsupervised Deep Learning with Pytorch

This repository tries to provide unsupervised deep learning models with Pytorch for convenient use.

Denoising Autoencoder

1-layer autoencoder. Corrupt the input (masking), then reconstruct the original input.

Stacked Denoising Autoencoder

Layerwise pretraining by denoising autoenocder. Then stack all layers and finetune further.

VAE

The famous Variational Autoencoder from the paper

Kingma, Diederik P., and Max Welling. "Auto-encoding variational bayes." ICLR (2014).

Convolutional VAE

VAE using convolutional and deconvolutional networks is demonstrated with SVHN dataset. Expample code is test_convvae.py. The reconstruction and samples generated is show as follows:

SVHN Reconstruction SVHN Sample

Variational Deep Embedding

Implementation of Variational Deep Embedding from the IJCAI2017 paper:

Jiang, Zhuxi, et al. "Variational deep embedding: An unsupervised and generative approach to clustering." International Joint Conference on Artificial Intelligence. 2017.

The original code is written in Keras. However, the original code is incorrect when computing the loss function. And I have corrected the loss function part with my code. The example usage can be found in test/test_vade-3layer.py, and it uses the pretrained weights from autoencoder in test/model/pretrained_vade-3layer.pt.

Note:

  • The pretrained weights is important to initialize the weights of VaDE.
  • Unlike the original code using combined training and test data for training and evaluation, I split the training and test data, and only use training data for training and test data for evaluation. I think it is a more appropriate way to evaluate the method for generalization.
  • As found, with above evaluation scheme and training for 3000 epochs, the clustering accuracy achieved is 94%.

unsuperviseddeeplearning-pytorch's People

Contributors

eelxpeng avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

unsuperviseddeeplearning-pytorch's Issues

The results of different runs differ a lot.

I am implementing this code on my biological datasets. When I run the code multiple times, it produces quite different results. The ARIs vary from 0.4 to 0.8. I find out that the pretrained model matters a lot to the results. How can I achieve robust results by getting a good pretrained model? Is the pretrained denoising autoencoder important to stacked autoencoder when the structure of the dae is not exactly the same as that of sdae?

Achieving paper results on alternate data sets

Thanks for posting this code. Its very helpful - and much more workable at this point than the original code.

Did you try to use any of the other data sets?

I'm specifically interested in the HAR data set. For some reason I can't achieve the same results as the paper using the pytorch code. Do you know why?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.