Coder Social home page Coder Social logo

savnani5 / zero-shot-emotion-recognition-with-aae Goto Github PK

View Code? Open in Web Editor NEW
5.0 1.0 2.0 7.12 MB

Learning Unseen Emotions from Gestures via Semantically-Conditioned Zero-Shot Perception with Adversarial Autoencoders

Python 100.00%
stgcn adverserial-autoencoder zeroshot-learning word-embeddings word2vec gan vae

zero-shot-emotion-recognition-with-aae's Introduction

This is the adapted from the implementation of STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits. The following citation needs to be used when referencing the code:

@inproceedings{bhattacharya2020step, title={STEP: Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits.}, author={Bhattacharya, Uttaran and Mittal, Trisha and Chandra, Rohan and Randhavane, Tanmay and Bera, Aniket and Manocha, Dinesh}, booktitle={AAAI}, pages={1342--1350}, year={2020} }

Instructions of use:

  1. Download the data using python3 download\_ebmdb.py inside the 'generate_data' and save it in folder 'data' inside the repo.
  2. Navigate to 'generate_data' folder and run python3 load\_data.py to generate all gait data to be used as input to the STEP pipeline. These will be saved inside the 'feature_data' folder
  3. Navigate to 'classifier_stgcn_real_only' folder and run python3 main.py to begin training. The final feature vector for all the inputs, post-training, will be saved in an 'output.h5' file.

Instructions to run ZSL:

  1. Navigate to the folder titled 'feature_data' and store the 'output.h5' that was generated from the STEP output there.
  2. Install transformers module using pip3 install transformers and download the 'bert-base-uncased' for BERT pretrained model and 'NRC-VAD-Lexicon.txt' for VAD and store it in 'feature_data' itself.
  3. Run python3 check.py to generate the mat files 'featuresT.mat' and 'labelsT.mat'.
  4. Copy these two mat files and navigate to the folder 'Generalized_Zero_Shot/data' and paste them there.
  5. Navigate to 'Generalized_Zero_Shot' and run python3 linear_classifier.py to begin training.

Instructions to run AAE:

  1. Navigate to the folder titled 'AdversarialAutoencoder'.
  2. Run python3 aae.py to start evaluation. Use arguments --dataset_path <location of mat files> --word_vec_loc <location of word to vec googlebin file>

zero-shot-emotion-recognition-with-aae's People

Contributors

savnani5 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

zero-shot-emotion-recognition-with-aae's Issues

About KeyError

Hello, I have a question to ask you. In load_data.py, there is a KeyError:'Intended emotion', and it runs successfully in the first 1446 lines, but there is a problem when it loops in line 1447, what is the reason? Thank you for your answer!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.