Coder Social home page Coder Social logo

t-lstm's Introduction

Time-Aware Long-Short Term Memory

Regularity of the duration between consecutive elements of a sequence is a property that does not always hold. An architecture that can overcome this irregularity is necessary to increase the prediction performance.

Time Aware LSTM (T-LSTM) was designed to handle irregular elapsed times. T-LSTM is proposed to incorporate the elapsed time information into the standard LSTM architecture to be able to capture the temporal dynamics of sequential data with time irregularities. T-LSTM decomposes memory cell into short-term and long-term components, discounts the short-term memory content using a non-increasing function of the elapsed time, and then combines it with the long-term memory.

Compatibility

Code is compatible with tensorflow version 1.2.1 and Pyhton 2.7.13.

Input Format

An example data format is given where data is stored as a list containing 3 dimensionals tensors such as [number of samples x sequence length x dimensionality].

Reference

Inci M. Baytas, Cao Xiao, Xi Zhang, Fei Wang, Anil K. Jain, Jiayu Zhou, "Patient Subtyping via Time-Aware LSTM Networks", KDD, 2017.

t-lstm's People

Contributors

baytasin avatar jiayuzhou avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

t-lstm's Issues

Dataset

Hi, I am interested in your work of Patient Subtyping via Time-Aware LSTM Networks, the experiment done with Parkinson data, I am trying to reproduce the result you mentioned in the paper. It is nice that I could run your code successfully, but I am not sure about the data that you mentioned containing 319 features and the 82 symptoms. could you tell me how can I get this data? thanks for your help.

Split0 Data Format

Hi illidanlab,

Thank you for sharing T-LSTM code! I really need this model to finish my master's thesis

  1. I want to know why the number of samples and dimensionality of elapsed_train.pkl and data_train.pkl in Split0 change?

  2. Is the data normalized? If so, what are the limits for elapsed_train.pkl and data_train.pkl normalization?

Dataset

Hi, is the Clustering_Data_1D.mat dataset the one relative to PPMI experiment in paper Patient Subtyping via time-Aware LTSM Networks?

Bug may lead to memory leak

Bug Code

c_test, y_pred_test, y_test, logits_test, labels_test = sess.run(lstm.get_cost_acc(), feed_dict={lstm.input: batch_xs, lstm.labels: batch_ys, lstm.time: batch_ts, lstm.keep_prob: test_dropout_prob})

Right Code

c_test, y_pred_test, y_test, logits_test, labels_test = sess.run([cross_entropy, y_pred, y, logits, labels], feed_dict={lstm.input: batch_xs, lstm.labels: batch_ys, lstm.time: batch_ts, lstm.keep_prob: test_dropout_prob})

Explaination

lstm.get_cost_acc() will add new tf.op to the tensorflow graph. In my test, this program will consume my server's all 30G memory at about 40th epoch. And It will slow down the test procedure.

what`s the final_model

python main.py 1 "/research/prip-baytasin/Synth_EHR/Split0" 1e-3 50 1.0 128 64 "/research/prip-baytasin/T-LSTM-master/TLSTM/final_model"
when i run this command, what`s the final_model,thanks,could you give me a example?
thank you very much!

Licence file

Can you update the licence file so that i can know which licence is it in?

Embedding and Padding

Hello team, I am curious if you have used an embedding layer in your model (I haven't found it in your paper yet). Did you also pad the data?

Input Data Format

Hi illidanlab,

Congrats on the great paper and thank you for sharing T-LSTM code!

I tried to run the code using the synthetic data provided; ‘Clustering_Data_1D.mat.tar.gz’ and the code works perfectly, but I am struggling to understand the input data format required for T-LSTM AE. My aim is to run the code on different EHR data (rather than EMRBot or PPMI) to obtain patient representation using your method.

• In issue #10, the data format is described as (mini batch size x sequence length x dimensionality).
o What is the intuition behind choosing the number of mini-batches? Do you group patients with similar temporal pattern (e.g. # of visits) and number of dimensionality (e.g. lab values)?
o The dimensionality, in the paper and the synthetic data, is 5. I assumed this refers to the number of lab values available for a patient? If so, what if a patient doesn’t have exactly 5 lab values? Is padding the missing data for this patient with zero makes sense?
• I am also interested to include diagnostic codes and medications to the input data. My understanding is the current synthetic data only has lab values
o Do you have any intuition about how I can also incorporate the diagnostic codes or medication codes?

Thank you for your time in advance.

Dataset format

Hi,
I read your article, it is very good ! I try to apply your code on another dataset but I can't understand the format of the data in clustering_data_1D.mat. When using the generate_batches function you obtain x, tand awhich are respectively 3D, 2D and 1D arrays but i can't understand what are the information they carry. Could you detail these three elements for me to construct the same structure from my dataset and then apply your model ?

Thank you very much.

Paul L

Compatibility withh tensorflow 2

Hi, is this implementation compatible with tensorflow 2 ?
Are you aware of any project reproducing the same kind of work with tensorflow 2 ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.