Comments (6)
Sorry for the late reply.
I didn't test any other embedding method because it's not the focus of this paper.
Did you try to shorten the embedding dim to reduce the trainable parameters?
from disan.
Thank you for your answer.
- What do you mean by trying to shorten the embedding dim?
- Where you train the embedding I can't discover it from the code. I know that you fine-tune glove but can't get where you do that
- finally and you can skip this if you think it's out of your scope but shall we fine-tune all the embedding or glove only is what we need to do
Between I know this drop was happening due to the dimensionality of the embedding because I have tested the same with Glove but in a way, I have made emb_mat contains all the token not only unique ones and yes the performance has dropped sharply
Thank you again Tao and I will appreciate if I get a quick response for these questions
from disan.
- I mean too long embedding dim will lead to too many parameters required to train. It will be difficult to reach coverage.
- I build a trainable word emb matrix which initialized by glove
- ... I didnt get you
from disan.
I was asking about where in the code we tune the word representations? And in general, if I used another word embedding is it possible to not need this tuning?
from disan.
Got u, I learned some basic knowledge about the transferring/pretrained embeddings like ELMo and CoVe within these days.
The other embeddings can be fine-tuned in this framework. In current glove setup, I create a tensorflow trainable variable which is initialized by the pretrained glove. The variable will be automatically fine-tuned by backpropagation.
Therefore, in your setup, just set the variables in word embedding layer or pertained LSTM as "trainable".
from disan.
Thank you!
from disan.
Related Issues (20)
- word embedding HOT 4
- Using tree structure not raw text in SNLI dataset HOT 3
- Perplexity HOT 6
- experiments on MSRP HOT 3
- tensorboard graph did not show anything HOT 1
- report error in line94 snli_main.py
- How to process new data? What should be the format of input file HOT 3
- what to do about the var rep_mask in the disan.py? HOT 10
- Can you take examples for the Fast-Disa.py?
- input to fast-disan.py
- SICK dataset
- Excuse me, where is the code for visual attention weight?
- SST data process HOT 1
- question about config parameter "only_sentence" HOT 3
- Where is the trained model stored?
- How can I generate predictions for my non-SNLI dataset using SNLI-DiSAN model? HOT 1
- It seems that without a pretrained embedding input, the results become worse HOT 2
- The net seems use lstm HOT 2
- About the structure of the code HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from disan.