Coder Social home page Coder Social logo

nicola-decao / bi-lstm-dependency-parsing Goto Github PK

View Code? Open in Web Editor NEW
2.0 2.0 3.0 20.29 MB

Bidirectional LSTM for dependency parsing in python: Disjoint predictions and complete classification accuracy in automated dependency parsing

Python 37.92% Perl 62.08%
dependency-parsing lstm bi-lstm natural-language-processing deeplearning machine-learning tensorflow

bi-lstm-dependency-parsing's Introduction

Bidirectional LSTM for dependency parsing

Disjoint predictions and complete classification accuracy in automated dependency parsing

A common approach for solving the dependency parsing problem is to apply the SHIFT-REDUCE algorithm in combination with neural networks outputting the desired transition and/or label at each iteration [1][2]. This study compares the performances of different models for labeled dependency parsing.

First, an unlabeled dependency parsing was implemented which consists of a bi-LSTM and an MLP on the top of it outputting the selected transition. This model was then extended by adding a two hidden layer MLP which takes the representations of the head and the tail of the transition and outputs one of 49 labels. This MLP was then altered to additionally accept as an input the parent of the current head. It was also built an other version that accepts the corresponding GloVe [3] word embeddings instead of LSTM output vectors. Finally it was created an architecture with a bi-LSTM followed by only one MLP, that predicts one of 99 possible labeled transitions out.

The purpose of this work is to evaluate such different architectures.

See report for further details.

Notes

Additional files are needed to run the project:

To run the project simply run python main.py.

References

[1] James Cross and Liang Huang. 2016. Incremental parsing with minimal features using bi-directional LSTM. CoRR, abs/1606.06406.

[2] Eliyahu Kiperwasser and Yoav Goldberg. 2016. Sim- ple and accurate dependency parsing using bidi- rectional LSTM feature representations. CoRR, abs/1603.04351.

[3] Jeffrey Pennington, Richard Socher, and Christo- pher D. Manning. Glove: Global vectors for word representation.

bi-lstm-dependency-parsing's People

Stargazers

 avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.