Coder Social home page Coder Social logo

intellabs / nlp-architect Goto Github PK

View Code? Open in Web Editor NEW
2.9K 165.0 444.0 543.74 MB

A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks

Home Page: https://intellabs.github.io/nlp-architect

License: Apache License 2.0

Python 81.42% Perl 3.50% HTML 0.02% JavaScript 1.16% CSS 1.12% Jupyter Notebook 11.93% Shell 0.16% Makefile 0.61% Dockerfile 0.06%
deeplearning nlp nlu tensorflow dynet deep-learning pytorch bert transformers quantization

nlp-architect's Introduction

⚠️ DISCONTINUATION OF PROJECT - This project will no longer be maintained by Intel. This project has been identified as having known security escapes. Intel has ceased development and contributions including, but not limited to, maintenance, bug fixes, new releases, or updates, to this project. Intel no longer accepts patches to this project.



A Deep Learning NLP/NLU library by Intel® AI Lab

GitHub Website GitHub release

NLP Architect is an open source Python library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing and Natural Language Understanding Neural Networks.

Overview

NLP Architect is an NLP library designed to be flexible, easy to extend, allow for easy and rapid integration of NLP models in applications and to showcase optimized models.

Features:

  • Core NLP models used in many NLP tasks and useful in many NLP applications

  • Novel NLU models showcasing novel topologies and techniques

  • Optimized NLP/NLU models showcasing different optimization algorithms on neural NLP/NLU models

  • Model-oriented design:

    • Train and run models from command-line.
    • API for using models for inference in python.
    • Procedures to define custom processes for training, inference or anything related to processing.
    • CLI sub-system for running procedures
  • Based on optimized Deep Learning frameworks:

  • Essential utilities for working with NLP models - Text/String pre-processing, IO, data-manipulation, metrics, embeddings.

Installing NLP Architect

We recommend to install NLP Architect in a new python environment, to use python 3.6+ with up-to-date pip, setuptools and h5py.

Install using pip

Install core library only

pip install nlp-architect

Install from source (Github)

Includes core library, examples, solutions and tutorials:

git clone https://github.com/IntelLabs/nlp-architect.git
cd nlp-architect
pip install -e .  # install in developer mode

Running Examples and Solutions

To run provided examples and solutions please install the library with [all] flag which will install extra packages required. (requires installation from source)

pip install .[all]

Models

NLP models that provide best (or near) in class performance:

Natural Language Understanding (NLU) models that address semantic understanding:

Optimizing NLP/NLU models and misc. optimization techniques:

Solutions (End-to-end applications) using one or more models:

Documentation

Full library documentation of NLP models, algorithms, solutions and instructions on how to run each model can be found on our website.

NLP Architect library design philosophy

NLP Architect is a model-oriented library designed to showcase novel and different neural network optimizations. The library contains NLP/NLU related models per task, different neural network topologies (which are used in models), procedures for simplifying workflows in the library, pre-defined data processors and dataset loaders and misc utilities. The library is designed to be a tool for model development: data pre-process, build model, train, validate, infer, save or load a model.

The main design guidelines are:

  • Deep Learning framework agnostic
  • NLP/NLU models per task
  • Different topologies used in models
  • Showcase End-to-End applications (Solutions) utilizing one or more NLP Architect model
  • Generic dataset loaders, textual data processing utilities, and miscellaneous utilities that support NLP model development (loaders, text processors, io, metrics, etc.)
  • Procedures for defining processes for training, inference, optimization or any kind of elaborate script.
  • Pythonic API for using models for inference
  • Extensive model documentation and tutorials

Note

NLP Architect is an active space of research and development; Throughout future releases new models, solutions, topologies and framework additions and changes will be made. We aim to make sure all models run with Python 3.6+. We encourage researchers and developers to contribute their work into the library.

Citing

If you use NLP Architect in your research, please use the following citation:

@misc{izsak_peter_2018_1477518,
  title        = {NLP Architect by Intel AI Lab},
  month        = nov,
  year         = 2018,
  doi          = {10.5281/zenodo.1477518},
  url          = {https://doi.org/10.5281/zenodo.1477518}
}

Disclaimer

The NLP Architect is released as reference code for research purposes. It is not an official Intel product, and the level of quality and support may not be as expected from an official product. NLP Architect is intended to be used locally and has not been designed, developed or evaluated for production usage or web-deployment. Additional algorithms and environments are planned to be added to the framework. Feedback and contributions from the open source and NLP research communities are more than welcome.

Contact

Contact the NLP Architect development team through Github issues or email: [email protected]

nlp-architect's People

Contributors

8key avatar aloneirew avatar andy-nervana avatar ardenma-intel avatar bethke avatar brettkoonce avatar chinnikrishna2231 avatar danielkorat avatar deeptig7 avatar dependabot[bot] avatar heavani avatar hkvision avatar indie avatar jmamou avatar maheshwarigagan avatar maneesht avatar michaelbeale-il avatar nlsapp avatar ofirzaf avatar orenpereg avatar peteriz avatar piesposito avatar s1113950 avatar several27 avatar sharathns93 avatar shira-g avatar stevesolun avatar vasudevl avatar yinyinl avatar yoptar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nlp-architect's Issues

pip install nlp-architect azure fails on azure notebook service

Target objective:

Trying to install nlp-architect on an azure notebooks vm using pip install nlp-architect and it's throwing the following error.

Installing collected packages: tensorboard, mock, tensorflow-estimator, tensorflow, docopt, num2words, cymem, plac, wasabi, blis, preshed, murmurhash, srsly, thinc, spacy, sklearn, feedparser, jieba3k, tinysegmenter, requests-file, tldextract, cssselect, feedfinder2, newspaper3k, ftfy, marisa-trie, langcodes, regex, wordfreq, elasticsearch, python-mimeparse, falcon,pymongo, hyperopt, dynet, seqeval, tensorflow-hub, falcon-multipart, pywikibot, hug, nlp-architect
  Found existing installation: tensorboard 1.14.0
    Uninstalling tensorboard-1.14.0:
      Successfully uninstalled tensorboard-1.14.0
  Found existing installation: tensorflow-estimator 1.14.0
    Uninstalling tensorflow-estimator-1.14.0:
      Successfully uninstalled tensorflow-estimator-1.14.0
Exception:
Traceback (most recent call last):
  File "/anaconda/envs/azureml_py36/lib/python3.6/site-packages/pip/basecommand.py", line 215, in main
    status = self.run(options, args)
  File "/anaconda/envs/azureml_py36/lib/python3.6/site-packages/pip/commands/install.py", line 342, in run
    prefix=options.prefix_path,
  File "/anaconda/envs/azureml_py36/lib/python3.6/site-packages/pip/req/req_set.py", line 784, in install
    **kwargs
  File "/anaconda/envs/azureml_py36/lib/python3.6/site-packages/pip/req/req_install.py", line 851, in install
    self.move_wheel_files(self.source_dir, root=root, prefix=prefix)
  File "/anaconda/envs/azureml_py36/lib/python3.6/site-packages/pip/req/req_install.py", line 1064, in move_wheel_files
    isolated=self.isolated,
  File "/anaconda/envs/azureml_py36/lib/python3.6/site-packages/pip/wheel.py", line 377, in move_wheel_files
    clobber(source, dest, False, fixer=fixer, filter=filter)
  File "/anaconda/envs/azureml_py36/lib/python3.6/site-packages/pip/wheel.py", line 329, in clobber
    os.utime(destfile, (st.st_atime, st.st_mtime))
PermissionError: [Errno 1] Operation not permitted
You are using pip version 9.0.1, however version 19.2.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.

Steps to objective:

  • pip install nlp-architect

Pull-Request related:

N/A

Problem with downloading pre-trained models

Description:

Hello everyone. I tried to launch REST server as in following tutorial: Link. However, in process of starting server some pre-trained models were downloaded, and seems like some model files contain only error. Starting server crashes with KeyError on trying to unpickle model_info.dat in ner_api.py. Contents of model_info.dat after download:
<Error><Code>NoSuchKey</Code><Message>The specified key does not exist.</Message><Key>NLP/NER/model_info.dat</Key><RequestId>7113734D2CFD0F64</RequestId><HostId>L8Szl6PSef3HNZ7nKMF20Ji2xjDfSpxPiAo6CnlBvrOJyffPYxHtMJTf+GWAwuw+mOlLO+vH6uw=</HostId></Error>

Steps to reproduce:

  • Clone nlp-architect
  • Make environment (make in root)
  • Launch server (hug -p 8080 -f server/serve.py)

Error in set expansion

Installing using the installation guide, using this commands:
python3 -m venv .nlp_architect_env
source .nlp_architect_env/bin/activate
export NLP_ARCHITECT_BE=GPU
pip install nlp-architect

then running the command:
(.nlp_architect_env) (base) [moranb@nlp01 ~]$ python -m nlp_architect.solutions.set_expansion.prepare_data --corpus all_sentences_commits

produce this error:
Traceback (most recent call last):
File "/home/nlp/moranb/anaconda3/lib/python3.7/runpy.py", line 193, in _run_module_as_main
"main", mod_spec)
File "/home/nlp/moranb/anaconda3/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/nlp/moranb/anaconda3/lib/python3.7/site-packages/nlp_architect/solutions/set_expansion/prepare_data.py", line 31, in
from nlp_architect.pipelines.spacy_np_annotator import NPAnnotator, get_noun_phrases
File "/home/nlp/moranb/anaconda3/lib/python3.7/site-packages/nlp_architect/pipelines/spacy_np_annotator.py", line 24, in
from nlp_architect.models.chunker import SequenceChunker
File "/home/nlp/moranb/anaconda3/lib/python3.7/site-packages/nlp_architect/models/chunker.py", line 19, in
import tensorflow as tf
File "/home/nlp/moranb/anaconda3/lib/python3.7/site-packages/tensorflow/init.py", line 24, in
from tensorflow.python import pywrap_tensorflow # pylint: disable=unused-import
File "/home/nlp/moranb/anaconda3/lib/python3.7/site-packages/tensorflow/python/init.py", line 49, in
from tensorflow.python import pywrap_tensorflow
File "/home/nlp/moranb/anaconda3/lib/python3.7/site-packages/tensorflow/python/pywrap_tensorflow.py", line 74, in
raise ImportError(msg)
ImportError: Traceback (most recent call last):
File "/home/nlp/moranb/anaconda3/lib/python3.7/site-packages/tensorflow/python/pywrap_tensorflow.py", line 58, in
from tensorflow.python.pywrap_tensorflow_internal import *
File "/home/nlp/moranb/anaconda3/lib/python3.7/site-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 28, in
_pywrap_tensorflow_internal = swig_import_helper()
File "/home/nlp/moranb/anaconda3/lib/python3.7/site-packages/tensorflow/python/pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "/home/nlp/moranb/anaconda3/lib/python3.7/imp.py", line 242, in load_module
return load_dynamic(name, filename, file)
File "/home/nlp/moranb/anaconda3/lib/python3.7/imp.py", line 342, in load_dynamic
return _load(spec)
ImportError: /lib64/libm.so.6: version `GLIBC_2.23' not found (required by /home/nlp/moranb/anaconda3/lib/python3.7/site-packages/tensorflow/python/_pywrap_tensorflow_internal.so)

Failed to load the native TensorFlow runtime.

See https://www.tensorflow.org/install/errors

Do you know what could be the problem here?

Thanks,
Moran

How does it support Chinese?

I saw in the paper for BIST that it works well in both Chinese and English, but didn't find how to use Chinese model in any of the functionalities. Does it come with a Chinese model? Or do we have to train our own?

GPU training support

This is more of a question than issue but I wasn't able to find a forum to ask the question.

I see installation instruction on enabling GPU supports. Yet, many algorithms (e.g. ner) do not specifically have a parameter for GPU enabling. Is it an issue easy to get around with or is it going to be a feature supported in the future?

How to use MLP classifier on multi-context features ?

Hello, I read your published paper "Term Set Expansion based NLP Architect by Intel AI Lab" and am quite impressed by your multi-context-based method. However, I found that the code associated with your methods in the repository seems not to implement this method. In your paper, you mentioned that

we train a Multilayer Perceptron
(MLP) classifier that predicts whether a candidate
term should be part of the expanded set based on
ten similarity scores (considered as input features)
obtained by the five different context types and
two different similarity-scoring methods

but in the code https://github.com/NervanaSystems/nlp-architect/blob/master/nlp_architect/solutions/set_expansion/set_expand.py, according to my understanding, you used another simpler method based on the similarity between a single embedding of seeds and candidates, and I cannot find the implementation of your algorithm in the paper.
Is there any code in the repository to use your algorithm in the paper? Thank you.

Problem running bist FileNotFoundError: [Errno 2] No such file or directory: (...) params.json

I tried to run bist in your interface using the command :

python3 server/serve.py --name bist

I get the following error :

25MB [00:28,  1.16s/MB]                                                                                                                                                                                     
Download Complete
Unzipping...
Done.
Traceback (most recent call last):
  File "serve.py", line 306, in <module>
    set_server_properties(app, args.name)
  File "serve.py", line 287, in set_server_properties
    service = Service(service_name)
  File "serve.py", line 143, in __init__
    self.service = self.load_service(service_name)
  File "serve.py", line 259, in load_service
    upload_service.load_model()
  File "/home/catalina/nlp-architect/nlp_architect/api/bist_parser_api.py", line 32, in load_model
    self.model = SpacyBISTParser()
  File "/home/catalina/nlp-architect/nlp_architect/pipelines/spacy_bist.py", line 48, in __init__
    self.bist_parser.load(bist_model if bist_model else SpacyBISTParser.pretrained)
  File "/home/catalina/nlp-architect/nlp_architect/models/bist_parser.py", line 117, in load
    with open(os.path.join(os.path.dirname(path), 'params.json'), 'r') as file:
FileNotFoundError: [Errno 2] No such file or directory: '/home/catalina/nlp-architect/nlp_architect/pipelines/bist-pretrained/params.json'

ner example trows ImportError

Target objective:

I would be great if ner example would not trow an ImportError: cannot import name 'saving' error that happens because of broken keras-contrib package.

Steps to objective:

  • modify requirements.txt to reference a specific working commit of keras-contrib package

Pull-Request related:

#3 could do it

NP2Vec model is not working

Target objective:

Train a np2vec model using a pre-trained model corpus provided.

Steps to objective:

Running the following command:

python3 examples/np2vec/train.py --size 200 --min_count 2 --window 10 --hs 0 --corpus ../enwiki-20171201_pretrained_set_expansion.txt --corpus_format txt --word_embedding_type fasttext

Trains a model for few minutes and then throws a gensim related error:

Traceback (most recent call last):
  File "examples/np2vec/train.py", line 216, in <module>
    args.word_ngrams)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/nlp_architect/models/np2vec.py", line 177, in __init__
    self._train()
  File "/home/ubuntu/.local/lib/python3.6/site-packages/nlp_architect/models/np2vec.py", line 216, in _train
    word_ngrams=self.word_ngrams)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/gensim/models/fasttext.py", line 388, in __init__
    seed=seed, hs=hs, negative=negative, cbow_mean=cbow_mean, min_alpha=min_alpha, fast_version=FAST_VERSION)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/gensim/models/base_any2vec.py", line 763, in __init__
    end_alpha=self.min_alpha, compute_loss=compute_loss)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/gensim/models/fasttext.py", line 666, in train
    queue_factor=queue_factor, report_delay=report_delay, callbacks=callbacks)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/gensim/models/base_any2vec.py", line 1081, in train
    **kwargs)
  File "/home/ubuntu/.local/lib/python3.6/site-packages/gensim/models/base_any2vec.py", line 546, in train
    for cur_epoch in range(self.epochs):
TypeError: 'builtin_function_or_method' object cannot be interpreted as an integer

Pull-Request related:

So far I've been able to fix another smaller issue, related to simple array assignment: #18 . I don't think that those two are related, however.

I'd appreciate any help or guidance, thanks!

ABSA ImportError: cannot import name 'keras'

Target objective:

Trying to run the absa train script from a jupyter notebook.

import argparse
from os import path
from pathlib import Path

from nlp_architect.models.absa.train.train import TrainSentiment
from nlp_architect.utils.io import validate_existing_filepath, validate_existing_path, \
    validate_existing_directory


def main() -> None:
    lib_root = Path(path.realpath(__file__)).parent.parent.parent.parent
    tripadvisor_train = lib_root / 'datasets' / 'absa' / \
        'tripadvisor_co_uk-travel_restaurant_reviews_sample_2000_train.csv'

    parser = argparse.ArgumentParser(description='ABSA Train')
    parser.add_argument('--rerank-model', type=validate_existing_filepath,
                        default=None, help='Path to rerank model .h5 file')

    group = parser.add_mutually_exclusive_group()
    group.add_argument('--data', type=validate_existing_path,
                       default=tripadvisor_train,
                       help='Path to raw data (directory or txt/csv file)')
    group.add_argument('--parsed-data', type=validate_existing_directory, default=None,
                       help='Path to parsed data directory')
    args = parser.parse_args()

    train = TrainSentiment(parse=not args.parsed_data, rerank_model=args.rerank_model)
    opinion_lex, aspect_lex = train.run(data=args.data, parsed_data=args.parsed_data)

    print('Aspect Lexicon: {}\n'.format(aspect_lex) + '=' * 40 + '\n')
    print('Opinion Lexicon: {}'.format(opinion_lex))

Steps to objective:

  1. Install nlp-architect
  2. Run cell above
  3. Get following error with keras
ImportError                               Traceback (most recent call last)
<ipython-input-5-f7ccc30f95ae> in <module>()
      3 from pathlib import Path
      4 
----> 5 from nlp_architect.models.absa.train.train import TrainSentiment
      6 from nlp_architect.utils.io import validate_existing_filepath, validate_existing_path,     validate_existing_directory
      7 

/mnt/azmnt/code/Users/nlp-architect/nlp_architect/models/absa/train/train.py in <module>()
     19 from nlp_architect.models.absa import TRAIN_OUT
     20 from nlp_architect.models.absa.train.acquire_terms import AcquireTerms
---> 21 from nlp_architect.models.absa.train.rerank_terms import RerankTerms
     22 from nlp_architect.models.absa.utils import parse_docs, _download_pretrained_rerank_model, \
     23     _write_aspect_lex, _write_opinion_lex

/mnt/azmnt/code/Users/nlp-architect/nlp_architect/models/absa/train/rerank_terms.py in <module>()
     17 import pickle
     18 import numpy as np
---> 19 import tensorflow
     20 from os import PathLike
     21 

/anaconda/envs/azureml_py36/lib/python3.6/site-packages/tensorflow/__init__.py in <module>()
     41 from tensorflow._api.v1 import initializers
     42 from tensorflow._api.v1 import io
---> 43 from tensorflow._api.v1 import keras
     44 from tensorflow._api.v1 import layers
     45 from tensorflow._api.v1 import linalg

ImportError: cannot import name 'keras'

Pull-Request related:

Logger logs done loading mention files before it actually does

Target objective:

Steps to objective:

nlp_architect/models/cross_doc_coref/system/cdc_utils.py

def load_mentions_vocab(mentions_files, filter_stop_words=False):
    logger.info('Loading mentions files...')
    mentions = []
    logger.info('Done loading mentions files, starting local dump creation...')
    for _file in mentions_files:
        mentions.extend(MentionData.read_mentions_json_to_mentions_data_list(_file))

    return extract_vocab(mentions, filter_stop_words)

Pull-Request related:

Error downloading datasets

Target objective:

REST server initial download

Steps to objective:

  • clone repository and install
  • nlp_architect server -p 8080
  • answer YES to download archive files

Pull-Request related:

Trace:

mrc_model can be downloaded from http://nervana-modelzoo.s3.amazonaws.com/NLP/mrc/mrc_model.zip
The terms and conditions of the data set license apply. Intel does not grant any rights to the data files or database

To download 'mrc_model' from http://nervana-modelzoo.s3.amazonaws.com/NLP/mrc/mrc_model.zip, please enter YES: YES
Downloading mrc_model...
Unable to determine total file size.
Downloading file to: /root/nlp-architect/mrc-pretrained/mrc_data/mrc_data.zip
1MB [00:00, 1045.70MB/s]
Download Complete
Unable to determine total file size.
Downloading file to: /root/nlp-architect/mrc-pretrained/mrc_trained_model/mrc_model.zip
1MB [00:00, 1985.94MB/s]
Download Complete
Traceback (most recent call last):
  File "/usr/local/bin/hug", line 10, in <module>
    sys.exit(development_runner.hug.interface.cli())
  File "/usr/local/lib/python3.6/site-packages/hug/interface.py", line 551, in __call__
    raise exception
  File "/usr/local/lib/python3.6/site-packages/hug/interface.py", line 547, in __call__
    result = self.output(self.interface(**pass_to_function), context)
  File "/usr/local/lib/python3.6/site-packages/hug/interface.py", line 100, in __call__
    return __hug_internal_self._function(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/hug/development_runner.py", line 59, in hug
    api_module = importlib.machinery.SourceFileLoader(file.split(".")[0], file).load_module()
  File "<frozen importlib._bootstrap_external>", line 399, in _check_name_wrapper
  File "<frozen importlib._bootstrap_external>", line 823, in load_module
  File "<frozen importlib._bootstrap_external>", line 682, in load_module
  File "<frozen importlib._bootstrap>", line 265, in _load_module_shim
  File "<frozen importlib._bootstrap>", line 684, in _load
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/src/nlp-architect/nlp_architect/server/serve.py", line 103, in <module>
    prefetch_models()
  File "/src/nlp-architect/nlp_architect/server/serve.py", line 36, in prefetch_models
    services[model] = Service(model)
  File "/usr/local/lib/python3.6/site-packages/nlp_architect/server/service.py", line 117, in __init__
    self.service = self.load_service(service_name)
  File "/usr/local/lib/python3.6/site-packages/nlp_architect/server/service.py", line 181, in load_service
    upload_service.load_model()
  File "/usr/local/lib/python3.6/site-packages/nlp_architect/api/machine_comprehension_api.py", line 107, in load_model
    self.download_model()
  File "/usr/local/lib/python3.6/site-packages/nlp_architect/api/machine_comprehension_api.py", line 96, in download_model
    data_zip_ref = zipfile.ZipFile(data_zipfile, 'r')
  File "/usr/local/lib/python3.6/zipfile.py", line 1131, in __init__
    self._RealGetContents()
  File "/usr/local/lib/python3.6/zipfile.py", line 1198, in _RealGetContents
    raise BadZipFile("File is not a zip file")
zipfile.BadZipFile: File is not a zip file

val_loss is too high while training loss is small in intent extraction

Target objective:

When I am using the file [train_mlt_model.py] for intent extraction, the val_loss is very high wheras the training loss is relatively small. However, the conll eval results are all very good.

Steps to objective:

  • Download the Dataset [nlu-benchmark] and run the file [train_mlt_model.py] with command:
    python train_mlt_model.py --dataset_path nlu-benchmark/2017-06-custom-intent-engines -b 100 -e 5
  • After 5 epochs, the console shows:
    13784/13784 [==============================] - 11s 817us/step - loss: 3.1504 - intent_classifier_output_loss: 0.0971 - intent_slot_crf_loss: 3.0533 - intent_classifier_output_categorical_accuracy: 0.9710 - intent_slot_crf_accuracy: 0.9671 - val_loss: 95.6078 - val_intent_classifier_output_loss: 0.0791 - val_intent_slot_crf_loss: 95.5287 - val_intent_classifier_output_categorical_accuracy: 0.9771 - val_intent_slot_crf_accuracy: 0.9740
  • Here, the val_loss is 95.6 whears the training loss is only 3.15.

Pull-Request related:

Validate python version

Target objective:

validate supported python in the library usage.
related to error in #60

Pull-Request related:

NP semantic segmentation related errors

Target objective:

I am trying to run the script data.py from np_semantic_segmentation and gettinng the error that is cannot allocate the memory.

Steps to objective:

step 1:
follows the command : python data.py --data input_data_path.csv --output output_prepared_path.csv --w2v_path <path_to_w2v>/GoogleNews-vectors-negative300.bin.gz

step 2 :
on the terminal i get
start loading wordvec model
.
.
.
finish loading feature extraction services.
. . . .
and the getting error such as

**file "data.py ", line 354
prepare_data(data_path, output_path, word2vec_path, http_proxy, https_proxy)

file "data.py ", line 168, in prepare_data
p = Pool(10)

self.pid = os.fork()
OSError: [Errno 12] Cannot allocate memory**

My system configuration is :
Dell Inspiron,
memory : 3.8 GiB
processor : Intel® Core™ i5-5200U CPU @ 2.20GHz × 4
OS type : 64-bit
Disk : 980.2 GB

Kindly address my problem, and give me suggestions,
Thanking You,
Nayan choudhari.

nlp_architect command fails if dev req aren't installed

Target objective:

nlp_architect command fails if dev req aren't installed
detect is libraries required are install and print a msg that states the error + how to install

Steps to objective:

  • fix cmd.py
  • [ ]

Pull-Request related:

Fix logging

Target objective:

Fix logging to a standard format

Pull-Request related:

trend analysis pickle bug

Hello,

i would like to test the project, espacially the topic extraction but when i run the command:
python -m nlp_architect.solutions.trend_analysis.topic_extraction myfolder myfolder
i have this error:

INFO:main:extracting NPs
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/usr/lib/python3.5/multiprocessing/pool.py", line 119, in worker
result = (True, func(*args, **kwds))
File "/home/nlp-architect/nlp_architect/solutions/trend_analysi
s/topic_extraction.py", line 52, in noun_phrase_extraction
np_app = NPScorer(parser=parser)
File "/home/nlp-architect/nlp_architect/solutions/trend_analysi
s/np_scorer.py", line 56, in init
self.nlp.add_pipe(NPAnnotator.load(_path_to_model, _path_to_params), last=Tru
e)
File "/home/nlp-architect/nlp_architect/pipelines/spacy_np_anno
tator.py", line 72, in load
model.load(_model_path)
File "/home/nlp-architect/nlp_architect/models/chunker.py", lin
e 213, in load
load_model(filepath, self)
File "/home/nlp-architect/nlp_architect/contrib/tensorflow/pyth
on/keras/utils/layer_utils.py", line 49, in load_model
model_data = pickle.load(fp)
_pickle.UnpicklingError: invalid load key, '<'.
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_main
"main", mod_spec)
File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/nlp-architect/nlp_architect/solutions/trend_analysi
s/topic_extraction.py", line 331, in
main(args.target_corpus, args.ref_corpus, args.single_thread, args.no_train,
args.url)
File "/home/nlp-architect/nlp_architect/solutions/trend_analysi
s/topic_extraction.py", line 301, in main
result_t = run_np_t.get()
File "/usr/lib/python3.5/multiprocessing/pool.py", line 608, in get
raise self._value
_pickle.UnpicklingError: invalid load key, '<'.

I am on google compute engine, ubuntu 16.04.

Any idea in order to fix this ?

Thank you !

kvmemn2n training time - help

Target objective:

I am running train_kvmemn2n.py on MacBook Pro with epoc=2000. It is completing at the rate of about 10 epochs per day, and at this rate I estimate 20 days to complete the training. Any ideas on how to speed it up? Does it need to train for 2000 epochs to create the model? Help appreciated. Is there a pre-built model I can leverage and add to?

Steps to objective:

  • [ ]
  • [ ]

Pull-Request related:

Intent Extraction: 'GetWeather' intent: Web viewer ignores to show the annotation/label for "timeRange"

Target objective:

I'm using the intent extraction demo with the pre-trained model that is downloaded when you start the embedded server. This is my example sentence:
"What's the forecast for today in Chicago?"

Below is the annotated result:

{'model_name': 'intent_extraction', 'docs': [{'id': 1, 'doc': "What's the forecast for today in Chicago?"}]}
Detected intent type: GetWeather
What	O	
's	O	
the	O	
forecast	O	
for	O	
today	B-timeRange	
in	O	
Chicago	B-city	
?	O	

As you can see, "today" is correctly annotated (in the terminal and in the JSON response) as a timeRange. The problem is that the visualizer on the web view only shows the annotation for the city. The timeRange is not visualized with its annotation.

Pull-Request related:

nlp_architect v. 0.4 Post 2

Question about Seq2SeqIntentModel

This is used for slot tagging, and in the example you provide, the dataset is SNIPS and it is used for NER. Wondering why it is placed under intent_extraction and come with the name IntentModel?

Consolidate public URLs

Target objective:

Consolidate public URLs and models into a single API (and possible file).
The file should contain all the public URLs used in the library and caching mechanism for managing downloaded files.
Use caching code from here

Pull-Request related:

Question: Can we do nested NER?

Can the NER module do the following?
tag A B C where A is an entity and ABC on the whole is another entity, a little bit like multi-label classification.

What is meaning of tag_field_num arguement SequentialTaggingDataset() function

I'm training my BIST-ner model, I've currently about 19 different ner classes, the problem is when is when I set

dataset = SequentialTaggingDataset(train, test,
                                   max_sentence_length=sentence_length,
                                   max_word_length=word_length,
                                   tag_field_no=19)

It throws an error, some kind of limit exceeded.

I wannted to whether tag_field_no is the number of total tags in the dataset or the number of tags that are associated with each word in a single line of data

Pip install failing from master

Target objective:

pip Install .

Steps to objective:

Successfully built nlp-architect
spacy 2.0.18 has requirement numpy>=1.15.0, but you'll have numpy 1.14.5 which is incompatible.
Installing collected packages: spacy, nlp-architect
Killed

Pull-Request related:

Not able to install nlp_architect v0.2

Target objective:

nlp_architect v0.2 installation using pip

Steps to objective:

$ pip install nlp-architect==0.2.1
Collecting nlp-architect==0.2.1
Could not find a version that satisfies the requirement nlp-architect==0.2.1 (from versions: 0.3.1)
No matching distribution found for nlp-architect==0.2.1
You are using pip version 9.0.1, however version 18.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.

Tried with v0.2.1, 0.2, v0.2 and different combinations.

Since v0.3 has issues in running intent extraction demo, I want to install v0.2 and try.

Pull-Request related:

intent extraction interactive mode results are totally incorrect while testing with train data

Target objective:

interactive mode should predicts well on sentences from train data or validation data

Steps to objective:

  • python train_joint_model.py --model_path model_v1.h5 --dataset_path ~/git/nlu-benchmark/2017-06-custom-intent-engines
  • python interactive.py --model_path model_v1.h5 --dataset_path ~/git/nlu-benchmark/2017-06-custom-intent-engines
  • input: Weather next year in Dominica
  • expected result: intent as GetWeather
  • actual result: PlayMusic

the model final performance from training was below,
intent_classifier_loss: 0.0617 - slot_tag_classifier_loss: 0.0473 - intent_classifier_categorical_accuracy: 0.9805 - slot_tag_classifier_categorical_accuracy: 0.9861 - val_loss: 0.1243 - val_intent_classifier_loss: 0.0677 - val_slot_tag_classifier_loss: 0.0566 - val_intent_classifier_categorical_accuracy: 0.9771 - val_slot_tag_classifier_categorical_accuracy: 0.9845

How to prepare Data for Custom NER Model Training

Target objective:

Train a custom NER Model

Steps to objective:

  • [ Prepare data in required format for NERCRF Model]
  • [Train NERCRF Model ]

Apologies if this is too naive of a question, forgive my ignorance. I want to train a custom NER model for a project, while going through this tutorial https://github.com/NervanaSystems/nlp-architect/blob/master/tutorials/ner/ner_demo.ipynb ; I realized that format in which train and text files are required to be, were not clear to me, as explained in the tutorial. I couldn't find NamedEntityDataset data set loader in the project directories, I can't seem to understand how each entity is to be tagged and separated. I tried to google for the required format but couldn't find any useful links. Let me know how to go about it.

Thanks
@peteriz

Problems with model loading in intent extraction

Target objective:

After I have run [train_mlt_model.py] according to the Readme.md, a model can be obtained. Save this model by 'model.save' method and reload it as model1. Then the prediction on the validation data of these two models are different.

Steps to objective:

  • Run [train_mlt_model.py] and a model is obtained. This model is stored in the memory.

  • Save this model on the disk by command: model.save(model_path)

  • Reload this model from the disk by command:
    model1 = MultiTaskIntentModel();
    model1.load(model_path)

  • Use the commands below to evaluate these two model:
    predictions = model.predict(test_inputs, batch_size=args.b)
    eval = get_conll_scores(predictions, test_y,
    {v: k for k, v in dataset.tags_vocab.vocab.items()})
    print(eval)

    predictions = model1.predict(test_inputs, batch_size=args.b)
    eval = get_conll_scores(predictions, test_y,
    {v: k for k, v in dataset.tags_vocab.vocab.items()})
    print(eval)

  • The results are different:
    model: avg / total 0.820 0.833 0.821 1794
    model1:avg / total 0.817 0.826 0.814 1794

So, why the predictions of model and model1 are different ?

nlp_architect command does not exist after fresh install

Hi,
I just performed a fresh install of nlp_architect and the command does not exist after the installation process is complete. Here's my environment:

  • Ubuntu 18.04.2 LTS
  • 16GB RAM
  • Python 3.6.8

Below is the installation process, followed by the failure of the command not existing

(base) nlp@nlp16-70GB:~$ conda create -n intelnlp python=3.6
WARNING: The conda.compat module is deprecated and will be removed in a future release.
Collecting package metadata: done
Solving environment: done


==> WARNING: A newer version of conda exists. <==
  current version: 4.6.11
  latest version: 4.7.5

Please update conda by running

    $ conda update -n base -c defaults conda



## Package Plan ##

  environment location: /home/nlp/anaconda3/envs/intelnlp

  added / updated specs:
    - python=3.6


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    ca-certificates-2019.5.15  |                0         133 KB
    certifi-2019.6.16          |           py36_0         154 KB
    libgcc-ng-9.1.0            |       hdf63c60_0         8.1 MB
    libstdcxx-ng-9.1.0         |       hdf63c60_0         4.0 MB
    openssl-1.1.1c             |       h7b6447c_1         3.8 MB
    pip-19.1.1                 |           py36_0         1.9 MB
    python-3.6.8               |       h0371630_0        34.4 MB
    setuptools-41.0.1          |           py36_0         656 KB
    sqlite-3.28.0              |       h7b6447c_0         1.9 MB
    wheel-0.33.4               |           py36_0          40 KB
    ------------------------------------------------------------
                                           Total:        55.1 MB

The following NEW packages will be INSTALLED:

  ca-certificates    pkgs/main/linux-64::ca-certificates-2019.5.15-0
  certifi            pkgs/main/linux-64::certifi-2019.6.16-py36_0
  libedit            pkgs/main/linux-64::libedit-3.1.20181209-hc058e9b_0
  libffi             pkgs/main/linux-64::libffi-3.2.1-hd88cf55_4
  libgcc-ng          pkgs/main/linux-64::libgcc-ng-9.1.0-hdf63c60_0
  libstdcxx-ng       pkgs/main/linux-64::libstdcxx-ng-9.1.0-hdf63c60_0
  ncurses            pkgs/main/linux-64::ncurses-6.1-he6710b0_1
  openssl            pkgs/main/linux-64::openssl-1.1.1c-h7b6447c_1
  pip                pkgs/main/linux-64::pip-19.1.1-py36_0
  python             pkgs/main/linux-64::python-3.6.8-h0371630_0
  readline           pkgs/main/linux-64::readline-7.0-h7b6447c_5
  setuptools         pkgs/main/linux-64::setuptools-41.0.1-py36_0
  sqlite             pkgs/main/linux-64::sqlite-3.28.0-h7b6447c_0
  tk                 pkgs/main/linux-64::tk-8.6.8-hbc83047_0
  wheel              pkgs/main/linux-64::wheel-0.33.4-py36_0
  xz                 pkgs/main/linux-64::xz-5.2.4-h14c3975_4
  zlib               pkgs/main/linux-64::zlib-1.2.11-h7b6447c_3


Proceed ([y]/n)? y


Downloading and Extracting Packages
libstdcxx-ng-9.1.0   | 4.0 MB    | ############################################################################# | 100% 
openssl-1.1.1c       | 3.8 MB    | ############################################################################# | 100% 
ca-certificates-2019 | 133 KB    | ############################################################################# | 100% 
wheel-0.33.4         | 40 KB     | ############################################################################# | 100% 
sqlite-3.28.0        | 1.9 MB    | ############################################################################# | 100% 
python-3.6.8         | 34.4 MB   | ############################################################################# | 100% 
pip-19.1.1           | 1.9 MB    | ############################################################################# | 100% 
libgcc-ng-9.1.0      | 8.1 MB    | ############################################################################# | 100% 
setuptools-41.0.1    | 656 KB    | ############################################################################# | 100% 
certifi-2019.6.16    | 154 KB    | ############################################################################# | 100% 
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
#
# To activate this environment, use
#
#     $ conda activate intelnlp
#
# To deactivate an active environment, use
#
#     $ conda deactivate

(base) nlp@nlp16-70GB:~$ conda activate intelnlp
(intelnlp) nlp@nlp16-70GB:~$ export NLP_ARCHITECT_BE=CPU
(intelnlp) nlp@nlp16-70GB:~$ cd Desktop/
(intelnlp) nlp@nlp16-70GB:~/Desktop$ git clone https://github.com/NervanaSystems/nlp-architect.git
Cloning into 'nlp-architect'...
remote: Enumerating objects: 171, done.
remote: Counting objects: 100% (171/171), done.
remote: Compressing objects: 100% (123/123), done.
remote: Total 6163 (delta 82), reused 96 (delta 47), pack-reused 5992
Receiving objects: 100% (6163/6163), 140.06 MiB | 16.23 MiB/s, done.
Resolving deltas: 100% (3800/3800), done.
(intelnlp) nlp@nlp16-70GB:~/Desktop$ cd nlp-architect/
(intelnlp) nlp@nlp16-70GB:~/Desktop/nlp-architect$ pip3 install -e .
Obtaining file:///home/nlp/Desktop/nlp-architect
Collecting bokeh (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/7d/2f/de96d3f6f43cec07efc6f8f24fddf176e9a119f23aab8fe6153f2e96c6d3/bokeh-1.2.0.tar.gz (17.6MB)
    100% |████████████████████████████████| 17.6MB 37kB/s 
Collecting dynet==2.1 (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/88/f0/01a561a301a8ea9aea1c28f82e108c38cd103964c7a46286ab01757a4092/dyNET-2.1-cp36-cp36m-manylinux1_x86_64.whl (28.1MB)
    100% |████████████████████████████████| 28.1MB 22kB/s 
Collecting elasticsearch (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/ae/43/38329621bcca6f0b97e1cc36fb3cef889414a1960fcdc83a41e26b496634/elasticsearch-7.0.2-py2.py3-none-any.whl (83kB)
    100% |████████████████████████████████| 92kB 4.0MB/s 
Collecting falcon==1.4.1 (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/e8/d0/20bb807dee65f1f163754670557b128eafce1710f6c9c363a38e357f3783/falcon-1.4.1-py2.py3-none-any.whl (159kB)
    100% |████████████████████████████████| 163kB 2.6MB/s 
Collecting falcon_multipart (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/e1/a2/e50ffc3101ed6b91d1edc63b3586c424ef8071e1ef0ef7dcb8745e65fc14/falcon_multipart-0.2.0-py3-none-any.whl
Collecting ftfy (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/8f/86/df789c5834f15ae1ca53a8d4c1fc4788676c2e32112f6a786f2625d9c6e6/ftfy-5.5.1-py3-none-any.whl (43kB)
    100% |████████████████████████████████| 51kB 6.4MB/s 
Collecting future (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/90/52/e20466b85000a181e1e144fd8305caf2cf475e2f9674e797b222f8105f5f/future-0.17.1.tar.gz (829kB)
    100% |████████████████████████████████| 829kB 604kB/s 
Collecting gensim (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/d3/4b/19eecdf07d614665fa889857dc56ac965631c7bd816c3476d2f0cac6ea3b/gensim-3.7.3-cp36-cp36m-manylinux1_x86_64.whl (24.2MB)
    100% |████████████████████████████████| 24.2MB 15kB/s 
Collecting h5py (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/30/99/d7d4fbf2d02bb30fb76179911a250074b55b852d34e98dd452a9f394ac06/h5py-2.9.0-cp36-cp36m-manylinux1_x86_64.whl (2.8MB)
    100% |████████████████████████████████| 2.8MB 225kB/s 
Collecting hug (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/5d/b4/a4b916691f439b7f25bb78868cd00708ee3c7c1f41a03f9c9a88fa45f57c/hug-2.5.6-py2.py3-none-any.whl (74kB)
    100% |████████████████████████████████| 81kB 4.2MB/s 
Collecting hyperopt (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/63/12/704382c3081df3ae3f9d96fe6afb62efa2fa9749be20c301cd2797fb0b52/hyperopt-0.1.2-py3-none-any.whl (115kB)
    100% |████████████████████████████████| 122kB 3.8MB/s 
Collecting newspaper3k (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/d7/b9/51afecb35bb61b188a4b44868001de348a0e8134b4dfa00ffc191567c4b9/newspaper3k-0.2.8-py3-none-any.whl (211kB)
    100% |████████████████████████████████| 215kB 1.6MB/s 
Collecting nltk (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/8d/5d/825889810b85c303c8559a3fd74d451d80cf3585a851f2103e69576bf583/nltk-3.4.3.zip (1.4MB)
    100% |████████████████████████████████| 1.5MB 396kB/s 
Collecting num2words (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/eb/a2/ea800689730732e27711c41beed4b2a129b34974435bdc450377ec407738/num2words-0.5.10-py3-none-any.whl (101kB)
    100% |████████████████████████████████| 102kB 2.2MB/s 
Collecting numpy (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/87/2d/e4656149cbadd3a8a0369fcd1a9c7d61cc7b87b3903b85389c70c989a696/numpy-1.16.4-cp36-cp36m-manylinux1_x86_64.whl (17.3MB)
    100% |████████████████████████████████| 17.3MB 38kB/s 
Collecting pandas (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/19/74/e50234bc82c553fecdbd566d8650801e3fe2d6d8c8d940638e3d8a7c5522/pandas-0.24.2-cp36-cp36m-manylinux1_x86_64.whl (10.1MB)
    100% |████████████████████████████████| 10.1MB 62kB/s 
Collecting pillow (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/d2/c2/f84b1e57416755e967236468dcfb0fad7fd911f707185efc4ba8834a1a94/Pillow-6.0.0-cp36-cp36m-manylinux1_x86_64.whl (2.0MB)
    100% |████████████████████████████████| 2.0MB 341kB/s 
Collecting pywikibot (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/90/36/1956b5fe6b8130d61b60a8f5d56d27912f80e2f003d03468041ba319314e/pywikibot-3.0.20190430.tar.gz (515kB)
    100% |████████████████████████████████| 522kB 698kB/s 
Collecting requests (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl (57kB)
    100% |████████████████████████████████| 61kB 6.2MB/s 
Collecting scipy (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/72/4c/5f81e7264b0a7a8bd570810f48cd346ba36faedbd2ba255c873ad556de76/scipy-1.3.0-cp36-cp36m-manylinux1_x86_64.whl (25.2MB)
    100% |████████████████████████████████| 25.2MB 26kB/s 
Collecting seqeval (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/34/91/068aca8d60ce56dd9ba4506850e876aba5e66a6f2f29aa223224b50df0de/seqeval-0.0.12.tar.gz
Collecting six (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting sklearn (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/1e/7a/dbb3be0ce9bd5c8b7e3d87328e79063f8b263b2b1bfa4774cb1147bfcd3f/sklearn-0.0.tar.gz
Collecting spacy (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/a1/5b/0fab3fa533229436533fb504bb62f4cf7ea29541a487a9d1a0749876fc23/spacy-2.1.4-cp36-cp36m-manylinux1_x86_64.whl (29.8MB)
    100% |████████████████████████████████| 29.8MB 18kB/s 
Collecting tensorflow==1.13.1 (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/77/63/a9fa76de8dffe7455304c4ed635be4aa9c0bacef6e0633d87d5f54530c5c/tensorflow-1.13.1-cp36-cp36m-manylinux1_x86_64.whl (92.5MB)
    100% |████████████████████████████████| 92.5MB 6.5kB/s 
Collecting tensorflow_hub (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/b5/be/f18c352d84382d9c795a0f37eaf16d42ace7d161fbb0ad20bdcd5e550015/tensorflow_hub-0.5.0-py2.py3-none-any.whl (78kB)
    100% |████████████████████████████████| 81kB 3.4MB/s 
Collecting termcolor (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz
Collecting tqdm (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/9f/3d/7a6b68b631d2ab54975f3a4863f3c4e9b26445353264ef01f465dc9b0208/tqdm-4.32.2-py2.py3-none-any.whl (50kB)
    100% |████████████████████████████████| 51kB 5.6MB/s 
Collecting wordfreq (from nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/d0/82/233c39f350ac66c740266dafb348f9e67ba3a5e5dad6a949a2c3715c34f5/wordfreq-2.2.1-py3-none-any.whl (32.8MB)
    100% |████████████████████████████████| 32.8MB 21kB/s 
Collecting Jinja2>=2.7 (from bokeh->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl (124kB)
    100% |████████████████████████████████| 133kB 2.7MB/s 
Collecting PyYAML>=3.10 (from bokeh->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/a3/65/837fefac7475963d1eccf4aa684c23b95aa6c1d033a2c5965ccb11e22623/PyYAML-5.1.1.tar.gz (274kB)
    100% |████████████████████████████████| 276kB 1.2MB/s 
Collecting packaging>=16.8 (from bokeh->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/91/32/58bc30e646e55eab8b21abf89e353f59c0cc02c417e42929f4a9546e1b1d/packaging-19.0-py2.py3-none-any.whl
Collecting python-dateutil>=2.1 (from bokeh->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl (226kB)
    100% |████████████████████████████████| 235kB 1.2MB/s 
Collecting tornado>=4.3 (from bokeh->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/30/78/2d2823598496127b21423baffaa186b668f73cd91887fcef78b6eade136b/tornado-6.0.3.tar.gz (482kB)
    100% |████████████████████████████████| 491kB 738kB/s 
Collecting cython (from dynet==2.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/5d/7e/d2a81d821193c88113e3613f2df456a4d7b1b15bd4551e97978e8107e3ef/Cython-0.29.10-cp36-cp36m-manylinux1_x86_64.whl (2.1MB)
    100% |████████████████████████████████| 2.1MB 347kB/s 
Collecting urllib3>=1.21.1 (from elasticsearch->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl (150kB)
    100% |████████████████████████████████| 153kB 2.0MB/s 
Collecting python-mimeparse>=1.5.2 (from falcon==1.4.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/26/2e/03bce213a9bf02a2750dcb04e761785e9c763fc11071edc4b447eacbb842/python_mimeparse-1.6.0-py2.py3-none-any.whl
Collecting wcwidth (from ftfy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl
Collecting smart-open>=1.7.0 (from gensim->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/37/c0/25d19badc495428dec6a4bf7782de617ee0246a9211af75b302a2681dea7/smart_open-1.8.4.tar.gz (63kB)
    100% |████████████████████████████████| 71kB 4.3MB/s 
Collecting networkx (from hyperopt->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/85/08/f20aef11d4c343b557e5de6b9548761811eb16e438cee3d32b1c66c8566b/networkx-2.3.zip (1.7MB)
    100% |████████████████████████████████| 1.8MB 321kB/s 
Collecting pymongo (from hyperopt->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/fb/4a/586826433281ca285f0201235fccf63cc29a30fa78bcd72b6a34e365972d/pymongo-3.8.0-cp36-cp36m-manylinux1_x86_64.whl (416kB)
    100% |████████████████████████████████| 419kB 799kB/s 
Collecting feedfinder2>=0.0.4 (from newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/35/82/1251fefec3bb4b03fd966c7e7f7a41c9fc2bb00d823a34c13f847fd61406/feedfinder2-0.0.4.tar.gz
Collecting jieba3k>=0.35.1 (from newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/a9/cb/2c8332bcdc14d33b0bedd18ae0a4981a069c3513e445120da3c3f23a8aaa/jieba3k-0.35.1.zip (7.4MB)
    100% |████████████████████████████████| 7.4MB 89kB/s 
Collecting tldextract>=2.0.1 (from newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/1e/90/18ac0e5340b6228c25cc8e79835c3811e7553b2b9ae87296dfeb62b7866d/tldextract-2.2.1-py2.py3-none-any.whl (48kB)
    100% |████████████████████████████████| 51kB 2.2MB/s 
Collecting feedparser>=5.2.1 (from newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/91/d8/7d37fec71ff7c9dbcdd80d2b48bcdd86d6af502156fc93846fb0102cb2c4/feedparser-5.2.1.tar.bz2 (192kB)
    100% |████████████████████████████████| 194kB 1.7MB/s 
Collecting cssselect>=0.9.2 (from newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/7b/44/25b7283e50585f0b4156960691d951b05d061abf4a714078393e51929b30/cssselect-1.0.3-py2.py3-none-any.whl
Collecting tinysegmenter==0.3 (from newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/17/82/86982e4b6d16e4febc79c2a1d68ee3b707e8a020c5d2bc4af8052d0f136a/tinysegmenter-0.3.tar.gz
Collecting lxml>=3.6.0 (from newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/2d/53/34a9f0c79c548e430148837892b6ae91adee571a0e8b6c17bd7ff9c2d12e/lxml-4.3.4-cp36-cp36m-manylinux1_x86_64.whl (5.7MB)
    100% |████████████████████████████████| 5.7MB 92kB/s 
Collecting beautifulsoup4>=4.4.1 (from newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/1d/5d/3260694a59df0ec52f8b4883f5d23b130bc237602a1411fa670eae12351e/beautifulsoup4-4.7.1-py3-none-any.whl (94kB)
    100% |████████████████████████████████| 102kB 4.9MB/s 
Collecting docopt>=0.6.2 (from num2words->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/a2/55/8f8cab2afd404cf578136ef2cc5dfb50baa1761b68c9da1fb1e4eed343c9/docopt-0.6.2.tar.gz
Collecting pytz>=2011k (from pandas->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/3d/73/fe30c2daaaa0713420d0382b16fbb761409f532c56bdcc514bf7b6262bb6/pytz-2019.1-py2.py3-none-any.whl (510kB)
    100% |████████████████████████████████| 512kB 670kB/s 
Collecting certifi>=2017.4.17 (from requests->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl (157kB)
    100% |████████████████████████████████| 163kB 1.7MB/s 
Collecting chardet<3.1.0,>=3.0.2 (from requests->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)
    100% |████████████████████████████████| 143kB 2.1MB/s 
Collecting idna<2.9,>=2.5 (from requests->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)
    100% |████████████████████████████████| 61kB 6.2MB/s 
Collecting Keras>=2.2.4 (from seqeval->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/5e/10/aa32dad071ce52b5502266b5c659451cfd6ffcbf14e6c8c4f16c0ff5aaab/Keras-2.2.4-py2.py3-none-any.whl (312kB)
    100% |████████████████████████████████| 317kB 1.2MB/s 
Collecting scikit-learn (from sklearn->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/85/04/49633f490f726da6e454fddc8e938bbb5bfed2001681118d3814c219b723/scikit_learn-0.21.2-cp36-cp36m-manylinux1_x86_64.whl (6.7MB)
    100% |████████████████████████████████| 6.7MB 105kB/s 
Collecting cymem<2.1.0,>=2.0.2 (from spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/3d/61/9b0520c28eb199a4b1ca667d96dd625bba003c14c75230195f9691975f85/cymem-2.0.2-cp36-cp36m-manylinux1_x86_64.whl
Collecting srsly<1.1.0,>=0.0.5 (from spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/aa/6c/2ef2d6f4c63a197981f4ac01bb17560c857c6721213c7c99998e48cdda2a/srsly-0.0.7-cp36-cp36m-manylinux1_x86_64.whl (180kB)
    100% |████████████████████████████████| 184kB 1.9MB/s 
Collecting thinc<7.1.0,>=7.0.2 (from spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/a9/f1/3df317939a07b2fc81be1a92ac10bf836a1d87b4016346b25f8b63dee321/thinc-7.0.4-cp36-cp36m-manylinux1_x86_64.whl (2.1MB)
    100% |████████████████████████████████| 2.1MB 357kB/s 
Collecting blis<0.3.0,>=0.2.2 (from spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/34/46/b1d0bb71d308e820ed30316c5f0a017cb5ef5f4324bcbc7da3cf9d3b075c/blis-0.2.4-cp36-cp36m-manylinux1_x86_64.whl (3.2MB)
    100% |████████████████████████████████| 3.2MB 215kB/s 
Collecting murmurhash<1.1.0,>=0.28.0 (from spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/a6/e6/63f160a4fdf0e875d16b28f972083606d8d54f56cd30cb8929f9a1ee700e/murmurhash-1.0.2-cp36-cp36m-manylinux1_x86_64.whl
Collecting jsonschema<3.1.0,>=2.6.0 (from spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/aa/69/df679dfbdd051568b53c38ec8152a3ab6bc533434fc7ed11ab034bf5e82f/jsonschema-3.0.1-py2.py3-none-any.whl (54kB)
    100% |████████████████████████████████| 61kB 6.5MB/s 
Collecting wasabi<1.1.0,>=0.2.0 (from spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/f4/c1/d76ccdd12c716be79162d934fe7de4ac8a318b9302864716dde940641a79/wasabi-0.2.2-py3-none-any.whl
Collecting plac<1.0.0,>=0.9.6 (from spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/9e/9b/62c60d2f5bc135d2aa1d8c8a86aaf84edb719a59c7f11a4316259e61a298/plac-0.9.6-py2.py3-none-any.whl
Collecting preshed<2.1.0,>=2.0.1 (from spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/20/93/f222fb957764a283203525ef20e62008675fd0a14ffff8cc1b1490147c63/preshed-2.0.1-cp36-cp36m-manylinux1_x86_64.whl (83kB)
    100% |████████████████████████████████| 92kB 4.6MB/s 
Collecting grpcio>=1.8.6 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/99/83/18f374294bf34128a448ee2fae37651f943b0b5fa473b5b3aff262c15bf8/grpcio-1.21.1-cp36-cp36m-manylinux1_x86_64.whl (2.2MB)
    100% |████████████████████████████████| 2.2MB 323kB/s 
Collecting absl-py>=0.1.6 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/da/3f/9b0355080b81b15ba6a9ffcf1f5ea39e307a2778b2f2dc8694724e8abd5b/absl-py-0.7.1.tar.gz (99kB)
    100% |████████████████████████████████| 102kB 4.6MB/s 
Collecting gast>=0.2.0 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/4e/35/11749bf99b2d4e3cceb4d55ca22590b0d7c2c62b9de38ac4a4a7f4687421/gast-0.2.2.tar.gz
Collecting wheel>=0.26 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/bb/10/44230dd6bf3563b8f227dbf344c908d412ad2ff48066476672f3a72e174e/wheel-0.33.4-py2.py3-none-any.whl
Collecting astor>=0.6.0 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/d1/4f/950dfae467b384fc96bc6469de25d832534f6b4441033c39f914efd13418/astor-0.8.0-py2.py3-none-any.whl
Collecting tensorboard<1.14.0,>=1.13.0 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/0f/39/bdd75b08a6fba41f098b6cb091b9e8c7a80e1b4d679a581a0ccd17b10373/tensorboard-1.13.1-py3-none-any.whl (3.2MB)
    100% |████████████████████████████████| 3.2MB 227kB/s 
Collecting keras-preprocessing>=1.0.5 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/28/6a/8c1f62c37212d9fc441a7e26736df51ce6f0e38455816445471f10da4f0a/Keras_Preprocessing-1.1.0-py2.py3-none-any.whl (41kB)
    100% |████████████████████████████████| 51kB 7.3MB/s 
Collecting tensorflow-estimator<1.14.0rc0,>=1.13.0 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/bb/48/13f49fc3fa0fdf916aa1419013bb8f2ad09674c275b4046d5ee669a46873/tensorflow_estimator-1.13.0-py2.py3-none-any.whl (367kB)
    100% |████████████████████████████████| 368kB 854kB/s 
Collecting protobuf>=3.6.1 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/d2/fb/29de8d08967f0cce1bb10b39846d836b0f3bf6776ddc36aed7c73498ca7e/protobuf-3.8.0-cp36-cp36m-manylinux1_x86_64.whl (1.2MB)
    100% |████████████████████████████████| 1.2MB 561kB/s 
Collecting keras-applications>=1.0.6 (from tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/71/e3/19762fdfc62877ae9102edf6342d71b28fbfd9dea3d2f96a882ce099b03f/Keras_Applications-1.0.8-py3-none-any.whl (50kB)
    100% |████████████████████████████████| 51kB 1.5MB/s 
Collecting langcodes>=1.4.1 (from wordfreq->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/fa/9a/e05169c2c00b11b21fb0af039644fa07210470a125aa508a460786c2e63f/langcodes-1.4.1.tar.gz (4.0MB)
    100% |████████████████████████████████| 4.0MB 175kB/s 
Collecting regex<=2018.02.21,>=2017.07.11 (from wordfreq->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/a2/51/c39562cfed3272592c60cfd229e5464d715b78537e332eac2b695422dc49/regex-2018.02.21.tar.gz (620kB)
    100% |████████████████████████████████| 624kB 693kB/s 
Collecting msgpack (from wordfreq->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/92/7e/ae9e91c1bb8d846efafd1f353476e3fd7309778b582d2fb4cea4cc15b9a2/msgpack-0.6.1-cp36-cp36m-manylinux1_x86_64.whl (248kB)
    100% |████████████████████████████████| 256kB 1.3MB/s 
Collecting MarkupSafe>=0.23 (from Jinja2>=2.7->bokeh->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/b2/5f/23e0023be6bb885d00ffbefad2942bc51a620328ee910f64abe5a8d18dd1/MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting pyparsing>=2.0.2 (from packaging>=16.8->bokeh->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/dd/d9/3ec19e966301a6e25769976999bd7bbe552016f0d32b577dc9d63d2e0c49/pyparsing-2.4.0-py2.py3-none-any.whl (62kB)
    100% |████████████████████████████████| 71kB 3.1MB/s 
Collecting boto3 (from smart-open>=1.7.0->gensim->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/f9/9e/26312043904b4e5f808025b259145025b5712459677b9852a116deb0799c/boto3-1.9.176-py2.py3-none-any.whl (128kB)
    100% |████████████████████████████████| 133kB 2.2MB/s 
Collecting boto>=2.32 (from smart-open>=1.7.0->gensim->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/23/10/c0b78c27298029e4454a472a1919bde20cb182dab1662cec7f2ca1dcc523/boto-2.49.0-py2.py3-none-any.whl (1.4MB)
    100% |████████████████████████████████| 1.4MB 427kB/s 
Collecting decorator>=4.3.0 (from networkx->hyperopt->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/5f/88/0075e461560a1e750a0dcbf77f1d9de775028c37a19a346a6c565a257399/decorator-4.4.0-py2.py3-none-any.whl
Collecting setuptools (from tldextract>=2.0.1->newspaper3k->nlp-architect==0.4.post2)
  Using cached https://files.pythonhosted.org/packages/ec/51/f45cea425fd5cb0b0380f5b0f048ebc1da5b417e48d304838c02d6288a1e/setuptools-41.0.1-py2.py3-none-any.whl
Collecting requests-file>=1.4 (from tldextract>=2.0.1->newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/23/9c/6e63c23c39e53d3df41c77a3d05a49a42c4e1383a6d2a5e3233161b89dbf/requests_file-1.4.3-py2.py3-none-any.whl
Collecting soupsieve>=1.2 (from beautifulsoup4>=4.4.1->newspaper3k->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/35/e3/25079e8911085ab76a6f2facae0771078260c930216ab0b0c44dc5c9bf31/soupsieve-1.9.2-py2.py3-none-any.whl
Collecting joblib>=0.11 (from scikit-learn->sklearn->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/cd/c1/50a758e8247561e58cb87305b1e90b171b8c767b15b12a1734001f41d356/joblib-0.13.2-py2.py3-none-any.whl (278kB)
    100% |████████████████████████████████| 286kB 1.3MB/s 
Collecting attrs>=17.4.0 (from jsonschema<3.1.0,>=2.6.0->spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/23/96/d828354fa2dbdf216eaa7b7de0db692f12c234f7ef888cc14980ef40d1d2/attrs-19.1.0-py2.py3-none-any.whl
Collecting pyrsistent>=0.14.0 (from jsonschema<3.1.0,>=2.6.0->spacy->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/68/0b/f514e76b4e074386b60cfc6c8c2d75ca615b81e415417ccf3fac80ae0bf6/pyrsistent-0.15.2.tar.gz (106kB)
    100% |████████████████████████████████| 112kB 4.3MB/s 
Collecting markdown>=2.6.8 (from tensorboard<1.14.0,>=1.13.0->tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/c0/4e/fd492e91abdc2d2fcb70ef453064d980688762079397f779758e055f6575/Markdown-3.1.1-py2.py3-none-any.whl (87kB)
    100% |████████████████████████████████| 92kB 4.6MB/s 
Collecting werkzeug>=0.11.15 (from tensorboard<1.14.0,>=1.13.0->tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/9f/57/92a497e38161ce40606c27a86759c6b92dd34fcdb33f64171ec559257c02/Werkzeug-0.15.4-py2.py3-none-any.whl (327kB)
    100% |████████████████████████████████| 327kB 974kB/s 
Collecting mock>=2.0.0 (from tensorflow-estimator<1.14.0rc0,>=1.13.0->tensorflow==1.13.1->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/05/d2/f94e68be6b17f46d2c353564da56e6fb89ef09faeeff3313a046cb810ca9/mock-3.0.5-py2.py3-none-any.whl
Collecting marisa-trie (from langcodes>=1.4.1->wordfreq->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/20/95/d23071d0992dabcb61c948fb118a90683193befc88c23e745b050a29e7db/marisa-trie-0.7.5.tar.gz (270kB)
    100% |████████████████████████████████| 276kB 1.2MB/s 
Collecting jmespath<1.0.0,>=0.7.1 (from boto3->smart-open>=1.7.0->gensim->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/83/94/7179c3832a6d45b266ddb2aac329e101367fbdb11f425f13771d27f225bb/jmespath-0.9.4-py2.py3-none-any.whl
Collecting s3transfer<0.3.0,>=0.2.0 (from boto3->smart-open>=1.7.0->gensim->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/16/8a/1fc3dba0c4923c2a76e1ff0d52b305c44606da63f718d14d3231e21c51b0/s3transfer-0.2.1-py2.py3-none-any.whl (70kB)
    100% |████████████████████████████████| 71kB 4.0MB/s 
Collecting botocore<1.13.0,>=1.12.176 (from boto3->smart-open>=1.7.0->gensim->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/0f/f6/5b67d7c222d6e032e60fd3001db1531c97a39334aa7d6e92d6622bcfa55a/botocore-1.12.176-py2.py3-none-any.whl (5.6MB)
    100% |████████████████████████████████| 5.6MB 137kB/s 
Collecting docutils>=0.10 (from botocore<1.13.0,>=1.12.176->boto3->smart-open>=1.7.0->gensim->nlp-architect==0.4.post2)
  Downloading https://files.pythonhosted.org/packages/36/fa/08e9e6e0e3cbd1d362c3bbee8d01d0aedb2155c4ac112b19ef3cae8eed8d/docutils-0.14-py3-none-any.whl (543kB)
    100% |████████████████████████████████| 552kB 876kB/s 
Building wheels for collected packages: bokeh, future, nltk, pywikibot, seqeval, sklearn, termcolor, PyYAML, tornado, smart-open, networkx, feedfinder2, jieba3k, feedparser, tinysegmenter, docopt, absl-py, gast, langcodes, regex, pyrsistent, marisa-trie
  Running setup.py bdist_wheel for bokeh ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/96/8c/18/ab51f7028839c79738fc7b21c7d660f3d59e7748eb903fbe15
  Running setup.py bdist_wheel for future ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/0c/61/d2/d6b7317325828fbb39ee6ad559dbe4664d0896da4721bf379e
  Running setup.py bdist_wheel for nltk ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/54/40/b7/c56ad418e6cd4d9e1e594b5e138d1ca6eec11a6ee3d464e5bb
  Running setup.py bdist_wheel for pywikibot ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/6c/07/84/ba6645f74fe204adce4ec9a49365afdc5c526f1250691c5e41
  Running setup.py bdist_wheel for seqeval ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/4f/32/0a/df3b340a82583566975377d65e724895b3fad101a3fb729f68
  Running setup.py bdist_wheel for sklearn ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/76/03/bb/589d421d27431bcd2c6da284d5f2286c8e3b2ea3cf1594c074
  Running setup.py bdist_wheel for termcolor ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/7c/06/54/bc84598ba1daf8f970247f550b175aaaee85f68b4b0c5ab2c6
  Running setup.py bdist_wheel for PyYAML ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/16/27/a1/775c62ddea7bfa62324fd1f65847ed31c55dadb6051481ba3f
  Running setup.py bdist_wheel for tornado ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/84/bf/40/2f6ef700f48401ca40e5e3dd7d0e3c0a90e064897b7fe5fc08
  Running setup.py bdist_wheel for smart-open ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/5f/ea/fb/5b1a947b369724063b2617011f1540c44eb00e28c3d2ca8692
  Running setup.py bdist_wheel for networkx ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/de/63/64/3699be2a9d0ccdb37c7f16329acf3863fd76eda58c39c737af
  Running setup.py bdist_wheel for feedfinder2 ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/de/03/ca/778e3a7a627e3d98836cc890e7cb40c7575424cfd3340f40ed
  Running setup.py bdist_wheel for jieba3k ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/83/15/9c/a3f1f67e7f7181170ad37d32e503c35da20627c013f438ed34
  Running setup.py bdist_wheel for feedparser ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/8c/69/b7/f52763c41c5471df57703a0ef718a32a5e81ee35dcf6d4f97f
  Running setup.py bdist_wheel for tinysegmenter ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/81/2b/43/a02ede72324dd40cdd7ca53aad718c7710628e91b8b0dc0f02
  Running setup.py bdist_wheel for docopt ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e
  Running setup.py bdist_wheel for absl-py ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/ee/98/38/46cbcc5a93cfea5492d19c38562691ddb23b940176c14f7b48
  Running setup.py bdist_wheel for gast ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/5c/2e/7e/a1d4d4fcebe6c381f378ce7743a3ced3699feb89bcfbdadadd
  Running setup.py bdist_wheel for langcodes ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/84/20/3d/dc2010b4f7c0b786a06947530a962972caead0c58898f25a02
  Running setup.py bdist_wheel for regex ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/6b/c9/cf/230425cdd343d6b98e8da5a5841c3dab1e0c8aaa134e29edb0
  Running setup.py bdist_wheel for pyrsistent ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/6b/b9/15/c8c6a1e095a370e8c3273e65a5c982e5cf355dde16d77502f5
  Running setup.py bdist_wheel for marisa-trie ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/45/24/79/022624fc914f0e559fe8a1141aaff1f9df810905a13fc75d57
Successfully built bokeh future nltk pywikibot seqeval sklearn termcolor PyYAML tornado smart-open networkx feedfinder2 jieba3k feedparser tinysegmenter docopt absl-py gast langcodes regex pyrsistent marisa-trie
Installing collected packages: MarkupSafe, Jinja2, PyYAML, numpy, six, pyparsing, packaging, pillow, python-dateutil, tornado, bokeh, cython, dynet, urllib3, elasticsearch, python-mimeparse, falcon, falcon-multipart, wcwidth, ftfy, future, jmespath, docutils, botocore, s3transfer, boto3, boto, certifi, chardet, idna, requests, smart-open, scipy, gensim, h5py, hug, decorator, networkx, pymongo, tqdm, hyperopt, soupsieve, beautifulsoup4, feedfinder2, nltk, jieba3k, setuptools, requests-file, tldextract, feedparser, cssselect, tinysegmenter, lxml, newspaper3k, docopt, num2words, pytz, pandas, pywikibot, keras-preprocessing, keras-applications, Keras, seqeval, joblib, scikit-learn, sklearn, cymem, srsly, blis, preshed, wasabi, murmurhash, plac, thinc, attrs, pyrsistent, jsonschema, spacy, termcolor, grpcio, absl-py, gast, wheel, astor, markdown, werkzeug, protobuf, tensorboard, mock, tensorflow-estimator, tensorflow, tensorflow-hub, marisa-trie, langcodes, regex, msgpack, wordfreq, nlp-architect
  Running setup.py develop for nlp-architect
Successfully installed Jinja2-2.10.1 Keras-2.2.4 MarkupSafe-1.1.1 PyYAML-5.1.1 absl-py-0.7.1 astor-0.8.0 attrs-19.1.0 beautifulsoup4-4.7.1 blis-0.2.4 bokeh-1.2.0 boto-2.49.0 boto3-1.9.176 botocore-1.12.176 certifi-2019.6.16 chardet-3.0.4 cssselect-1.0.3 cymem-2.0.2 cython-0.29.10 decorator-4.4.0 docopt-0.6.2 docutils-0.14 dynet-2.1 elasticsearch-7.0.2 falcon-1.4.1 falcon-multipart-0.2.0 feedfinder2-0.0.4 feedparser-5.2.1 ftfy-5.5.1 future-0.17.1 gast-0.2.2 gensim-3.7.3 grpcio-1.21.1 h5py-2.9.0 hug-2.5.6 hyperopt-0.1.2 idna-2.8 jieba3k-0.35.1 jmespath-0.9.4 joblib-0.13.2 jsonschema-3.0.1 keras-applications-1.0.8 keras-preprocessing-1.1.0 langcodes-1.4.1 lxml-4.3.4 marisa-trie-0.7.5 markdown-3.1.1 mock-3.0.5 msgpack-0.6.1 murmurhash-1.0.2 networkx-2.3 newspaper3k-0.2.8 nlp-architect nltk-3.4.3 num2words-0.5.10 numpy-1.16.4 packaging-19.0 pandas-0.24.2 pillow-6.0.0 plac-0.9.6 preshed-2.0.1 protobuf-3.8.0 pymongo-3.8.0 pyparsing-2.4.0 pyrsistent-0.15.2 python-dateutil-2.8.0 python-mimeparse-1.6.0 pytz-2019.1 pywikibot-3.0.dev0 regex-2018.2.21 requests-2.22.0 requests-file-1.4.3 s3transfer-0.2.1 scikit-learn-0.21.2 scipy-1.3.0 seqeval-0.0.12 setuptools-41.0.1 six-1.12.0 sklearn-0.0 smart-open-1.8.4 soupsieve-1.9.2 spacy-2.1.4 srsly-0.0.7 tensorboard-1.13.1 tensorflow-1.13.1 tensorflow-estimator-1.13.0 tensorflow-hub-0.5.0 termcolor-1.1.0 thinc-7.0.4 tinysegmenter-0.3 tldextract-2.2.1 tornado-6.0.3 tqdm-4.32.2 urllib3-1.25.3 wasabi-0.2.2 wcwidth-0.1.7 werkzeug-0.15.4 wheel-0.33.4 wordfreq-2.2.1
(intelnlp) nlp@nlp16-70GB:~/Desktop/nlp-architect$ 
(intelnlp) nlp@nlp16-70GB:~/Desktop/nlp-architect$ nlp_architect -h
nlp_architect: command not found
(intelnlp) nlp@nlp16-70GB:~/Desktop/nlp-architect$ nlp-architect -h
nlp-architect: command not found
(intelnlp) nlp@nlp16-70GB:~/Desktop/nlp-architect$ pip install -r dev-requirements.txt
Collecting sphinx (from -r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/6e/e5/c9ba68935cd2d72c553d49bc156bfb15ddb40e734ea7e3f238d8bd6ca6f1/Sphinx-2.1.2-py3-none-any.whl (3.2MB)
     |████████████████████████████████| 3.3MB 3.0MB/s 
Collecting sphinx_rtd_theme (from -r dev-requirements.txt (line 2))
  Downloading https://files.pythonhosted.org/packages/60/b4/4df37087a1d36755e3a3bfd2a30263f358d2dea21938240fa02313d45f51/sphinx_rtd_theme-0.4.3-py2.py3-none-any.whl (6.4MB)
     |████████████████████████████████| 6.4MB 11.5MB/s 
Collecting flake8-html (from -r dev-requirements.txt (line 3))
  Downloading https://files.pythonhosted.org/packages/59/8e/ddd22716fe3c5c38f33120252417bf36cd7b32ee110155290465a7393d6e/flake8_html-0.4.0-py2.py3-none-any.whl
Collecting pep8 (from -r dev-requirements.txt (line 4))
  Downloading https://files.pythonhosted.org/packages/42/3f/669429ce58de2c22d8d2c542752e137ec4b9885fff398d3eceb1a7f5acb4/pep8-1.7.1-py2.py3-none-any.whl (41kB)
     |████████████████████████████████| 51kB 11.6MB/s 
Collecting flake8 (from -r dev-requirements.txt (line 5))
  Downloading https://files.pythonhosted.org/packages/e9/76/b915bd28976068a9843bf836b789794aa4a8eb13338b23581005cd9177c0/flake8-3.7.7-py2.py3-none-any.whl (68kB)
     |████████████████████████████████| 71kB 13.9MB/s 
Collecting pytest (from -r dev-requirements.txt (line 6))
  Downloading https://files.pythonhosted.org/packages/b3/eb/df264c0b1ff4aaf263375dc09aabd9093364f66060be9b26f3a2c166d558/pytest-4.6.3-py2.py3-none-any.whl (229kB)
     |████████████████████████████████| 235kB 16.8MB/s 
Collecting pytest-cov (from -r dev-requirements.txt (line 7))
  Downloading https://files.pythonhosted.org/packages/84/7b/73f8522619d1cbb22b9a36f9c54bc5ce5e24648e53cc1bf566477d2d1f2b/pytest_cov-2.7.1-py2.py3-none-any.whl
Collecting pytest-mock (from -r dev-requirements.txt (line 8))
  Downloading https://files.pythonhosted.org/packages/30/43/8deecb4c123bbc16d25666f1a6d241109c97aeb2e50806b952661c8e4b95/pytest_mock-1.10.4-py2.py3-none-any.whl
Collecting pylint (from -r dev-requirements.txt (line 9))
  Downloading https://files.pythonhosted.org/packages/60/c2/b3f73f4ac008bef6e75bca4992f3963b3f85942e0277237721ef1c151f0d/pylint-2.3.1-py3-none-any.whl (765kB)
     |████████████████████████████████| 768kB 13.3MB/s 
Requirement already satisfied: setuptools in /home/nlp/.local/lib/python3.6/site-packages (from sphinx->-r dev-requirements.txt (line 1)) (41.0.1)
Collecting sphinxcontrib-applehelp (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/13/9a/4428b3114d654cb1cd34d90d5e6fab938d5436f94a571155187ea1dd78b4/sphinxcontrib_applehelp-1.0.1-py2.py3-none-any.whl (121kB)
     |████████████████████████████████| 122kB 12.8MB/s 
Collecting sphinxcontrib-devhelp (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/b0/a3/fea98741f0b2f2902fbf6c35c8e91b22cd0dd13387291e81d457f9a93066/sphinxcontrib_devhelp-1.0.1-py2.py3-none-any.whl (84kB)
     |████████████████████████████████| 92kB 6.1MB/s 
Collecting alabaster<0.8,>=0.7 (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/10/ad/00b090d23a222943eb0eda509720a404f531a439e803f6538f35136cae9e/alabaster-0.7.12-py2.py3-none-any.whl
Requirement already satisfied: packaging in /home/nlp/.local/lib/python3.6/site-packages (from sphinx->-r dev-requirements.txt (line 1)) (19.0)
Collecting imagesize (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/fc/b6/aef66b4c52a6ad6ac18cf6ebc5731ed06d8c9ae4d3b2d9951f261150be67/imagesize-1.1.0-py2.py3-none-any.whl
Collecting sphinxcontrib-jsmath (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/c2/42/4c8646762ee83602e3fb3fbe774c2fac12f317deb0b5dbeeedd2d3ba4b77/sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl
Requirement already satisfied: requests>=2.5.0 in /home/nlp/.local/lib/python3.6/site-packages (from sphinx->-r dev-requirements.txt (line 1)) (2.22.0)
Collecting snowballstemmer>=1.1 (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/a0/5e/d9ead2d57d39b3e1c1868ce84212319e5543a19c4185dce7e42a9dd968b0/snowballstemmer-1.9.0.tar.gz (76kB)
     |████████████████████████████████| 81kB 7.5MB/s 
Collecting Pygments>=2.0 (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/5c/73/1dfa428150e3ccb0fa3e68db406e5be48698f2a979ccbcec795f28f44048/Pygments-2.4.2-py2.py3-none-any.whl (883kB)
     |████████████████████████████████| 890kB 11.6MB/s 
Collecting sphinxcontrib-qthelp (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/ce/5b/4747c3ba98b3a3e21a66faa183d8f79b9ded70e74212a7988d236a6eb78a/sphinxcontrib_qthelp-1.0.2-py2.py3-none-any.whl (90kB)
     |████████████████████████████████| 92kB 4.9MB/s 
Requirement already satisfied: docutils>=0.12 in /home/nlp/.local/lib/python3.6/site-packages (from sphinx->-r dev-requirements.txt (line 1)) (0.14)
Collecting sphinxcontrib-htmlhelp (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/e4/35/80a67cc493f4a8a9634ab203a77aaa1b84d79ccb1c02eca72cb084d2c7f7/sphinxcontrib_htmlhelp-1.0.2-py2.py3-none-any.whl (96kB)
     |████████████████████████████████| 102kB 9.2MB/s 
Collecting babel!=2.0,>=1.3 (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/2c/60/f2af68eb046c5de5b1fe6dd4743bf42c074f7141fe7b2737d3061533b093/Babel-2.7.0-py2.py3-none-any.whl (8.4MB)
     |████████████████████████████████| 8.4MB 9.9MB/s 
Collecting sphinxcontrib-serializinghtml (from sphinx->-r dev-requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/57/b3/3648e48fa5682e61e9839d62de4e23af1795ceb738d68d73bd974257a95c/sphinxcontrib_serializinghtml-1.1.3-py2.py3-none-any.whl (89kB)
     |████████████████████████████████| 92kB 5.5MB/s 
Requirement already satisfied: Jinja2>=2.3 in /home/nlp/.local/lib/python3.6/site-packages (from sphinx->-r dev-requirements.txt (line 1)) (2.10.1)
Collecting mccabe<0.7.0,>=0.6.0 (from flake8->-r dev-requirements.txt (line 5))
  Downloading https://files.pythonhosted.org/packages/87/89/479dc97e18549e21354893e4ee4ef36db1d237534982482c3681ee6e7b57/mccabe-0.6.1-py2.py3-none-any.whl
Collecting entrypoints<0.4.0,>=0.3.0 (from flake8->-r dev-requirements.txt (line 5))
  Downloading https://files.pythonhosted.org/packages/ac/c6/44694103f8c221443ee6b0041f69e2740d89a25641e62fb4f2ee568f2f9c/entrypoints-0.3-py2.py3-none-any.whl
Collecting pycodestyle<2.6.0,>=2.5.0 (from flake8->-r dev-requirements.txt (line 5))
  Downloading https://files.pythonhosted.org/packages/0e/0c/04a353e104d2f324f8ee5f4b32012618c1c86dd79e52a433b64fceed511b/pycodestyle-2.5.0-py2.py3-none-any.whl (51kB)
     |████████████████████████████████| 51kB 5.4MB/s 
Collecting pyflakes<2.2.0,>=2.1.0 (from flake8->-r dev-requirements.txt (line 5))
  Downloading https://files.pythonhosted.org/packages/84/f2/ed0ffb887f8138a8fe5a621b8c0bb9598bfb3989e029f6c6a85ee66628ee/pyflakes-2.1.1-py2.py3-none-any.whl (59kB)
     |████████████████████████████████| 61kB 14.1MB/s 
Collecting pluggy<1.0,>=0.12 (from pytest->-r dev-requirements.txt (line 6))
  Downloading https://files.pythonhosted.org/packages/06/ee/de89e0582276e3551df3110088bf20844de2b0e7df2748406876cc78e021/pluggy-0.12.0-py2.py3-none-any.whl
Collecting importlib-metadata>=0.12 (from pytest->-r dev-requirements.txt (line 6))
  Downloading https://files.pythonhosted.org/packages/bd/23/dce4879ec58acf3959580bfe769926ed8198727250c5e395e6785c764a02/importlib_metadata-0.18-py2.py3-none-any.whl
Collecting more-itertools>=4.0.0; python_version > "2.7" (from pytest->-r dev-requirements.txt (line 6))
  Downloading https://files.pythonhosted.org/packages/b3/73/64fb5922b745fc1daee8a2880d907d2a70d9c7bb71eea86fcb9445daab5e/more_itertools-7.0.0-py3-none-any.whl (53kB)
     |████████████████████████████████| 61kB 13.9MB/s 
Requirement already satisfied: attrs>=17.4.0 in /home/nlp/.local/lib/python3.6/site-packages (from pytest->-r dev-requirements.txt (line 6)) (19.1.0)
Requirement already satisfied: six>=1.10.0 in /home/nlp/.local/lib/python3.6/site-packages (from pytest->-r dev-requirements.txt (line 6)) (1.12.0)
Collecting atomicwrites>=1.0 (from pytest->-r dev-requirements.txt (line 6))
  Downloading https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl
Requirement already satisfied: wcwidth in /home/nlp/.local/lib/python3.6/site-packages (from pytest->-r dev-requirements.txt (line 6)) (0.1.7)
Collecting py>=1.5.0 (from pytest->-r dev-requirements.txt (line 6))
  Downloading https://files.pythonhosted.org/packages/76/bc/394ad449851729244a97857ee14d7cba61ddb268dce3db538ba2f2ba1f0f/py-1.8.0-py2.py3-none-any.whl (83kB)
     |████████████████████████████████| 92kB 14.3MB/s 
Collecting coverage>=4.4 (from pytest-cov->-r dev-requirements.txt (line 7))
  Downloading https://files.pythonhosted.org/packages/f8/4e/f28fc04019bac97d301512d904992791569234a06826cd420f78fba9a361/coverage-4.5.3-cp36-cp36m-manylinux1_x86_64.whl (205kB)
     |████████████████████████████████| 215kB 22.7MB/s 
Collecting isort<5,>=4.2.5 (from pylint->-r dev-requirements.txt (line 9))
  Downloading https://files.pythonhosted.org/packages/e5/b0/c121fd1fa3419ea9bfd55c7f9c4fedfec5143208d8c7ad3ce3db6c623c21/isort-4.3.21-py2.py3-none-any.whl (42kB)
     |████████████████████████████████| 51kB 10.7MB/s 
Collecting astroid<3,>=2.2.0 (from pylint->-r dev-requirements.txt (line 9))
  Downloading https://files.pythonhosted.org/packages/d5/ad/7221a62a2dbce5c3b8c57fd18e1052c7331adc19b3f27f1561aa6e620db2/astroid-2.2.5-py3-none-any.whl (193kB)
     |████████████████████████████████| 194kB 23.9MB/s 
Requirement already satisfied: pyparsing>=2.0.2 in /home/nlp/.local/lib/python3.6/site-packages (from packaging->sphinx->-r dev-requirements.txt (line 1)) (2.4.0)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /home/nlp/.local/lib/python3.6/site-packages (from requests>=2.5.0->sphinx->-r dev-requirements.txt (line 1)) (1.25.3)
Requirement already satisfied: certifi>=2017.4.17 in /home/nlp/.local/lib/python3.6/site-packages (from requests>=2.5.0->sphinx->-r dev-requirements.txt (line 1)) (2019.6.16)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /home/nlp/.local/lib/python3.6/site-packages (from requests>=2.5.0->sphinx->-r dev-requirements.txt (line 1)) (3.0.4)
Requirement already satisfied: idna<2.9,>=2.5 in /home/nlp/.local/lib/python3.6/site-packages (from requests>=2.5.0->sphinx->-r dev-requirements.txt (line 1)) (2.8)
Requirement already satisfied: pytz>=2015.7 in /home/nlp/.local/lib/python3.6/site-packages (from babel!=2.0,>=1.3->sphinx->-r dev-requirements.txt (line 1)) (2019.1)
Requirement already satisfied: MarkupSafe>=0.23 in /home/nlp/.local/lib/python3.6/site-packages (from Jinja2>=2.3->sphinx->-r dev-requirements.txt (line 1)) (1.1.1)
Collecting zipp>=0.5 (from importlib-metadata>=0.12->pytest->-r dev-requirements.txt (line 6))
  Downloading https://files.pythonhosted.org/packages/a0/0f/9bf71d438d2e9d5fd0e4569ea4d1a2b6f5a524c234c6d221b494298bb4d1/zipp-0.5.1-py2.py3-none-any.whl
Collecting lazy-object-proxy (from astroid<3,>=2.2.0->pylint->-r dev-requirements.txt (line 9))
  Downloading https://files.pythonhosted.org/packages/1a/2a/d73b99e9407be3acd7c0328fcc44bcf6f5c42e6d03d1fb192032c0057d13/lazy_object_proxy-1.4.1-cp36-cp36m-manylinux1_x86_64.whl (49kB)
     |████████████████████████████████| 51kB 11.0MB/s 
Collecting typed-ast>=1.3.0; implementation_name == "cpython" (from astroid<3,>=2.2.0->pylint->-r dev-requirements.txt (line 9))
  Downloading https://files.pythonhosted.org/packages/31/d3/9d1802c161626d0278bafb1ffb32f76b9d01e123881bbf9d91e8ccf28e18/typed_ast-1.4.0-cp36-cp36m-manylinux1_x86_64.whl (736kB)
     |████████████████████████████████| 737kB 12.2MB/s 
Collecting wrapt (from astroid<3,>=2.2.0->pylint->-r dev-requirements.txt (line 9))
  Downloading https://files.pythonhosted.org/packages/23/84/323c2415280bc4fc880ac5050dddfb3c8062c2552b34c2e512eb4aa68f79/wrapt-1.11.2.tar.gz
Building wheels for collected packages: snowballstemmer, wrapt
  Building wheel for snowballstemmer (setup.py) ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/93/db/97/496f1d6bbcad1fecbc58fe45363540414be519312eded82bab
  Building wheel for wrapt (setup.py) ... done
  Stored in directory: /home/nlp/.cache/pip/wheels/d7/de/2e/efa132238792efb6459a96e85916ef8597fcb3d2ae51590dfd
Successfully built snowballstemmer wrapt
Installing collected packages: sphinxcontrib-applehelp, sphinxcontrib-devhelp, alabaster, imagesize, sphinxcontrib-jsmath, snowballstemmer, Pygments, sphinxcontrib-qthelp, sphinxcontrib-htmlhelp, babel, sphinxcontrib-serializinghtml, sphinx, sphinx-rtd-theme, mccabe, entrypoints, pycodestyle, pyflakes, flake8, flake8-html, pep8, zipp, importlib-metadata, pluggy, more-itertools, atomicwrites, py, pytest, coverage, pytest-cov, pytest-mock, isort, lazy-object-proxy, typed-ast, wrapt, astroid, pylint
Successfully installed Pygments-2.4.2 alabaster-0.7.12 astroid-2.2.5 atomicwrites-1.3.0 babel-2.7.0 coverage-4.5.3 entrypoints-0.3 flake8-3.7.7 flake8-html-0.4.0 imagesize-1.1.0 importlib-metadata-0.18 isort-4.3.21 lazy-object-proxy-1.4.1 mccabe-0.6.1 more-itertools-7.0.0 pep8-1.7.1 pluggy-0.12.0 py-1.8.0 pycodestyle-2.5.0 pyflakes-2.1.1 pylint-2.3.1 pytest-4.6.3 pytest-cov-2.7.1 pytest-mock-1.10.4 snowballstemmer-1.9.0 sphinx-2.1.2 sphinx-rtd-theme-0.4.3 sphinxcontrib-applehelp-1.0.1 sphinxcontrib-devhelp-1.0.1 sphinxcontrib-htmlhelp-1.0.2 sphinxcontrib-jsmath-1.0.1 sphinxcontrib-qthelp-1.0.2 sphinxcontrib-serializinghtml-1.1.3 typed-ast-1.4.0 wrapt-1.11.2 zipp-0.5.1
(intelnlp) nlp@nlp16-70GB:~/Desktop/nlp-architect$ nlp_architect -h
nlp_architect: command not found
(intelnlp) nlp@nlp16-70GB:~/Desktop/nlp-architect$ nlp-architect -h
nlp-architect: command not found

BISTModel error while loading the pretrained model

I have downloaded the pretrained BIST model but it is not loading when I use the BISTModel.load() function.

What I'm doing is:

from nlp_architect.models.bist_parser import BISTModel
parser = BISTModel()
parser.load('/home/zeus/nlp-architect/cache/bist-pretrained/bist.model')

It gives the following error:

[dynet] random seed: 1090240239
[dynet] allocating memory: 512MB
[dynet] memory allocation done.
Traceback (most recent call last):
File "/home/zeus/Assistance/be/ServiceDispatcher/buildDependencies.py", line 5, in <module>
parser.load(''.__setattr__('parent','/home/zeus/nlp-architect/cache/bist-pretrained/bist.model'))
AttributeError: 'str' object has no attribute 'parent'

`

nlp-architect fails to build on macOS 10.14 Mojave #195

During pip install -e . , It gives following error. Seems like clang/gcc is pretty broken.

Last login: Wed May 22 10:51:01 on console
Prabhats-MacBook-Air:~ pksingh$ cd nlp-architect/
Prabhats-MacBook-Air:nlp-architect pksingh$ pip install -e .
Obtaining file:///Users/pksingh/nlp-architect
Requirement already satisfied: tensorflow==1.13.1 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from nlp-architect==0.4.post1) (1.13.1)
Collecting dynet==2.0.2 (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/01/10/cfc00197733dd7cf52d9e00c42017ebe7e818653980bccfe6f241ea5b79a/dyNET-2.0.2.tar.gz
Collecting spacy (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/80/d2/61774b69cd79abbf5de91af0093f8cc919f4b28849787982daaa449fbf5f/spacy-2.1.4-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting nltk (from nlp-architect==0.4.post1)
Collecting gensim (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/82/bb/56f295a604dfafdef746cc81081ff4c6e825690de95963000300a1cd3d80/gensim-3.7.3-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting sklearn (from nlp-architect==0.4.post1)
Requirement already satisfied: scipy in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from nlp-architect==0.4.post1) (1.2.0)
Requirement already satisfied: numpy in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from nlp-architect==0.4.post1) (1.16.0)
Collecting tensorflow_hub (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/10/5c/6f3698513cf1cd730a5ea66aec665d213adf9de59b34f362f270e0bd126f/tensorflow_hub-0.4.0-py2.py3-none-any.whl
Collecting elasticsearch (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/82/c9/3920effd37e555f670bc522483dc940eb4197b9a4d3d95dd2a05842be849/elasticsearch-7.0.1-py2.py3-none-any.whl
Collecting fasttextmirror (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/fb/78/cf79876cfbb92bf7baae65472b19c680f6e20eaf55ca41721a53ea2014bb/fasttextmirror-0.8.22.tar.gz
Collecting newspaper3k (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/d7/b9/51afecb35bb61b188a4b44868001de348a0e8134b4dfa00ffc191567c4b9/newspaper3k-0.2.8-py3-none-any.whl
Collecting wordfreq (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/d0/82/233c39f350ac66c740266dafb348f9e67ba3a5e5dad6a949a2c3715c34f5/wordfreq-2.2.1-py3-none-any.whl
Collecting seqeval (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/55/dd/3bf1c646c310daabae47fceb84ea9ab66df7f518a31a89955290d82b8100/seqeval-0.0.10-py3-none-any.whl
Collecting pywikibot (from nlp-architect==0.4.post1)
Collecting num2words (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/eb/a2/ea800689730732e27711c41beed4b2a129b34974435bdc450377ec407738/num2words-0.5.10-py3-none-any.whl
Collecting hyperopt (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/63/12/704382c3081df3ae3f9d96fe6afb62efa2fa9749be20c301cd2797fb0b52/hyperopt-0.1.2-py3-none-any.whl
Requirement already satisfied: h5py in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from nlp-architect==0.4.post1) (2.9.0)
Collecting pandas (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/fc/43/fd867e3347559845c8f993059d410c50a1e18709f1c4d4b3b47323a06a37/pandas-0.24.2-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Requirement already satisfied: tqdm in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from nlp-architect==0.4.post1) (4.29.1)
Collecting ftfy (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/8f/86/df789c5834f15ae1ca53a8d4c1fc4788676c2e32112f6a786f2625d9c6e6/ftfy-5.5.1-py3-none-any.whl
Collecting bokeh (from nlp-architect==0.4.post1)
Requirement already satisfied: six in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from nlp-architect==0.4.post1) (1.12.0)
Requirement already satisfied: future in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from nlp-architect==0.4.post1) (0.17.1)
Requirement already satisfied: requests in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from nlp-architect==0.4.post1) (2.20.0)
Requirement already satisfied: termcolor in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from nlp-architect==0.4.post1) (1.1.0)
Collecting pillow (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/22/55/2ce41fa510f131c776112a1d24ee90cddffc96f1bf0311efb14fdd8ae877/Pillow-6.0.0-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting hug (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/04/d9/1b0cf6bd3cd1cb8b33246c48105bc1cbf87108c35d1e444a325460a32d56/hug-2.5.4-py2.py3-none-any.whl
Collecting falcon==1.4.1 (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/e8/d0/20bb807dee65f1f163754670557b128eafce1710f6c9c363a38e357f3783/falcon-1.4.1-py2.py3-none-any.whl
Collecting falcon_multipart (from nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/e1/a2/e50ffc3101ed6b91d1edc63b3586c424ef8071e1ef0ef7dcb8745e65fc14/falcon_multipart-0.2.0-py3-none-any.whl
Requirement already satisfied: grpcio>=1.8.6 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (1.20.0)
Requirement already satisfied: gast>=0.2.0 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (0.2.2)
Requirement already satisfied: wheel>=0.26 in /Users/pksingh/Library/Python/3.7/lib/python/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (0.33.1)
Requirement already satisfied: tensorflow-estimator<1.14.0rc0,>=1.13.0 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (1.13.0)
Requirement already satisfied: astor>=0.6.0 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (0.7.1)
Requirement already satisfied: protobuf>=3.6.1 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (3.7.1)
Requirement already satisfied: keras-preprocessing>=1.0.5 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (1.0.9)
Requirement already satisfied: tensorboard<1.14.0,>=1.13.0 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (1.13.1)
Requirement already satisfied: keras-applications>=1.0.6 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (1.0.7)
Requirement already satisfied: absl-py>=0.1.6 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow==1.13.1->nlp-architect==0.4.post1) (0.7.1)
Requirement already satisfied: cython in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from dynet==2.0.2->nlp-architect==0.4.post1) (0.29.7)
Collecting preshed<2.1.0,>=2.0.1 (from spacy->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/48/fe/2f2e8c91541785f2abe0d51f37eb00356513b9ff3d24fb27fd5b59e18264/preshed-2.0.1-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting plac<1.0.0,>=0.9.6 (from spacy->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/9e/9b/62c60d2f5bc135d2aa1d8c8a86aaf84edb719a59c7f11a4316259e61a298/plac-0.9.6-py2.py3-none-any.whl
Requirement already satisfied: jsonschema<3.1.0,>=2.6.0 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from spacy->nlp-architect==0.4.post1) (2.6.0)
Collecting wasabi<1.1.0,>=0.2.0 (from spacy->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/f4/c1/d76ccdd12c716be79162d934fe7de4ac8a318b9302864716dde940641a79/wasabi-0.2.2-py3-none-any.whl
Collecting murmurhash<1.1.0,>=0.28.0 (from spacy->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/b9/bd/faace403086ee922afc74e5615cb8c21020fcf5d5667314e943c08f71fde/murmurhash-1.0.2-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting srsly<1.1.0,>=0.0.5 (from spacy->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/c8/b2/d2cc9f5aa5a458aca45d8689279b41d4b42ba4e8c63b0cb6a9f009340fd0/srsly-0.0.5-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting blis<0.3.0,>=0.2.2 (from spacy->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/36/41/9e934e2b8a2cdae447ed1923a94f98c2d70c898b65af6635f5fe55f7ed4d/blis-0.2.4-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting thinc<7.1.0,>=7.0.2 (from spacy->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/92/b1/d7df83813ee3c42d46e6ddf4d4f1d9bd35a4735827d35b7f02539bea3136/thinc-7.0.4-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting cymem<2.1.0,>=2.0.2 (from spacy->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/d7/11/37da628920bf2999bd8c4ffc40908413622486d5dbc4e60d87a58c428367/cymem-2.0.2-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting smart-open>=1.7.0 (from gensim->nlp-architect==0.4.post1)
Requirement already satisfied: scikit-learn in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from sklearn->nlp-architect==0.4.post1) (0.20.2)
Requirement already satisfied: urllib3>=1.21.1 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from elasticsearch->nlp-architect==0.4.post1) (1.24.2)
Collecting pybind11>=2.2 (from fasttextmirror->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/f2/7c/e71995e59e108799800cb0fce6c4b4927914d7eada0723dd20bae3b51786/pybind11-2.2.4-py2.py3-none-any.whl
Requirement already satisfied: setuptools>=0.7.0 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from fasttextmirror->nlp-architect==0.4.post1) (40.8.0)
Collecting beautifulsoup4>=4.4.1 (from newspaper3k->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/1d/5d/3260694a59df0ec52f8b4883f5d23b130bc237602a1411fa670eae12351e/beautifulsoup4-4.7.1-py3-none-any.whl
Collecting feedparser>=5.2.1 (from newspaper3k->nlp-architect==0.4.post1)
Requirement already satisfied: PyYAML>=3.11 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from newspaper3k->nlp-architect==0.4.post1) (5.1)
Collecting tldextract>=2.0.1 (from newspaper3k->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/1e/90/18ac0e5340b6228c25cc8e79835c3811e7553b2b9ae87296dfeb62b7866d/tldextract-2.2.1-py2.py3-none-any.whl
Collecting jieba3k>=0.35.1 (from newspaper3k->nlp-architect==0.4.post1)
Collecting tinysegmenter==0.3 (from newspaper3k->nlp-architect==0.4.post1)
Collecting feedfinder2>=0.0.4 (from newspaper3k->nlp-architect==0.4.post1)
Collecting lxml>=3.6.0 (from newspaper3k->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/45/6c/436a534dca42f7982ba793983353035d117ab70541266704974efa323ade/lxml-4.3.3-cp37-cp37m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting cssselect>=0.9.2 (from newspaper3k->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/7b/44/25b7283e50585f0b4156960691d951b05d061abf4a714078393e51929b30/cssselect-1.0.3-py2.py3-none-any.whl
Requirement already satisfied: python-dateutil>=2.5.3 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from newspaper3k->nlp-architect==0.4.post1) (2.7.5)
Collecting regex<=2018.02.21,>=2017.07.11 (from wordfreq->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/a2/51/c39562cfed3272592c60cfd229e5464d715b78537e332eac2b695422dc49/regex-2018.02.21.tar.gz
Collecting langcodes>=1.4.1 (from wordfreq->nlp-architect==0.4.post1)
Collecting msgpack (from wordfreq->nlp-architect==0.4.post1)
Collecting Keras>=2.2.4 (from seqeval->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/5e/10/aa32dad071ce52b5502266b5c659451cfd6ffcbf14e6c8c4f16c0ff5aaab/Keras-2.2.4-py2.py3-none-any.whl
Requirement already satisfied: docopt>=0.6.2 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from num2words->nlp-architect==0.4.post1) (0.6.2)
Requirement already satisfied: pymongo in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from hyperopt->nlp-architect==0.4.post1) (3.7.2)
Requirement already satisfied: networkx in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from hyperopt->nlp-architect==0.4.post1) (2.2)
Requirement already satisfied: pytz>=2011k in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from pandas->nlp-architect==0.4.post1) (2018.9)
Requirement already satisfied: wcwidth in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from ftfy->nlp-architect==0.4.post1) (0.1.7)
Collecting tornado>=4.3 (from bokeh->nlp-architect==0.4.post1)
Requirement already satisfied: packaging>=16.8 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from bokeh->nlp-architect==0.4.post1) (18.0)
Requirement already satisfied: Jinja2>=2.7 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from bokeh->nlp-architect==0.4.post1) (2.10.1)
Requirement already satisfied: certifi>=2017.4.17 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from requests->nlp-architect==0.4.post1) (2019.3.9)
Requirement already satisfied: idna<2.8,>=2.5 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from requests->nlp-architect==0.4.post1) (2.7)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from requests->nlp-architect==0.4.post1) (3.0.4)
Collecting python-mimeparse>=1.5.2 (from falcon==1.4.1->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/26/2e/03bce213a9bf02a2750dcb04e761785e9c763fc11071edc4b447eacbb842/python_mimeparse-1.6.0-py2.py3-none-any.whl
Requirement already satisfied: mock>=2.0.0 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorflow-estimator<1.14.0rc0,>=1.13.0->tensorflow==1.13.1->nlp-architect==0.4.post1) (2.0.0)
Requirement already satisfied: markdown>=2.6.8 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorboard<1.14.0,>=1.13.0->tensorflow==1.13.1->nlp-architect==0.4.post1) (3.1)
Requirement already satisfied: werkzeug>=0.11.15 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from tensorboard<1.14.0,>=1.13.0->tensorflow==1.13.1->nlp-architect==0.4.post1) (0.15.2)
Collecting boto>=2.32 (from smart-open>=1.7.0->gensim->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/23/10/c0b78c27298029e4454a472a1919bde20cb182dab1662cec7f2ca1dcc523/boto-2.49.0-py2.py3-none-any.whl
Requirement already satisfied: boto3 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from smart-open>=1.7.0->gensim->nlp-architect==0.4.post1) (1.9.96)
Collecting soupsieve>=1.2 (from beautifulsoup4>=4.4.1->newspaper3k->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/b9/a5/7ea40d0f8676bde6e464a6435a48bc5db09b1a8f4f06d41dd997b8f3c616/soupsieve-1.9.1-py2.py3-none-any.whl
Collecting requests-file>=1.4 (from tldextract>=2.0.1->newspaper3k->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/23/9c/6e63c23c39e53d3df41c77a3d05a49a42c4e1383a6d2a5e3233161b89dbf/requests_file-1.4.3-py2.py3-none-any.whl
Collecting marisa-trie (from langcodes>=1.4.1->wordfreq->nlp-architect==0.4.post1)
Using cached https://files.pythonhosted.org/packages/20/95/d23071d0992dabcb61c948fb118a90683193befc88c23e745b050a29e7db/marisa-trie-0.7.5.tar.gz
Requirement already satisfied: decorator>=4.3.0 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from networkx->hyperopt->nlp-architect==0.4.post1) (4.4.0)
Requirement already satisfied: pyparsing>=2.0.2 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from packaging>=16.8->bokeh->nlp-architect==0.4.post1) (2.4.0)
Requirement already satisfied: MarkupSafe>=0.23 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from Jinja2>=2.7->bokeh->nlp-architect==0.4.post1) (1.1.1)
Requirement already satisfied: pbr>=0.11 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from mock>=2.0.0->tensorflow-estimator<1.14.0rc0,>=1.13.0->tensorflow==1.13.1->nlp-architect==0.4.post1) (5.1.3)
Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from boto3->smart-open>=1.7.0->gensim->nlp-architect==0.4.post1) (0.9.4)
Requirement already satisfied: s3transfer<0.3.0,>=0.2.0 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from boto3->smart-open>=1.7.0->gensim->nlp-architect==0.4.post1) (0.2.0)
Requirement already satisfied: botocore<1.13.0,>=1.12.96 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from boto3->smart-open>=1.7.0->gensim->nlp-architect==0.4.post1) (1.12.134)
Requirement already satisfied: docutils>=0.10 in /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages (from botocore<1.13.0,>=1.12.96->boto3->smart-open>=1.7.0->gensim->nlp-architect==0.4.post1) (0.14)
Building wheels for collected packages: dynet, fasttextmirror, regex, marisa-trie
Building wheel for dynet (setup.py) ... error
ERROR: Complete output from command /Library/Frameworks/Python.framework/Versions/3.7/bin/python3.7 -u -c 'import setuptools, tokenize;file='"'"'/private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' bdist_wheel -d /private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-wheel-v0wl9pe8 --python-tag cp37:
ERROR: running bdist_wheel
running build
INFO:root:CMAKE_PATH='/usr/local/bin/cmake'
INFO:root:MAKE_PATH='/usr/bin/make'
INFO:root:MAKE_FLAGS='-j 4'
INFO:root:EIGEN3_INCLUDE_DIR='/private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet/build/py3.7-64bit/eigen'
INFO:root:EIGEN3_DOWNLOAD_URL='https://bitbucket.org/eigen/eigen/get/699b6595fc47.zip'
INFO:root:CC_PATH='/usr/bin/gcc'
INFO:root:CXX_PATH='/usr/bin/g++'
INFO:root:SCRIPT_DIR='/private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet'
INFO:root:BUILD_DIR='/private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet/build/py3.7-64bit'
INFO:root:INSTALL_PREFIX='/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/../../..'
INFO:root:PYTHON='/Library/Frameworks/Python.framework/Versions/3.7/bin/python3.7'
cmake version 3.14.4

CMake suite maintained and supported by Kitware (kitware.com/cmake).
Configured with: --prefix=/Applications/Xcode.app/Contents/Developer/usr --with-gxx-include-dir=/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk/usr/include/c++/4.2.1
Apple LLVM version 10.0.1 (clang-1001.0.46.4)
Target: x86_64-apple-darwin18.6.0
Thread model: posix
InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin
INFO:root:Creating build directory /private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet/build/py3.7-64bit
INFO:root:Fetching Eigen...
INFO:root:Unpacking Eigen...
INFO:root:Configuring...
-- The C compiler identification is AppleClang 10.0.1.10010046
-- The CXX compiler identification is AppleClang 10.0.1.10010046
-- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/usr/bin/gcc
-- Check for working C compiler: /Applications/Xcode.app/Contents/Developer/usr/bin/gcc -- broken
CMake Error at /usr/local/Cellar/cmake/3.14.4/share/cmake/Modules/CMakeTestCCompiler.cmake:60 (message):
The C compiler

  "/Applications/Xcode.app/Contents/Developer/usr/bin/gcc"

is not able to compile a simple test program.

It fails with the following output:

  Change Dir: /private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet/build/py3.7-64bit/CMakeFiles/CMakeTmp

  Run Build Command(s):/usr/bin/make cmTC_9f760/fast
  /Applications/Xcode.app/Contents/Developer/usr/bin/make -f CMakeFiles/cmTC_9f760.dir/build.make CMakeFiles/cmTC_9f760.dir/build
  Building C object CMakeFiles/cmTC_9f760.dir/testCCompiler.c.o
  /Applications/Xcode.app/Contents/Developer/usr/bin/gcc   -g -O2 -mmacosx-version-min=10.13 -isysroot /SDKs/MacOSX.platform/MacOSX10.13.sdk  -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk   -o CMakeFiles/cmTC_9f760.dir/testCCompiler.c.o   -c /private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet/build/py3.7-64bit/CMakeFiles/CMakeTmp/testCCompiler.c
  Linking C executable cmTC_9f760
  /usr/local/Cellar/cmake/3.14.4/bin/cmake -E cmake_link_script CMakeFiles/cmTC_9f760.dir/link.txt --verbose=1
  /Applications/Xcode.app/Contents/Developer/usr/bin/gcc -g -O2 -mmacosx-version-min=10.13 -isysroot /SDKs/MacOSX.platform/MacOSX10.13.sdk  -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.14.sdk -Wl,-search_paths_first -Wl,-headerpad_max_install_names -mmacosx-version-min=10.13 -isysroot /SDKs/MacOSX.platform/MacOSX10.13.sdk  CMakeFiles/cmTC_9f760.dir/testCCompiler.c.o  -o cmTC_9f760
  clang: warning: no such sysroot directory: '/SDKs/MacOSX.platform/MacOSX10.13.sdk' [-Wmissing-sysroot]
  ld: library not found for -lSystem
  clang: error: linker command failed with exit code 1 (use -v to see invocation)
  make[1]: *** [cmTC_9f760] Error 1
  make: *** [cmTC_9f760/fast] Error 2




CMake will not be able to correctly generate this project.

Call Stack (most recent call first):
CMakeLists.txt:1 (project)

-- Configuring incomplete, errors occurred!
See also "/private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet/build/py3.7-64bit/CMakeFiles/CMakeOutput.log".
See also "/private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet/build/py3.7-64bit/CMakeFiles/CMakeError.log".
error: /usr/local/bin/cmake /private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet -DCMAKE_INSTALL_PREFIX='/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/../../..' -DEIGEN3_INCLUDE_DIR='/private/var/folders/d5/_y4xg4k91b5gllsppplwsd5m0000gn/T/pip-install-rkn7ci4c/dynet/build/py3.7-64bit/eigen' -DPYTHON='/Library/Frameworks/Python.framework/Versions/3.7/bin/python3.7'

Question: Is anyone using Intel Python with NLP Architect

I want to make sure that I can get the results that everyone else is getting with NLP Architect, and I'm curious if anyone using Intel Python.

I see that Intel Python is NOT in the installation instructions for NLP Architect, and I also see that Intel Python downgrades at least 30 packages during it's installation on Ubuntu 16.04 LTS.

The process that I'm using to install Intel Python is:

conda install -c intel intelpython3_full

Error running absa train.py example

I am able to successfully install nlp architect but I get an error when i try to run:

python nlp-architect/examples/absa/train/train.py

The error I get is:

image

Is there something wrong with the provided dataset?

I am using the following python version:

3.6.8 |Anaconda, Inc.| (default, Dec 30 2018, 01:22:34)
[GCC 7.3.0]

BIST has error with multiple roots in a sentence

I found these two datasets in English and Chinese:
https://github.com/UniversalDependencies/UD_English-EWT
https://github.com/UniversalDependencies/UD_Chinese-GSD
and tested them with BIST.

However, this error occurs after a few thousand batches of training for both:
nlp_architect.models.bist.eval.conllu.conll17_ud_eval.UDError: There are multiple roots in a sentence
I have checked the training data and the number of roots is same as number of sentences in each train/dev/test set. Do you know what might cause the problem and how to fix it? Or do you have dataset that works and we can use it?

Thank you very much!

installing and set-expansion data processing error

Hi, I am trying to install NLP-architect and run set-expansion solution. However I met some problems:

  1. Trying to run the command given at set-expansion on my desktop:

python -m nlp_architect.solutions.set_expansion.prepare_data.py --corpus TRAINING_CORPUS --marked_corpus MARKED_TRAINING_CORPUS

I would get the following error:

/usr/bin/python3: Error while finding module specification for 'nlp_architect.solutions.set_expansion.prepare_data.py' (AttributeError: module 'nlp_architect.solutions.set_expansion.prepare_data' has no attribute 'path')

what could be the problem here?

  1. When I try to run the exact same thing on a server, I got the following error:

[1] 13988 illegal hardware instruction python -m nlp_architect.solutions.set_expansion.prepare_data.py --corpus

what could be the problem here? For your information, this is the CPU info about my server:

processor : 0
vendor_id : GenuineIntel
cpu family : 15
model : 6
model name : Intel(R) Xeon(TM) CPU 3.00GHz
stepping : 4
microcode : 0x2
cpu MHz : 2992.630
cache size : 2048 KB
physical id : 0
siblings : 2
core id : 0
cpu cores : 2
apicid : 0
initial apicid : 0
fpu : yes
fpu_exception : yes
cpuid level : 6
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx lm constant_tsc pebs bts nopl pni dtes64 monitor ds_cpl vmx est cid cx16 xtpr pdcm lahf_lm tpr_shadow
bogomips : 5985.26
clflush size : 64
cache_alignment : 128
address sizes : 36 bits physical, 48 bits virtual
power management:

intent_extraction_demo.ipynb is not working using nlp_architect v0.3.1

Target objective:

Trying to run intent classification demo "intent_extraction_demo.ipynb" using default dataset mentioned in notebook itself.

Steps to objective:

  • Intents are not onehot encoded so got below error

“ValueError: Error when checking target: expected intent_classifier_output to have shape (7,) but got array with shape (1,)”

  • I have tried in 2 ways, done explicit onehot encoding on my own and tried to import one_hot from v0.2. After importing, intent array is converted to one-hot encoded format. But getting the below error.

Train on 13784 samples, validate on 700 samples
Epoch 1/1

InvalidArgumentError Traceback (most recent call last)
in ()
4 # train the model
5 model.fit([train_x, train_c], [train_i, train_y],batch_size=batch_size,epochs=no_epochs,
----> 6 validation=([test_x, test_c], [test_i, test_y]))
7
8 #model.fit([train_x, train_c], [intent_labels,train_y],batch_size=batch_size,epochs=no_epochs)

~/.local/lib/python3.6/site-packages/nlp_architect/models/intent_extraction.py in fit(self, x, y, epochs, batch_size, callbacks, validation)
52 self.model.fit(x, y, epochs=epochs, batch_size=batch_size, shuffle=True,
53 validation_data=validation,
---> 54 callbacks=callbacks)
55
56 def predict(self, x, batch_size=1):

/glob/intel-python/python3/lib/python3.6/site-packages/tensorflow/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, **kwargs)
1361 initial_epoch=initial_epoch,
1362 steps_per_epoch=steps_per_epoch,
-> 1363 validation_steps=validation_steps)
1364
1365 def evaluate(self,

/glob/intel-python/python3/lib/python3.6/site-packages/tensorflow/python/keras/engine/training_arrays.py in fit_loop(model, inputs, targets, sample_weights, batch_size, epochs, verbose, callbacks, val_inputs, val_targets, val_sample_weights, shuffle, callback_metrics, initial_epoch, steps_per_epoch, validation_steps)
262 ins_batch[i] = ins_batch[i].toarray()
263
--> 264 outs = f(ins_batch)
265 if not isinstance(outs, list):
266 outs = [outs]

/glob/intel-python/python3/lib/python3.6/site-packages/tensorflow/python/keras/backend.py in call(self, inputs)
2912 self._make_callable(feed_arrays, feed_symbols, symbol_vals, session)
2913
-> 2914 fetched = self._callable_fn(*array_vals)
2915 self._call_fetch_callbacks(fetched[-len(self._fetches):])
2916 return fetched[:len(self.outputs)]

/glob/intel-python/python3/lib/python3.6/site-packages/tensorflow/python/client/session.py in call(self, *args, **kwargs)
1380 ret = tf_session.TF_SessionRunCallable(
1381 self._session._session, self._handle, args, status,
-> 1382 run_metadata_ptr)
1383 if run_metadata:
1384 proto_data = tf_session.TF_GetBuffer(run_metadata_ptr)

/glob/intel-python/python3/lib/python3.6/site-packages/tensorflow/python/framework/errors_impl.py in exit(self, type_arg, value_arg, traceback_arg)
517 None, None,
518 compat.as_text(c_api.TF_Message(self.status.status)),
--> 519 c_api.TF_GetCode(self.status.status))
520 # Delete the underlying status object from memory otherwise it stays alive
521 # as there is a reference to status from this from the traceback due to

InvalidArgumentError: slice index 1 of dimension 0 out of bounds.
[[Node: loss/intent_slot_crf_loss/cond/strided_slice_4 = StridedSlice[Index=DT_INT32, T=DT_INT32, begin_mask=0, ellipsis_mask=0, end_mask=0, new_axis_mask=0, shrink_axis_mask=1, _device="/job:localhost/replica:0/task:0/device:CPU:0"](loss/intent_slot_crf_loss/cond/Shape_4, metrics/accuracy/cond/rnn_1/strided_slice_2/stack, metrics/accuracy/cond/rnn_1/strided_slice_2/stack_1, metrics/accuracy/cond/rnn_1/strided_slice_2/stack)]]

Looks code is broken in v0.3, please let me know the solution for the above issue.

Pull-Request related:

Question: CRF can't be loaded by tf.keras.models.load_model()

I notice that in the models of this repo, only weights are saved. During model loading, a new model is constructed using model topology and weights are set to it.

If I construct a model using tf.keras.layers and CRF, save the whole model using model.save(path), if I call tf.keras.models.load_model(path, custom_objects={"CRF": CRF}) to load it, it will throw the following error:

cls = <class 'nlp_architect.contrib.tensorflow.python.keras.layers.crf.CRF'>
config = {'dtype': 'float32', 'mode': 'reg', 'name': 'intent_slot_crf', 'output_dim': 5, ...}

    @classmethod
    def from_config(cls, config):
      """Creates a layer from its config.
    
      This method is the reverse of `get_config`,
      capable of instantiating the same layer from the config
      dictionary. It does not handle layer connectivity
      (handled by Network), nor weights (handled by `set_weights`).
    
      Arguments:
          config: A Python dictionary, typically the
              output of get_config.
    
      Returns:
          A layer instance.
      """
>     return cls(**config)
E     TypeError: __init__() missing 1 required positional argument: 'num_classes'

Seems that num_classes not included in get_config?

If I modify get_config, it will throw another error:

identifier = 'loss'
module_objects = {'K': <module 'tensorflow.python.keras.backend' from '/home/kai/.local/lib/python3.5/site-packages/tensorflow/python/k...ction mean_absolute_error at 0x7fd2cb6926a8>, 'MAPE': <function mean_absolute_percentage_error at 0x7fd2cb692730>, ...}
custom_objects = None, printable_module_name = 'loss function'

    @tf_export('keras.utils.deserialize_keras_object')
    def deserialize_keras_object(identifier,
                                 module_objects=None,
                                 custom_objects=None,
                                 printable_module_name='object'):
      if isinstance(identifier, dict):
        # In this case we are dealing with a Keras config dictionary.
        config = identifier
        if 'class_name' not in config or 'config' not in config:
          raise ValueError('Improper config format: ' + str(config))
        class_name = config['class_name']
        if custom_objects and class_name in custom_objects:
          cls = custom_objects[class_name]
        elif class_name in _GLOBAL_CUSTOM_OBJECTS:
          cls = _GLOBAL_CUSTOM_OBJECTS[class_name]
        else:
          module_objects = module_objects or {}
          cls = module_objects.get(class_name)
          if cls is None:
            raise ValueError('Unknown ' + printable_module_name + ': ' + class_name)
        if hasattr(cls, 'from_config'):
          arg_spec = tf_inspect.getargspec(cls.from_config)
          custom_objects = custom_objects or {}
    
          if 'custom_objects' in arg_spec.args:
            return cls.from_config(
                config['config'],
                custom_objects=dict(
                    list(_GLOBAL_CUSTOM_OBJECTS.items()) +
                    list(custom_objects.items())))
          with CustomObjectScope(custom_objects):
            return cls.from_config(config['config'])
        else:
          # Then `cls` may be a function returning a class.
          # in this case by convention `config` holds
          # the kwargs of the function.
          custom_objects = custom_objects or {}
          with CustomObjectScope(custom_objects):
            return cls(**config['config'])
      elif isinstance(identifier, six.string_types):
        function_name = identifier
        if custom_objects and function_name in custom_objects:
          fn = custom_objects.get(function_name)
        elif function_name in _GLOBAL_CUSTOM_OBJECTS:
          fn = _GLOBAL_CUSTOM_OBJECTS[function_name]
        else:
          fn = module_objects.get(function_name)
          if fn is None:
            raise ValueError('Unknown ' + printable_module_name + ':' +
>                            function_name)
E           ValueError: Unknown loss function:loss

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.