Coder Social home page Coder Social logo

rasahq / rasalit Goto Github PK

View Code? Open in Web Editor NEW
303.0 17.0 62.0 9.56 MB

Visualizations and helpers to improve and debug machine learning models for Rasa Open Source

License: Apache License 2.0

Python 5.03% HTML 13.00% Makefile 0.03% Jupyter Notebook 81.93%
rasa machine-learning visualization research

rasalit's Introduction

Note! The code for this project is meant for Rasa Open Source 2.x.

RasaLit

A collection of helpful viewers that help with understand Rasa NLU components. Some of these views are made using streamlit, hence the wink in the name.

Feedback is welcome.

Installation

You can install via pip by linking to this github repository.

python -m pip install git+https://github.com/RasaHQ/rasalit

Compatibility

The focus is to support the most recent version of Rasa. Current we target 2.x. We keep older versions around though. You can find rasalit for Rasa 1.10 here.

Usage

You can directly access the command line app.

> python -m rasalit --help
Usage: rasalit [OPTIONS] COMMAND [ARGS]...

  Helper Views for Rasa NLU

Options:
  --help  Show this message and exit.

Commands:
  diet-explorer  Allows you to explore the DIET settings.
  live-nlu       Select a trained Rasa model and interact with it.
  nlu-cluster    Cluster a text file to look for clusters of intents.
  overview       Gives an overview of all `rasa train nlu` results.
  spelling       Check the effect of spelling on NLU predictions.
  version        Prints the current version of rasalit.

Features

The app contains a collection of viewers that each specialize in a seperate task.

nlu-cluster

This command allows you to cluster similar utterances in a text file.

Note that this app has some extra dependencies. You can install them via;

python -m pip install "whatlies[umap]"

Example Usage:

python -m rasalit nlu-cluster --port 8501

This will start a server locally. Internally it is using the whatlies package to handle the embeddings. This means that while the demo is only in English, you can extend the code to work for Non-English scenarios too! For more details, as well as a labelling tool, check out the notebook found here.

overview

This command shows an summary of the intent/entity scores from a rasa train nlu run.

Example Usage:

> python -m rasalit overview --folder gridresults --port 8501

This will start a server locally on port that will displace an interactive dashboard of all your NLU gridsearch data.

To fully benefit from this feature you'll need to run some models first. You can run cross validation of models in Rasa via the command line:

rasa test nlu --config configs/config-light.yml \
              --cross-validation --runs 1 --folds 2 \
              --out gridresults/config-light
rasa test nlu --config configs/config-heavy.yml \
              --cross-validation --runs 1 --folds 2 \
              --out gridresults/config-heavy

Then Rasa, in this case, will save the results in gridresults/config-light and gridresults/config-heavy respectively.

To get an overview of all the results in subfolders of gridresults, you can run the rasalit overview --folder gridresults command from the same folder where you ran the rasa test command. You'll get some simple charts that summarise the intent/entity performance.

spelling

This command let's you predict text with augmented spelling errors to check for robustness.

> python -m rasalit spelling --help
> python -m rasalit spelling --port 8501

This will start a server locally on port 8501 that will displace an interactive playground for your trained Rasa NLU model. You can see the confidence levels change as you allow for more or less spelling errors.

It's assumed that you run this command from the root of your Rasa project but you can also make it point to other projects via the command line settings.

live-nlu

This command gives you an interactive gui that lets you see the output of a trained modelling pipeline.

Example Usage:

> python -m rasalit live-nlu --help
> python -m rasalit live-nlu --port 8501

This will start a server locally on port 8501 that will displace an interactive playground for your trained Rasa NLU model. You can see the confidence levels as well as the detected entities. We also show some shapes of internal featurization steps.

It's assumed that you run this command from the root of your Rasa project but you can also make it point to other projects via the command line settings.

Attention Charts

If you're using the DIETClassifier you'll be able to also use this app to debug the internals. The app also allows you to inspect all the pipeline settings as well as the internal attention mechanism.

diet-explorer

This command gives you an interactive visualisation of DIET that allows you to see the available hyperparameters from all the layers in the algorithm.

Example Usage:

> rasalit diet-explorer --port 8501

This will start a server locally on port 8501 that will display an interactive visualisation of the DIET architecture.

Notebooks

This project also hosts a few jupyter notebooks that contain interactive tools.

Bulk Labelling

The bulk labelling demo found in this video and this video can be found here.

This notebook allows you to use embeddings and a drawing tool to do some bulk-labelling.

Contribute

There are many ways you can contribute to this project.

  • You can suggest new features.
  • You can help review new features.
  • You can submit new components.
  • You can let us know if there are bugs.
  • You can let us know if the components in this library help you.

Feel free to start the discussion by opening an issue on this repository. Before submitting code to the repository it would help if you first create an issue so that we can disucss the changes you would like to contribute. You can ping the maintainer (Github alias: koaning) both in the issues here as well as on the Rasa forum if you have any questions.

rasalit's People

Contributors

dependabot[bot] avatar dingusagar avatar koaning avatar tabergma avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rasalit's Issues

Thesaurus

It might be cool if we could connect with small little thesaurus API that can help people write better nlu.md files.

KeyError: "['support'] not found in axis"

I followed the readme file instructions to test the nlu model on mulitple config.yml files and stored results to gridresults folder.

after that on running
python -m rasalit overview --folder gridresults
I get the following error in the browser

KeyError: "['support'] not found in axis" Traceback: File "/home/dingusagar/PycharmProjects/rasa-demo/venv/lib/python3.6/site-packages/streamlit/script_runner.py", line 324, in _run_script exec(code, module.__dict__) File "/home/dingusagar/PycharmProjects/rasa-demo/venv/lib/python3.6/site-packages/rasalit/apps/overview/app.py", line 21, in <module> df_response = read_reports(root_folder, report="response") File "/home/dingusagar/PycharmProjects/rasa-demo/venv/lib/python3.6/site-packages/rasalit/apps/overview/common.py", line 16, in read_reports return pd.DataFrame(data).drop(columns=["support"]).melt("config") File "/home/dingusagar/PycharmProjects/rasa-demo/venv/lib/python3.6/site-packages/pandas/core/frame.py", line 4169, in drop errors=errors, File "/home/dingusagar/PycharmProjects/rasa-demo/venv/lib/python3.6/site-packages/pandas/core/generic.py", line 3884, in drop obj = obj._drop_axis(labels, axis, level=level, errors=errors) File "/home/dingusagar/PycharmProjects/rasa-demo/venv/lib/python3.6/site-packages/pandas/core/generic.py", line 3918, in _drop_axis new_axis = axis.drop(labels, errors=errors) File "/home/dingusagar/PycharmProjects/rasa-demo/venv/lib/python3.6/site-packages/pandas/core/indexes/base.py", line 5278, in drop raise KeyError(f"{labels[mask]} not found in axis")

Anybody knows whats going on here?

Adding labeled utterances for reference in Bulk Labeling Tool

Thanks for the amazing work!! Loved it.

Bulk Labeling Tool currently only takes unlabeled utterances. Is it possible to visualize few already labeled utterances as well (possibly with different color) in the same semantic space. This will allow users to visualize separation of clusters more clearly. For example there might be confusion of considering a sub-cluster or a bigger super-cluster while labeling. Colored dots(few labeled samples) can help guide the manual cluster selection process.

Also a drop down menu for all available intent that are already present in the current labeled version would go a long way I feel.

Update Attention Viewer

At some point this PR will be merged and once it is our views will break. Once the new Rasa version is live on pypi this repo should update.

Error while pip installing rasalit from github: "TypeError: a bytes-like object is required, not 'str'"

I installed Rasa 2.8.3, Python 3.8.11. For my case I only use Rasa NLU for the german language.

This is the first time this error appears while collecting astor:

INFO: pip is looking at multiple versions of astor to determine which version is compatible with other requirements. This could take a while.
Collecting astor
  Using cached astor-0.8.0-py2.py3-none-any.whl (27 kB)
  Using cached astor-0.7.1-py2.py3-none-any.whl (27 kB)
  Using cached astor-0.7.0-py2.py3-none-any.whl (27 kB)
  Using cached astor-0.6.2-py2.py3-none-any.whl (26 kB)
  Using cached astor-0.6.1-py2.py3-none-any.whl (26 kB)
  Using cached astor-0.6-py2.py3-none-any.whl (26 kB)
  Using cached astor-0.5-py2.py3-none-any.whl (12 kB)
  Using cached astor-0.4.1-py2.py3-none-any.whl (12 kB)
  Using cached astor-0.4-py2.py3-none-any.whl (13 kB)
  Using cached astor-0.3.tar.gz (10 kB)
  Using cached astor-0.2.1.tar.gz (10 kB)
    ERROR: Command errored out with exit status 1:
     command: 'C:\Users\User\anaconda3\envs\rasa_env\python.exe' -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\User\\AppData\\Local\\Temp\\pip-install-6h22mgjd\\astor_2ad0d30b2b3c48f2b8cc7c51089444f4\\setup.py'
"'"'; __file__='"'"'C:\\Users\\User\\AppData\\Local\\Temp\\pip-install-6h22mgjd\\astor_2ad0d30b2b3c48f2b8cc7c51089444f4\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n
'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base 'C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-gnwugxmp'
         cwd: C:\Users\User\AppData\Local\Temp\pip-install-6h22mgjd\astor_2ad0d30b2b3c48f2b8cc7c51089444f4\
    Complete output (24 lines):
    running egg_info
    creating C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-gnwugxmp\astor.egg-info
    writing C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-gnwugxmp\astor.egg-info\PKG-INFO
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "C:\Users\User\AppData\Local\Temp\pip-install-6h22mgjd\astor_2ad0d30b2b3c48f2b8cc7c51089444f4\setup.py", line 11, in <module>
        setup(
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\core.py", line 148, in setup
        dist.run_commands()
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\dist.py", line 966, in run_commands
        self.run_command(cmd)
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\dist.py", line 985, in run_command
        cmd_obj.run()
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\site-packages\setuptools\command\egg_info.py", line 292, in run
        writer(self, ep.name, os.path.join(self.egg_info, ep.name))
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\site-packages\setuptools\command\egg_info.py", line 628, in write_pkg_info
        metadata.write_pkg_info(cmd.egg_info)
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\dist.py", line 1117, in write_pkg_info
        self.write_pkg_file(pkg_info)
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\site-packages\setuptools\dist.py", line 168, in write_pkg_file
        long_desc = rfc822_escape(self.get_long_description())
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\util.py", line 475, in rfc822_escape
        lines = header.split('\n')
    TypeError: a bytes-like object is required, not 'str'

Then it goes further and gets another error:

WARNING: Discarding https://files.pythonhosted.org/packages/65/aa/a0fe2c7165f5d27ace1850639089ff178d9b0a9b48b491e5301ee269ccc0/astor-0.2.1.tar.gz#sha256=ee4ee1252e355eaa4bbc5e91c51e55fcd2327d1a94d7ce3b0c6fb96154f118be (from https://
pypi.org/simple/astor/). Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
  Using cached astor-0.2.tar.gz (9.6 kB)
    ERROR: Command errored out with exit status 1:
     command: 'C:\Users\User\anaconda3\envs\rasa_env\python.exe' -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\User\\AppData\\Local\\Temp\\pip-install-6h22mgjd\\astor_6d4f99e4ab544643bf331a2ce5cbbc7a\\setup.py'
"'"'; __file__='"'"'C:\\Users\\User\\AppData\\Local\\Temp\\pip-install-6h22mgjd\\astor_6d4f99e4ab544643bf331a2ce5cbbc7a\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n
'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base 'C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-1sad69bh'
         cwd: C:\Users\User\AppData\Local\Temp\pip-install-6h22mgjd\astor_6d4f99e4ab544643bf331a2ce5cbbc7a\
    Complete output (24 lines):
    running egg_info
    creating C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-1sad69bh\astor.egg-info
    writing C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-1sad69bh\astor.egg-info\PKG-INFO
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "C:\Users\User\AppData\Local\Temp\pip-install-6h22mgjd\astor_6d4f99e4ab544643bf331a2ce5cbbc7a\setup.py", line 11, in <module>
        setup(
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\core.py", line 148, in setup
        dist.run_commands()
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\dist.py", line 966, in run_commands
        self.run_command(cmd)
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\dist.py", line 985, in run_command
        cmd_obj.run()
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\site-packages\setuptools\command\egg_info.py", line 292, in run
        writer(self, ep.name, os.path.join(self.egg_info, ep.name))
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\site-packages\setuptools\command\egg_info.py", line 628, in write_pkg_info
        metadata.write_pkg_info(cmd.egg_info)
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\dist.py", line 1117, in write_pkg_info
        self.write_pkg_file(pkg_info)
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\site-packages\setuptools\dist.py", line 168, in write_pkg_file
        long_desc = rfc822_escape(self.get_long_description())
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\util.py", line 475, in rfc822_escape
        lines = header.split('\n')
    TypeError: a bytes-like object is required, not 'str'
    ----------------------------------------
WARNING: Discarding https://files.pythonhosted.org/packages/d1/b4/b1d8da20de5fdd62e596ebe9e3820a454160ff89d7226a50708b7c6c8eeb/astor-0.2.tar.gz#sha256=5f0e6ad42e958170ba916b302248f367dd51abdd4153f84d89c474d4960cab17 (from https://py
pi.org/simple/astor/). Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
  Using cached astor-0.1.tar.gz (8.9 kB)
    ERROR: Command errored out with exit status 1:
     command: 'C:\Users\User\anaconda3\envs\rasa_env\python.exe' -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\User\\AppData\\Local\\Temp\\pip-install-6h22mgjd\\astor_b22bd0c67c2e4beda33f963164caaafc\\setup.py'
"'"'; __file__='"'"'C:\\Users\\User\\AppData\\Local\\Temp\\pip-install-6h22mgjd\\astor_b22bd0c67c2e4beda33f963164caaafc\\setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n
'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base 'C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-xphe1rk6'
         cwd: C:\Users\User\AppData\Local\Temp\pip-install-6h22mgjd\astor_b22bd0c67c2e4beda33f963164caaafc\
    Complete output (24 lines):
    running egg_info
    creating C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-xphe1rk6\astor.egg-info
    writing C:\Users\User\AppData\Local\Temp\pip-pip-egg-info-xphe1rk6\astor.egg-info\PKG-INFO
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "C:\Users\User\AppData\Local\Temp\pip-install-6h22mgjd\astor_b22bd0c67c2e4beda33f963164caaafc\setup.py", line 11, in <module>
        setup(
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\core.py", line 148, in setup
        dist.run_commands()
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\dist.py", line 966, in run_commands
        self.run_command(cmd)
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\dist.py", line 985, in run_command
        cmd_obj.run()
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\site-packages\setuptools\command\egg_info.py", line 292, in run
        writer(self, ep.name, os.path.join(self.egg_info, ep.name))
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\site-packages\setuptools\command\egg_info.py", line 628, in write_pkg_info
        metadata.write_pkg_info(cmd.egg_info)
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\dist.py", line 1117, in write_pkg_info
        self.write_pkg_file(pkg_info)
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\site-packages\setuptools\dist.py", line 168, in write_pkg_file
        long_desc = rfc822_escape(self.get_long_description())
      File "C:\Users\User\anaconda3\envs\rasa_env\lib\distutils\util.py", line 475, in rfc822_escape
        lines = header.split('\n')
    TypeError: a bytes-like object is required, not 'str'
    ----------------------------------------

And after this one it will do nothing, until I click Control+C to abort:

WARNING: Discarding https://files.pythonhosted.org/packages/bc/f3/0d2cac649c45157f0386ec516a4d5fc310307f0d7cea8194b0eb246c22a7/astor-0.1.tar.gz#sha256=ae7e98ba6e868920bc2a47d58fcbcb388f908e9a9533246e6ad53b8105bd297c (from https://py
pi.org/simple/astor/). Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
INFO: pip is looking at multiple versions of backcall to determine which version is compatible with other requirements. This could take a while.
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell
 us what happened here: https://pip.pypa.io/surveys/backtracking
INFO: pip is looking at multiple versions of astor to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of argon2-cffi-bindings to determine which version is compatible with other requirements. This could take a while.
Collecting argon2-cffi-bindings
  Using cached argon2_cffi_bindings-21.1.0-cp36-abi3-win_amd64.whl (30 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell
 us what happened here: https://pip.pypa.io/surveys/backtracking
INFO: pip is looking at multiple versions of argon2-cffi to determine which version is compatible with other requirements. This could take a while.
Collecting argon2-cffi
  Using cached argon2_cffi-21.2.0-py3-none-any.whl (14 kB)
  Using cached argon2_cffi-21.1.0-cp35-abi3-win_amd64.whl (40 kB)
  Using cached argon2_cffi-20.1.0-cp38-cp38-win_amd64.whl (42 kB)
  Using cached argon2_cffi-19.2.0-cp38-cp38-win_amd64.whl (41 kB)
  Using cached argon2_cffi-19.1.0.tar.gz (1.8 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
    Preparing wheel metadata ... done
  Using cached argon2_cffi-18.3.0.tar.gz (1.8 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
    Preparing wheel metadata ... done
  Using cached argon2_cffi-18.2.0.tar.gz (1.8 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
    Preparing wheel metadata ... done
  Using cached argon2_cffi-18.1.0.tar.gz (1.8 MB)
  Using cached argon2_cffi-16.3.0.tar.gz (1.5 MB)
  Using cached argon2_cffi-16.2.0.tar.gz (1.2 MB)
  Using cached argon2_cffi-16.1.0.tar.gz (1.2 MB)
  Using cached argon2_cffi-16.0.0.tar.gz (822 kB)
  Using cached argon2_cffi-15.0.1.tar.gz (818 kB)
  Using cached argon2_cffi-15.0.0.tar.gz (818 kB)
INFO: pip is looking at multiple versions of argon2-cffi to determine which version is compatible with other requirements. This could take a while.
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell
 us what happened here: https://pip.pypa.io/surveys/backtracking
INFO: pip is looking at multiple versions of yarl to determine which version is compatible with other requirements. This could take a while.
Collecting yarl
  Using cached yarl-1.7.2-cp38-cp38-win_amd64.whl (122 kB)
  Using cached yarl-1.7.0-cp38-cp38-win_amd64.whl (122 kB)
  Using cached yarl-1.6.2-cp38-cp38-win_amd64.whl (127 kB)
  Using cached yarl-1.6.1-cp38-cp38-win_amd64.whl (127 kB)
  Using cached yarl-1.6.0-cp38-cp38-win_amd64.whl (129 kB)
  Using cached yarl-1.5.1-cp38-cp38-win_amd64.whl (128 kB)
  Using cached yarl-1.4.2-cp38-cp38-win_amd64.whl (126 kB)
  Using cached yarl-1.4.1-cp38-cp38-win_amd64.whl (126 kB)
  Using cached yarl-1.4.0-cp38-cp38-win_amd64.whl (126 kB)
  Using cached yarl-1.3.0.tar.gz (159 kB)
  Using cached yarl-1.2.6.tar.gz (159 kB)
  Using cached yarl-1.2.5.tar.gz (160 kB)
  Using cached yarl-1.2.4.tar.gz (159 kB)
  Using cached yarl-1.2.3.tar.gz (159 kB)
  Using cached yarl-1.2.2.tar.gz (158 kB)
  Using cached yarl-1.2.1.tar.gz (158 kB)
  Using cached yarl-1.2.0.tar.gz (157 kB)
  Using cached yarl-1.1.1.tar.gz (156 kB)
  Using cached yarl-1.1.0.tar.gz (156 kB)
  Using cached yarl-1.0.0.tar.gz (156 kB)
INFO: pip is looking at multiple versions of argon2-cffi-bindings to determine which version is compatible with other requirements. This could take a while.
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell
 us what happened here: https://pip.pypa.io/surveys/backtracking

[BUG] - Rasa Spelling PLAYGROUND

Hi guys & @koaning. Your work is amazing.

I have an issue with the Rasa Spelling PLAYGROUND

image

My model trained with : Rasa V2.1.3
RasaLit : V0.1.2
image

File "/home/robin/askMona_Projets/1_Projet_NomDeLieux/ArDf/Divers_Tests/TEST_RASA/env/lib/python3.6/site-packages/streamlit/script_runner.py", line 332, in _run_script
    exec(code, module.__dict__)
File "/home/robin/askMona_Projets/1_Projet_NomDeLieux/ArDf/Divers_Tests/TEST_RASA/env/lib/python3.6/site-packages/rasalit/apps/spelling/app.py", line 50, in <module>
    preds = clf.predict_proba(augs)
File "/home/robin/askMona_Projets/1_Projet_NomDeLieux/ArDf/Divers_Tests/TEST_RASA/env/lib/python3.6/site-packages/rasalit/apps/spelling/classifier.py", line 93, in predict_proba
    result.append([ranking_dict[n] for n in self.class_names_])
File "/home/robin/askMona_Projets/1_Projet_NomDeLieux/ArDf/Divers_Tests/TEST_RASA/env/lib/python3.6/site-packages/rasalit/apps/spelling/classifier.py", line 93, in <listcomp>
    result.append([ranking_dict[n] for n in self.class_names_])

KeyError: 'smalltalk residence'
Looks like it doesn't like how I called my Intent ?

Clustered Views on Training Data

I can imagine a file-selector that allows you to select your nlu.md file as well as a selector for all the intents that you're interested in. From here we could perhaps leverage whatlies to show all sorts of charts on how the intents are perhaps clustered together. It'd be especially awesome if we could apply this clustering on the inputs just before it enters DIET.

Interactive View for Model Output

I can imagine that it is very useful to be able to "play" around with a trained model. I can imagine a streamlit app with a file-selector as well as a text bar. It'd be grand if the user can fill in text and see the output from DIET. Both the intents as well as the entities.

Add quick summary and confidence to nlu visual

@JEM-Mosig made a good point that we should add some standard properties from DIET, such as MODEL_CONFIDENCE and SCALE, to our tool. We currently allow you to find all the info, but it might be nice to also has some quick info nearby.

config.yml interactive visualisation

I'm thinking that an interactive visualisation of a config.yml file might be a nice next feature for this little library.

image

The idea is that when you hover over the config.yml different descriptions will light up. We may also zoom in on the features that are being generated. This can be especially useful with the new configuration settings.

image

Windows and Click>=7.1.0

Somebody on Slack mentioned that windows users require click>=7.1.0 in order to get this to work. Would be good to investigate with some CI that also checks on windows.

Using live-nlu with models with Custom Components

I'm using Rasa 2.1.2, with the following pipeline in config:

pipeline:
  - name: SpacyNLP
    model: "en_core_web_sm"
  - name: SpacyTokenizer
  - name: custom_components.SimpleNameExtractor
  - name: SpacyEntityExtractor
    dimensions: ["PERSON"] #https://spacy.io/api/annotation#section-named-entities
  - name: SpacyFeaturizer
    pooling: mean
  - name: CountVectorsFeaturizer
    analyzer: "char_wb"
    min_ngram: 1
    max_ngram: 4
  - name: DIETClassifier
    epochs: 100
  - name: EntitySynonymMapper
  - name: ResponseSelector
    epochs: 100

Running rasalit live-nlu doesn't work since it doesn't know where to find custom_components.SimpleNameExtractor. I get the following error:

ComponentNotFoundException: Failed to load the component 'custom_components.SimpleNameExtractor'. Failed to find module 'custom_components'. Either your pipeline configuration contains an error or the module you are trying to import is broken (e.g. the module is trying to import a package that is not installed). Traceback (most recent call last): File "/usr/local/lib/python3.6/dist-packages/rasa/nlu/registry.py", line 121, in get_component_class return rasa.shared.utils.common.class_from_module_path(component_name) File "/usr/local/lib/python3.6/dist-packages/rasa/shared/utils/common.py", line 19, in class_from_module_path m = importlib.import_module(module_name) File "/usr/lib/python3.6/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked ModuleNotFoundError: No module named 'custom_components'

Is there a way right now for me to point rasalit at my config or my custom class?

Missing Rasa logo

Expected: I should be able to install into a clean virtual environment and run python -m rasalit nlu-cluster and see the streamlit app

Actual: I was able to load the streamlit app (after pip installing whatlies into the same env), but I got the following error:

image

Feedback on `live-nlu` command

This is a place where we might collect feedback for the live-nlu command. Let us know if it helps you or if there's features missing!

Friendlier error message

When you currently point the nlu-demo to a folder with no models you get this:

image

We can make this experience friendlier. Let's add a proper message there.

missing whatlies dependency

Expected: In a clean virtual environment, install rasalit via instructions from README, and once installed, I could run python -m rasalit nlu-cluster

Actual: Streamlit runs as expected, but the app shows the error below

image

[Spelling/NLU-playground] Support Custom NLU components

My NLU pipeline contains a custom spellchecking component which is not supported by the rasalit spellingcommand.

I launched RasaLit from the same project directory as I normally run my bot with rasa run. The customizations module/directory is located in the project root and is typically identified by rasa run without any issue. See project structure below.

cd <project_root>

tree <project_root>
bot ❯ tree
.
├── Dockerfile
├── Makefile
├── actions
│   ├── README.md
| ...
├── config
│   ├── config.yml
│   ├── credentials.yml
│   ├── credentials.yml.tpl
│   ├── domain.yml
│   └── endpoints.yml
├── customizations
│   ├── __init__.py
│   ├── botframework_utils.py
│   └── spellchecker.py
├── data
│   ├── _generated_
| ...
│   ├── core
| ...
│   └── nlu
│     ...
├── evaluation
│   ├── failed_test_stories.yml
| ...

├── models
│   └── bot-nlu-core-model.tar.gz


python -m rasalit spelling --folder models --port 8501

My config.yaml for reference:

language: de

pipeline:
  - name: customizations.spellchecker.SpellChecker
  - name: WhitespaceTokenizer
    intent_tokenization_flag: True
  - name: LanguageModelFeaturizer
    model_name: bert
    model_weights: dbmdz/bert-base-german-uncased
  - name: RegexFeaturizer
  - name: LexicalSyntacticFeaturizer
  ...

image

Updating to RASA 3.x

Would be nice to update this project to support newest RASA version (if it is still being developed).

Make utilities available for Jupyter Users too

Some of the tools we provide are really swell for streamlit, but some of use like to hack. We should also make some of the charts/tools available from Jupyter.

Something like;

from rasalit import load_interpreter
from rasalit.charts import nlu_chart

Feedback on `overview` command

This is a place where we might collect feedback for the overview command. Let us know if it helps you or if there's features missing!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.