This project is based on rasa.ai NLU. It is used to predict the intents for our core server based on natural language. For more details on the core project, lookup the following links KFZ-Core or Core. The main part of this project are the intents which are test data to trail the nlu in order to understand and predict the user intents.
- intents contains training data, each intent got a single file to separate them.
- config.yaml describes the configuration of the nlu, there are additional information about data processing and entity extraction.
- models contains the trained models, these are autogenerated at run time.
- tests contains example test files for some intents.
- train_nlu.py script which will train a model with all given intents.
First you need to install the requirements by using the provided 'requirements.txt'. To achieve this you can run the following command: If you wnat to get more informations about the installation, please read NLU Installation.
pip install -r requirements.txt
Furthermore you need to install the language specific package of spacy which could not be done be the 'requirements.txt'. You could user the next command to install this dependency:
python -m spacy download de
In order to run the nlu you need to fill the intents directory with some trainings data. After that you can simply generate (train) a model by using the following line:
python src/train_nlu.py
After generating a model, you need to start the http server so the core could interact with the nlu. This is done be the following command:
python -m rasa_nlu.server --path models/ -c config.yaml
To deploy and run this project docker is mandatory, you would need to install docker as well as docker stack or docker compose. You can run the build by the following command. We will tag the image with our docker registry url and the projects name.
docker build -t docker.nexus.gpchatbot.archi-lab.io/chatbot/nlu .
In order to run the nlu you will need to create a docker network which is called 'chatbot'. You can do this by running the following command:
docker network create chatbot
The project could be run by the following command:
docker-compose -f docker/docker-compose.yaml up -d
If you want to test the nlu automated, you can add some test scripts in the tests directory and run the following command:
py.test
If you want to test the nlu by yourself you can send http-request to the nlu-server api. It will respond with its prediction data.
curl -XPOST localhost:5000/parse -d '{"q":"hallo"}'
Further information about the status and configuration of the nlu-server could be received by querying /status or /config