Coder Social home page Coder Social logo

francbartoli / covid19-helper-bot Goto Github PK

View Code? Open in Web Editor NEW

This project forked from alessandrojcm/covid19-helper-bot

1.0 1.0 0.0 221 KB

A helper Whatsapp Bot for COVID-19 related information

License: GNU General Public License v3.0

Dockerfile 0.22% Shell 1.19% Python 98.43% JavaScript 0.16%

covid19-helper-bot's Introduction

COVID19 Helper Bot

buddy pipeline DeepSource buddy pipeline

A Whatsapp Bot to help get information about COVID19, aiming to participate on the Twilio & Dev 2020 Hackathon.

Using Twilio Autopilot and Python 3.8

Stack

  • Python3.8.2
  • Poetry
  • Fastapi
    • Pydantic
  • Fauna DB
  • Hosted on Heroku
  • Pytest
  • Loguru
  • Requests
  • Sentry
  • Click

Kickstarted with: https://github.com/Dectinc/cookiecutter-fastapi

Project structure

Files related to application are in the app or tests directories. Application parts are:

.
├── app
│   ├── api - Api routes
│   │   └── routes
│   │       └── autopilot - Twilio Autopilot Dynamic actions
│   ├── core - Critical configuration (sessions, logging, etc)
│   ├── custom_routers - FastAPI custom routes
│   ├── error_handlers - Custom error handlers
│   ├── middlewares - FastAPI/Starlette middleware configuration
│   ├── models - Pydatic models
│   ├── scripts - Helper scripts (mainly cli stuff)
│   ├── services - Utilites for interaction with external apis and logic too heavy for the routes
│   └── utils - Helper functions
├── assistant - Autopilot schema
├── design_docs - Some diagrams (not to comprehensive)
└── tests - Pytest

How to run

First, install dependencies with Poetry: poetry install.

Integrated cli

This app comes with an integrated CLI, activate the env (with poetry shell) and run python main.py. The options are:

Usage: main.py [OPTIONS] COMMAND [ARGS]...

  COVID19 Whatsapp Bot CLI Helper

Options:
  --help  Show this message and exit.

Commands:
  create-collections  Creates all collections defined in the models folder
  generate-env        Generates a file .env with the default configuration with the default values
  prepare-schema      Replaces the variables in the assistant schema for the ones on the env
  run                 Runs the development server

FaunaDB

Then, we need to set FaunaDB; two options:

Use provide Docker file

  • Spin up the Docker image: docker-compose up.
  • Run the FaunaDB shell: docker-compose exec --user root shell /bin/bash
  • Run: fauna create-database myapp
  • Then, fauna create-key myapp
  • Copy the key that gets generated

Use FaunaDB from the cloud

Easier but not so recommended for development.

Environment variables.

The default environment variables are defined ins app/models/config and are as follow:

API_PREFIX = "/api"
AUTOPILOT_ENDPOINT_PREFIX = "/autopilot" # API Prefix for autopilot endpoints
VERSION = "0.1.0"
DEBUG = False
TESTING = False
PROJECT_NAME = "COVID19 Helper Bot"
LOGGING_LEVEL = LoggingLevels.INFO
ENVIRONMENT = Environments.DEV # Possible also STAGING and PRODUCTION
FAUNA_DB_URL = "http://localhost:8443"
FAUNA_SERVER_KEY = "your_server_key_here"
TWILIO_ENDPOINT = "http://localhost:5000" # Your API's endpoint that will be called by Twilio
NOVELCOVID_JHUCSSE_API_URL = "https://corona.lmao.ninja/v2"
ENDLESS_MEDICAL_API_URL = "https://api.endlessmedical.com/v1/dx"
OUTCOME_THRESHOLD = 0.45  # Confidence level for recommending seeking medical help
FAKE_NUMBER = '+15555555' # Fake number to be user in dev
TWILIO_AUTH_TOKEN # Not needed in dev, can be leaved as this
SENTRY_DSN # Not needed in dev, can be leaved as this

The order on which the variables take precedence are:

  1. The ones defined in the SO environment
  2. The ones defined in the .env file (can only be used in dev mode)
  3. The defaults on the config schema

For development purposes, just generate the .env file with python main.py generate-env

You need to replace FAUNA_SERVER_KEY with the key obtained when you set up FaunaDB. Also, if you are using the cloud version, you need to replace the FAUNA_DB_URL with the url stated on that step.

Create schema

Once the envs are set up, run python main.py create-collection to generate the FaunaDB Collections an indexes.

Running

Finally, you can run the server with python main.py run. Docs available under /docs.

Generate the Autopilot Assistant

The Autopilot Assistant Schema for this bot lives in assistant/schema.json.

Refer to Twilio docs for how to create an assistant with the CLI.

Once that's done, replace the TWILIO_ENDPOINT with your url; run python main.py prepare-schema. This option looks for occurrences of %TWILIO_ENDPOINT%%API_PREFIX%%AUTOPILOT_ENDPOINT_PREFIX% on the schema and replaces with the values defined in the environment (or .env file if in dev mode).

Then, open the after_deploy.sh script and replace the --unique-name flag for your assistant's unique name. Run the script with bash to update your assistant with the schema (needs yarn installed). This script is a bit rough around the edges, since it was written ad-hoc for CI/CD. In summary, what this script does is prepare the schema with the prepare-schema command defined above (so you don't need to run that if you use after_deploy) and uploads the schema to twilio using its cli.

For running in local

In dev mode, the API injects the FAKE_NUMBER env into all incoming Autopilot Requests, this way you can test your API locally from the Twilio Simulator using a SSH tunnel such as ngrok.

Deployment

This API comes with a Procfile ready to be deployed on Heroku, the only extra configuration you need is to add this buildpack in order for Heroku to support Poetry. But, for running, it will require the following envs to be present:

  • FAUNA_SERVER_KEY
  • TWILIO_AUTH_TOKEN
  • SENTRY_DSN

Tests

The tests are by no means complete, but you can run them anyways with pytest. The caveat is that, being FaunaDB so new, there is not much info on how to mock it; so the FAUNA_SERVER_KEY would need to be present (use Docker for this).

Information

All information regarding statistics are from the Novelcovid API; which in turn takes its data from the John Hopkins University repository.

The Endless Medical API was used for the analysis of the user symptoms. This analysis only serves as guidance and does not replace the diagnosis of a doctor.

Acknowledgments

Huge thanks to the team of Endless Medical for answering my inquiries and being so helpful overall. And thanks to all the friends and family who copped with my annoying "could you please text the bot?"

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.