==================================
==================================
This project aims to create an automated and near real time service that predicts Hubspot contact's probability to buy exchange programs. In its main core, this project uses a XGboost model which was trained with Hubspot records of leads buying pattern.
In order to receive data from Hubspot, this project relies on a Google Cloud Function which works as a producer sending events to a redis queue. The GCP function is subscribed to HubSpot through a webhook service which triggers this function everytime a contact is created or modified in Hubspot. Then, this project's collector module reads the redis subscription queue, fetches contacts main information from Hubspot database and sends all this data to the preprocessor module through another redis queue. Next, the preprocessor module enhances the fetched data for the prediction stage. After all that, the model module predicts contact's propability to buy exchange programs by using a Xgboost machine learning model.
Finally, predictions generated by the ML model are sent to Hubspot database by the sender module. At the end of the day, these predicitons help sales representatives to decide which contacts they should invest their time.
- Optimize Hubspot sales representative's time.
- Ensure machine learning models ability of generating useful and real life insights
- Hubspot webhook API subscription.
- Google Cloud or AWS Lambda function. The function code can be found here.
- Redis server.
- Cronjob to refresh the Hupspot API token each 6 hours.
- Clone this repository:
$ git clone [email protected]:gPass0s/isales.git
- Access the project root folder:
$ cd isales
- Rename the environment file:
$ mv .env.template env
- Dockerize:
$ docker build --no-cache -t isales:latest -f Dockerfile .
- Run collector service:
$ docker run -d -it --name collector --network host --env-file env isales:latest python isales run collector
- Run preprocessor service :
$ docker run -d -it --name preprocessor --network host --env-file env isales:latest python isales run preprocessor
- Run model service:
$ docker run -d -it --name model --network host --env-file env isales:latest python isales run model
- Run sender service:
$ docker run -d -it --name sender --network host --env-file env isales:latest python isales run model
- Activate the virtual environment:
$ pipenv install && pipenv shell
- Run tests:
$ pipenv run python -m pytest --vv tests/