Inference Repository for the bitou model - deployment
- Ensure that you have docker installed on your computer
- Only once - Build the image defined in the
Dockerfile
.- Either through
docker compose build flask-deploy
. This will build the image with the namebitouflask
- Through running
docker build --target prod -t bitouflask
in the command line
- Either through
- Deploy the image with the name
bitouflask
(which should show up as built when runningdocker images
)- Either through
docker compose up flask-deploy
- Or through
docker run -p 5000:5000 bitouflask
- Either through
- Open a private browser window (otherwise the content of the html will be cached and updates not displayed correctly) and enter
127.0.0.1:5000
- Select an image and click "upload" to show inference results
- Building the image (steps 1 and 2) are only necessary when definitions have changed. Otherwise start at step 3 to use the local image
- Building the image will take a significant amount of time (~45 minutes) and Hard drive space (~8 GB)
The package can also be installed using conda on Ubuntu Linux 22.04.
- run
conda env create -f csuinf.txt
to create the environment - run
conda activate csuinf
to activate the environment. This has flask and all necessary files installed - download / symlink the files
model.py
andutils.py
from the bitou segmentation repository- Download
model.py
and place it insrc/csuinf
- Download
utils.py
and place it insrc/csuinf
- Download
- Install the local files locally into the conda environment through running
pip install -e .
- Download the
colour_code.json
and place it inconfig
- Run
app.py
by runningpython app.py
. Alternatively use the default flask config that is targeted. - Access
127.0.0.1:500
in a local private browser and upload an image. The uploaded images can be found in theimages
directory and are persistent!
A debugging configuration is provided that allows running the model inside a docker container and attaching to it through a vscode debugger.
- build the
dev
configuration of the container.- Through
docker compose build flask-debug
- Through
docker build --target dev -t dev/pytorchflask .
- Through
- Deploy the container through running
docker compose up flask-debug
- this will ensure correct port forwarding - Place a breakpoint in
app.py
in vscode. - Use the launch configuration
Python: Remote Attach
defined in launch.json to runapp.py
- Open a private browser and enter
127.0.0.1:5000
and follow the desired commands - vscode should stop at the breakpoint
- This basic one from the pytorch development team, on their website
- This one from docker itself how to run a flask application inside docker on git
- With a Tutorial to go alongside in the docker blog
- This one from an AI Engineer on Medium
- This one from another engineer, which also includse docker on his website
- This one from some guy who already wrote a git for this stuff on GitHub
- This better Github Version of a dockerfile on GitHub
- Something about Docker Port mapping and exposing here
- This one about uploading images with flask to a website from some guy on Youtube
- Image Upload and display without saving to storage - on GitHub
- Image upload and display with saving to storage - a blog post
- Debugging flask and python with vscode
- Displaying an image in flask with python on Stackoverflow
- Debugging docker and flask inside vscode in this medium post
- Dockerizing and debugging flask and vscode with docker in this blog post