Coder Social home page Coder Social logo

nvidia_jetson_nano_aws_workshop's Introduction

AWS ML@Edge with NVIDIA Jetson Nano/Xavier/TX2

Goal of this workshop is to demonstrate how to deploy image classification model (resnet18) on to NVIDIA Jetson devices using AWS IoT Greengrass. Deployed edge application (IoT Greengrass Lambda) on the Jetson will classify images based on imagenet classification.

Link to the Jetson image used for this repo - https://drive.google.com/file/d/1xCARf2FUwZ2hrzIYLfHc_SdtPabxs7VS/view?usp=sharing or http://d2izi96gg1mdrw.cloudfront.net/jetson/nano/awsnv_nano.img.zip

OR if you don't want to reinstall a fresh image you can manually install prerequisites at: https://public-ryan.s3.amazonaws.com/jetson/nano/AWS_NVIDIA_Training_Jetson_Setup_Guide.pdf

Here are the steps we will follow:

Step 1: Download trained model

Step 2: Deploy model on Jetson using AWS IoT Greengrass

Step 3: Test the model inference on AWS IoT console

Let's start with Step 1:

Step 1: Download trained model

In this lab we will use sample model provided by SageMaker Neo.The model is resnet18, performs general purpose image classification (based on imagenet) Model is already converted using SageMaker Neo for Jetson platform.

Download model from link **** Upload the model to your own S3 bucket *****. This will be used in step 2.4 below.

Step 2: Deploy model on Jetson using AWS IoT Greengrass

This step will need

  • 2.0 Python 3.7 setup
  • 2.1 Installing SageMaker Neo runtime
  • 2.2 Installing AWS IoT Greengrass
  • 2.3 Setup and configure Inference code using AWS Lambda
  • 2.4 Set machine learning at edge deployment
  • 2.5 Deploy machine learning at edge on NVIDIA Jetson
  • 2.6 Run model, check inference

2.0 Python 3.7 setup

AWS Greengrass currently supports Lambda with Python v 2.7 or 3.7., but not 3.6. The Jetson image is based on 3.6, so we need to install 3.7 in parallel or you can do one of two choices:

  1. Install Python 3.7 in parallel to 3.6
sudo apt-get install -y python3.7
sudo apt-get install -y python3.7-dev
sudo apt-get install -y python3.7-ven

#need to run following commands under root


python3.7 -m pip install setuptools==3.7.1
python3.7 -m pip install setuptools --upgrade
python3.7 -m pip install Cython
python3.7 -m pip install numpy

#numpy install may take 5-10 mins while waiting you can move to step 2.2, open another terminal
  1. OR Create symlink from Python 3.7 to 3.6
sudo ln -s /usr/bin/python3.6 /usr/bin/python3.7

2.1 Installing SageMaker Neo runtime

SageMaker Neo Runtime aka SageMaker Neo DLR is a runtime library that helps run models compiled using SageMaker Neo in the cloud. In our model training step, last step is to compile the model using SageMaker Neo. In following steps we will install SageMaker Neo Runtime. Because there is no official wheel for Jetpack 4.4 just yet, we will install an unofficial pre-built wheel to save time compiling it yourself.

sudo su

tar xzvf neo-prebuilt.tgz
python3.7 -m pip install neo-ai-dlr/python/dist/dlr-1.2.0-py3-none-any.whl

If you created a symlink from Python 3.6 to Python 3.7 you can run this instead:

sudo pip3 install neo-ai-dlr/python/dist/dlr-1.2.0-py3-none-any.whl

2.2 Installing AWS IoT Greengrass

Open terminal on your Jetson device

Run the following commands on your Jetson to create greengrass user and group:

sudo adduser --system ggc_user
sudo addgroup --system ggc_group
sudo usermod -a -G video ggc_user

Setup your AWS account and Greengrass group (more info here https://docs.aws.amazon.com/greengrass/latest/developerguide/gg-config.html)

Setup Greengrass:

Select default

Setup new Greengrass Group:

Attache Role: Create new basic AWS Lambda role or use existing role

Disable streaming

Create new Greengrass Core:

Dowload certs:

After downloading your unique security resource keys to your Jetson that were created in this step, proceed to step below. If you created and downloaded these keys on machine other than Jetson then you will need to copy these to Jetson. You can use SCP to transfer files from your local machine to Jetson.

Download the AWS IoT Greengrass Core Software (1.10.2 or latest) for ARMv8 (aarch64):

(please see latest version here https://docs.aws.amazon.com/greengrass/latest/developerguide/what-is-gg.html#gg-core-download-tab)

wget https://d1onfpft10uf5o.cloudfront.net/greengrass-core/downloads/1.10.2/greengrass-linux-aarch64-1.10.2.tar.gz

Extract Greengrass core and your unique security keys on your Jetson:

sudo tar -xzvf greengrass-linux-aarch64-1.10.2.tar.gz -C /
sudo tar -xzvf <hash>-setup.tar.gz -C /greengrass   # these are the security keys downloaded while setting up greengrass

Download AWS ATS endpoint root certificate (CA):

cd /greengrass/certs/
sudo wget -O root.ca.pem https://www.amazontrust.com/repository/AmazonRootCA1.pem

Start greengrass core on your Jetson:

cd /greengrass/ggc/core/
sudo ./greengrassd start

You should get a message in your terminal "Greengrass successfully started with PID: xxx"

2.3 Setup and configure inference code using AWS Lambda

(optional: more info on AWS Greengrass Lambda https://docs.aws.amazon.com/greengrass/latest/developerguide/create-lambda.html)

Go to AWS Management console and search for Lambda

Click 'Create function'

Choose 'Author from scratch'

Name the function: e.g. jetson-workshop Role: Choose an existing role [Note: You may need to create new role, give basic execution permissions, choose default)

****Please choose Python 3.7 runtime

Click Create Function with default code. Once the AWS Lambda function is created, open it again and upload from this repo. You will need to download lambda.zip to your local machine first.

in basic settings, please change the handler to "inference-lambda.lambda_handler"

Publish AWS Lambda

[optional] - You can open the interface-lambda.py a code and get familiar. It uses test.jpg at line#59. This test image will be used by the AWS Lambda function as input for ML model.

2.4 Set machine leaning at edge deployment

  • In memory, set it to 700mb+
  • In resources, add ML model as per below, Select S3 bucket where optimized model (i.e. SageMaker Neo compiled) is located. ***** Select bucket first from dropdown box and then model file *****

  • Setup Greengrass role: go to "Settings" menu on left menu items, this will open Greengrass settings. Check top part that says "Group role", select Greengrass service role. Go to AWS IAM console, go to roles, select the greengrass role and add "AmazonS3fullAccess", "CloudWatchFullAccess" and "AWSGreengrassResourceAccessRolePolicy" .. see screenshot below for reference.

  • Setup Greengrass logs Under "Settings", scroll down, you will see option to setup log level. Setup Greengrass and Lambda logs to info-level logs per screenshot below

2.5 Deploy machine learning at edge on NVIDIA Jetson

  • make sure Greengras is started

  • Go back to AWS IoT Greengrass console
  • We will need to send messages from NVIDIA Jetson to cloud. so, we need to setup message subscription per screenshot below. Choose "subscription" menu from left menu items, choose "source" as your Lambda function and destination as "IoT Cloud", topic as one used in the Lambda code i.e. "neo-detect". This will route messages from Lambda to IoT Cloud i.e. AWS IoT.

  • Now we are ready to deploy model, Lambda and configuration.
  • From Actions menu on top right side, select "Deploy"
  • This will take few minutes to download and deploy model

2.6 Check inference

  • Select Test from left menu

  • Add "#" in Subscribe topic, click Subscribe. This will subscribe to all IoT topics coming to Jetson

  • In Subscription box you will start seeing IoT messages coming from your Jetson device

2.7 Troubleshooting

  • Error logs are recorded in CloudWatch, you can log into AWS CloudWatch and check for greengrass errors
  • Lambda user error logs on device are located at /greengrass/ggc/var/log/user and then your region, account number, then you will see log file named after your Lambda e.g. inference-lambda.log
  • Greengrass system logs are on device at /greengrass/ggc/var/system. There are many logs, runtime log is imp
  • if you get any error related to camera buffer then please run command "sudo systemctl restart nvargus-daemon" to restart related process.
  • to start and stop greengrass, cd to /greengrass/ggc/core and then ./greengrassd start to start and ./greengrassd to stop

Summary

In this lab we have completed:

  1. Setup and configure AWS IoT Greengrass
  2. Deploy the inference Lambda function and model on NVIDIA Jetson
  3. Test the model inference data using AWS IoT dashboard

nvidia_jetson_nano_aws_workshop's People

Contributors

mahendrabairagi avatar rvanderwerf avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

nvidia_jetson_nano_aws_workshop's Issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.