top of page

Implementing a GPU container on Jetson Nano, an example: AI/DL Hand pose recognition problem

Updated: Jun 18, 2023

Lab Setup

To show the steps, we will use an NVIDIA GPU Cloud(NGC) container. NGC is the hub for optimised GPU containers for developing AI and HPC applications. The case study we have chosen is Hand gesture detection. Here we will be only doing the inference. For doing this project, two sources were very helpful, they are listed in References.

What are the Benefits of GPU container?

  • Saves time and effort as there are no version incompatibility issues amongst tools

  • Pre-tested and hence easy to deploy

  • Optimized and hence inference runs smoothly even on the Jetson platform with minimum resources

NVIDIA Jetson Nano

The configuration used in this lab is as follows:

  • Device 0: "NVIDIA Tegra X1" - Maxwell GPU

  • CUDA Driver Version / Runtime Version is 10.2 / 10.2

  • CUDA Capability version number is 5.3

  • Total amount of global memory is 1972 MBytes

  • 1 x Multiprocessors, (128) CUDA Cores/MP totalling 128 CUDA Cores

To know more about JetsonNano click here.

Use the following link to install docker

sudo apt-get install ca-certificates curl gnupg
sudo install -m 0755 -d /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
 sudo chmod a+r /etc/apt/keyrings/docker.gpg
 echo   "deb [arch="$(dpkg --print-architecture)" signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu  "$(. /etc/os-release && echo "$VERSION_CODENAME")" stable" |   sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update
sudo apt-get install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
sudo docker run hello-world

Some useful Docker commands

to list running containers

sudo docker ps

to stop a running container

sudo docker stop 'container ID'

( ID is taken from previous command)

Setting up the GPU Container

Prepare the Jetson Nano with latest Jetpack SD Card image. Let Jetson be in standalone Headless mode.

  • Pull the l4t-ml GPU container.

  • Run the container and login to the Jupyter Notebook running on Jetson Nano. On any system on the network, the browser can point to ( https://IP-address-of-Jetson:8888/ ). you can get terminal access to Jetson.

To know all packages, from within a container shell type

pip3 list

Install TRT POSE related dependencies on Jetson using terminal

Follow the steps in GETTING STARTED section. The most important step is detecting the camera.

git clone https://github.com/NVIDIA-AI-IOT/jetcam 
cd jetcam
python3 setup.py install
[Finished processing dependencies for jetcam==0.0.0]
git clone https://github.com/NVIDIA-AI-IOT/torch2trt 
cd torch2trt
python3 setup.py install
[Finished processing dependencies for torch2trt==0.4.0]
pip3 install tqdm cython pycocotools
[Successfully installed importlib-resources-5.4.0 pycocotools-2.0.6 tqdm-4.64.1]
cd /home/l4t-data
git clone https://github.com/NVIDIA-AI-IOT/torch2trt 
cd torch2trt
python3 setup.py install --plugins
[Finished processing dependencies for torch2trt==0.4.0]
cd /home/l4t-data/trt_pose
python3 setup.py install
[Finished processing dependencies for trt-pose==0.0.1]

Go to https://github.com/NVIDIA-AI-IOT/trt_pose_hand & choose the download link

cd /home/l4t-data
git clone https://github.com/NVIDIA-AI-IOT/trt_pose_hand.git 
[project files will get downloaded]

Download the Pre-trained Resnet18 model (81 MB) 'hand_pose_resnet18_att_244_244.pth' for Hand-pose estimation & upload into home/l4t-data/trt_pose_hand/model

Before starting the hand-pose demo first check whether the USB camera is working

Run the home/l4t-data/jetcam/notebooks/usb_camera/usb_camera.ipynb file in Jupyter Notebook to check the camera

Execute ths code in the Notebook before moving to next task. The USB camera application only requires that the notebook be reset, whereas the CSI camera application requires that the 'camera' object be specifically released

import os
os._exit(00)

Open the live hand-pose ipynb file from the folder /home/l4t-data/trt_pose_hand/

Run each cell in that notebook one by one

We have completed the Hand pose modelling. Now, using the Jupyter Notebook open the file home/l4tdata/trt_pose_hand/gesture_classification_live_demo.ipynb. As mentioned earlier we are not doing training. We will only do inference.








References

bottom of page