This repository contains the code and guides that have been produced for the bachelor thesis "Machine Learning Based Autonomous Robot with NVIDIA Jetson Nano".
The goal was to build a JetBot according to the instructions provided by NVIDIA on GitHub. The JetBot platform had to be evaluated and a demo case had to be implemented. In addition how to guides that allow the easy repetition of the conducted steps had to be written.
This guide provides instructions on how to setup the demo case and the model trainings.
For instructions on how to use the packages please have a look at the corresponding guides.
If you just want to try out the demonstration case you can skip chapters 5-8 and go directly to chapter 9.
- Prerequisites
- Order JetBot Parts
- Build JetBot
- Setup Software
4.1 Setup this Repository
4.2 Camera Tint Fix
4.3 RAM Fix - Data Preprocessing
- Setup Detection Training
- Setup Classification Training
- Evaluation
- Setup Demo Case
- Team
To follow the guides in this repository the following prerequisites have to be satisfied.
We strongly recommend to use Ubuntu and cannot guarantee that everything will run on Windows.
On the host computer:
- Ubuntu 18.04 LTS installed (recommended)
- Jupyter Notebook installed
- CUDA 10.0 or higher installed
- cuDNN 7.6.5 installed
- CUDA-capable NVIDIA GPU
On the Jetson Nano:
- JetPack 4.3 installed (comes with JetCard image)
- tensorflow-gpu 2.0.0+nv20.1.tf2 installed
- RAM fix executed (see below)
- Camera tint fix installed (see below)
We provide a customized version of the NVIDIA bill of materials. This order list fits better to the Swiss market and can be found here.
Build the JetBot according to the NVIDIA hardware guide.
Setup the JetBot software according to the NVIDIA software guide.
Connect a display and a keyboard and login with credentials:
Username: jetbot
Password: jetbot
Then execute the fixes below.
Some files had to be compressed in order to upload them to GitHub. You will need to unpack them for usage.
- Unpack the archive named events.out.tfevents.1583264662.7z.001 in howto/2_detection/output/eval_0 into the same folder
- Unpack the archive named model_collision_avoidance.7z.001 in howto/5_demo_case/model_alexnet into the same folder
You will have to install a new camera profile to fix the red tint on the Jetson Nano camera profile.
1. Download the new camera profile:
wget https://www.waveshare.com/w/upload/e/eb/Camera_overrides.tar.gz tar zxvf Camera_overrides.tar.gz
2. Unpack the archive:
tar --xzvf Camera_overrides.tar.gz
3. Copy the new profile to the target directory:
sudo cp camera_overrides.isp /var/nvidia/nvcam/settings/
4. Set the owner specifications:
sudo chmod 664 /var/nvidia/nvcam/settings/camera_overrides.isp
sudo chown root:root /var/nvidia/nvcam/settings/camera_overrides.isp
5. Reboot the Jetson Nano.
To free up additional RAM on the Jetson Nano the GUI can be disabled:
sudo systemctl set-default multi-user.target
To enable it again execute:
sudo systemctl set-default graphical.target
Note that this will deactivate the auto-login. You will have to connect at least a keyboard and maybe a display to login to the JetBot.
This package contains the data preparation instructions if you would like to train your own models.
Please note that merge_datasets_detection.ipynb
has to be only executed after the other notebooks have already been followed.
- preprocess_data_egohands.ipynb - Prepare Egohands dataset for detection
- preprocess_data_tinyhands.ipynb - Prepare TinyHands dataset for detection
- preprocess_data_lared.ipynb - Prepare laRED dataset for classification and detection.
- merge_datasets_detection.ipynb - Merge other datasets for detection.
If you wish to train your own hand detection model, you can follow the guide in this package.
Please see detection_training_guide.md for the setup details.
You can follow the guide in this package if you wish to train your own gesture classification model .
Please see classification_training_guide.ipynb for the setup details.
To evaluate the trained model you can follow the notebooks in this package.
Please see evaluation_classification.ipynb for the walkthrough details to evaluate the classification model.
Please see evaluation_end_to_end.ipynb for the walkthrough details to evaluate the pipeline with the classification and the detection model.
Please see evaluation_webcam_demo.ipynb for the walkthrough details to run the pipeline with the classification and the detection model using a webcam.
To setup the demo case perform the following steps.
- Download this repository.
- Copy the folder
howto/5_demo_case/
to the/home/jetbot/notebooks/
directory on the JetBot. - Connect to the JetBot JupyterLab, open
gesture_demo_case.ipynb
and follow the instructions.
-
Dimitri Muralt
- GitHub: @dimitrimuralt
-
Christoph Wenk
- GitHub: @ChristophWenk