While I was working at Open Zeka, I built this car but before building that I worked on with Jetson Inference, DeepStream SDK, Hello AI World, Transfer Learning Toolkit and Jetbot
Jetson Nano: https://openzeka.com/urun/nvidia-jetson-nano-developer-kit/
CSI Camera: https://openzeka.com/urun/raspberry-pi-kamera-v2/
First of all, I started this work by drawing a simple car in Fusion360.
I mounted Jetson Nano, L298N motor driver and DC motors
Hi, the following text is going to explain creating dataset, moving the car and how to build this car.
Firstly, I started to use Isaac Sim in order to create a scene like the figure below.
Secondly, I used Synthetic Data Recorder for gathering image data with respect to their coordinates so that my dataset is created with their labels in npy file format. As the dataset was recorded as npy file, I wrote a pyhton script that converts npy file to txt file for yolo training. In the figure below, you can see how it looks like. By the way, I used Domain randomization to use texture component, light component, movement component etc. (You can see how I collected data using Domain Randomization in this link: https://youtu.be/1a3Q7aID_Ag). As a result, data collection with labels just got 30 minutes by the help of Isaac Sim
Also, you can see the rest of my data set: https://drive.google.com/drive/folders/1Kev6fsuPnR0Kro1vbTEKbixf1MmVFES5
Thirdly, I trained YOLOv4-tiny on Google Colab, then converted my YOLOv4-tiny model to TensorRT by the following link (Just use Demo 5 in this repository): https://github.com/SamedHira626/tensorrt_demos
Finally, clone this repository and just run stalker.py with python3:
python3 stalker.py