Skip to content
This repository has been archived by the owner on Jun 24, 2022. It is now read-only.

wuxiaohua1011/ROAR_Jetson

 
 

Repository files navigation

ROAR_Jetson

Intro

This is the code for ROAR platform that run on a Jetson Nano. It is extensible with respect to control side. By default, it is controlled by analog controller with help of Arduino, where Arduino receives signal from analog controller and control the car, which we call Analog-Control mode. When there is no analog controller on, the system switches to Jetson-Control mode, where Arduino receives signal from Jetson. In this case, you are allowed to implement your novel physical (e.g. joysticks) or algorithmic (e.g. autonomous pilot) controller and easily plug it in our system.

While running, it keeps sending out video and control signals which could be received by other divices for display. We created a ROAR_VR platform which provides you immersive driving experience.

Setup

Jetson

Clone this repository into your Jetson. The code runs on Python 3. Pre-requisite libraries include: numpy, opencv-python, statistics, prettytable, pyserial, pyrealsense2. Use pip to install them. If you have MIPI camera connected or you want to involve VR, you also need to install Nvidia Accelatated GStreamer, you can find the setup guidance here.

Arduino

Then we need to upload the Arduino code into Arduino board. This is done in Jetson. In Jetson, launch Arduino IDE. Select Arduino Nano in Tools -> Board: "..." and select the channel that connects your jetson and Arduino (For example /dev/ttyUSB0) in Tools -> Port: "...". Then click "Upload" button to burn the code into the board.

VR

This part is optional, if you want to involve VR into your tryout, refer to ROAR_VR.

GStreamer

Our video streaming relies on GStreamer, which is a pipeline-based cross platform multimedia framework used to handle streams of different formats.

GStreamer works on pipelines that consist of different modules. The data stream is provided by sources, transformed and encoded by codecs and filters and then passed to sinks such as a display window or other external API. It provides a wide range of plugins, all you need to do is to assemble those elements as a pipeline. We stream H.264 encoded video over rtp to a port using 'udpsink'. And the video could be received from 'udpsrc' on another device. The sending pipeline can be found in camera.py.

On the receiver side, you can simply use command line to decode the videos. Here is an example command: 'gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink' if the connection is successful, you will see the video shows in a pop-up window.

If you want to edit and display the video in some other application, GStreamer provides a sink plugin called 'appsink'. We recommend using OpenCV VideoCapture, which you can put the pipeline string in the constructor. Then you can treat it as any other video sources in OpenCV and read the frames. Here is our sample code.

Our performance relies heavily on the speed of the network. We take advantage of Wi-Fi 6, the faster next-generation Wi-Fi standard.

Currently we are able to real-time stream two 1280*720 resolution videos.

Run

Before you run the main program, you may want to modify the configuration in myconfig.py.

If you get VR or other receiver involved, set CLIENT_IP to ip address of your PC in the format as "192.168.1.50". You can also specify ip address in command line. You can change IMAGE_W and IMAGE_H to get a different resolution, but along with that, you need to also change some parameters on the receiver side.

If you want to play it out in Jetson-Control mode, you can change THROTTLE_MAX and STEERING_MAX to automatically scale the values sent to Arduino to limit maximum speed and angle. To use your own custom Controller (will be explained later) instead of NaiveController, change CONTROLLER parameter to the name of the Controller class you define.

After the configuration is all set, execute ./roar-vr.py in command line to run in Analog-Control mode or ./roar-vr.py -c to run in Jetson-Control mode. When you are playing with VR, you can specify ip address of PC in here by adding parameter in the command as --ip "192.168.1.50". This will override the CLIENT_IP setting in myconfig.py.

Some examples

# ./roar-vr.py                      # Analog-Control mode. No video streaming.
$ ./roar-vr.py --ip 192.168.1.50    # Analog-Control mode. Stream video to 192.168.1.50.
$ ./roar-vr.py -c                   # Jetson-Control mode. By default you will enter commandline controller.

By default, if you run in Jetson-Control mode you will be using NaiveController and you can play it out. When the program is launched and initialization is done, type two floating-point numbers, indicating throttle and steering value, and hit Enter. Range of both is [-1,1]. You can do this as many times as you want before you press Ctrl-C to stop the program.

Customize

To customize your own Controller, you need to create your own XXXController class INSIDE controller.py, having it inherited from Controller class and overriding update and run_threaded method. Please refer to controller.py to see the template.

The most important method is run_threaded. The main vehicle thread will call this function to get throttle and steering value for next time step in each loop, and control the car accordingly. The function takes a sequence of data fetched from camera as input and return throttle and steering values as output.

If you want to implement some logic that you do not want to be synchronous with the main vehicle thread and block the main thread, you can write it in update method, which runs in another thread. An example would be NaiveController, where we naively get new throttle and steering data from command line input. Receiving input from command line is blocking, so we implement it in update since we do not want it to make the car stuck.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 90.9%
  • C 3.9%
  • Processing 3.3%
  • Python 1.5%
  • Other 0.4%