Towards better athletic intelligence
Install Catkin tools using the steps provided here
# Install dependencies
sudo apt install libmpfr-dev
# Download project
mkdir -p <your-workspace>/src && cd <your-file> && catkin init && cd src
git clone git@github.com:asblab-research/tbai.git --recursive
# Install other dependencies using rosdep
cd .. && rosdep install --from-paths src --ignore-src -r -y && cd src/tbai
# !! Now install libtorch by following the installation guideline below
There are two steps to installing libtorch
. First, you need to download a suitable libtorch
version.
Once the library is downloaded, it's necessary to create a symlink to it in the dependencies
folder.
Here's how to do it:
Get your download link from the official PyTorch website. Note that opting for the (cxx11 ABI)
version is paramount.
If you download the (Pre-cxx11 ABI)
version, things won't work as necessary.
Now that you have your url, you can download the library, unzip it and create a symlink in the dependencies
folder.
wget <your-url> # Latest URL (JUL-29-2024) - https://download.pytorch.org/libtorch/cpu/libtorch-cxx11-abi-shared-with-deps-2.4.0%2Bcpu.zip
unzip <downloaded-zip> -d <your-workspace>/src/tbai/dependencies # can be `dependencies` folder of the package
ln -s <your-folder>/libtorch dependencies # Only necessary if, in the previous step, you did not unzip in `dependencies`
Your dependencies
folder should not look as follows:
cd <your-workspace>/src
git clone git@github.com:asblab-research/aws-robomaker-hospital-world.git
In case you are using a venv these additional packages will be needed.
pip install empy==3.3.4
pip install catkin_pkg
sudo apt-get install doxygen
pip install rospkg
pip install defusedxml
#Build aws-robomaker-hospital-world
cd <your-workspace> && catkin build aws_robomaker_hospital_world
# Build tbai
cd src/tbai
catkin config -DCMAKE_BUILD_TYPE=Release
bash ./tbai.bash --build # This will only build the necessary packages
# Source
cd ../.. && source devel/setup.bash
Currently only tbai_mpc_perceptive and tbai_rl_perceptive are modified to use aws-robomaker-hospital-world.
# Start ROS and relevant nodes
roslaunch tbai_mpc_perceptive simple_hospital_two_floor.launch # single floor world can be launched using simple_hospital.launch
# Change controllers (in a different terminal)
rostopic pub /anymal_d/change_controller std_msgs/String "data: 'STAND'"
rostopic pub /anymal_d/change_controller std_msgs/String "data: 'WBC'"
#rostopic pub /anymal_d/change_controller std_msgs/String "data: 'SIT'"
# In the MPC controller terminal (started with roslaunch) change gait to trot
# Start ROS and relevant nodes
roslaunch tbai_rl_perceptive simple_hospital_two_floor.launch # single floor world can be launched using simple_hospital.launch
# Change controllers (in a different terminal)
rostopic pub /anymal_d/change_controller std_msgs/String "data: 'RL'"
rostopic pub /anymal_d/change_controller std_msgs/String "data: 'BOB'" # RL and BOB are the same controllers
rostopic pub /anymal_d/change_controller std_msgs/String "data: 'STAND'"
#rostopic pub /anymal_d/change_controller std_msgs/String "data: 'SIT'"
A joystick is used to control the robot. joy_node
is started through the launch file.
left analog stick = translation ; right analog stick = rotation
If you do not have access to a controller (recommended), a joystick emulator is provided.
rosrun tbai_rl_perceptive joy_emulator.py
📦tbai
┣ 📂tbai_static # Static (high gain PD) controller
┣ 📂tbai_mpc_perceptive # Perceptive NMPC controller [1]
┣ 📂tbai_mpc_blind # Blind NMPC controller [1]
┣ 📂tbai_rl_perceptive # Perceptive RL controller [2]
┣ 📂tbai_rl_blind # Blind RL controller [2]
┣ 📂tbai_dtc # DTC controller (perceptive) [3]
┣ 📂tbai_joe # Perceptive NMPC controller with NN-based tracking controller [1],[3]
[1] Perceptive Locomotion through Nonlinear Model Predictive Control
https://arxiv.org/abs/2208.08373
[2] Learning robust perceptive locomotion for quadrupedal robots in the wild
https://arxiv.org/abs/2201.08117
[3] DTC: Deep Tracking Control
https://arxiv.org/abs/2309.15462
mpc_perceptive_f.mp4
mpc_blind_f.mp4
rl_perceptive_fe.mp4
rl_blind_fe.mp4
dtc_f.mp4
joe_f.mp4
If any of the steps throws an error for you, please let use know and we will try to extend this guideline with a fix as soon as possible. Thanks 🤗
This project stands on the shoulders of giants. None of this would have been possible were it not for many amazing open-source projects. Here are a couple that most inspiration was drawn from and that were instrumental during the development:
- https://github.com/leggedrobotics/ocs2
- https://github.com/qiayuanl/legged_control
- https://github.com/leggedrobotics/legged_gym
- https://github.com/leggedrobotics/rsl_rl
- https://github.com/ANYbotics/elevation_mapping
- https://github.com/leggedrobotics/elevation_mapping_cupy
- https://github.com/bernhardpg/quadruped_locomotion
- https://github.com/stack-of-tasks/pinocchio
- https://x-io.co.uk/open-source-imu-and-ahrs-algorithms/
- https://github.com/mayataka/robotoc
- https://github.com/mayataka/legged_state_estimator
- hundreds of others ...
Thank you all 🤗