This is an implementation of a mobile robot in ros2. The software includes the following functionalities:
-
Teleoperation through websockets with live video feed using webrtc (aiortc).
-
Integration of Intel realsense d435 and t265 cameras for depth estimation and localization respectively.
-
2D SLAM with cartographer.
-
3D SLAM using rtabmap.
I used the Xiaor Geek Jetbot as a base platform and modified it to include a wide-angle camera, as well as the Intel Realsense d435 and t265.
- ROS2 eloquent or foxy.
- librealsense2.
- Motor Drivers - In this case, installed from the jetbot's repository.
- Clone this repo and its submodules.
git clone --recurse-submodules https://github.com/cameronmcnz/surface.git
-
Install aiortc for webrtc support.
pip3 install crc32c==2.0 pip3 install aiortc pip3 install aiohttp
-
Install pyfakewebcam so that camera frames can be modified inside a ROS2 node and then shared through webrtc:
apt-get install v4l2loopback-utils pip3 install pyfakewebcam
-
Write a service that creates fake webcam devices that can be used to share camera frames.
gedit /etc/rc.local
Copy and paste in file:
#!/bin/sh -e modprobe v4l2loopback devices=2 # will create two fake webcam devices exit 0
Save the file and make it executable with this command:
chmod +x /etc/rc.local
-
Install Cartographer.
sudo apt-get install ros-<distro>-cartographer
-
Install rtabmap and rtabmap_ros following these instructions in the branch ros2.
```bash
cd dev_ws
source /opt/ros/<distro>/setup.bash
colcon build
. /install/setup.bash
```
-
Run local teleoperation server.
python3 local_server/webcam.py
In a browser, open the teleoperation interface by going to: <jetson_nano's ip-address>:8080
-
In a new terminal, run the motion control launchfile to start streaming video and receiving motion commands.
ros2 launch motion_control jetbot_launch.py
-
Run SLAM.
- For 2D-SLAM, in another terminal:
ros2 launch realsense_ros2 slam_cartogrepher_launch.py
- For 3D-SLAM, in another terminal:
ros2 launch realsense_ros2 slam_rtabmap_launch.py
3D Dense SLAM is too resource consuming for the Jetson Nano, in this case it is recommended to run it on a remote host. For this, simply set the same DOMAIN_ID on both the Jetson Nano and the remote host. (e.g. export DOMAIN_ID=0) and run the cameras in the Jetson Nano:
ros2 launch realsense_ros2 realsense_launch.py
Comment the nodes corresponding to the cameras on the host and run the rtabmap launch:
ros2 launch realsense_ros2 slam_rtabmap_launch.py