Setting up Realsense R200 on Odroid XU4 with ROS

Update 1st may 2018: added a note to use realsense nodelet tag 1.8.1 and some patches to use ROS built from sources or GCC7.

This post will guide you through the configuration of a Realsense R200 on Odroid XU4. There is nothing strictly related to this platform so they should apply to any armhf/x86 Ubuntu 16.04.xx systems.

Let’s start from the requirements:

Preparing the workspace

Skip this section if you have already a workspace.

If don’t, let’s do it now (remember to ‘source /opt/ros/kinetic/setup.bash’ in case you have not added it to your .bashrc):


Librealsense requires some prerequisites to be built:

Cloning the repositories

now it’s time to clone the librealsense and the Realsense nodelet repo:

NOTE: if using ros built from source (and not installed under /opt/ros/* apply the following patch to librealsense:

Also if you are using gcc7 you probably need the next patch as well:

Please note that you need to use librealsense v1.12.1 and realsense 1.8.1 with R200 as the support was dropped in later versions.

Building and testing

You may move to the root of your workspace (ros_ws) and build it:

(j4 is required to prevent the compilation failures due to RAM constraints)

In case of failure check whether it is asking for some dependencies (every distribution  has always a slightly different set of packages preinstalled).

Now you can test if everything is working fine.

Ensure to have ROS Kinetic installed on your pc and to configure properly the odroid XU4 netbios name inside your PC /etc/hosts file (all details for network setup here).

then you can show the image topics from camera using:

Change with the IP of your embedded linux box(Odroid XU4).

“Padre Balistico” a (hopefully) fast line follower robot


Spending time with the guys at Officine Robotiche (a huge thank to Stefano Artigianato Robotico),  i’ve discovered the world of line followers. Apparently simple robots whose ultimate goal is to, guess what, follow a line 🙂

This simple task involves several issues:

  • reading sensors to determine the line displacement;
  • employing a control algorithm to determine the movement vector;
  • motor control;
  • several other ancillary stuffs, like telemetry, etc;

Obviously mine will run LibrePilot 🙂 and that simplifies a lot of things as most of the component needed are either in place or needs minor rework.

Unfortunately I had no enough time to make a custom board so I went for some ready stuffs for sensor board and motor driver. Thanks to Stefano’s suggestions and after fiddling with their datasheets I ended with the following component list:

It will be based on OpenPilot Revolution board given it has plenty of I/O and RF module onboard, useful for tuning and telemetry.

I’ve also left some mounting holes that may host a NanoPI neo. One day it will be used for optical recognition for, i.e. better line speed planning.

This is the frame I made for this robot



It is made of two parts, the main frame structure and sensor housing. It is available for download at Thingiverse (

I used PETG for the main part and PLA (that is hard and have very low friction).


img_20160925_195654_hdr  img_20160925_194736_hdr

And here is a short video of the thing moving for the first time, using a standard RC transmitter (and related receiver) for control. It is already using gyro for yaw/trajectory stabilization

There is still a big to do list ahead, including (but not limited to) reversable motors handling and sensor reading.

LibrePilot had already almost everything needed. I added a specific target ( i called that roborevolution  based on the revolution target, employing several changes needed to manage brushed motors and all the sensors needed). To have this first test working i only had to tweak the Servo motor drivers to handle higher frequencies and to increase the available resolution.


edit: here is the, still very hacky. source code i’m working on

edit2: and it works 🙂 until i find some time for a new update post, here are two videos of its first tests

…after a bit of tuning 🙂