Having spent a couple of weeks playing with ROS on and off and getting robot vision, speech and a couple of other robot basics working a basic thought struck me.
Why don’t I look at the basic robots provided by ROS and see if it’s easier to reverse engineer them?
This probably should have struck me earlier, but I’d got involved with getting face detection working and was really quite enjoying it.
It turns out that the ROS Turtlebot used in the basic tutorials, has a physical counter part, also call turtlebot and is (will soon be in Europe) for sale. This platform is based on the iRobot Roomba (500 series?) vacuum cleaners.
Now I also don’t have a Roomba, and I’m not going to spend (well let’s face it, the wife wouldn’t sign it off) £200 odd on a robotic vacuum cleaner to then hack it apart for an experiment.
However the Roomba and Create both use Open Interface which handily has a pdf Protocol Document available. I did a bit of searching online, and could find a few Arduino based libraries to control Roombas, but not to emulate them.
So off I went and wrote this Open Interface Arduino Library.
Today it is a pretty complete implementation, it handles communication with a controller that thinks it’s talking to a iRobot device using callbacks to allow a developer to handle the driving, song playing and sensor interaction in their own way.
I’ve tested this with my own bot, which is based on a 1984 Tomy/Tandy Omnibot/Robbie Snr toy robot. I have it driving around controlled by the ROS Turtlebot vanilla implementation, no problems at all 🙂
Now I’m not a C++ coder by any standard, so if someone would like to comment/code review/tear to shreds, please do so. I won’t be offended (much).
We have a turtlebot_simulator stack that allows you to control a virtual TurtleBot if you want to play with a robot but not buy one 🙂
Hi there. Have you made any further progress on your ROS robot project? I know how life tends to get in the way of interesting projects like this one.
Yeah I bought a Kinect to add to it and it’s now running on a Acer Aspire One notebook running ROS. I had some issues with the Kinect pulling too much power from my DC-DC Converter (my 1980s bot runs from 6v lead acid battery and the Kinect needs a 12v supply as well as the USB).
I’ve not had him running since November. I’ll try to go back through my work and document a bit more.
Bob
Hello,
I am new to arduino and ROS. I’ve tested your OpenInterface implementation on my own Bot (which is also a Omnibot 2000 :). It works great for direct driving the Bot. My question is now, how can i add custom sensors or bumpers ? Do i have to extend the Code ? or can i just connect a bumper switch to a specific Port on Arduino ?
I want to add custom wheel encoders and servos also … How to give the Sensor messages to ROS ?
With friendly regards from Germany.
Sascha Loos
Hi Sascha,
No the interface doesn’t have any default configuration for any sensors.
You only need to register callbacks for functions that the Bot would be told to perform by ROS, e.g. move forward, activate an output/actuator.
There is a method available on the OpenInterface class setSensorValue() to set the value of any sensor.
There are some constants available to allow you to set the values.
OI_SENSOR_WHEELDROPS_BUMPS
and
OI_MASK_BUMP_RIGHT
OI_MASK_BUMP_LEFT
OI_MASK_DROP_RIGHT
OI_MASK_DROP_LEFT
OI_MASK_DROP_CASTER
the masks should allow you to address a single bit in the OI_SENSOR_WHEELDROPS_BUMPS sensor packet.
By calling OpenInterface::setSensorValue() the OpenInterface object maintains the state to send back to ROS when it is requested.
Hope that helps.
If you have any further requests, log it as a ticket in GitHub, and it may kick me into doing a bit more work on this, as I’ve lapsed recently.
Bob.
Hello Bob,
Thank you for your answer.
Because I’am not a Programmer, i have problems to find the right point to start.
At the moment I’am a bit confused about the different Filetypes like .h .cpp etc…
(Till now, i have only wrote small Bash scripts :))
If i understand it right, i have to extend the OpenInterface class (.cpp file?) for my needs. At the moment, I do not understand where to beginn, where goes the code for the Hardware Interaction like define the Pins, read the state of the Pins or trigger a Pin and so on…
even if you can show a short example on how to add a simple sensor by defining pins and write a litle routine to get pin states or set a pin, would be great 🙂
Greets
Sascha Loos
Hi Sascha,
I hope all is well with you.
I have a question for you. Currently, I have joined the open interface/arduino/ icreate movement and would like to know if you have by any way found an answer to your question. That would help me a lot.
My greetings from Oman
Sultan