Cloud Robotics Hackathon: Friday Update

What? There’s only 24 hours in a day now?

So my original aim to post daily updates hasn’t quite worked out. Myself and @tobypinder have been fitting this project in around ‘real’ work, so every spare minute has been spent working on the robot, not leaving much time to actually write about it!

Software Progress

So since my last update we’ve succeeded in getting the robot to follow walls sufficiently well. I tweaked the software that I had working by Sunday, and added functionality to not only avoid walls when it encounters them, but it now also attempts to follow a wall if it loses sight of it, through external corners, etc.

I was getting to the point with the obstacle avoidance/tracking routine where I was adding tripping over all loads of different if’s and flags. In the end I took a step back, and re-wrote it to work as a state machine, with a switch()..case controlling the current state of the robot:

Taking this approach makes the software much more predictable, as there is only one of 5 states that the task can be in at any one time.

https://github.com/chrispbarlow/cloudrobotics2013/blob/Day5/Source/Tasks/object_detection/object_detection.c

How are we getting the data, and what are we doing with it?

At this point it’s probably a good time to explain how we are connecting to the ‘cloud’. The company I work for, Smith Electric Vehicles, runs a large telematics program that enables them to monitor vehicle performance and aid future development.

For this project, we are using the same hardware that is fitted to the vehicles to log the CAN bus data and transmit it to the SmithLink servers where the data is translated and stored in a mySQL database.

Toby’s web front-end for this project pulls the data from this database and uses the encoder readings and IR sensor measurements to plot, on a canvas, the outlines of the objects that the robot sees. In other words, we see what the robot sees, gaining an insight into the robot’s awareness of it’s surroundings.

In the spirit of the Hackathon we will use the MyRobots API to plot graphs of other data that the robot is collecting, such as motor speed, temperature, etc.

As all the computing concerning the drawing of the environment and the robot monitoring are carried out by Toby’s application, it would be possible to have more than one robot working together to contribute to the same environment map, or even to contribute to a larger map

What’s it look like now?

It’s starting to look a bit mental with all the hardware bolted on!

Tallbot

Web-side progress

We are now at a stage where we can play recorded data through the system:

frontend2

What’s next?

We are very close to joining the two parts of the system so that we can see a live stream from the robot! Very exciting times!

Advertisements

Tell me what you think...

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s