So the results of the hackathon were released yesterday, and I must say, we’re very happy with how things turned out. We were placed 4th against some very tough competition, and were given an honourable mention by the organisers:
Before naming the three winners, we would like to give a give a honourable mention toteam Telemetron. They created an outstanding robotic behaviour and actively participated and documented their project throughout the week. They were also very active on Twitter. As a result they submitted by far the best documentation. – Cloud Robotics Hackathon 2013 Continue reading
So today is the day of the deadline and everything has actually come together quite nicely.
@tobypinder and I have been working on this separately since friday, using only the Internet to communicate (in the spirit of the Hackathon and all that). He did a great job with the web front end allowing us to carry out a test run with the robot transmitting live data this afternoon. There were a few hiccups with turn angles and things, but once they were ironed out, the results relatively good! Continue reading
What? There’s only 24 hours in a day now?
So my original aim to post daily updates hasn’t quite worked out. Myself and @tobypinder have been fitting this project in around ‘real’ work, so every spare minute has been spent working on the robot, not leaving much time to actually write about it! Continue reading
Today, in short, I made the robot work as a roamer. I’ve been chasing bugs all day, but finally got it working (almost) exactly how I want it. The video below shows how it was this afternoon, I’ve since tweaked some parameters and ironed out some bugs in the CAN transmission side of things so it now doesn’t get confused when it enters a wide open space with all the sensors reading maximum range. Continue reading
SO this year it seemed fate wouldn’t allow me to get my usual annual robotics fix through Eurobot; I’m at uni the week of the UK competition studying for my masters.
Instead, myself and a couple of colleagues have decided to put a few days work into the Cloud Robotics Hackathon. The theme this year surrounds remote robot monitoring, which, given our involvement in a telematics project at work, suits us down to the ground.
Project: Remote Simultaneous Localis(z)ation and Mapping (SLAM) using cloud services
The plan is to convert the robot we entered into Eurobot last year to transmit data to the ‘cloud’ web service, myrobots.com. It will roam around a room, scanning its path with infrared sensors, and transmit these readings along with encoder readings. This information will then be used to construct a map of the robot’s environment. Myrobots will be used to display robot status information, and alerts about changes in the environment. Continue reading
After an intensive week of testing on the competition table at Middlesex University (our only time with the table) our team, “R.Me.R.T”, placed 5th overall which, although not taking us to the world finals in France, was an excellent result considering the timescale of the build. The robot passed the approval phase in time to take part in all four qualification matches. Continue reading
‘An Accident Waiting to Happen’ was an art installation created by the Autonomous Systems Lab, and debuted at Kinetica Art Fair 2011. The piece consists of an organic table, on top of which sits a ceramic vase. As observers move closer to the table, the vase comes to life and slides around the table, reacting to the positions of people standing around it. The vase approaches the edge of the table, and hangs on as it teeters over the side, occasionally breaking its ‘spiritual’ bond with the table and crashing to the floor. Continue reading
The proximity sensing in ‘An Accident Waiting to Happen’ was done by a PICAXE 28X2 microcontroller, reading 6 Devantech SRF-10 ultrasonic sensors mounted around the perimeter of the table. The PICAXE board was programmed in PBASIC using their proprietary IDE. The detection cones of the 6 sensors overlapped slightly, so there were area where an observer would only be detected by one sensor, and also areas in between these where they would be detected by 2 sensors. This gave us 12 ‘zones’ in which to detect objects. Proximity readings were taken by the 6 sensors in sequence, and this was output to the PLC as a 4-bit parallel signal representing which zone had the closest object. Continue reading
Heineken Robot is an autonomous robotic bartender developed by ASL. The robot was developed from a ‘fun’ installation created for Kinetica Art Fair 2010. The project brief was to build, to a tight budget, a robot that could serve a cup of beer from an off-the-shelf mini beer keg. Continue reading