Department of Computing: Robotics Lecture Course (Course code 60019)

I teach the Robotics Course in the Department of Computing, attended by third years and MSc students. This is a one term course which focuses on mobile robotics, and aims to cover the basic issues in this dynamic field via lectures and a large practical element where students work in groups. For the 2020-2021 academic year we are running the course in fully remote mode, but will follow the "learn by doing" and "build robots and algorithms from scratch" philosophy of the course in its usual hardware form as closely as possible.

We will implement mobile robotics algorithms within the powerful CoppeliaSim robotics simulator, which is available free for educational purposes from Coppelia Robotics and runs on Linux, Windows or MacOS.

The course always finishes with a competition where the groups compete to build and program the robot which can most effectively complete a certain challenge against the clock. See the bottom of this page for pictures and videos from previous years' competitions.

Huge thanks to Stephen James, Tristan Laidlow, Zoe Landgraf, Shikun Liu, Hide Matsuki, Riku Murai, Joe Ortiz, Raluca Scona, Edgar Sucar, Kentaro Wada and Shuaifeng Zhi, who are the current lab assistants, have helped a lot with the development of the practicals, and are another point of contact for any problem.

Spring 2021 Important Information

Course Schedule

The timetabled slots for Robotics are for four hours a week, on Wednesday, from 10am to 12pm and Friday, from 1pm to 3pm. All lectures and practical sessions will take place live on Microsoft Teams.

All lectures will be given live, but will be recorded and uploaded soon after to PANOPTO to watch again.

In the first week of the course only we will have a two hour introductory live lecture on Wednesday. The first practical will be on Friday in the first week.

Most weeks we will have a one hour live lecture at 10am on Wednesday, and three hours of compulsory live practical session from 11am to 12pm on Wednesday, and from 1pm to 3pm on Friday.

The course runs for 7 weeks from week 2 to week 8 of college term. The full plan for lectures and tutorials will be kept up to date below, and I will announce any changes in lectures.

Lecture slides and practical sheets for the course will be available from the links below (the links will come alive gradually throughout term).

Wednesday 10am Wednesday 11am Friday 1pm Friday 2pm
Week 1
Jan 20 and 22
1. Introduction to Robotics
Lecture continued Practical
1. Introduction to CoppeliaSim
Practical continued
Week 2
Jan 27 and 29
2. Robot Motion
2. Accurate Robot Motion (assessed)
practical2_fix.ttt (fixes Windows problem)
practical2_fix2.ttt (fixes Windows problem in a better way!)
Practical continued Practical continued
Week 3
Feb 3 and 5
3. Sensors
3. Sensors
Practical continued
Assessment of Practical 2
Practical continued
Week 4
Feb 10 and 12
4. Probabilistic Robotics
4. Probabilistic Motion (assessed)
Practical continued Practical continued
Week 5
Feb 17 and 19
5. Monte Carlo Localisation
5. Monte Carlo Localisation (assessed)
Practical continued
Assessment of Practical 4
Practical continued
Week 6
Feb 24 and 26
6. Advanced Sensing
6. Navigation Challenge
Practical continued
Assessment of Practical 5
Practical continued
Week 7
Mar 3 and 5
Guest Lecture: Dr. Rob Deaves, Dyson, on Robotics in Industry
Navigation Challenge Competition
Practical continued


For practicals, you will work in fixed groups of 2-4 members throughout term. Please organise yourselves into groups, and use this Wiki to record the members of your practical group. You can see the names of the other students registered for the course via the MS Teams channel. If you need help to find group members, there is a "Search For Teammates" section on Piazza you can use, or contact me if you need more help. It is not crucial to be in a fixed team for the week 1 introductory practical, but please settle on your group by the start of week 2.

Live practical sessions will take place on MS Teams. Please join the channel assigned to your group number to work in a video chat with your group members during the session. Please note that these channels are not private and any student doing Robotics will be able to join any of the channels, so please keep to your own group's channel. You will probably want to set up your own separate way of communicating and sharing files within your group using your own MS Teams chat or otherwise.

The team of TAs and I will be in the live session and we will "visit" different groups to see how you are getting on. If you have a question, post a short message with your group number in the Lab Question Queue channel and a TA will join your channel to talk to you.

There will be a new practical exercise set every week. These will be announced in lectures and a detailed practical sheet explaining what to do will be made available from the links in the schedule. You will be able to work on these exercises during the live practical sessions with teaching assistants supporting via MS Teams, and use your own time outside of practical sessions to complete the exercises. General support outside of live sessions is available via Piazza.

Three practicals during term (the ones set in weeks 2, 4 and 5) will be assessed. The way we assess practicals is via a face to face MS Teams discussion and live demonstration of your work via screenshare with me or one of the teaching assistants. You will have one week to work on each of the assessed exercises, and the sessions when the assessments will happen are marked in the schedule above. The exact goals of each exercise and what you will have to demonstrate will be explained clearly in the practical sheets. These assessed practicals form the only coursework element of the Robotics course. No additional assessed exercises will be set and you will not need to submit any coursework documents or code. We will add up all marks from the assessed practicals throughout term to form a final coursework mark for each group. All members of a group will receive the same mark by default.

Additional Handouts

Although the course will be self-contained and you will not need any additional resources to complete it, I will link here some additional materials which some students might find interesting, related both to what we do in the course and the wider area of robotics and Embodied AI.

Robot Floor Cleaner Case Study: a tutorial we have used in previous years of Robotics to get students thinking about and discussing some of the capabilities of mobile robotic products. Questions and Answers.

Monte Carlo Localization: Efficient Position Estimation for Mobile Robots, Dieter Fox, Wolfram Burgard, Frank Dellaert, Sebastian Thrun. The original paper on MCL.

Rearrangement: A Challenge for Embodied AI, Dhruv Batra, Angel X. Chang, Sonia Chernova, Andrew J. Davison, Jia Deng, Vladlen Koltun, Sergey Levine, Jitendra Malik, Igor Mordatch, Roozbeh Mottaghi, Manolis Savva, Hao Su. An up-to-date discussion of the current challenges in robotics and embodied AI research, with a focus on research using simulation platforms (including Imperial's own RLBench, which is based on CoppeliaSim).


Thank you very much to Stefan Leutenegger, Duncan White, Geoff Bruce, LLoyd Kamara, Maria Valera-Espina, Jan Czarnowski, Charlie Houseago, Binbin Xu, Andrea Nicastro, Robert Lukierski, Lukas Platinsky, Jan Jachnik, Jacek Zienkiewicz, Jindong Liu, Adrien Angeli, Ankur Handa and Simon Overell who helped enormously with the course in previous years; and to Ian Harries and Keith Clark who developed earlier material from which the current course has evolved. In particular Ian Harries deserves the credit for making this the practically-driven course it still is today.

Localisation Challenge in 2018

The 2018 challenge was about accurate and fast localisation against a map, with robots starting in a randomly placed location and having to visit other waypoints as quickly as possible. The winning team of Qiulin Wang, Rui Zhou, Jianqiao Cheng and Chengzhi Shi had an extremely fast and efficient robot!

Path Planning Challenge in 2017

This year we had a path planning challenge, where the groups had to program their robots to use their sonar sensor to detect obstacles in a crowded area and then use a local planning approach to find a smooth but fast route across. Most groups followed the Dynamic Window Approach that I had shown in the lectures (and which you can try in simulation using this python code). This video shows the incredibly fast and smooth robot from the winning team of Alessandro Bonardi, Alberto Spina and Riku Murai. Thanks to Charlie Housego for the photo, video and commentary!

Object Finding Challenge in 2016

This year we tried quite a different challenge, but as ever with the emphasis on sensing, accurate localisation and speed. Three bottles were randomly placed within the course area which we have previously used for Monte Carlo Localisation, and the goal was to find them, proving this by bumping into each in turn, and then returning to the starting point within the course to prove that the robot was still localised. Several teams did very well, and the challenge was won by the team of Saurav Mitra, Andrew Li, Daniel Grumberg and Mohammad Karamlou who had a very fast and innovative strategy.

Localisation Challenge in 2015

This year we again had a large class (150) and introduced new custom motor controllor software by Stefan Leutenegger and Jan Jachnik to run on the Raspberry Pi/BrickPi kits. The groups had to tune their own PID controllers to make their robots move smoothly and repeatably. The final challenge was localisation in a course next to a wall, where three locations had to be visited as quickly as possible. The challenge was won by the team of Arnas Kaminskas, Baisheng Song, Jiayun Ding, Siqi Wei and Yiming Lin who overcame various hardware difficulties during to course to build a very well calibrated and precise robot.

Localisation Challenge in 2014

In 2014, with more students than ever on the course (140+), for the first time we used a new robotics platform with the combination of Raspberry Pi/BrickPi/Python/Lego NXT motors and sensors. For this year's challenge the groups had to build robots which used odometry and sonar to solve a tricky localisation and navigation problem as fast as possible. Within a walled area for which the teams were given a map in advance, the robot was placed in one of five known locations chosen randomly and at a random orientation. It had to recognise its location by taking a sonar scan and matching this against saved templates, then set off to navigate to each of the other four locations in turn before returning to its start point, using continuous Monte Carlo Localisation or other techniques of the group's choice. Accurate motor control proved to be tricky via BrickPi (something we have hopefully improved substantially for 2015 with new low level control software), but some groups still achieved very good solutions. This is the very impressive winning robot from the team of Christos Karamanos, Panos Evagorou, Pamela Cruz and Andrew Jack which used the sonar to full advantage during navigation by deliberately turning it to point orthogonally at walls for high quality measurements.

Localisation Challenge in 2013

We returned to a localisation challenge similar to the one in 2011, with the groups' robots racing against the clock to reach three waypoints in a course laid out close to the wall of the lab. A wall-following approach was generally the most successful. This year there were a very large number of students doing the course, and we had 29 groups competing, of which around 10 managed to complete the course. Several groups got very good times, but with a fast time in their second run the group of Udara Gamalath, Alejandro Garcia Ochoa, Tristan Pollitt, Shailesh Ghimire and Maxwell Popescu clinched victory in 51 seconds. Sorry that I don't have a video of their robot's winning run!

Mapping and Navigation Challenge in 2012

This year we had a completely new challenge where the teams had to build robots able to cross an area filled with obstacles without bumping into any of them. The robots had to rely on good odometry for localisation, and then use sonar to detect obstacles and plan a path between them. Marks were given depending on how far the robots managed to advance, with extra time points for those teams that made it all the way across. The starting point was the techniques for occupancy grid mapping we had learnt in the course, but the desire for speed meant that many teams switched to simpler methods due to the high computational cost of occupancy mapping.

The winning robot was developed by Nicolas Paglieri, Clemens Lutz, Antonio Azevedo and Francesco Giovannini and completed the challenge in a remarkably fast 21 seconds, though impressively around half of the teams completed the whole course and a couple came close to this time. The winning team's robot used the Lego light sensors cleverly as proximity sensors, allowing giving the robot an extra obstacle sensor which was particularly useful at high speed, and this together with fast planning gave them the best time.


Localisation Challenge in 2011

We used quite a different layout for this year's localisation challenge, with the robots needing to recognise their location and orientation in one of three randomly chosen places and then navigate fast along a long corridor to pre-determined waypoints. Marks were given both for accuracy and time.

The winners' robot was remarkably precise, and its motion included particularly nice curved entries into the waypoint spaces, all while maintaining very good speed such that it beat its nearest competitor by 8 seconds. The members of the winning group were Alexandre Vicente, Ajay Lakshmanan, Garance Bruneau, Kevin Keraudren, Axel Bonnet and Zae Kim (video courtesy of Jindong Liu).


Localisation Challenge in 2010

The challenge this year was similar to 2009, but in a new course and with a marking scheme which emphasized accuracy over speed. Again, the robots were placed randomly by me at one of five pre-learned waypoints and had to determine their locations and navigate autonomously to all of the other waypoints --- within 5cm accuracy to gain full marks, which was a challenging problem.

The winning team this time consisted of Jim Li, Daniel Abebe, Robert Kopaczyk, Nicholas Heung and Cheuk Tam, and their robot's successful completion of the course in under 40 seconds is shown below (video courtesy of the team).

Localisation Challenge in 2009

This year's competition challenge required the robots to first localise from scratch at one of five pre-learned locations, and then to localise continuously to navigate around a route of waypoints. The whole process was timed so there was some need to trade off accuracy for speed in order to win.

Again there were several teams which achieved the challenge impressively within the target time of 30 seconds, but the winners by a narrow margin were Ivan Dryanovski, Tingting Li, Wenbin Li, Edmund Noon and Ke Wang whose winning run is shown in the video below (video courtesy of the team).


Monte Carlo Localisation in 2008

The challenge at the end of the course in 2008 was to implement a probabilistic localisation algorithm based on particle filtering, using the Lego Mindstorms NXT kits with motor odometry, sonar and compass sensors. The groups were given a "map" of a small enclosed area, indicating the measured locations of walls relative to a fixed coordinate system. The goal was to use Monte Carlo Localisation to keep a continuous estimate of the moving robot's position which was good enough to accurately follow a pre-defined path through a sequence of waypoints.

Several of the teams achieved good results, and one or two even made promising progress on the more difficult problem of global localisation (the "kidnapped robot problem"), where the robot had to initialise a localisation estimate from scratch when dropped at an arbitrary position in the course. This video shows the robot of the team of David Passingham, Vincent Dedoyard, John Payce and Mengru Li in action (video courtesy of the team).

Robot Obstacle Course in 2007

At the end of the course in 2007 we had a robot obstacle course challenge where the students in groups of three or four designed Lego robots to reach a light beacon while bouncing off and avoiding obstacles. The robots used light sensors to detect the direction of the bright light and bump sensors to detect collisions. Each of the robots was timed over three runs from different start points on a course constructed by Simon Overell.

The winning robot was from the team of Philip Stennett, Nicholas Ball, Maurice Adler and Wei Chieh Soon which completed the course all three times with a total time of 36.9 seconds --- this is a (very dark) video of their robot in action . The robots from the team of Si Yu Li, Henry Arnold, Shobhit Srivastava and Jonathan Dorling, and the team of Ricky Shrestha, Hussein Elgridly and Maxim Zverev also successfully completed the course three times.

Robot Racing in 2006

Last year we finished the course with a time-trial competition between Lego line-following robots designed and programmed by the students in groups of three. The robots used downward-looking light sensors to observe the line, and ran programs written in the C-like NQC language. The robots were timed over three laps of this course, which the groups saw for the first time on the day of the races:

The easy winner of the time-trial was the simple but very slick robot from Jiefei Ma, Winnie Xu Zheng and Nan Wang which you can see in action in this movie. Another interesting robot featuring articulation from Steven Lovegrove, Alex Lamaison and Folabi Ogunkoya performed very well in practice but didn't quite make it round three laps of the race track. Also see this movie of several of the robots on track at once. (Thanks to Jiefei Ma and Steven Lovegrove for providing the videos).
Andrew Davison