From 1991 to 1994 I studied for a BA degree in physics in Oxford (First Class Honours).
Since 1994, I have worked almost continuously in vision-based localisation and mapping. Moving to the Department of Engineering Science in Oxford, from October 1994 to February 1998 I did a D.Phil (PhD) in the Active Vision Lab under the supervision of Professor David Murray. My thesis was entitled "Mobile Robot Navigation using Active Vision" (available from my Publications page) and presented one of the very first working robot SLAM systems.
Straight after submitting my PhD thesis, in February 1998 I went to Japan where I spent two years on a European Union Science and Technology Fellowship. This consisted of four months in Tokyo learning Japanese and then research in the Intelligent Systems Institute of AIST (previously known as ETL) in Tsukuba, with which I still have strong links, working with Nobuyuki Kita. I had a fantastic time in Japan and recommend anyone to visit.
During my post-doctoral work at AIST, the focus was on vision-based navigation for industrial inspection robots, and included research on localisation for multiple collaborating robots (IROS 2000). One of the main advances was extending my work on robot SLAM into full 3D, modelling the motion of a wheeled robot driving on an undulating surface. In this movie the robot performs simultaneous localisation and mapping in full 3D while navigating on an undulating surface in an industrial inspection scenario (CVPR 2001).
Returning to Oxford, from May 2000 to September 2001 I spent some time working on Markerless Human Motion Capture, in cooperation with Ian Reid in Oxford. In parallel, I continued to work on SLAM and continued working towards increasingly general vision-based systems (RAS 2001).
From 2002 to 2007 I held an EPSRC Advanced Research Fellowship, moved to Imperial to take up a lectureship and found the Robot Vision Group in May 2005, and have since worked full-time on visual SLAM and its many applications.
I made a major breakthrough in 2003 with MonoSLAM, a real-time localisation and mapping system based on a single computer-connected camera which enabled robust and drift-free localisation in full 3D. The lack of need for additional sensors, infrastructure or special computing resources made for a very flexible system with a wide range of possible applications. The MonoSLAM system is widely acknowledged to be one of the key prototypes for recent commercial projects and products in low-cost mobile robotics (e.g. Dyson) or mobile phone/tablet/wearable 3D localisation and sensing (e.g. Google Project Tango).
Among many other published advances since this time, I have collaborated with long-term colleagues and PhD students at Imperial College, Oxford, Zaragoza and others on important algorithms, studies and systems such as Inverse Depth Features (2006), Active Matching (2008), "Why Filter?" (2010), DTAM and KinectFusion (2011), SLAM++ (2013) and Event Camera SLAM (2014). Please see the publications and videos on the Robot Vision Research Group site for much more information.
Alongside this academic research, I have a longstanding relationship with Dyson Ltd. in the UK, having worked for them as a consultant on robot vision technology since 2005. This collaboration led to the creation in 2014 of the Dyson Robotics Laboratory at Imperial College of which I am the Director and founder.Andrew Davison