Biologically inspired vision systems for guidance of autonomous aerial vehicles:


Our work with honeybees has shown that insects use cues derived from image motion (optic flow) to guide and control a number of aspects of flight - such as the speed and height of flight, obstacle avoidance and landing. Work in other laboratories has shown that flying insects monitor the profile of the horizon to sense and stabilise their attitude. We are using these principles to design and test vision systems for the guidance of model aircraft. Our aims are twofold: (a) to test our hypotheses about the strategies of visual guidance of insect flight in real, natural environments, and (b) to explore the possibility that biology may offer alternative solutions to the problem of flight guidance, that are simpler and computationally less expensive. Two examples are given below. A number of additional and more recent examples can be viewed at http://www.youtube.com/user/qbibiorobotics/videos


Visual Homing


This study describes a novel view-based method using panoramic images to perform local homing in outdoor environments. A holistic algorithm makes uses of difference images (in relation to a reference snapshot) to build an image reference frame centred at the home position. The currently experienced view at any local position is then projected onto this reference frame to determine the image coordinates and the homing vector. The biologically inspired algorithm described in this study is a feasible alternative to the local homing schemes that strongly rely on odometry or landmark extraction, making it therefore well suited for implementation onboard unmanned aerial vehicles (UAVs). On the opposite figure are examples of home paths, obtained in a simulated environment with our novel view-based method, demonstrating smooth homing trajectories, suitable for an UAV controller.

A. Denuelle, S. Thurrowgood, F. Kendoul and M.V. Srinivasan (2015)
A view-based method for local homing of unmanned rotorcraft.
International Conference on Automation, Robotics and Applications (ICARA 2015), Queenstown, 17-19 February 2015.


Visual odometry using optic flow and stereo


This test deomonstrates a fully automatic flight of a quadrotor aircraft that uses purely visual information to take off, perform a prescribed trajectory at a specified altitude (in this case a 10 m square trajectory at a height of 3 m above the ground), and land at the starting location. The system performs odometry and path integration using optic flow and stereo information derived from an on-board vision system. There is no use of GPS, and no use of ranging devices that depend on sonar, radar, laser or barometric information. The landing location is not identified visually, but is inferred from the path-integrated trajectory of the aircraft. The small discrepancy between the take off and landing positions represents the cumulative error in the odometry and path integration over the 40 m circuit.

More information on this ongoing study is available here:
R. Strydom, S. Thurrowgood and M.V. Srinivasan (2014)
Visual Odometry: Autonomous UAV Navigation using Optic Flow and Stereo.
In: Proceedings of the Australasian Conference on Robotics and Automation (ARCA 2014), Melbourne, 2-4 December 2013.


Airborne Vision System for the Detection of Moving Objects


Here is an example of a vision-based technique for the detection of moving objects by a moving airborne vehicle. A pilot manually flies the rotorcraft, while the vision system is detecting the motion of a red ball. The technique, which is based on measurement of optic flow, computes the egomotion of the aircraft based on the pattern of optic flow in a panoramic image, then determines the component of this optic flow pattern that is generated by the aircraft’s translation, and finally detects the moving object by determining whether the direction of the flow generated by the object is different from that expected for translation in a stationary environment.

More information on this ongoing study is available here:
R. Strydom, S. Thurrowgood and M.V. Srinivasan (2013)
Airborne Vision System for the Detection of Moving Objects.
In: Proceedings of the Australasian Conference on Robotics and Automation (ARCA 2013), Sydney, 2-4 December 2013.


Control of height above ground and attitude using optic-flow cues


In this example the pattern of optic flow generated by the ground is used to monitor and stabilize the aircraft's attitude, and to regulate its height above the ground. The right-hand panel shows the view acquired by the visual system, and the left-hand panel shows an animation of the aircraft's attitude and height above ground, reconstructed from the flight data. In the "Manual" mode the aircraft is being flown by a pilot via radio control; in the "Auto" mode the plane is in autonomous flight, guided by the on-board vision system.

More information on this ongoing study is available here:
R.J.D. Moore, S.Thurrowgood, D. Bland, D. Soccol and M. Srinivasan (2010)
UAV altitude and attitude stabilization using a coaxial stereo vision system.
In: Proceedings, IEEE International Conference on Robotics and Automation. Anchorage, Alaska, 3-8 May 2010. IEEE Press.


Control of aircraft attitude using the visual horizon


In this example the attitude of the aircraft in roll and pitch is sensed by detecting and monitoring the shape of the horizon profile. The horizon is sensed by using an adaptive algorithm that classifies each pixel in the image as belonging to the ground or the sky, depending upon its spectral composition.

More information on this ongoing study is available here:
S. Thurrowgood, R.J.D. Moore, D. Bland, D. Soccol and M.V. Srinivasan (2010)
UAV attitude control using the visual horizon.
Proceedings, Twelfth Australasian Conference on Robotics and Automation (ARCA 2010), Brisbane, 1-3 December 2010.