Richard Moore; photographed at NERC 2013.

Richard J. D. Moore

Research Scientist
Optical Measurement Systems & Data Analysis
SINTEF ICT
Address:
Forskningsveien 1
Oslo, Norway
Email:
rjdmooreuqconnect.edu.au
Links:
Personal
Resume
Facebook
Google+
LinkedIn
Commons
Youtube
JMT hike
     Research
Scholar
Thesis
Harvard SSR Lab and videos
QBI Biorobotics Lab and videos
Optical Measurement Systems
& Data Analysis

FicTrac

About me

Originally from Melbourne, Australia, I completed my bachelor's degrees (Sci/Eng) at Swinburne University of Technology before moving to Brisbane to undertake my PhD on visual guidance for autonomous aircraft at QBI. After receiving my doctorate from the University of Queensland in 2012, I moved to Cambridge, MA where I completed a Postdoctoral Research Fellowship on the RoboBees project at the Harvard School of Engineering and Applied Sciences. I then moved to industry, designing and implementing algorithms for smart vision systems as a Function Owner at Valeo's research facility in Co. Galway, Ireland, and now work as a computer vision and autonomous systems specialist with various industrial partners at SINTEF ICT in Oslo, Norway.

Broadly, my research interests lie in visual guidance, aerial robotics, and biologically inspired systems. I love designing novel guidance algorithms for autonomous robots and I am particularly interested in applying principles of biological vision-based guidance to aerial robotics.

Outside work life I enjoy hiking, skiing, and running, and I have also won several national beach sprinting titles.


Vision-based navigation / collision avoidance for MAVs

    
  • (In prep.) RJD Moore, K Dantu, GL Barrows, R Nagpal. Omnidirectional vision-based guidance for an ultra-lightweight MAV.

  • (2014) RJD Moore, K Dantu, GL Barrows, R Nagpal. Autonomous MAV guidance with a lightweight omnidirectional vision sensor [ICRA] [preprint].

Visual motion tracking

    
  • (2014) GJ Taylor, RJD Moore, AC Paulk, T Pearson, B van Swinderen, MV Srinivasan. Using FicTrac to accurately measure the motion of animals walking in virtual reality [ICN].

  • (2014) RJD Moore, GJ Taylor, AC Paulk, T Pearson, B van Swinderen, MV Srinivasan. FicTrac: a visual method for tracking spherical motion and generating fictive animal paths [J. Neuroscience Methods] [preprint].

Visual guidance for autonomous aircraft

    
  • (2011) RJD Moore, S Thurrowgood, D Soccol, D Bland, MV Srinivasan. A method for the visual estimation and control of 3-DOF attitude for UAVs [ACRA].

  • (2011) RJD Moore, S Thurrowgood, D Bland, D Soccol, MV Srinivasan. A fast and adaptive method for estimating UAV attitude from the visual horizon [IROS] [preprint].

  • (2010) S Thurrowgood, RJD Moore, D Bland, D Soccol, MV Srinivasan. UAV attitude control using the visual horizon [ACRA].

  • (2009) S Thurrowgood, D Soccol, RJD Moore, D Bland, MV Srinivasan. A vision based system for attitude estimation of UAVs [IROS] [preprint].
    
  • (2010) S Thurrowgood, RJD Moore, D Soccol, M Knight, MV Srinivasan. A biologically inspired, vision-based guidance system for automatic landing of a fixed-wing aircraft [J. Field Robotics] [preprint].
    
  • (In prep.) S Thurrowgood, RJD Moore, MV Srinivasan. Optic flow for aircraft control.

Vision-based tracking and interception

    

Visual wind estimation

    
  • (2012) RJD Moore, S Thurrowgood, MV Srinivasan. Vision-only estimation of wind field strength and direction from an aerial platform [IROS] [preprint].

Stereo vision-based terrain following

    
  • (2011) RJD Moore, S Thurrowgood, D Soccol, D Bland, MV Srinivasan. A bio-inspired stereo vision system for guidance of autonomous aircraft [InTech].

  • (2010) RJD Moore, S Thurrowgood, D Bland, D Soccol, MV Srinivasan. UAV altitude and attitude stablisation using a coaxial stereo vision system [ICRA] [preprint].

  • (2009) RJD Moore, S Thurrowgood, D Bland, D Soccol, MV Srinivasan. A stereo vision system for UAV guidance [IROS] [preprint].

Back to top

© 2016 Richard Moore.