Winter Guerra, Ezra Tal, Varun Murali, Gilhyun Ryou, Sertac Karaman
Laboratory for Information and Decision Systems (LIDS)
Massachusetts Institute of Technology (MIT)
Abstract: FlightGoggles is a photorealistic sensor simulator for perception-driven robotic vehicles. The key contributions of FlightGoggles are twofold. First, FlightGoggles provides photorealistic exteroceptive sensor simulation using graphics assets generated with photogrammetry. Second, it provides the ability to combine (i) synthetic exteroceptive measurements generated in silico in real time and (ii) vehicle dynamics and proprioceptive measurements generated in motio by vehicle(s) in a motion-capture facility. FlightGoggles is capable of simulating a virtual-reality environment around autonomous vehicle(s).While a vehicle is in flight in the FlightGoggles virtual reality environment, exteroceptive sensors are rendered synthetically in real time while all complex extrinsic dynamics are generated organically through the natural interactions of the vehicle. TheFlightGoggles framework allows for researchers to accelerate development by circumventing the need to estimate complex and hard-to-model interactions such as aerodynamics, motor mechanics, battery electrochemistry, and behavior of other agents. The ability to perform vehicle-in-the-loop experiments with photorealistic exteroceptive sensor simulation facilitates novel research directions involving, e.g., fast and agile autonomous flight in obstacle-rich environments, safe human interaction, and flexible sensor selection. FlightGoggles has been utilized as the main test for selecting nine teams that will advance in the AlphaPilot autonomous drone racing challenge.We survey approaches and results from the top Alpha Pilot teams, which may be of independent interest.
Fig 1. FlightGoggles renderings of the Abondoned Factory environment, designed for autonomous drone racing. Note the size of the environment and the high level of detail.
Fig 4. Object photographs that were used for photogrammetry, and corresponding rendered assets in FlightGoggles.
Fig 5. Object photographs that were used for photogrammetry, and corresponding rendered assets in FlightGoggles Stata Center Environment.
Fig 6. More object photographs that were used for photogrammetry, and corresponding rendered assets in FlightGoggles Stata Center Environment.
Table III. Sensor usage, algorithme choices and five highest scores in AlphaPilot Simulation Challenge
Winter Guerra, Ezra Tal, Varun Murali, Gilhyun Ryou, Sertac Karaman, FlightGoggles: Photorealistic Sensor Simulation for Perception-driven Robotics using Photogrammetry and Virtual Reality, arXiv preprint, 2019. URL: https://arxiv.org/abs/1905.11377