Skip to main content
European Commission logo
TRIMIS

Low-latency Perception and Action for Agile Vision-based Flight

PROJECTS
Funding
European
European Union
Duration
-
Status
Ongoing
Geo-spatial type
Other
Total project cost
€2 000 000
EU Contribution
€2 000 000
Project Acronym
AGILEFLIGHT
STRIA Roadmaps
Connected and automated transport (CAT)
Transport mode
Airborne icon
Transport policies
Safety/Security,
Deployment planning/Financing/Market roll-out

Overview

Call for proposal
ERC-2019-COG
Link to CORDIS
Background & Policy context

Drones are disrupting industries, such as agriculture, package delivery, inspection, and search and rescue. However, they are still either controlled by a human pilot or heavily rely on GPS for navigating autonomously. The alternative to GPS are onboard sensors, such as cameras: from the raw data, a local 3D map of the environment is built, which is then used to plan a safe trajectory to the goal. While the underlying algorithms are well understood, we are still far from having autonomous drones that can navigate through complex environments as good as human pilots. State-of-the-art perception and control algorithms are mature but not robust: coping with unreliable state estimation, low-latency perception, real-time planning in dynamic environments, and tight coupling of perception and action under severe resource constraints are all still unsolved research problems. Another issue is that, because battery energy density is increasing at a very slow rate, drones need to navigate faster in order to accomplish more within their limited flight time. To obtain more agile robots, we need faster sensors and low-latency processing.

Objectives

Drones are still nowhere near capable of navigating through complex environments as well as human pilots. Obtaining more agile robots calls for faster sensors and low-latency processing. The EU-funded AGILEFLIGHT project intends to develop innovative scientific methods to demonstrate autonomous, vision-based, agile quadrotor navigation in unknown, GPS-denied and cluttered environments with possibly moving obstacles. The aim is to achieve navigation that is as effective in terms of manoeuvrability and agility as that of professional drone pilots. To this end, the project will develop algorithms that combine standard camera and event camera advantages. AGILEFLIGHT will also develop novel methods that enable agile manoeuvres through cluttered, unknown and dynamic environments. This work could benefit disaster response, aerial delivery and inspection work.

The goal of this project is to develop novel scientific methods that would allow me to demonstrate autonomous, vision-based, agile quadrotor navigation in unknown, GPS-denied, and cluttered environments with possibly moving obstacles, which can be as effective in terms of maneuverability and agility as those of professional drone pilots. The outcome would not only be beneficial for disaster response scenarios, but also for other scenarios, such as aerial delivery or inspection. To achieve this ambitious goal, I will first develop robust, low-latency, multimodal perception algorithms that combine the advantages of standard cameras with event cameras. Then, I will develop novel methods that unify perception and state estimation together with planning and control to enable agile maneuvers through cluttered, unknown, and dynamic environments.

Funding

Specific funding programme
H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC)
Other Programme
ERC-2019-COG ERC Consolidator Grant

Partners

Lead Organisation
EU Contribution
€2 000 000

Technologies

Technology Theme
Aircraft design and manufacturing
Technology
AI-based autonomous flight control system
Development phase
Demonstration/prototyping/Pilot Production

Contribute! Submit your project

Do you wish to submit a project or a programme? Head over to the Contribute page, login and follow the process!

Submit