Airbus has completed two years of flight testing for its Autonomous Taxi, Take-Off, and Landing (ATTOL) project. The work culminated in a fully automatic, vision-based flight of an A350-1000 widebody airliner, achieved through the use of on-board image recognition technology, which the manufacturer described as a world-first in a June 29 press statement.
More than 500 test flights were made as part of the project and around 450 of these were used to gather the raw video data that Airbus engineers needed to support and fine-tune the algorithms for the autonomous technology. There were six test flights to evaluate the autonomous flight capabilities, with each one including five takeoffs and landings in the airliner.
Under the direction of the Airbus UpNext team, ATTOL tapped technical expertise from several Airbus divisions, including Acubed’s Project Wayfinder team which developed the Vahana eVTOL technology demonstrator, as well as Airbus Defence and Space, and Airbus China. French national aerospace laboratory Onera was also involved in the work.
According to Airbus, ATTOL was conceived to explore how autonomous technologies, including the use of machine learning algorithms and automated tools for data labeling, processing, and model generation could help pilots to focus more on strategic decision making and mission management during flights, rather than on aircraft operations. The goal is to boost the operational safety of existing airliners, but also potentially allow for autonomous flights by new generation eVTOL urban air mobility aircraft.
“Many aircraft are already able to land automatically,” explained ATTOL project lead Sebastien Giuliano. “But they’re reliant on external infrastructure like instrument landing systems or GPS signals. ATTOL aims to make this possible solely using on-board technology to maximize efficiency and to reduce infrastructure cost.”
Acubed’s Wayfinder team developed the software that, based on computer vision and machine learning, allows an aircraft to detect its surrounding environment and calculate how best to navigate within it. This is achieved using a combination of sensors, including cameras, radar, laser-based LiDAR, and powerful onboard computers.
“The key challenge for self-piloting capabilities is how the system reacts to unforeseen events,” said Wayfinder project executive Arne Stoschek. “That’s the big jump from automated to autonomous.”
Read more at: www.ainonline.com
Follow us on instagram