Original article can be found here (source): artificial intelligence
NATS, the air navigation service provider (ANSP) for the U.K., is evaluating the use of an artificial intelligence and machine learning framework that has demonstrated the potential to significantly reduce delays caused by low visibility conditions for aircraft arriving into London’s Heathrow International Airport.
The ANSP’s goal is to continually evaluate how Aimee, an artificial intelligence and machine learning platform supplied by Searidge, could use ultra high fidelity cameras aligning the runway that monitor approaches and landings to alert controllers as to when the runway is safely cleared for the next arrival. The tower’s vision of Heathrow airport is commonly lost due to clouds or fog, a driving factor behind the project as it would allow NATS to remove the restrictions that are placed on the airport’s capacity in low visibility created by those conditions.
Andy Taylor, the chief digital solutions officer for NATS, told Avionics International that his team completed analysis of more than 40,000 arrivals into Heathrow’s northern runway.
“The results are good and are so pleasing in fact that we want push it further. The next step will be reviewing how it operates in low visibility procedures where due to fog, you’re actually protecting the localizer signal of the instrument landing system (ILS), so that we can apply improvements of the fidelity and the automatic monitoring of Aimee AI in all conditions including low visibility procedures,” Taylor said.
Controllers are able to use Aimee because of the trial that NATS started in January 2019 by installing 18 Ultra-HD (UHD) 4K, cameras on the tower and another 20 cameras along the airfield. The images from those cameras are then fed live into Aimee, which can interpret the images, track the aircraft and then inform the controller when it has successfully cleared the runway. The controller then makes the decision to clear the next arrival.
Marco Rueckert, head of innovation for Searidge Technologies, told Avionics that Aimee uses a “convolutional neural network for image segmentation,” which was trained using archived video footage from the cameras installed along the airfield. Searidge engineers used the recorded videos from footage taken by cameras to allow Aimee to learn the different types of aircraft that arrive and depart on a 24-hour basis.
“Aimee is a collection of software components that run on various hardware components, such as servers and workstations. At Heathrow, Aimee runs on [graphics processing unit] GPU workstations to accelerate the image segmentation neural network. The outputs of the Aimee sub-systems can then either be displayed on Searidge’s controller working position, as is the case at London Heathrow, or fed into external systems,” Rueckert said.
The ongoing evaluation is part of a $3 million investment NATS has made in a ‘digital tower laboratory’ located inside the Heathrow air traffic control tower. There, it is working with the airport to understand how technology could support the air traffic operation now and in the future.
NATS’ Taylor said a goal for their evaluation is to understand if Aimee is capable of automatically monitoring aircraft on final approach when fog obscures visibility below 1,800 feet of runway visual range and beyond.
“The business case for us is to remove the restrictions applied when the controllers can’t see the runway from the tower because the tower is obscured by clouds,” Taylor said. “I am aiming to test the system in different fog conditions and see how low we can go. 75 meters [runway visual range] RVR is the nirvana, that’s the lowest runway visual range that aircraft will land in. While we can land in 0 visibility, they generally require 75 meters, so that they can taxi clear of the runway on a visual basis following the illuminated centerline. Ideally if we can get the system to show a similar level of performance in RVR particularly as low as that, that gives significant benefit because I can then look at refining our low visibility LPV procedures and tower in cloud procedures.”
NATS controllers have been working alongside software engineers from Searidge to provide feedback on the human machine interface they use to monitor the AI-powered alerts they receive on displays showing the arrival paths of aircraft into Heathrow. On the airline side, Taylor said the operation runs seamlessly, with pilots receiving the exact same communications to be cleared for final approaches whether Aimee is in use or not.
Although controllers at Heathrow are only using Aimee for video image pattern detection, the platform does have other AI airport use cases.
“Searidge has used the AI technologies contained within Aimee on different data sources, such as ATC RT, en-route and surface surveillance data with different partners around the world. For example, Searidge has MOUs with Singapore’s CAAS and UAE’s GCAA to use Aimee to recognize patterns in surface and en-route surveillance data to increase the safety and efficiency of the operation of our partners,” Rueckert said.
The next step for NATS will be submitting their analysis of the safety benefits demonstrated Aimee to the U.K.’s Civil Aviation Authority (CAA) later this year, with the eventual goal of using the system as part of their everyday operations at Heathrow.