IMBALS project, powered by ScioTeq, makes commercial aviation robust against disturbed GPS
The European funded research project IMBALS, led by Belgian ScioTeq, promises to deliver a novel technology for determining aircraft position during approach and landing, independent from susceptible satellite navigation signals. The present situation following the Russian invasion in Ukraine gives the factual proof how the susceptibility of satellite navigation signals may disrupt commercial aviation operations. The IMBALS project will demonstrate an autonomous landing guidance technique that is based on computer vision technology instead of relying on GPS satellites, making the critical approach and landing phases independent from external susceptible signals. As such, IMBALS will help to ensure continued safe operations in the neighborhood of conflict areas.
Need for autonomy
The European Aviation Safety Agency (EASA), responsible for safety and regulations in European aviation, have issued a warning on March 17th relating to observed disturbances of satellite navigation signals in large areas from Eastern Finland over Eastern Europe down to Northern Iraq. Signal disturbances have reportedly led to erroneous determination of aircraft position, which may mislead flight crew and other aircraft systems that use the position information. Satellite navigation (e.g. GPS) is frequently used as navigation aid during approach and landing, still the most critical phase of the flight where little room for error exists. Signal disturbance during this phase may have catastrophic consequences.
Luckily, we can count on many safety provisions in systems and procedures, together with a well-trained flight crew, to prevent the disaster to happen. At the same time, we have to acknowledge that safety margins are compromised by such vulnerability and, when relying on human intervention, there is always a possibility for human error. Next to the safety impact, these signal disturbances have an economic and ecologic impact as well since affected flights had to abort their approach and make a go around. In some cases, flights are diverted to other airports, with all inconveniences for the passengers and the airline company. Hence, the industry recognizes high value in making commercial aviation less depending on satellite navigation signals, or, by extension, on any external aids. Such increased independence from external means translates into increased aircraft autonomy. Computer vision technology is one of the most promising means to reach that goal.
Motivation for disturbing signals
The satellite navigation system most commonly used for civil aviation (i.e. US-made GPS) was originally conceived for military purposes. Satellite navigation has high value in military context: it is very precise, it is available everywhere at any time (unless it’s jammed) and it is very convenient and easy to use when having the adequate equipment. Many modern weapon systems obtain their high precision and autonomy thanks to this satellite navigation. Obviously, it is a high priority in a conflict to deny the enemy from using satellite navigation. And this will be endeavored from both sides of the front line. From one side, the US may want to prevent Russian forces to take advantage from the US satellite navigation signals (GPS). Therefore, they could deny the use of the accurate GPS signal to all users, except their own and allied military users who have a specific decoding key. This is called spoofing.
At the other side, Russia may want to prevent Ukraine forces to take advantage from any satellite navigation system. Therefore, Russia may disturb the signal for all users. This is called jamming. In both cases, any civil user within the affected areas would no longer have a reliable signal, so this could affect commercial aircraft navigation systems. The disturbance in this case does not stop at the geopolitical borders, but extends to large areas that are not directly involved in the conflict. Jamming or spoofing may corrupt the signals as such that the receiver still produces a position, without notifying any users that that position is outside the accuracy requirements for a given application. This is exactly what we cannot tolerate in the approach and landing phase.
The IMage BAsed Landing Solution (i.e. the wording behind the acronym) consists of an on-board camera system and an Image Processing Platform (IPP). The IPP receives the images from the camera and extracts information out of these images. During approach and landing, the IPP detects the runway in the image and interprets the perspective of how that runway appears in the image. This interpretation does approximately the same task as the human vision of the pilot while the pilot continuously assesses its position relative to the runway.
The IPP is able to determine the entire 6 dimensional pose of the aircraft relative to the runway (i.e. distance from the runway, height above the runway, lateral deviation from the extended runway centerline, roll, pitch and aspect angle versus extended runway centerline). There are several ways for exploiting the IPP output data. In any case, the independence of this data from any other system or signal is tremendously important, particularly in view of the signal vulnerability described above. The output of the IPP could be used to validate the (position) information obtained from satellite navigation for example, enabling the system to reject erroneous position data provided by the satellite navigation receiver.
When the satellite navigation is rejected, the data from the image based landing could be used to continue the guidance down to the touch down and roll out, whereas today it is more likely that the approach and landing need to be aborted. Obviously, atmospheric vision obstructs like fog, rain, snow are key limitations of the image based landing solution today (like it is for the human vision). However, on the longer term, the use of infra-red cameras and other sensors promise to work around this limitation.
IMBALS project status
The IMBALS project is in its 4th and final year of execution. The project partners, Sol.One and University of Leuven are closely working together for validating the technologies and verifying the IPP against its requirements. They work together with Airbus to integrate and test the IPP prototypes in the Disruptive Cockpit avionics system in the frame of the Clean Sky 2 Large Passenger Aircraft demonstrator. They are testing the IPP with both simulated and real video because of the complementary advantages and limitations of the two strategies.
The IMBALS project will deliver by the end of 2022 a strong technology base together with valuable insights into the challenges and potential of computer vision for aircraft approach and landing guidance. The partners have started exploring the next steps to advance the state of art of this technology towards a product and its certification beyond the IMBALS project. However, the extent of deployment this system will not only depend on advancements in the technology, but also on advancements in the certification regulations.
The current generally accepted industry standards used for the certification of complex avionics systems do not provide a good basis for addressing with quantitative and qualitative methods the perception risks that are associated with computer vision systems. Therefore, the IMBALS partners suggest a broader collaborative effort between industry and certification authorities to enable the certification of computer vision systems for safety critical applications. Unfortunately, it will still take several years before image based landing can be certified and deployed. Until then, projects like IMBALS will pave the path to get there and airline operators will mitigate the risks as suggested in the EASA bulletin.
The IMBALS project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No 785459.