A Personal View expressed by Ken Dunlap
The Final Word

A Personal View expressed by Ken Dunlap

“Doing more with less” combined with the never-ending quest to increase shareholder value (or emerging from bankruptcy reorganisation) will force AVSEC leaders in the post-COVID 19 world to tackle some difficult decisions. The most important may be deciding between those security functions that must continue to be performed by people and those that can be outsourced to machine learning. Most AVSEC organisations do not have the fundamentals in place to make these decisions and bad outcomes will result until they do.

To be clear, machine learning, or what we more casually refer to as ‘artificial intelligence’ (AI), excels at evaluating large amounts of data with a degree of precision and so allows for a significant reduction in personnel such as analysts, control centre employees, inspectors, and government oversight personnel. I’ll leave it to the global vendor community to fill in the other potential uses. Regardless of the capability, before you make an acquisition you need to understand machine learning, your team, your adversary, and your vendor with a clarity not often brought to AVSEC purchases. With this in mind, here are some guidelines:

AI/Machine Learning Ethics

Long before a procurement decision, empanel an AI ethics board for your organisation. Bias can be intentionally or accidentally built into an AI system from the first line of code and the first set of data used to train the AI algorithms. This may result in overt bias. For example, by including more people of certain ethnicities in your dataset, your algorithms will have a corresponding bias towards (or against) people of those ethnicities. An equally insidious yet more subtle bias is associated with assigning weights to the various factors that will be used to identify items of interest in your data. The AVSEC professionals on your ethics team – not the system vendor – need to be the final authority in labelling a factor as being of greater or lesser importance. Your ethics board will help ensure that the vendor has sufficient expertise in the AVSEC field and uses training data that reflects the aviation world. Don’t skip this step.

AVSEC Team

Aviation personifies the world of ‘can-do’ attitudes. Books on innovation and creativity call these personality types ‘the implementers’. A team consisting solely of implementers will fail at deploying AI powered AVSEC systems. A properly balanced team will consist of implementers and innovators. Don’t fall for the person who claims to be both. At the start of an AI rollout, your team needs to be driven by the innovators but, at a defined point, some project milestone or even a date, the innovators need to give way to the implementers. Rolling out an AI project without role definition and a properly balanced team will fail.

The Adversary

Here’s an uncomfortable fact: most universities teach machine learning for the perfect world. This is because it’s complicated enough to teach basic machine learning concepts to students, let alone concurrently teach them how to make AI systems resilient to adversarial exploitation. Think of it like this: a camera-based recognition system will identify a picture of a rifle on a T-shirt as readily as a rifle held in a person’s hand. Adversaries know that many vision systems can’t detect the difference between a 3D object and a 2D picture and are working to exploit this vulnerability. Understand the limits of your AI and continually test it in an adversarial manner.

Vendors

Insist that your vendor programmes the system using experts in AVSEC. If your vendor’s team isn’t as good as your AVSEC team, you have a problem. Very powerful AI fuelled by AVSEC novices will give you powerfully faulty answers. The typical vendor response will be, “we have the finest AI programmers in the world”. That may be so, but theoretical knowledge can never be a functional replacement for operational experience. Finally, don’t be smitten by talk of neural networks and the expertise they bring. Vendors like touting this. Neural networks are excellent for making associations in unstructured data where none are obvious, but these can be notoriously inaccurate. Neural nets are a tool; make sure that it’s the right tool for your job. The best screwdriver in the world is useless against a nail.

If you’re like any other organisation, you’ve likely thrown up your hands and thought that these guidelines are a huge burden in procurement decision. In fact, they are and should be so. AVSEC is a unique industry where decisions can have life or death implications, lead to imprisonment, inability to find work, or subject persons to grave inconvenience. Important, high-impact decisions should be based on reliable, valid data processed by powerful tools. AI, therefore, is the future, but we must implement and use it wisely to ensure we don’t introduce new problems in attempting to solve others.


Ken Dunlap is managing partner at Catalyst-Go. His firm provides strategy development and enterprise architecture for procuring and deploying cutting edge transportation technologies. He hosts the podcast and blog ThinkingThroughAutonomy. He is a co-founder and member of the board of the Partnership to Advance Responsible Technology dot AI (PART.AI). Before establishing Catalyst-Go, Ken was the International Air Transport Association’s (IATA) Director of Government Affairs and, prior to that, was IATA’s Global Director, Security and Travel Facilitation.

Leave a Reply