AI Robustness (AIR)

The ability of a system to keep operating at the same high level of performance regardless of the conditions under which it is used is one definition of robustness.

Within the context of AI systems, robustness introduces a new set of challenges. The architectures of neural networks present a unique difficulty due to the fact that not only are they difficult to explain, but also they often exhibit behaviour that is unexpected because of their nonlinear nature. The characterization of the robustness of neural networks is an unsolved research problem, and the testing and verification methods both have their drawbacks.

Controls related to this risk category are listed as below:

  • AIR 01 - Evaluation of Resilience

  • AIR 02 - System Performance

  • AIR 03 - Measurable Performance Improvements

Last updated