UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Security analysis of deep neural network-based cyber-physical systems Kashyap, Aarti

Abstract

Cyber-Physical Systems (CPS) are deployed in many mission-critical applications such as medical devices (e.g., an Artificial Pancreas System (APS)), autonomous vehicular systems (e.g., self-driving cars, unmanned aerial vehicles) and aircraft control management systems (e.g., Horizontal Collision Avoidance System (HCAS) and Collision Avoidance System-Xu (ACAS-XU)). Ensuring correctness is becoming more difficult as these systems adopt new technology, such as Deep Neural Network (DNN), to control these systems. DNN are black-box algorithms whose inner workings are complex and difficult to discern. As such, understanding their vulnerabilities is also complex and difficult. We identify a new vulnerability in these systems and demonstrate how to synthesize a new category of attacks Ripple False Data Injection Attacks (RFDIA) in them by perturbing specific inputs, by minimal amounts, to stealthily change the DNN’s output. These perturbations propagate as ripples through multiple DNN layers and can lead to corruptions that can be fatal. We demonstrate that it is possible to construct such attacks efficiently by identifying the DNN’s critical inputs. The critical inputs are those that affect the final outputs the most on being perturbed. Understanding this new class of attacks sets the stage for developing methods to mitigate vulnerabilities. Our attack synthesis technique is based on modeling the attack as an optimization problem using Mixed Integer Linear Programming (MILP). We define an abstraction for DNN-based CPS that allows us to automatically: 1) identify the critical inputs, and 2) find the smallest perturbations that produce output changes. We demonstrate our technique on three practical CPS with two mission-critical applications in increasing order of complexity: Medical systems (APS) and aircraft control management systems (HCAS and ACAS-XU). Our key observations for scaling our technique to complex systems such as ACAS-XU were to define: 1) appropriate intervals for their inputs and the outputs, and 2) attack specific objective (cost) functions in the abstraction.

Item Citations and Data

Rights

Attribution-NonCommercial-NoDerivatives 4.0 International