Files in this item

FilesDescriptionFormat

application/pdf

application/pdfHUANG-DISSERTATION-2016.pdf (1MB)
(no description provided)PDF

Description

Title:Compositional analysis of networked cyber-physical systems: safety and privacy
Author(s):Huang, Zhenqi
Director of Research:Mitra, Sayan
Doctoral Committee Chair(s):Mitra, Sayan
Doctoral Committee Member(s):Dullerud, Geir; Kwiatkowska, Marta; Vaidya, Nitin
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Degree:Ph.D.
Genre:Dissertation
Subject(s):Invariant Verification
Partial Order Reduction
Differential Privacy
Sensitivity Analysis
Cyber-Physical System
Abstract:Cyber-physical systems (CPS) are now commonplace in power grids, manufacturing, and embedded medical devices. Failures and attacks on these systems have caused significant social, environmental and financial losses. In this thesis, we develop techniques for proving invariance and privacy properties of cyber-physical systems that could aid the development of more robust and reliable systems. The thesis uses three different modeling formalisms capturing different aspects of CPS. Networked dynamical systems are used for modeling (possibly time-delayed) interaction of ordinary differential equations, such as in power system and biological networks. Labeled transition systems are used for modeling discrete communications and updates, such as in sampled data-based control systems. Finally, Markov chains are used for describing distributed cyber-physical systems that rely on randomized algorithms for communication, such as in a crowd-sourced traffic monitoring and routing system. Despite the differences in these formalisms, any model of a CPS can be viewed as a mapping from a parameter space (for example, the set of initial states) to a space of behaviors (also called trajectories or executions). In each formalism, we define a notion of sensitivity that captures the change in trajectories as a function of the change in the parameters. We develop approaches for approximating these sensitivity functions, which in turn are used for analysis of invariance and privacy. For proving invariance, we compute an over-approximation of reach set, which is the set of states visited by any trajectory. We introduce a notion of input-to-state (IS) discrepancy functions for components of large CPS, which roughly captures the sensitivity of the component to its initial state and input. We develop a method for constructing a reduced model of the entire system using the IS discrepancy functions. Then, we show that the trajectory of the reduced model over-approximates the sensitivity of the entire system with respect to the initial states. Using the above results we develop a sound and relatively complete algorithm for compositional invariant verification. In systems where distributed components take actions concurrently, there is a combinatorial explosion in the number of different action sequences (or traces). We develop a partial order reduction method for computing the reach set for these systems. Our approach uses the observation that some action pairs are approximately independent, such that executing these actions in any order results in states that are close to each other. Hence a (large) set of traces can be partitioned into a (small) set of equivalent classes, where equivalent traces are derived through swapping approximately independent action pairs. We quantify the sensitivity of the system with respect to swapping approximately independent action pairs, which upper-bounds the distance between executions with equivalent traces. Finally, we develop an algorithm for precisely over-approximating the reach set of these systems that only explore a reduced set of traces. In many modern systems that allow users to share data, there exists a tension between improving the global performance and compromising user privacy. We propose a mechanism that guarantees ε-differential privacy for the participants, where each participant adds noise to its private data before sharing. The distributions of noise are specified by the sensitivity of the trajectory of agents to the private data. We analyze the trade-off between ε-differential privacy and performance, and show that the cost of differential privacy scales quadratically to the privacy level. The thesis illustrates that quantitative bounds on sensitivity can be used for effective reachability analysis, partial order reduction, and in the design of privacy preserving distributed cyber-physical systems.
Issue Date:2016-11-28
Type:Thesis
URI:http://hdl.handle.net/2142/95351
Rights Information:Copyright 2016 Zhenqi Huang
Date Available in IDEALS:2017-03-01
Date Deposited:2016-12


This item appears in the following Collection(s)

Item Statistics