Blogs (1) >>
POPL 2019
Sun 13 - Sat 19 January 2019 Cascais, Portugal
Wed 16 Jan 2019 15:43 - 16:05 at Sala I - Machine Learning and Linear Algebra Chair(s): Aws Albarghouthi

We present a novel method for scalable and precise certification of deep neural networks. The key technical insight behind our approach is a new abstract domain which combines floating point polyhedra with intervals and is equipped with abstract transformers specifically tailored to the domain of neural networks. Concretely, we introduce new transformers for the affine transform, the rectified linear unit (ReLU) activation, sigmoid, tanh, and maxpool functions.

We implemented our method in a system called DeepPoly and evaluated it extensively on a range of datasets, neural architectures, and specifications. Our experimental results indicate that DeepPoly is more precise than prior work while scaling to large networks. We also show how to combine DeepPoly with abstraction refinement in order to prove, for the first time, robustness of the given input (e.g., an image) to complex perturbations such as rotations.

Wed 16 Jan

POPL-2019-Research-Papers
15:21 - 16:27: Research Papers - Machine Learning and Linear Algebra at Sala I
Chair(s): Aws AlbarghouthiUniversity of Wisconsin-Madison
POPL-2019-Research-Papers15:21 - 15:43
Talk
Uri AlonTechnion, Meital ZilbersteinTechnion, Omer LevyUniversity of Washington, USA, Eran YahavTechnion
Link to publication DOI Pre-print Media Attached File Attached
POPL-2019-Research-Papers15:43 - 16:05
Talk
Link to publication DOI Media Attached
POPL-2019-Research-Papers16:05 - 16:27
Talk
Zachary KincaidPrinceton University, Jason BreckUniversity of Wisconsin - Madison, John CyphertUniversity of Wisconsin - Madison, Thomas RepsUniversity of Wisconsin - Madison and GrammaTech, Inc.
Link to publication DOI Media Attached File Attached