Blogs (1) >>
POPL 2019
Sun 13 - Sat 19 January 2019 Cascais, Portugal
Wed 16 Jan 2019 15:43 - 16:05 at Sala I - Machine Learning and Linear Algebra Chair(s): Aws Albarghouthi

We present a novel method for scalable and precise certification of deep neural networks. The key technical insight behind our approach is a new abstract domain which combines floating point polyhedra with intervals and is equipped with abstract transformers specifically tailored to the domain of neural networks. Concretely, we introduce new transformers for the affine transform, the rectified linear unit (ReLU) activation, sigmoid, tanh, and maxpool functions.

We implemented our method in a system called DeepPoly and evaluated it extensively on a range of datasets, neural architectures, and specifications. Our experimental results indicate that DeepPoly is more precise than prior work while scaling to large networks. We also show how to combine DeepPoly with abstraction refinement in order to prove, for the first time, robustness of the given input (e.g., an image) to complex perturbations such as rotations.

Wed 16 Jan

Displayed time zone: Belfast change

15:21 - 16:27
Machine Learning and Linear AlgebraResearch Papers at Sala I
Chair(s): Aws Albarghouthi University of Wisconsin-Madison
15:21
22m
Talk
code2vec: Learning Distributed Representations of Code
Research Papers
Uri Alon Technion, Meital Zilberstein Technion, Omer Levy University of Washington, USA, Eran Yahav Technion
Link to publication DOI Pre-print Media Attached File Attached
15:43
22m
Talk
An Abstract Domain for Certifying Neural Networks
Research Papers
Link to publication DOI Media Attached
16:05
22m
Talk
Closed Forms for Numerical Loops
Research Papers
Zachary Kincaid Princeton University, Jason Breck University of Wisconsin - Madison, John Cyphert University of Wisconsin - Madison, Thomas Reps University of Wisconsin - Madison and GrammaTech, Inc.
Link to publication DOI Media Attached File Attached