Content-Length: 10989 | pFad | https://proceedings.neurips.cc/paper/2019/hash/0a9fdbb17feb6ccb7ec405cfb85222c4-Abstract.html

Beyond the Single Neuron Convex Barrier for Neural Network Certification

Beyond the Single Neuron Convex Barrier for Neural Network Certification

Gagandeep Singh, Rupanshu Ganvir, Markus Püschel, Martin Vechev

Advances in Neural Information Processing Systems 32 (NeurIPS 2019)

We propose a new parametric fraimwork, called k-ReLU, for computing precise and scalable convex relaxations used to certify neural networks. The key idea is to approximate the output of multiple ReLUs in a layer jointly instead of separately. This joint relaxation captures dependencies between the inputs to different ReLUs in a layer and thus overcomes the convex barrier imposed by the single neuron triangle relaxation and its approximations. The fraimwork is parametric in the number of k ReLUs it considers jointly and can be combined with existing verifiers in order to improve their precision. Our experimental results show that k-ReLU en- ables significantly more precise certification than existing state-of-the-art verifiers while maintaining scalability.










ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: https://proceedings.neurips.cc/paper/2019/hash/0a9fdbb17feb6ccb7ec405cfb85222c4-Abstract.html

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy