ReLU Hull Approximation
File version
Version of Record (VoR)
Author(s)
Li, J
Bai, G
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
Abstract
Convex hulls are commonly used to tackle the non-linearity of activation functions in the verification of neural networks. Computing the exact convex hull is a costly task though. In this work, we propose a fast and precise approach to over-approximating the convex hull of the ReLU function (referred to as the ReLU hull), one of the most used activation functions. Our key insight is to formulate a convex polytope that "wraps"the ReLU hull, by reusing the linear pieces of the ReLU function as the lower faces and constructing upper faces that are adjacent to the lower faces. The upper faces can be efficiently constructed based on the edges and vertices of the lower faces, given that an n-dimensional (or simply nd hereafter) hyperplane can be determined by an (n-1)d hyperplane and a point outside of it. We implement our approach as WraLU, and evaluate its performance in terms of precision, efficiency, constraint complexity, and scalability. WraLU outperforms existing advanced methods by generating fewer constraints to achieve tighter approximation in less time. It exhibits versatility by effectively addressing arbitrary input polytopes and higher-dimensional cases, which are beyond the capabilities of existing methods. We integrate WraLU into PRIMA, a state-of-the-art neural network verifier, and apply it to verify large-scale ReLU-based neural networks. Our experimental results demonstrate that WraLU achieves a high efficiency without compromising precision. It reduces the number of constraints that need to be solved by the linear programming solver by up to half, while delivering comparable or even superior results compared to the state-of-the-art verifiers.
Journal Title
Proceedings of the ACM on Programming Languages
Conference Title
Book Title
Edition
Volume
8
Issue
POPL
Thesis Type
Degree Program
School
Publisher link
DOI
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
© 2024 Copyright held by the owner/author(s). This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Item Access Status
Note
Access the data
Related item(s)
Subject
Neural networks
Software engineering
Theory of computation
Numerical and computational mathematics
Persistent link to this record
Citation
Ma, Z; Li, J; Bai, G, ReLU Hull Approximation, Proceedings of the ACM on Programming Languages, 2024, 8 (POPL), pp. 2260-2287