tool 2026

ExLipBaB: Exact Lipschitz Constant Computation for Piecewise Linear Neural Networks

Tom A. Splittgerber

0 citations · 32 references · arXiv (Cornell University)

α

Published on arXiv

2602.15499

Input Manipulation Attack

OWASP ML Top 10 — ML01

Key Finding

Extends exact Lipschitz constant computation from ReLU-only networks to all continuous piecewise linear activations, enabling robustness certification for architectures previously unsupported by exact methods.

ExLipBaB

Novel technique introduced


It has been shown that a neural network's Lipschitz constant can be leveraged to derive robustness guarantees, to improve generalizability via regularization or even to construct invertible networks. Therefore, a number of methods varying in the tightness of their bounds and their computational cost have been developed to approximate the Lipschitz constant for different classes of networks. However, comparatively little research exists on methods for exact computation, which has been shown to be NP-hard. Nonetheless, there are applications where one might readily accept the computational cost of an exact method. These applications could include the benchmarking of new methods or the computation of robustness guarantees for small models on sensitive data. Unfortunately, existing exact algorithms restrict themselves to only ReLU-activated networks, which are known to come with severe downsides in the context of Lipschitz-constrained networks. We therefore propose a generalization of the LipBaB algorithm to compute exact Lipschitz constants for arbitrary piecewise linear neural networks and $p$-norms. With our method, networks may contain traditional activations like ReLU or LeakyReLU, activations like GroupSort or the related MinMax and FullSort, which have been of increasing interest in the context of Lipschitz constrained networks, or even other piecewise linear functions like MaxPool.


Key Contributions

  • Generalizes the LipBaB Branch-and-Bound algorithm to compute exact Lipschitz constants for arbitrary continuous piecewise linear neural networks (not just ReLU), including GroupSort, MinMax, FullSort, LeakyReLU, and MaxPool
  • Supports arbitrary p-norms, enabling exact local and global Lipschitz constant computation for Lipschitz-constrained network architectures
  • Provides hard upper and lower bounds when stopped early, making it useful for benchmarking approximate methods and computing robustness certificates for small, sensitive-data models

🛡️ Threat Analysis

Input Manipulation Attack

Exact Lipschitz constant computation directly enables certified robustness guarantees against adversarial input perturbations; the paper explicitly frames robustness quantification against adversarial examples as the core application.


Details

Domains
vision
Model Types
cnntransformer
Threat Tags
digitalinference_time
Applications
certified robustnessneural network verificationlipschitz-constrained networks