α

Published on arXiv

2603.21876

Input Manipulation Attack

OWASP ML Top 10 — ML01

Key Finding

Achieves outstanding physical attack success rate with strong cross-domain generalization and black-box transferability without online computational overhead

UPPA (Universal Physical Patch Attack)

Novel technique introduced


Although infrared pedestrian detectors have been widely deployed in visual perception tasks, their vulnerability to physical adversarial attacks is becoming increasingly apparent. Existing physical attack methods predominantly rely on instance-specific online optimization and rigid pattern design, leading to high deployment costs and insufficient physical robustness. To address these limitations, this work proposes the Universal Physical Patch Attack (UPPA), the first universal physical attack method in the infrared domain. This method employs geometrically constrained parameterized Bezier blocks to model perturbations and utilizes the Particle Swarm Optimization (PSO) algorithm to perform unified optimization across the global data distribution, thus maintaining topological stability under dynamic deformations. In the physical deployment phase, we materialize the optimized digital perturbations into physical cold patches, achieving a continuous and smooth low-temperature distribution that naturally aligns with the thermal radiation characteristics of infrared imaging. Extensive experiments demonstrate that UPPA achieves an outstanding physical attack success rate without any online computational overhead, while also exhibiting strong cross-domain generalization and reliable black-box transferability.


Key Contributions

  • First universal physical patch attack in the infrared domain using parameterized Bézier curves for smooth perturbation modeling
  • Combines Particle Swarm Optimization with TPS and EOT to achieve topological stability under dynamic deformations
  • Achieves zero online computational overhead through offline universal perturbation generation

🛡️ Threat Analysis

Input Manipulation Attack

Creates adversarial physical patches that cause misdetection in infrared pedestrian detectors at inference time. Uses parameterized Bézier curves and PSO optimization to generate universal perturbations that work across multiple instances, causing the detector to fail. This is a physical adversarial attack causing evasion.


Details

Domains
vision
Model Types
cnn
Threat Tags
black_boxinference_timeuntargetedphysical
Applications
infrared pedestrian detectionthermal imaging