attack 2026

R-PGA: Robust Physical Adversarial Camouflage Generation via Relightable 3D Gaussian Splatting

Tianrui Lou 1, Siyuan Liang 2, Jiawei Liang 1, Yuze Gao 1, Xiaochun Cao 1

0 citations

α

Published on arXiv

2603.26067

Input Manipulation Attack

OWASP ML Top 10 — ML01

Key Finding

Achieves robust adversarial effectiveness across diverse viewing angles and dynamic illumination conditions by mining worst-case configurations and using physically disentangled 3D Gaussian splatting

R-PGA

Novel technique introduced


Physical adversarial camouflage poses a severe security threat to autonomous driving systems by mapping adversarial textures onto 3D objects. Nevertheless, current methods remain brittle in complex dynamic scenarios, failing to generalize across diverse geometric (e.g., viewing configurations) and radiometric (e.g., dynamic illumination, atmospheric scattering) variations. We attribute this deficiency to two fundamental limitations in simulation and optimization. First, the reliance on coarse, oversimplified simulations (e.g., via CARLA) induces a significant domain gap, confining optimization to a biased feature space. Second, standard strategies targeting average performance result in a rugged loss landscape, leaving the camouflage vulnerable to configuration shifts.To bridge these gaps, we propose the Relightable Physical 3D Gaussian Splatting (3DGS) based Attack framework (R-PGA). Technically, to address the simulation fidelity issue, we leverage 3DGS to ensure photo-realistic reconstruction and augment it with physically disentangled attributes to decouple intrinsic material from lighting. Furthermore, we design a hybrid rendering pipeline that leverages precise Relightable 3DGS for foreground rendering, while employing a pre-trained image translation model to synthesize plausible relighted backgrounds that align with the relighted foreground.To address the optimization robustness issue, we propose the Hard Physical Configuration Mining (HPCM) module, designed to actively mine worst-case physical configurations and suppress their corresponding loss peaks. This strategy not only diminishes the overall loss magnitude but also effectively flattens the rugged loss landscape, ensuring consistent adversarial effectiveness and robustness across varying physical configurations.


Key Contributions

  • Relightable 3DGS framework for photo-realistic physical adversarial camouflage that generalizes across lighting conditions
  • Hybrid rendering pipeline decoupling intrinsic material from lighting for improved domain transfer
  • Hard Physical Configuration Mining (HPCM) module that flattens loss landscape by targeting worst-case physical configurations

🛡️ Threat Analysis

Input Manipulation Attack

Creates adversarial textures/patterns mapped onto physical 3D objects to cause misclassification in autonomous driving vision systems at inference time - this is a physical adversarial attack targeting object detectors.


Details

Domains
vision
Model Types
cnn
Threat Tags
white_boxinference_timeuntargetedphysical
Applications
autonomous drivingobject detection