attack 2025

Dynamic Parameter Optimization for Highly Transferable Transformation-Based Attacks

Jiaming Liang , Chi-Man Pun

0 citations · 43 references · arXiv

α

Published on arXiv

2511.11993

Input Manipulation Attack

OWASP ML Top 10 — ML01

Key Finding

Applying DPO to BSR with ResNet-50 as surrogate improves average non-targeted attack success rate by 3.2% to 90.7% at Epoch 100 across eight diverse target models

Dynamic Parameter Optimization (DPO)

Novel technique introduced


Despite their wide application, the vulnerabilities of deep neural networks raise societal concerns. Among them, transformation-based attacks have demonstrated notable success in transfer attacks. However, existing attacks suffer from blind spots in parameter optimization, limiting their full potential. Specifically, (1) prior work generally considers low-iteration settings, yet attacks perform quite differently at higher iterations, so characterizing overall performance based only on low-iteration results is misleading. (2) Existing attacks use uniform parameters for different surrogate models, iterations, and tasks, which greatly impairs transferability. (3) Traditional transformation parameter optimization relies on grid search. For n parameters with m steps each, the complexity is O(mn). Large computational overhead limits further optimization of parameters. To address these limitations, we conduct an empirical study with various transformations as baselines, revealing three dynamic patterns of transferability with respect to parameter strength. We further propose a novel Concentric Decay Model (CDM) to effectively explain these patterns. Building on these insights, we propose an efficient Dynamic Parameter Optimization (DPO) based on the rise-then-fall pattern, reducing the complexity to O(nlogm). Comprehensive experiments on existing transformation-based attacks across different surrogate models, iterations, and tasks demonstrate that our DPO can significantly improve transferability.


Key Contributions

  • Empirical study revealing three dynamic patterns of adversarial transferability with respect to transformation parameter strength across iterations, surrogate models, and tasks — showing low-iteration evaluations are misleading
  • Concentric Decay Model (CDM) that theoretically explains observed transferability patterns using KL-divergence-based plausible model density around the surrogate
  • Dynamic Parameter Optimization (DPO) that reduces parameter search complexity from O(mn) to O(n log₂m) while substantially improving transfer attack success rates across existing attacks (Admix, SSIM, STM, BSR)

🛡️ Threat Analysis

Input Manipulation Attack

Proposes Dynamic Parameter Optimization (DPO) to improve transformation-based adversarial attacks — these are input manipulation attacks crafting adversarial examples that transfer across black-box target models at inference time, directly improving evasion attack success rates.


Details

Domains
vision
Model Types
cnntransformer
Threat Tags
black_boxgrey_boxinference_timeuntargetedtargeteddigital
Datasets
NeurIPS'17 adversarial dataset
Applications
image classification