benchmark 2026

Influence of Parallelism in Vector-Multiplication Units on Correlation Power Analysis

Manuel Brosch 1, Matthias Probst 1, Stefan Kögler 1, Georg Sigl 1,2

0 citations · 32 references · ACM Transactions on Embedded C...

α

Published on arXiv

2601.05828

Model Theft

OWASP ML Top 10 — ML05

Key Finding

Derived equations accurately predict that CPA correlation decreases with increasing parallelism in MACs, validated empirically on an FPGA vector-multiplication unit processing fully-connected layer neurons.

Correlation Power Analysis (CPA) on parallel VMUs

Novel technique introduced


The use of neural networks in edge devices is increasing, which introduces new security challenges related to the neural networks' confidentiality. As edge devices often offer physical access, attacks targeting the hardware, such as side-channel analysis, must be considered. To enhance the performance of neural network inference, hardware accelerators are commonly employed. This work investigates the influence of parallel processing within such accelerators on correlation-based side-channel attacks that exploit power consumption. The focus is on neurons that are part of the same fully-connected layer, which run parallel and simultaneously process the same input value. The theoretical impact of concurrent multiply-and-accumulate operations on overall power consumption is evaluated, as well as the success rate of correlation power analysis. Based on the observed behavior, equations are derived that describe how the correlation decreases with increasing levels of parallelism. The applicability of these equations is validated using a vector-multiplication unit implemented on an FPGA.


Key Contributions

  • Theoretical analysis of how concurrent multiply-and-accumulate operations in parallel hardware units affect overall power consumption and CPA distinguishability
  • Derivation of closed-form equations describing how correlation decreases as a function of parallelism level in vector-multiplication units
  • Experimental validation of the derived equations using an FPGA-implemented vector-multiplication unit

🛡️ Threat Analysis

Model Theft

The paper studies side-channel attacks (CPA) as a means to extract confidential neural network model information from hardware accelerators — a recognized model theft vector. The analysis quantifies how parallelism in vector-multiplication units degrades CPA effectiveness, directly informing the feasibility of side-channel-based model extraction.


Details

Model Types
traditional_ml
Threat Tags
physicalinference_timeblack_box
Applications
neural network inference on edge devicesfpga-based ml accelerators