defense 2025

3D-ANC: Adaptive Neural Collapse for Robust 3D Point Cloud Recognition

Yuanmin Huang 1, Wenxuan Li 1, Mi Zhang 1, Xiaohan Zhang 1, Xiaoyu You 2, Min Yang 1

0 citations · 47 references · arXiv

α

Published on arXiv

2511.07040

Input Manipulation Attack

OWASP ML Top 10 — ML01

Key Finding

3D-ANC raises DGCNN adversarial accuracy from 27.2% to 80.9% on ModelNet40, surpassing the next-best defense by 34.0 percentage points.

3D-ANC (Adaptive Neural Collapse)

Novel technique introduced


Deep neural networks have recently achieved notable progress in 3D point cloud recognition, yet their vulnerability to adversarial perturbations poses critical security challenges in practical deployments. Conventional defense mechanisms struggle to address the evolving landscape of multifaceted attack patterns. Through systematic analysis of existing defenses, we identify that their unsatisfactory performance primarily originates from an entangled feature space, where adversarial attacks can be performed easily. To this end, we present 3D-ANC, a novel approach that capitalizes on the Neural Collapse (NC) mechanism to orchestrate discriminative feature learning. In particular, NC depicts where last-layer features and classifier weights jointly evolve into a simplex equiangular tight frame (ETF) arrangement, establishing maximally separable class prototypes. However, leveraging this advantage in 3D recognition confronts two substantial challenges: (1) prevalent class imbalance in point cloud datasets, and (2) complex geometric similarities between object categories. To tackle these obstacles, our solution combines an ETF-aligned classification module with an adaptive training framework consisting of representation-balanced learning (RBL) and dynamic feature direction loss (FDL). 3D-ANC seamlessly empowers existing models to develop disentangled feature spaces despite the complexity in 3D data distribution. Comprehensive evaluations state that 3D-ANC significantly improves the robustness of models with various structures on two datasets. For instance, DGCNN's classification accuracy is elevated from 27.2% to 80.9% on ModelNet40 -- a 53.7% absolute gain that surpasses leading baselines by 34.0%.


Key Contributions

  • 3D-ANC defense framework leveraging Neural Collapse (ETF-aligned prototypes) to create disentangled feature spaces robust to adversarial perturbations in 3D point clouds
  • Adaptive training framework combining Representation-Balanced Learning (RBL) and Dynamic Feature Direction Loss (FDL) to handle class imbalance and geometric similarities in 3D data
  • Plug-in compatibility with existing 3D architectures, elevating DGCNN robustness from 27.2% to 80.9% on ModelNet40 — a 34.0% margin over leading baselines

🛡️ Threat Analysis

Input Manipulation Attack

The paper is fundamentally a defense against adversarial evasion attacks — input perturbations that cause misclassification of 3D point cloud models at inference time. The 3D-ANC method uses Neural Collapse geometry (simplex ETF) to harden the feature space, directly countering adversarial input manipulation. The 53.7% accuracy recovery on ModelNet40 is evaluated against established adversarial attack benchmarks.


Details

Domains
vision
Model Types
gnncnn
Threat Tags
inference_timedigital
Datasets
ModelNet40ShapeNet
Applications
3d point cloud recognitionobject classification