Md Abdullah Al Mamun

Papers in Database (1)

attack arXiv Aug 28, 2025 · Aug 2025

Poison Once, Refuse Forever: Weaponizing Alignment for Injecting Bias in LLMs

Md Abdullah Al Mamun, Ihsen Alouani, Nael Abu-Ghazaleh · University of California · Queen’s University Belfast

Data poisoning attack exploits LLM alignment to inject targeted demographic bias via selective refusal, evading FL defenses with 1% poisoning rate

Model Poisoning Data Poisoning Attack Training Data Poisoning nlpfederated-learning
PDF