Paul Ungermann

h-index: 1 3 citations 2 papers (total)

Papers in Database (1)

attack arXiv Oct 31, 2025 · Oct 2025

Diffusion LLMs are Natural Adversaries for any LLM

David Lüdke, Tom Wollschläger, Paul Ungermann et al. · Technical University of Munich

Uses Diffusion LLMs as amortized jailbreak generators, producing low-perplexity transferable harmful prompts against black-box and proprietary LLMs

Prompt Injection nlpgenerative
3 citations PDF Code