Faisal Ladhak

Papers in Database (1)

benchmark arXiv Aug 9, 2025 · Aug 2025

Many-Turn Jailbreaking

Xianjun Yang, Liqiang Xiao, Shiyang Li et al. · University of California · Amazon Inc.

Benchmarks multi-turn jailbreaking in LLMs, showing unsafe outputs persist across follow-up conversation turns beyond initial jailbreak

Prompt Injection nlp
PDF