MC$^2$Mark: Distortion-Free Multi-Bit Watermarking for Long Messages
Xuehao Cui , Ruibo Chen , Yihan Wu , Heng Huang
Published on arXiv
2602.14030
Output Integrity Attack
OWASP ML Top 10 — ML09
Key Finding
Achieves near-perfect accuracy for short messages and outperforms the second-best multi-bit watermarking method by nearly 30% for long messages while preserving generation quality.
MC²Mark
Novel technique introduced
Large language models now produce text indistinguishable from human writing, which increases the need for reliable provenance tracing. Multi-bit watermarking can embed identifiers into generated text, but existing methods struggle to keep both text quality and watermark strength while carrying long messages. We propose MC$^2$Mark, a distortion-free multi-bit watermarking framework designed for reliable embedding and decoding of long messages. Our key technical idea is Multi-Channel Colored Reweighting, which encodes bits through structured token reweighting while keeping the token distribution unbiased, together with Multi-Layer Sequential Reweighting to strengthen the watermark signal and an evidence-accumulation detector for message recovery. Experiments show that MC$^2$Mark improves detectability and robustness over prior multi-bit watermarking methods while preserving generation quality, achieving near-perfect accuracy for short messages and exceeding the second-best method by nearly 30% for long messages.
Key Contributions
- Multi-Channel Colored Reweighting that encodes bits through structured token reweighting while keeping the token distribution unbiased (distortion-free)
- Multi-Layer Sequential Reweighting to amplify watermark signal strength for long messages
- Evidence-accumulation detector for robust multi-bit message recovery from watermarked text
🛡️ Threat Analysis
MC²Mark watermarks LLM text outputs (not model weights) to trace content provenance and authenticate AI-generated text — a canonical output integrity / content watermarking contribution. The watermark is in the generated text, not in the model, so this is ML09, not ML05.