{"ID":689675,"CreatedAt":"2026-03-04T20:59:41Z","UpdatedAt":"2026-03-04T20:59:41Z","DeletedAt":null,"paper_url":"https://paperswithcode.com/paper/iterative-prompting-with-persuasion-skills-in","arxiv_id":"2503.20320","title":"Iterative Prompting with Persuasion Skills in Jailbreaking Large Language Models","abstract":"Large language models (LLMs) are designed to align with human values in their responses. This study exploits LLMs with an iterative prompting technique where each prompt is systematically modified and refined across multiple iterations to enhance its effectiveness in jailbreaking attacks progressively. This technique involves analyzing the response patterns of LLMs, including GPT-3.5, GPT-4, LLaMa2, Vicuna, and ChatGLM, allowing us to adjust and optimize prompts to evade the LLMs' ethical and security constraints. Persuasion strategies enhance prompt effectiveness while maintaining consistency with malicious intent. Our results show that the attack success rates (ASR) increase as the attacking prompts become more refined with the highest ASR of 90% for GPT4 and ChatGLM and the lowest ASR of 68% for LLaMa2. Our technique outperforms baseline techniques (PAIR and PAP) in ASR and shows comparable performance with GCG and ArtPrompt.","url_abs":"https://arxiv.org/abs/2503.20320v1","url_pdf":"https://arxiv.org/pdf/2503.20320v1.pdf","authors":"[\"Shih-Wen Ke\", \"Guan-Yu Lai\", \"Guo-Lin Fang\", \"Hsi-Yuan Kao\"]","published":"2025-03-26T00:00:00Z","tasks":"[\"Persuasion Strategies\"]","methods":"[\"Refunds@Expedia|||How do I get a full refund from Expedia?\", \"15 Ways to Contact How can i speak to someone at Delta Airlines\", \"Attention\", \"Absolute Position Encodings\", \"Linear Layer\", \"Label Smoothing\", \"Attention Dropout\", \"Cosine Annealing\", \"Softmax\", \"Weight Decay\", \"Dropout\", \"Position-Wise Feed-Forward Layer\", \"BPE\", \"Transformer\", \"Multi-Head Attention\", \"Dense Connections\", \"{Dispute@FaQ-s}How to file a dispute with Expedia?\", \"Residual Connection\", \"GPT-4\", \"Linear Warmup With Cosine Annealing\", \"Adam\", \"Layer Normalization\", \"ALIGN\", \"GPT-3\"]","has_code":false}
