↓
Skip to main content
Ryan A. Gibson
About me
Posts
Extra
About me
Posts
Extra
Jailbreaking
Injected Approval: A Low Effort Local LLM Jailbreak
20 December 2024
·
4 mins
Large-Language-Models
Jailbreaking
Cybersecurity
A quick look into into one of the simplest attacks on LLM safety mitigations, revealing large gaps in current approaches from major tech companies.