Today's cybersecurity headlines are brought to you by ThreatPerspective


The Register - Security

'Skeleton Key' attack unlocks the worst of AI, says Microsoft

Simple jailbreak prompt can bypass safety guardrails on major models Microsoft on Thursday published details about Skeleton Key a technique that bypasses the guardrails used by makers of AI models to prevent their generative chatbots from creating harmful content.

Published: 2024-06-28T06:38:13













     


© Ethical Hacking News . All rights reserved.

Privacy | Terms of Use | Contact Us