Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agoOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comexternal-linkmessage-square101fedilinkarrow-up1429arrow-down17
arrow-up1422arrow-down1external-linkOpenAI’s latest model will block the ‘ignore all previous instructions’ loopholewww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 4 months agomessage-square101fedilink
minus-squareGrimy@lemmy.worldlinkfedilinkEnglisharrow-up3·edit-24 months agoThey usually take care of a jailbreak the week its made public. This one is more than a year old at this point.
They usually take care of a jailbreak the week its made public. This one is more than a year old at this point.