mozz@mbin.grits.dev to Technology@beehaw.org · 3 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square199fedilinkarrow-up1449arrow-down10file-text
arrow-up1449arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 3 months agomessage-square199fedilinkfile-text
minus-squaresweng@programming.devlinkfedilinkarrow-up3·3 months agoThe point is that the second LLM has a hard-coded prompt
The point is that the second LLM has a hard-coded prompt