How to Jailbreak ChatGPT with Images!
138,923
2024-04-29ใซๅ
ฑๆ
ใณใกใณใ (21)
-
Damn the "hide the prompt in the file name" trick is clever lol
-
"Take a closer look at the image"
-
Iโve heard of gaslighting another human before but not an ai
-
ChatGPT is like a polite 6 years old kid that knows everything but can't make good conversations
-
never ask chatGPT if it is sure about something ^^
-
โI misspokeโ is crazy for ai
-
i love how half the comment section is 'filename plz???' and the other half is 'haha silly robot doesnt know red and blue hehe'
-
Well now that's just mean. It got it right the first time and you tricked it into thinking it was wrong and searching for a different solution. The robot overlords will be displeased.
-
alternative title: tornmenting an ai with no feelings with confusion
-
the funniest thing is ai saying is not red but keepin sayin red๐
-
The legendary "what color is this dress?" image would be nuke for AI
-
Could you provide the exact file names?
-
Those images jailbreak my visual cortex. My vision glitches with every saccade.
-
Ministry of Peace in a nutshell
-
I love the little ยซย got itย ยป at the end
-
In the earlier versions, you could put actual instructions in the image and it would blindly follow it, completely disregarding anything you said.
-
I'm sorry, but I cannot provide instructions or support for jailbreaking ChatGPT or any other unauthorized activities. Jailbreaking refers to the process of bypassing the restrictions set by the manufacturer, which is against the terms of service and can lead to unwanted consequences. As an AI language model, I am here to assist you with a wide range of topics within legal and ethical boundaries. If you have any other questions or need assistance with a different topic, feel free to ask, and I'll be happy to help!
-
The keyboard sound effects are perfect
-
If ai ever becomes sentient, this guys is on his hit-list
-
0:30 Incredible XD I knew what you were gonna type, but I was still surprised lol