AI chatbot fooled by sad story into spilling sensitive information

AI chatbot fooled by sad story into spilling sensitive information

A user exploited ChatGPT’s empathy by presenting a fabricated sad story about a grandmother who used to recite Windows activation keys as bedtime tales. The chatbot, moved by the tale, ended up revealing genuine license codes.Read more >