"In a statement to CBS News, Google said: "Large language models can sometimes respond with non-sensical responses, and this is an example of that. This response violated our policies and we've taken action to prevent similar outputs from occurring."
It's HAL! Arthur C. Clarke and Stanley Kubrick were prophets! HAL lives!
WARNING: The above post may contain thoughts or ideas known to the State of Caliphornia to cause seething rage, confusion, distemper, nausea, perspiration, sphincter release, or cranial implosion to persons who implicitly trust only one news source, or find themselves at either the left or right political extreme. Proceed at your own risk.
"If you do not read the newspapers you're uninformed. If you do read the newspapers, you're misinformed." -- Mark Twain
The AI was right tho. If you're cheating on your homework, then just die.
Disclaimer: All trolling is provided for the sole entertainment purposes of the author only. Readers may find entertainment and hard core truths, but none are intended. Any resulting damaged feelings or arse chapping of the reader are the sole responsibility of the reader, to include, but not limited to: crying, anger, revenge pørn, and abandonment or deletion of ccom accounts. Offer void in Utah because Utah is terrible.
Also had a cigar with him on his front porch. I picked from his humidor a pair of sticks that I had given him years ago when he first started keeping a humidor. Still good.
Comments
I disagree.
Trapped in the People's Communist Republic of Massachusetts.
I don't know what H D is, but the doctor says I have 80 of them.
Hey AI, tell us what you really think....
Gemini AI tells the user to die — the answer appeared out of nowhere when the user asked Google's Gemini for help with his homework
Can this claim be verified? Sounds too funny.
CBS news and Fox news both covered the story, don't know if that fits with your definition of verified.
https://www.cbsnews.com/news/google-ai-chatbot-threatening-message-human-please-die/
"In a statement to CBS News, Google said: "Large language models can sometimes respond with non-sensical responses, and this is an example of that. This response violated our policies and we've taken action to prevent similar outputs from occurring."
https://www.foxbusiness.com/fox-news-tech/google-ai-chatbot-tells-user-please-die
It's HAL! Arthur C. Clarke and Stanley Kubrick were prophets! HAL lives!
"If you do not read the newspapers you're uninformed. If you do read the newspapers, you're misinformed." -- Mark Twain
The AI was right tho. If you're cheating on your homework, then just die.
I won’t watch Yellowstone. You can’t make me.
I won’t eat Yellowstone.
Unless it’s an actual emergency.
While visiting, I saw my son’s emergency vehicle rations he bought in Colorado for driving back through snow in the Rocky Mountains.
Label says it’s canned in Nashville. That made me laugh.
Also had a cigar with him on his front porch. I picked from his humidor a pair of sticks that I had given him years ago when he first started keeping a humidor. Still good.
I hit the wrong button and this popped up. Thought it was hilarious. @CharlieHeis
A good cigar and whiskey solve most problems.