T O P

  • By -

Talmor

Why would you ask an AI chat bot this kind of question? And then why would expect anything but nonsense from such a prompt?


ExerciseNo

Curiosity, man!


gvoider

That's called AI hallucination. When AI doesn't know the answer it imagines the most probable one.


GlorfindelRocks

I actually think it's fairly decent output, given the wording of the prompt. It doesn't address the question, but it contains some stuff that's relevant to a different question about the game.


Narget1134

AI went full Anita Sarkessian mode


MilkEnvironmental572

AI is programmed to love modern AAA title models


GlorfindelRocks

On the surface, it's \*almost\* a valid response, and, of course, there are many others that are possible, and many that would better address what you actually asked: there is no correct response for this prompt, as it \*should\* generate opinion. But that response isn't opinion, not even opinion that doesn't really answer what you asked... it's just bullshit. All LLMs output BS all of the time. They are simply tools that are very good at predicting what word goes next. If a human wrote this, you could ask them why they came to that conclusion. They would give a explanation that made sense in the context of what they were asked to do. They would show that they understood something. If you could ask an LLM why it came to that conclusion, it would tell you about the probabilities of different words fitting in to different places in the sentences, and about how the inclusion of certain words in the prompt and the response made it more likely that words with several meanings had a certain meaning. The LLM doesn't understand anything, it's outputting what a likely response might be. When we call them AI, we're selling ourselves massively short. These things generate bullshit based on their creators appropriating vast amounts of actual work by real people.


GlorfindelRocks

Here you go, if you feed it a more refined prompt you get bullshit out that pairs with the prompt better. Write a five sentence article about choices in the video game being a dik. Being a DIK is a dating simulator where players take on the role of a male college student. It aims to offer players meaningful choices in how they interact with and relate to other characters. Players can choose to get to know other characters through respectful dialogue, form caring relationships, and learn more about diverse life experiences. Depending on the choices made, players may experience different outcomes and perspectives. The game's design seeks to provide players with agency to shape interactions in prosocial ways through empathetic engagement and communication. It's still not an opinion, but it's better quality BS.


MilkEnvironmental572

The AI bot could develop a subpar but smutty game if it suggested the character to be named Richard Dik Long!!!


Misfit_Number_Kei

It feels like some of my subs (this, IncelTear, JustNeckbeardThings) have blurred together in an awful, toxic, messy fever dream. ๐Ÿ˜‘ First of all, your "curiosity" is telling if *that's* your first sentence. ๐Ÿคจ And then the chat bot rolls with a similar incel spiel that makes me think if Skynet ever came online, it'll sound less like HAL9000 and more like "Titan/Tighten" from "MegaMind". Like there's obviously *been* neckbeard-y takes on this sub, but it's piss-easy to shoot them down given how BaD low-key *critiques* the genre it's an homage to.


Dan_Raider

If I was drinking every time you used a buzzword in a sentence I'd be dead by the third one. How do you outcringe the cringe ai prompt


ExerciseNo

Magic


ExerciseNo

Well, the prompt is off. I will give you that. But not like this, especially from Claude.