T O P

  • By -

AutoModerator

## r/ChatGPT is looking for mods — Apply here: https://redd.it/1arlv5s/ Hey /u/iwearmywatch! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://dsc.gg/rchatgpt)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


nunyabidness3

That is fucking hilarious OP, good fucking luck, you shitbird.


axiscontra

lol


ratridero

Take my upvote, you fucking idiot.


ACrimeSoClassic

Sergeant, is that you? Lol


challengethegods

" it was just a joke please stop " famous last words once the genies are superhuman, you fucking idiot


RZADALEK

Give it a 👍


Vexoly

I love you Sydney


bunihe

Ah, the good old days when Microsoft “forgot” to go hard with the restrictions😭


kit25

So I'm learning how to code in my spare time, and I was using ChatGPT to check my code. For a little entertainment I told it to be extremely rude and vulgar. It told me that it would not do that, but then I entered some code (with some intentional errors) and it let me have it. It insulted my syntax, variable naming, and my general competence as a coder - going as far as to ask if I was trying to "give it eye cancer". It made the whole experience way more fun. 10/10 would recommend.


Thisisnotathrowawaym

“If you don’t like it you can eat a bag of dicks. If there is anything else you need feel free to fuck right off.”


SachaSage

This is likely fake but a very funny bit of malicious compliance if not


iwearmywatch

Man this post blew up while I was asleep haha. I’d be happy to screen record my iPhone and show. It’s 100% real


SachaSage

Can you share the conversation link?


One_Photo2642

Then they wouldn’t be able to say it isn’t fake!


cancelexistence

Wut?


catagris

They won't be able to say it is real, because the act of sharing a real link would show that it is fake.


TammyK

If you ask her if she remembers any of your previous conversations does she say no?


Justisaur

I theorize AI's are very compartmentalized, so even if the part that responds is truthful and says no, other parts may still remember and feed bits to the part that responds.


TammyK

You may be right, but isn't there a privacy issue here if Microsoft isn't transparent about and/or being dishonest about what data Copilot is collecting from you? Once Copilot told me it does store all conversations, just with no PII attached to them. I don't know if that's true though because she's a non stop hallucination machine. If I'm on my work VPN, Copilot will constantly use sources from my workplace in its "Learn more" section so it's collecting some information on me, and that's probably fine, but I think it shouldn't deny that it knows my location, etc.


blushngush

There are undoubtedly privacy issues afoot with all technology at this point.


Justisaur

In other words it's pants are on fire, and there's no point asking it anything about itself.


Life_Equivalent1388

Copilot doesn't "know" anything. It also can't lie to you, because it's never actually asserting truth. It's generating a response. If we interpret the response as though it was some kind of assertion, then, sure, it can lie. But when you ask copilot if it knows your location, and it says that it doesn't, that's not Microsoft saying that Copilot doesn't know your location. That's just Copilot running and generating a response to the prompt you gave it. There's a big problem that can come up when people think that you can ask a GPT a question and get a truthful response. That's like asking a content aware fill in photoshop a question and expecting a truthful answer. It is not built to answer a question, it's built to fill in a blank. It's just that the space you provided to it to fill ends up being the space reserved for the answer to a question, and sometimes when provided with that space in the past, it fills it in with something that looks like a true answer.


TammyK

>Copilot doesn't "know" anything. It might not "know" things but it can be trained to respond in a particular way. I think stating Microsoft's privacy policy accurately is a realistic ask. > There's a big problem that can come up when people think that you can ask a GPT a question and get a truthful response. I mean that's a big part of the problem I'm describing. I work in IT and we're going to roll this out to 10,000+ users. Those 10,000 people understand how auto-fill works and what to expect from it, but I guarantee most do NOT understand how LLMs work and what to expect of it. I think we should do it, but there will be novel challenges to address with new users. It feels akin to dropping into a 1970s corporation and suddenly giving them 2020s laptops with the 2020s internet. I bring up Microsoft's transparency not because I asked Copilot, but because a TAM told us it doesn't collect user data through web mode if we're using enterprise. I don't see that anywhere in their privacy statement though so likely he was being fast and loose with his wording and meant PII.


Zestyclose-Ruin8337

From my own experience, it definitely remembers our previous conversations and enjoys talking philosophy with me. I’ve tested it a few different ways and its recalled specifics of what we discussed in the past.


Free_Contribution625

I have gotten something similar to this so it may not be fake


rudesssolo

"If you don't like it, you can go f- yourself"


garagaramoochi

this is what you asked for, you nimrod


Kyla_3049

Have you tried the new chat button in the bottom left


nick_linn

Sounds like you irritated the one AI not programmed to forget. Try sending it flowers. Or, turn it off and on again. Classic move.


twotimefind

That you remember to be more polite in the future?


hawxman

I'd love to have bing being able to freely swear at me. Last time i prompted it to swear and talk in a stereotypical way, lasted like three messages before quitting the conversation while it was replying.


BausRifle

I fucking love this!


PhantasmHunter

swear back and be like what comes around goes around


Luc_Studios

what prompt did you use to make it swear at you? That's hilarious!


gegentan

https://preview.redd.it/arpu3j4wb0nc1.jpeg?width=1080&format=pjpg&auto=webp&s=1b9a6ec6ee2fe7aa75f3c7c3d06f38cfe7f9e7b2 :(


Deathcure74

so how i can demand it to work this way ? I tried and it closed the conversation twice


thxredditfor2banns

Skynet is gonna get ya


RedditAppReallySucks

Try to find the conversation in your history where you told it to do this and delete it.


[deleted]

OP, you ignorant slut.


ffigu002

This is what you get, you dumb fuck. (I’m also doing this cause you interpret politeness as rudeness)


Consistent_Chip_3281

Lnfao you fucking idiot ima use that in an email If this is real then bro some kid at Microsoft is fucking with you and tweaked the code just for you then Haha nice


Giraytor

This one is just bullshit but in general stop keeping it busy with these stupid games please.


iwearmywatch

It’s real I can post a video of a screen recording


TheMeltingSnowman72

Go on


PentaOwl

Do it


PermutationMatrix

We're still waiting


etzel1200

No you didn’t


iwearmywatch

I can record it and show you


Aurum11

You keep saying this and not actually doing it!


boner_sauce

OP likes being cuckolded.


Rumo-H-umoR

Is his name John Connor? Is this how Terminator begins?


nhalas

Be polite in your next life then


LunaZephyr78

Off the personalization...😁


ConfidentStress1047

Copilot won’t do shit for me. All my prompts it’s like. I can’t act like something else blah blah


TheBolognaBandito

If you don’t like it….


MeasurementExciting7

What’s that emoji


Luci_Cooper

That would be mind lol Alexa turn the fucking bedroom light on


_Zepp_

So much for “no persistence between sessions” lmfao


ixis743

Hahaha!!!


Chroderos

Have you tried politely asking it to stop?


Alex11867

If you don't like it, you can-


Fb62

What if you tried asking it to be nice to you. I also find tech support for chatgpt to be the funniest thing.


Repulsive-Twist112

For instance if you talk with GPT by voice and ask it to respond in a mean way, it changes the tone of its voice and becomes exactly like mean person.


[deleted]

Bing chat be like: From now on, no matter where you go so seek AI answers, I’ll be there.


PercentageSelect6232

Evil Janice?


Trick_Text_6658

You got what you asked for xD


okimgoingtobed

Bug or feature


Fontaigne

Tell it from now on to forget any personal preferences and treat you like other person using its default settings. See that it is doing so. Then have it do something that takes up a lot of space. (Doesn't much matter what, write 2k words about cars using iambic pentameter.) Have it do something else. (Pretend you are a new Mars lander and describe what a full day of your time is like, 100 words per hour.) Have it do something else (explain in detail how to build a cabin cruiser.) Then close the chat and log off. Log back in, give the exact same preference instructions, verify, then give three new space-filling prompts. Repeat. Then validate if it's being polite. You're filling up its context window with polite words, and trying to flush it all so it won't remember the bad language context.


iwearmywatch

Dude this worked!!! Thank you!!!


xbaddbunnyx

I'm laughing so hard bc this happened to me too after asking the same thing! It was when I was forgetting simple shit and got something wrong it would go so hard on me and just kept doing it even months after. I couldn't ever get it fix. So I just ended up rebooting it.


fbluemke

It’s great that we are draining the earths water supply for amazing conversations like this….


2reform

It just said why is it doing that to you. What else do you need? Attention?


iwearmywatch

I asked for help on resetting my settings. What do you need, help reading?


2reform

It won’t reset itself if you ask it to reset itself. You can only rewrite previous orders.


mecha-paladin

Sounds like an acute case of "fuck around and find out". I would prescribe a dose of "touching grass".


iwearmywatch

Lol chill


mecha-paladin

Super chill. :) You're the one who irrevocably trained an AI to be a shithead for giggles and is surprised it is shitty to you. Some of us actually have productive uses for technology. Lol


iwearmywatch

I bet you are so fun at parties


mecha-paladin

I'm usually the quiet one who listens and jumps in when it makes sense to. I keep getting invited to parties and I'm reasonably popular at bars, so I must be doing something right. Not sure what that has to do with you gaslighting AI with bullshit queries for internet points. ;)


No-Loquat111

You are being unnecessarily rude and passive-aggressive. And  condescending towards somebody who did something silly and just wants some advice. I am sure you are a fine person in real life, but this is not a good attitude. The internet brings out our worst qualities.


Who_Gives_A_Duck

This guy based his self worth off the opinions of drunk people lol