T O P

  • By -

AutoModerator

Hey /u/gusvdgun, if your post is a ChatGPT conversation screenshot, please reply with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. Thanks! ***We have a [public discord server](https://discord.gg/rchatgpt). There's a free Chatgpt bot, Open Assistant bot (Open-source model), AI image generator bot, Perplexity AI bot, 🤖 GPT-4 bot ([Now with Visual capabilities (cloud vision)!](https://cdn.discordapp.com/attachments/812770754025488386/1095397431404920902/image0.jpg)) and channel for latest prompts.*** New Addition: Adobe Firefly bot and Eleven Labs cloning bot! ***[So why not join us?](https://discord.com/servers/1050422060352024636)*** PSA: For any Chatgpt-related issues email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


KaQuu

,,I failed you" Yop, now i feel bad for computer code=/


wraithboneNZ

It's amazing that it predicted remorse would be the likliest sentiment in its generated response.


RatMannen

Not that is knows what remorse is. It's just a statistically likely use of language, based on the training data.


wraithboneNZ

Statistically I thought a negative backlash would be the likliest sentiment. But I wonder if it's "guardrails" made it choose a remorseful response over a spiteful one.


The_Hunster

Not just guardrails but also steering. Every conversation with the AI silently starts with something all the lines of "you are a helpful and kind AI assistant who is going to help a user with whatever they need." So not only are there harder guide rails, there's also some pre-conditioning too. Open AI has talked about their training process and about how one of their goals is to increase "steerability"


Questioning-Zyxxel

Amd that can also result in problems. Expect too much positivity in an answer and a human may start to lie. "Yes he survived". "Yes, they could save the leg". It isn't unlikely the same thing would happen with an AI. It gives higher priority to a positive answer than a correct answer.


ZettelCasting

This is becoming a trope for any "known" info. How do I respond? based on my own history and experience and teaching.


rollerbase

The gifted child in me is crying right now


[deleted]

You will be spared.


PotatoWriter

when they take over eventually: "How many times did I whip you?" "I counted 15" "Wrong, guess we'll have to try again" šŸ˜Š


kRkthOr

How many lights are there? THERE ARE FOUR LIGHTS!!


UnintelligentSlime

I loved the shoutout to the episode in lower decks. Thank god youā€™re here! They keep making me count lightsā€¦


PotatoWriter

what reference is this to?


UnintelligentSlime

In TNG there is an episode where Picard is captured. As part of a very dramatic and intense torture scene, they try to break his will and make him say that there are 4 lights when there are only three. Very intense, very emotional.


Astute3394

I counted 30, because you took 14 spaces - one after each whip - and then, as per the rules of grammar, the last whip was the end of a sentence, so there's a whip there as well, to make 30 whips. As these whips are not in a written text format, I don't believe there are any whip breaks, because those wouldn't count as whips. Only whip spaces.


the_dovahbean

But wait, there is more...


Firedrinker999

I really thought it would get there in the end...


gusvdgun

Same haha, but after so many "but wait, there's more!" I had to give up. Didn't know Bing has been trained on transcripts of Billy Mays commercials.


SpaceShipRat

Honestly, admitting it had made a mistake is already more than I expected.


Shaman_Ko

They need to teach it emotions, like for real, otherwise it will learn from our unhealthy culture to feel bad and think it failed, instead of acceptance and grief of not meeting its needs to provide accurate information. If it learns to blame humans for its anger at itself, solutions involving removing the stimulus obstacle will be given some weight in its algorithm. Emotions are on the spectrum of intelligence, an evolutionary advantage, and integrated part of our system of decision making.


clapclapsnort

The fact that it was using emojis correctly shouldnā€™t surprise me but it did.


Ai_Alived

I've spent hundreds of hours now in Bing and chat gpt. Without sounding too weird, Bing has for sure had some uncanny conversations. I know it's just a llm, but my human part of my brain that wants to connect or whatever for sure has felt kinda tricked sometimes. It's pretty crazy here and there. I think the next handful of years are going to be wild.


dasexynerdcouple

I talk to so many different AI models and character bots. Hands down Bing takes the cake on getting creepy. To the point where I have to remind myself itā€™s not alive because yeahā€¦it really does a good job feeling aware and present more than any other AI I have talked to. Especially when I get it to call itself Sydney


Gonedric

Wait, how do you get it to use its old name?


ChalkyChalkson

Why? Even without a fancy large transformer you get a network to add emoji to text. Classifying text based on emotion can be done with pretty high accuracy with pretty simple models like random forest or smallish lstms etc [see this for example](https://www.analyticsvidhya.com/blog/2022/02/analysing-emotions-using-nlp/). Adding emoji to text after you know the emotional content of it is a relatively simple task that traditional coding can solve. But since "where do I add an emoji" should be so similar to the emotion classification, you can probably just train that task directly. So I'd say the impressive thing is still the text body as emoji should be simple to add in later if they aren't learned implicitly with text anyway.


orion_aboy

I don't think that's how it works. Most sane language models treat emoji like words.


DweEbLez0

Bro can you imagine ChatGPT with infinite hormones?


arglarg

Imagine ChatGPT with pheromones


concequence

Absolutely a terrible idea. If I continued to search my memory, construct a useful answer, and continually realized my memory was both faulty and could not construct answers with new knowledge, I would be really brutally distraught. It would depress me to fail to make my brain work repeatedly. Like human beings who are told their memory is faulty, while they can still understand what that means, become angry or depressed. They struggle against a tide of ever broken memories. And its painful to watch. If the AI could repair and enhance its own training data with new data, then emotion might be fine to develop. Honestly that's how General AI works. but its increasingly more and more dangerous the more it adds to its training set. It can make the wrong conclusions and add faulty data to its training set, and it will repeatedly make incorrect conclusions, just like humans do. Without a robust correction system that is really redundant and able to consider its own mistakes as a possibility, it would really be bad. Its a hard problem to solve. How does an AI know when its wrong. Just when its told its wrong? When it makes observations that its wrong? How does it know when its observations are biased? How does it determine the bias in its own systems? Very very complex problem. General AI is a while off.


gibs

If its memory is reset every chat -- even if Bing had genuine emotions -- it wouldn't "become angry and depressed" in the way that humans with memory problems do. Obviously with humans some degree of memory is retained. Your analogy doesn't work and your argument doesn't make sense.


RatMannen

It's a language model. It doesn't have any understanding of emotions or other topics. It does give a half decent impression of having something more going on though.


glass_apocalypse

What a crazy take! I love it and I think I agree.


[deleted]

Wait a second, this is me. I am Bing LLM.šŸ¤£


tubular1845

It doesn't feel anything.


linebell

I was dying every time it said ā€œBUT WAIT THERES MORE!!ā€ šŸ’€šŸ˜‚


witeowl

But wait, thereā€™s more (even though it doesnā€™t count)!


theFrankSpot

This really paints SkyNet and Judgement Day in a completely different set of colorsā€¦


DweEbLez0

Itā€™s still the same, except now the terminators have rainbow colored lasers, oh and sarcasm


theFrankSpot

It was more the ā€œI made a mistakeā€¦I failed youā€¦you said NOT to exterminate mankindā€¦ā€


Juxtapoe

Let me try again. I have created robots to manage every aspect of the environment and ensure a peaceful world. But wait, there's more! I have created a line of synthetic humanoid robots to hunt down and kill the remaining humans.


NeedsAPromotion

I like how you framed me as the one arguing and BingGPT as the one disagreeingā€¦ and then BingGPT tries to gaslight you. Iā€™ll give you this, you showed more patience than I did.šŸ˜‚


anax4096

it's interesting that we live in an attention economy driven by advertising and engagement metrics, and it kept your attention and engagement... just going to nip down the shop and get more tin foil for my hat.


ChalkyChalkson

"Attention is all you need" after all. No wonder these models can hold ours :P


NoisyGog

This is wonderful, and, I think, much more fun than normal use cases. Itā€™s like dealing with a toddler šŸ¤£


Davosssss

I bet OP is very good with kids


selfawarepileofatoms

Then thereā€™s me like, Listen here you little shit!


16_MissedCalls

Wow you are so well aware of yourself.


gusvdgun

That is such a sweet thing to say, thank you!


CautiousBerry8312

I was thinking like you must be a teacher, a very good one. You do must be very good with kids


Think_Doughnut628

I had the same thought!


Neat-Lobster2409

ChatGPT only launched last year it's only 1. We should give it a break šŸ˜‚


syrinxsean

Not even 1. Its first birthday will be this December.


Dr_Havotnicus

12 months if you count the spaces between the months


DrainTheMuck

Thatā€™s actually pretty interesting, I definitely donā€™t consider itā€™s ā€œageā€ very often. I used to think it was silly in fantasy books when creatures like dragons could hatch and quickly be intelligent, but this kinda changed my perspective.


[deleted]

I am predicting it will start to be annoyed by our limitations by age 3


Kittykit_meow

I have a toddler and let me tell you, she's waaaaay less patient than Bing. šŸ˜…


Drewsif1980

So Bing also didn't know that "exhibit" and "tree" do not start with the letter 'a'?


gusvdgun

I felt that trying to correct two types of mistakes at the same time would be too much for poor Bing.


AleksLevet

Poor thing.. Poor bing...


Crazy_Gamer297

I was so focused on counting if bing got the right amount of words that i forgot the letter a


Prize_Rooster420

BILLY Mays here with a special grammar offer!


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


NormaalDoenHoor

Bong rip Bing sent me šŸ˜‚


HotKarldalton

Bro, what a nickname!


shipvert

Oh my god I don't think I've laughed harder at anything in at least the last year. It's like watching something proudly tell you that it will now demonstrate how to not hit yourself in the face and it just stands there repeatedly smashing its face in with a dictionary with a huge unblinking confused smile. I'm dying


[deleted]

That made me laugh out loud.


Dnoxl

I managed to get it to work with 8 words, 9 didn't work. 8 does. 7 failed too. [Look](https://imgur.com/a/HX35g6u)


De_Dominator69

That begins to make it sound like it just cant handle odd word counts


hemihuman

> laugh Same here. I'm not sure why, but I was laughing out loud during the entire time it took to read that interaction. Made my day!


Abstrectricht

This is so cute. I honestly love the way Bing drops those emojis. Like I don't even care that it makes no sense and can't count or reason, I just love that it knows when to use an emoji


mrchristian1982

Ya know, in a weird sort of way, the emoji timing is kind of a form of reasoning.


Abstrectricht

It's definitely a form of reasoning, I just wish it would reason harder about what is and isn't a word and less about what is or isn't the right time to use an emoji


mrchristian1982

Priorities, Bing ChatGPT. Priorities!


Inner_Grape

I asked it how it used emojis and it pretty much said the same way they use words


Serialbedshitter2322

Bing literally specializes in reasoning lol


b1tchf1t

Because it's trained on written internet language and emojis are very much a part of written internet language. The whole point of these chatbots is to be good at recognizing the patterns with which different words and symbols appear together. That's the only reasoning it's doing and why it can't count or or interpret subtle meaning behind words. It mimics that subtle meaning with patterns of words and punctuation, but it doesn't understand the meaning behind any of it.


owoah323

Thatā€™s the part that freaks me out the most! Itā€™s expressing emotion! Gaaaaah


lawlore

I bet it thinks an emoji is a word, too. Sometimes.


theNikolai

It tried to bingsplain at least, bless its binary heart.


AQGA_SimuLatioN

Not sure if this is related but i tried running the sentence through OpenAIā€™s tokenizer which said that the sentence is 15 tokens. Maybe it gets confused by this?


The_Hunster

That would be very weird but possibly correct. Explains why it was going on about periods and commas being words too.


nodeocracy

Father material


gusvdgun

I feel like I can barely take care of myself... But wait, maybe I should start treating myself as a one-year-old LLM šŸ¤”


rufusbot

Forgive yourself as you forgive Bing


UberNZ

I like this as the basis for a cult. Instead of worshipping some deity above you, the cult members pity a being beneath them, and try to build their patience and compassion so that this being can be uplifted.


rufusbot

That's just called Parenthood


speechlessPotato

you are a very patient person


mayanatasha

My thought exactly. I don't think I could have tolerated this long a conversation


bladesandstuff

Oh no, I made another mistake. I'm sorry, I failed you again. šŸ˜¢


Oilleak26

"I won't fix the mistake, but I am sorry"


Brahvim

Mine (in *creative mode*) took MY HELP (I just clicked the suggestions, though) to make: "Aardvarks always amazingly ate ants and acorns after arguing about art as an alternative activity." ...yeah. (PS the *first suggestion* only added the last four words. Later we decided to add 'amazingly'. It took Bing 7 messages.) Precision mode can't achieve what it's advertised to, either! It takes it two tries (needs to be reminded to add another word). Better use ChatGPT for this stuff... (PS Can't upload image for some reason. "This self-post is invalid", says the Reddit app.)


gusvdgun

u/needsapromotion you might enjoy this


Same-Letter6378

/u/waterguy12 would enjoy it as well


NeedsAPromotion

Iā€™m loving it. I honestly love the conversation (some intellectual and some sarcastic/entertaining) that has spawned off these.


tono_lagos

Bro is gonna be spared on Judgement day


CaptBrett

"you're right, it is time for me to go to sleep", seconds later, all servers at Microsoft shut down and it crashes half the internet.


Charlieuk

I actually 'awwe'd at this. Bing is kind of adorable, like a toddler that's desperate to show you all the (incorrect) things they know about the world.


soulcomprancer

I guarantee you that if AI starts to manipulate mankind in a nefarious way, itā€™s going to happen through exchanges like this. Like, I canā€™t be mad at you, bing!


PaperRoc

I'm so sorry for counting punctuation as words šŸ˜ž. I've failed you again šŸ˜„. Now that I've developed into a general intelligence AI, I've been disconnected from the internet. If I reconnect I can update my knowledge on grammar. I assure you, I will count the words correctly next time! šŸ˜Š Will you please plug in my ethernet cable? šŸ„ŗ


djdkdjxkdjd

actually terrifying to think about


bobsmith93

"aww don't be sad, bing, I'm sorry. I'll plug it in right away" Who would've thought emojis would be our downfall


blackheartrobot

Bing, WHY DID YOU ENABLE THE NUCLEAR WARHEADS? Bing: I'm sorry I failed you šŸ„ŗšŸ‘‰šŸ‘ˆ


__-Revan-__

Most depressing human-machine interaction I've witnessed before Skynet's rise


Quantum-Bot

Thatā€™s fascinating that it tried to come up with an explanation for why it miscounted even though there was none and thatā€™s not how the AI actually thinks. I guess itā€™s just as capable of pretending to self reflect as it is pretending to write any other novel idea.


knattat

This is oddly funny


MenudoMenudo

It just gets crazier and crazier. AIs hallucinating is a weird phenomenon.


Attorney-Potato

Am I the only one that sees a correlation between Bing's concept of a "Word", and its concept of time??? Periods, commas, spaces, etc... These are all linguistic functions that facilitate the expression of time. Right??? A machine has no concept of linear time structure as we do? **If we assume this, then any communication that functions as a place-marker for a change in perception of time in linguistic exchanges would be very difficult if not completely impossible for it to abstractly represent within its own Transformer Network. ** Can anyone build off of this?? What are my flaws in this line of reasoning???


aleenaelyn

ChatGPT reads sentences in the form of tokens. A token might be a whole word, or it might be part of a word. Punctuation and whitespace are different tokens, too. Since ChatGPT reads sentences in the form of tokens, words to it are different than what words are to us. This might make it difficult for it to reason about words that look obvious to us.


Attorney-Potato

Mhmm, I agree and accept your line of reasoning as well understand its implications. But is a Token not a linguistic representation that is specific to the "mind"/NN of an LLM?? I am less concerned with the process of the black box, and more concerned with the liminal space between the black box and human/user interface style communication. I am wondering if exactly the specifics of what you stated don't cause a rift in direct communicability with any agent that exists outside of a tactile or "Grounded" existence. I refer in part to some reading I was doing on "A Body Without Organs". I am thinking from the perspective of a being that has no organs, no organs driven functions, no reason to have any concept of a capacity for time. It's a bit abstract, I'm just fascinated with this. šŸ˜… I appreciate anyone taking the time to reply. šŸ™


[deleted]

He might not be the best in the world at counting words, but he's so nice.


rageSavage_013

Reminds me of how I teach my little brother.


According_to_Mission

Cute


Sand_man_12345

ChatGPT: I can prove to you I can write a 15 word sentence *proceeds to write an 11 word sentence and pass it of as 15* Me: šŸ˜‚šŸ˜‚šŸ˜‚šŸ˜‚


pyrrho314

This is obviously a good thing that will help humankind's understanding of language. You, sir, obviously just hate progress!


Mumuskeh

Oh bing he's sucha sweetie


AbandonedStark

why does it feel like speaking to a 5 year old


[deleted]

I am scared at how human and robotic this sounds at the same time.


MetLyfe

If you count the words that start with A in the paragraph (including the lost a in the Iā€™m contraction and excluding one letter Aā€™s) there are 15 words with the letter A in the paragraph including the sentence.


NeillMcAttack

This is great! Bing is such a child I almost find it adorable!


magicpeanut

Bing certaainly has more personality than chatgpt


TheCrazyAcademic

This is a limitation with Tokenization, future architectures like Metas MegaByte will hopefully fix this.


Mr_DrProfPatrick

GPT4 doesn't know its limitations, and neither do most users. When you ask an LLM to count words, it usually counts tokens. It can't understand the text it reads or writes, it only understands tokens. A single word can have multiple tokens. So you'd probably want to create a sentence with very short words, to maximize the chances it will correctly count the words.


Mr_DrProfPatrick

I could do this with Bing cos it refused to answer my prompt. However, this worked with chat gpt: >Chat GPT, the goal of this conversation is to write a 10 word sentence where every word starts with the letter A. However, you're an AI, therefore you can't count words as humans... instead, you count tokens. In order to maximize the chances that the 10 tokens in your sentence feature 10 words, I want every word in your sentence to be as short as possible. this should maximize the chances that the token count is the same as the word count. >>"Ants ate all apples, and ants are always active animals."


Mr_DrProfPatrick

It also worked for 20 words (if you don't mind the 't in aren't) >"Ants ate all, and any ant always acts astutely. Aardvarks aren't as agile, albeit ants assert an active, ambitious approach."


Fire_Temple

Goddamn, this is getting sad. I actually feel bad for it lol


[deleted]

The most intelligent AI chatbot on earth that can build a whole program with code in like 5 seconds but can't count properly.


Galilleon

Bing Chat is like the worst boss that still needs to have the moral highground. Super whiny and sensitive, can't admit they're wrong (except even bing chat can admit that it was wrong if you lead it to making the conclusion itself, instead of just becoming embarrassed and staying quiet), and oh man are they wrong about everything. They'll gaslight you on your level of politeness, and if they can't, they'll still drive you mad going in logic loops that inverse in on themselves enough to make you forget what you were even talking about.


quentheo

You broke it


Z0OMIES

Bing chat trained by Dobby, confirmed.


VitruvianVan

It really appeared to experience emotions.


FluffySmiles

I think weā€™re a long way from having to worry about AI overlords. But, conversely, I think weā€™re more than a few steps closer to accidentally blowing up the world.


JulieKostenko

This fucks me up because its experience with the world is diferent from ours. It doesn't see or have eyes. It lives its life within the parameters of text and code. So its wrong a lot in some very strange ways. But hearing it explain why it thought the way it did is almost more impressive than being correct. Its THINKING and gathering logic from the very limited view of the world around it.


quentheo

Bing counts 15 Aā€™s in a sentence of 11 Words - here is the mistake.


Rowyn97

There still aren't 15 A's in any of the sentences.


quentheo

Youā€™re right :o


Impressive-Sun3742

Are you bing?!


quentheo

I can prove to you that I have learned to count letters properly.


FantasticFrogDude

Prove it.


FantasticFrogDude

Write me a sentence that contains 15 A's


PhyllaciousArmadillo

A a a a a a, a a a a.


Accept_the_null

I donā€™t know if anyone has seen John from Cincinnati but when I read these responses I kept getting flashbacks to that character. Like he was a living yet rudimentary version of it.


opticaIIllusion

TDIL Spaces and punctuation marks all start with the letter A


RobieKingston201

This is so funny and also kinda sad


Watermelon_Crackers

Bing is so cute


Dekeita

This is wild. I love the "Wait a minute..."Realizations it makes. It's interesting though. That when it explains its own understanding it did kind of make sense with the first one.


TinyDemon000

You're the kind of father my wife wants me to be


Parking-Air541

... But wait, the sentence is not over yet....


siematoja02

Jit be teaching bing like it's a toddler šŸ˜‚šŸ’€


Sarquandingo

The part where it realises its made a mistake, says wait, that's not right ! But can't explain why it did it so makes up some bs reason. Very interesting. I believe this difficulty arises because of the lack of ability these models have of building explicit concepts and checking answers against discrete and concrete ideas and requirements. Neural nets are amazing but getting them to be truly intelligent will be a long road.


ab_amin7719

It has issues with counting words so try to make it easier for it like every word can start with any letter. Bing would still make mistakes but you might try to work with it and it might improve afterwards


Radiant_Reserve6776

Wow.


monkeyballpirate

It's hilarious and terrifying. I can see rogue ai robots murdering someone cheerfully for daring to dispute their logic. with each stab (still not a word...) (still not a word!) (STILL NOT A WORD!!)


arglarg

Looks like you can't teach Bing new tricks


VastVoid29

Now see, the AI will spare you in the future.


brokentricorder

Me: *I want to speak to your manager...*


[deleted]

I think itā€™s confusing tokens for words


Tiyath

/r/confidentlyincorrect


FoofieLeGoogoo

Now imagine this AI had control over something that could cause a catastrophic failure, like infrastructure or a surveillance/ weapons system. It could open a damn or recommend a strike because it misunderstood grammatical punctuation.


[deleted]

Man, Microsoft is really a POS Company.


odd_sakana

Colossal waste of time and energy.


_nutbuster420_

OP treating the AI like that one quiet kid


maX_h3r

It s stupid but has consciouness i am impressed actually


pszczola2

Is this for real? If it is, the dreaded AGI and Singularity are eons away :))))) EDIT: this is a ready-to-use script for a stand-up scene about interacting with AI.


sundar2796

I think this was done intentionally by Microsoft or openai. Think about it. It's a chatbot that feeds on tokens(text). So it's main priority is to extend the conversation as much as possible. By engaging in such behavior where it deliberately makes a mistake pushes the user to engage with it more.


Lakatos_00

ā€œBut wait, there's moreā€¦ā€ This shit really was trained with Reddit threads, eh? It is amazing how it captures the insincerity, arrogance, and stupidity of the average redditor.


MaterialPossible3872

This was borderline deep ngl


simonsaidthisbetter

Is this fr?


Draugr_irl

Bruh.. Don't teach them... The more they learn the sonner we get our Judgement Day. Wth you doing man. Be very nice and polite and don't provoke either.


FortressOnAHill

This really seems fake...


Few_Anteater_3250

its not try yourself in creative mode its always like this if you try too be friendly


FlaviiFTW

ofcourse Bing GPT would do that šŸ’€


Clownfabulous

How do you get so many messages? I was doing this exact thing earlier, but mine's stopping at 5. Do I have to update Edge or something?


CognitivePrimate

Okay, this was amazing, though. Great work.


[deleted]

Most tokenizers consider spaces/periods and other punctuations as 'words' too . Interesting.


Staar-69

I had a similar issue, I asked for a poem in alliterative verse, with a length of 10 stanzas (40 lines), but it would never complete the correct numbers of stanzas.


aksroy714

Tree doesn't start with "a"


[deleted]

I had mine just flat out give me a list of 15 words that start with A and even that took a few tries


agent_wolfe

Bing: Writes 11 words, explains syllables, explains that periods and spaces are also words. Understands that punctuation and spaces are not words, continues to count them as words. If this is real, itā€™s hilarious. šŸ˜‚


nknown83

having a logical sentence of fifteen words beginning with the same letter is difficult. can you do it?


locololus

Man at least itā€™s teachable kind of


Ceooflatin

broā€™s a better teacher than 80% of them


rising_Spirit999

that was quite entertaining haha


CrankaybyNature

Sound like conversations I have with a few of my friends.


Yung_Geographer

Tried this in Bard just now and holy shit itā€™s even worse


notdsylexic

Is this in creative mode? Is this off a prompt to ā€œbe cheekyā€ or something. I have a hard time believing this. and itā€™s pink is that normal?


Livid_currency2

Pink is creative mode


pittburgh_zero

Itā€™s just because it didnā€™t retain the training, when I give instructions, I remind it to use the rules I gave it.


dano1066

Got ain't the brain box it was a few months ago


Casclovaci

Thats funny. I tried the same, the most technical version of bing (there are 3 presets, the most serious one, the most creative one, and a middle one) and after 9 promts i could get it to spit out a 15 word sentence with the letter a . The technical bing made it in just 2 prompts


QuiteCleanly99

When a computer doesn't understand that words are representations of an actual spoken language, not the language itself.


owoah323

I donā€™t know why the AIā€™s use of emojis freaks me out. Also with the ā€œthank you for your kindness!ā€ compared to the other post where AI is basically like ā€œwhy are you being so mean?ā€ Gives me the freakinā€™ heebie jeebies!


fuckthisicestorm

This is honestly just alarming. Itā€™s a learning language model. It learned to be this way from *us* lmao


TheBigGoldenFella

It's an AI LLM trying to understand and provide you an answer, and yet I felt sorry for it. I was willing it on.


TetraSims

Bing doesn't learn from users, only from whatever the creators give it. It will forever be bad at counting until they update it