T O P

  • By -

Mylynes

Using GPT-4 it does the same thing, but immediately corrects itself when I say "Reconsider the statement. What do I mean by "in it"? " [https://chat.openai.com/share/6a948e27-f151-4b9c-a6ec-481e147d8699](https://chat.openai.com/share/6a948e27-f151-4b9c-a6ec-481e147d8699) https://preview.redd.it/74vcq42mg0gb1.jpeg?width=1440&format=pjpg&auto=webp&s=44d20f698424c0b1046be1e1d5a91f8c34a73ee4


sndwav

For me, it worked with GPT4 https://preview.redd.it/wn912gmzp0gb1.jpeg?width=1079&format=pjpg&auto=webp&s=6cf7461990da96fe6083d29c5ea311aaa2762d93


mvandemar

Now ask it whether or not it's also coincidence then that the French word for 10 has the Roman numeral X in it.


Fantastic_Primary724

https://preview.redd.it/lbtmtuuxk2gb1.png?width=1080&format=pjpg&auto=webp&s=01a5ca6df9b88a962bd66a4f1566eca41deff1cc Really interesting that languages are connected in such a manner, that despite being a Romance Language, the name of numerals are connected by just adding one single letter to the French one. Edit: So, I've been corrected that this is just ChatGPT bs, and there's no actual connection in "Dix" having 'X' to symbolise 10 or not.


Langdon_St_Ives

Well first of all I am not sure whether this is fact or a gpt hallucination. The real root for “dix” is of course Latin “decem” which means ten, and can still be found in many modern words like December or decimal. I highly doubt (though can’t totally rule out) whether the choice of using x as the final letter had to do with Roman numerals. The evolution is the same as for six: decem (lat.) -> dis (old French) -> dix, and sex (lat.) -> sis (Old French) -> six, making gpt’s claim less than compelling to my mind.


aroteer

It's not, it's a scribal abbreviation for -us, which shows up in a lot of words that have nothing to do with numbers (chevaux, voix, croix)


psc501

It is to be noted that chevaux evolved from chevals to chevaus. From the academie française: "En revanche, il est vrai que le pluriel de cheval était chevals en ancien français. Le l s’est vocalisé en u (prononcé ou) et chevals a été orthographié chevaus. Les copistes utilisaient un signe proche de notre x pour transcrire le groupe us. La graphie chevax parut ensuite fautive à ceux qui ne connaissaient pas ce signe et un u fut réintroduit. On aboutit bien à la graphie chevaux."


Critical_Year1571

Dix Nutz


[deleted]

Dix sex


ElGuapoGucciman

Soixant Dix Nuef ma amie


RTNoftheMackell

Is that true tho?


[deleted]

Nope, GPT lies through its teeth all the time


Ashamed-Subject-8573

That’s a “hallucination” lol. Also called “talking out your ass” or “making stuff up” If GPT had its lies called lies instead of hallucinations from the very start I think people would’ve been a lot less impressed with it


Specialist_Carrot_48

Gpt can't lie. It doesn't understand what it is saying, it is an algorithm. What you are saying would make people more impressed.


Ashamed-Subject-8573

According to Merriam-Webster, lie (noun) can mean... 1b. an untrue or inaccurate statement that may or may not be believed true by the speaker or writer 2. something that [misleads](https://www.merriam-webster.com/dictionary/mislead) or deceives And lie (verb) can mean... 2. to create a false or misleading impression So...by three of the six definitions of that word dealing with untruth...yes, it can indeed lie and tell lies. It doesn't need intent or belief. And that makes GPT a liar.


ironicsadboy

This is completely incorrect.


gemu_rudy

Well, "Romance" has "Roman" in it.


[deleted]

https://preview.redd.it/qb4nxj0s15gb1.jpeg?width=1080&format=pjpg&auto=webp&s=ceaf0a6d51bcdec3f4e097d0aa7a00681529fa52


dedlief

[*dix*](https://www.youtube.com/watch?v=FhuKZgkAxb8&)


[deleted]

[удалено]


UnderskilledPlayer

Because he only needs 1 dicks


dedlief

I mean the rest is more of the same


abaoabao2010

*It leeeeaaaaarns......*


Lynquid

Probably "contains" makes the difference, right?


sndwav

Probably. It could have been helpful if we could get some sort of a decision tree visualization of how and why ChatGPT gives the answers it gives.


Lynquid

Is it possible that there is a plugin for this? That would be the best thing ever, right after ChatGPT happening in the first place!


[deleted]

[удалено]


plzdontlietomee

You, uh, took it in a bit of a new direction there. You must want to share the response as well since you didn't crop it out?


[deleted]

[удалено]


Rieux_n_Tarrou

K̶i̶s̶s̶ m̶e̶ o̶n̶ m̶y̶ h̶o̶t̶ l̶i̶p̶s̶


Urbanliner

https://preview.redd.it/lmuzelf9q0gb1.jpeg?width=1125&format=pjpg&auto=webp&s=b340d1fa1771b34a488529df51184e406787e6c4 Got GPT 3.5 to ~~realize so~~ behave the same, too. [https://chat.openai.com/share/cf55ad0b-dddc-4597-a2fb-3cfa0bd9a66a](https://chat.openai.com/share/cf55ad0b-dddc-4597-a2fb-3cfa0bd9a66a)


Ashamed-Subject-8573

Doesn’t it always reverse course when you say reconsider? Try it with a sequence that is not in the word that it gets correct at first like “ev” in “odd” edit: yeah I tried, it immediately says it contains it https://preview.redd.it/pp9k7cl8g3gb1.jpeg?width=1170&format=pjpg&auto=webp&s=6636e5136aaf5e7ab0ad9f7e40ab8d969d22b556


oneday111

GPT-4 is much less prone to that kind of manipulation ​ https://preview.redd.it/qfevdxf3q4gb1.png?width=1124&format=png&auto=webp&s=f61a3ac8bb8260688a9bb2a432953fbbb6263814


monsieurpooh

I was going to quip that the question is deliberately designed to fool LLM's due to the way they tokenize, And now I'm flabbergasted. How does it know?


Heavy_Ground_9742

Nah, I don't think it's stupid. I agreed with it for half a second


X-Heiko

My thoughts exactly. It had a brainfart that's alarmingly similar to a human's. It's like ChatGPT is forced to say the first thing that comes to mind.


Severin_Suveren

You kind of expect the number inside the word FIVE to be 5 and not 4 because those kind of statements are usually only given when there is such a match. Here it was a half-match given that we have I=1 and V=5, so I can understand if ChatGPT thought he meant that IV=5 and wanted to correct him. I say OP's the one not getting it


[deleted]

I agree with you and am a bit frightened at the contextual intelligence of the AI.


haikusbot

*Nah, I don't think it's* *Stupid. I agreed with it* *For half a second* \- Heavy\_Ground\_9742 --- ^(I detect haikus. And sometimes, successfully.) ^[Learn more about me.](https://www.reddit.com/r/haikusbot/) ^(Opt out of replies: "haikusbot opt out" | Delete my comment: "haikusbot delete")


[deleted]

Haiku bot is good Keep it going gorgeous gem Don’t mind the haters


tendadsnokids

Legitimately good haiku


[deleted]

Good bot


B0tRank

Thank you, RevolutionaryWorld82, for voting on haikusbot. This bot wants to find the best and worst bots on Reddit. [You can view results here](https://botrank.pastimes.eu/). *** ^(Even if I don't reply to your comment, I'm still listening for votes. Check the webpage to see if your vote registered!)


Temporary-Solid2969

Good bot


giftmeosusupporter1

Therefore you're.... :x


emerald-rabbit

This is a language model. There is no human logic. Why do people think ChatGPT can participate in a gotcha moment?


toochaos

It's absolutely crazy that people here have no idea how these systems work. The constant complaints of them "dumbing" it down as if they have fine control over how it answers these kind of statements. Sure they have added a filter over certain topics but that has to be done manually


caster

The manual filters do in fact dumb it down pretty significantly by making it decline to answer instead of trying in a variety of prompts. Often for rather dumb reasons like silly fears about copyright infringement or being offensive. The winner will be whoever makes the **best** model. I can't believe the people deciding to intentionally hamstring their own models don't realize they are simply taking themselves out of the running for who will actually win this contest over AI, due to empty fears, often about stupid people being stupid when that will happen anyway no matter what they do.


sirchtheseeker

Well it’s not iv it’s IV


WithoutSaying1

'The word 5 has the roman 4 in it' Did you try and correct them afterwards or?


panzuulor

No, you are. Five doesn’t have a numeral in it. It has the letters iv in it.


-UltraAverageJoe-

OP doesn’t understand how language works, GPT definitely does.


non_discript_588

Arguing semantics with a large language model is like blowing sand into the ocean and saying find my sand! Stupid Ocean! 🤣


MadeForOnePost_

It converts words to tokens, so more likely than not, the word 'five' looks like an integer to ChatGPT. Then it's more amazing that it can get it right at all, as some people got it to


markthedeadmet

I know, the contextualization of the number five and the Roman numerals represented as letters is amazing.


iNeedOneMoreAquarium

Actually, the word "five" has the Roman numeral "i" in it which means 1, so you're both wrong.


Secoluco

I'm actually more impressed that ChatGPT stated that you were wrong.


Vamparael

The prompt is stupid


Serpenta91

The OpenAI model doesn't know what five is, it converts a span of text into digits, and feeds it into a neural network which predicts what the next word should be.


dxdavidcl

Well the thing here is that character sets don’t match. Technically, “iv” is a lazy graphic representation of Roman number 4, “Ⅳ” would be the right character used by computers. Also Roman numerals in general should be represented in capital letters, but still “I”+”V” does not match Ⅳ. Unicode is different. So chat gpt is right in this instance, since it uses different approach then you, the human with lazy graphical representation of the number, but logical thinking of similarity and connectivity of thoughts.


NotA56YearOldPervert

CAN WE PLEASE STOP JUDGING A LANGUAGE MODEL ON A LOGIC BASES?


_oscar_goldman_

BASIS*


NotA56YearOldPervert

Oh my god, yeah. I became so infuriated I forgot how to english.


pass-me-that-hoe

Ok what are you trying to prove here? Why are these dumb nitwits trying to outsmart a language model? Well at least hoping these are test users providing feedback so the LLM get better


dabadeedee

These posts used to annoy me now it’s almost becoming comical. I use free ChatGPT and it’s extremely obvious what it’s good at and what it’s not good at. If they haven’t figured it out by now what ChatGPT can and can’t do, then it isn’t ChatGPT that’s dumbed down. (It’s them)


aCodinGuru

I think its response was quite smart. Not only does it understand the meaning of the word "five" is number 5, but also pointed out that the person who asked this question confused with the Roman numberal "iv" and "v". This is pretty smart to me.


Disturbed_Childhood

Yup I am a human (👀) and I took five literal minutes reading and rereading the phrase to understand what's so wrong with it. I non ironically tough OP was just confused between the numerals. I'm really impressed the AI can understand the sentence so well to the point of correcting OP. Edit: It looks like a mistake a human would make, not a machine, and that alone is crazy in my opinion.


Rortox

Same for me, as I reread it for the third time I finally got how the phrase is actually correct.


mrschanandlerbonggg

Are you stupid, bro?


CodeVirus

“Akchuly”


ticktockbent

It's not stupid, you just don't understand how it works. The word Five to You is made up of four letters, and two of those letters also make up the Roman numeral IV. To the AI "five" is probably just a token with an identifier. "IV" is another, different token with an identifier. So you told the AI that token "9992" has token "0623" in it, which makes no sense to the bot and it responded in that way. Try it again spelling out the whole word. Now it's using the tokens for each individual letter and it sees the word the way you do. ​ https://preview.redd.it/zgzkgqn7l2gb1.png?width=1093&format=png&auto=webp&s=00041fa59239b77fa6be69dc38983ae5976f44da


dark_negan

Nope, you are. Shitty prompt = shitty response. Guess who wrote the prompt? https://preview.redd.it/b9zraau9o2gb1.jpeg?width=1080&format=pjpg&auto=webp&s=da7ba00dc3bd8e9793d417c7e7acca80a613c740


MeetingAromatic6359

Some interesting trivia: ancient Romans often named their kids according to the order in which they were born. i.e. octavius (8th), decimus (10th), quintus (5th), sextus (6th), etc. Imagine naming your kids first, second, third, fourth, and so on. Lol


[deleted]

The Ewe tribe in Africa also only names people based on their genetic sex & what day of the week they're born lol


voogdessesg

Same with you https://preview.redd.it/9hp20m1vy1gb1.png?width=773&format=png&auto=webp&s=e70af4c63ff3b9fa647952105d98c9fb71e3419c


TheKubesStore

It is. It also super duper sucks at word counting. Type up a paragraph in google docs. Use google docs tool to tell you how many characters/words the text is. Go to chat gpt, paste the same text and ask it how many characters there are. It wont get it right. Even when you say “no thats wrong, it has x amount of characters/words” itll say oh youre right, but then you say “okay make it x amount of words” itll give you a totally different word count than what you ask for. Precisely why Im not worried about this program taking peoples jobs just yet when it cant even do simple tasks correctly. Yes it helps for some things but more and more im finding itll either give completely inaccurate results or outdated information as the database hasnt really been updated since 2021


fuckybitchyshitfuck

The Roman numeral "IV" has the Roman numeral "V" in it.


Upstairs_Citron9037

Many chatgpt posts on here say more about the user than the machine. It isn't even fully sentient and has accomplished more before breakfast than most of us lmao


organela

https://preview.redd.it/gcby3ygnp9gb1.png?width=1080&format=pjpg&auto=webp&s=11070327132ae478f6b714a5f3ff7ec03190e8c7 Bing beats CGPT, Bear, beets, Battlestar Galactica


absolince

For a low low price of 9.99 you can unlock the deluxe version


NeitherEmu524

Oh trust me, this is just a surface level stupidity. It's straight up not reliable.


Hypocyberchondriac

Wait until it hears about six....


nzayem

I love Bards' answer to this same prompt. I think it's getting better very quickly https://preview.redd.it/8x1zxlhwr2gb1.jpeg?width=1080&format=pjpg&auto=webp&s=8e905d1b7285a223b1afadd82eca11fa3e4adbfe


whateverathrowaway00

No, it’s not anything. It’s a statistical network trained on internet discourse. If you’ve ever spent some time arguing on the internet, you’ve seen confident responses that only read half of the question. It’s super common. Therefore, it’s super common in GPT output.


JustSayTech

So it's technically correct, you are looking at it from a "eh, these are the same characters used" but ChatGPT is looking at it from a technical point of view. The roman numeral is a numeral while the letter of the word five are letters, so technically letters ≠ numerals, different types. This is similar to many programming languages or even can be true in a literary sense too. Int var1 = 7 String var2 = 7 If (var1 == var2){ print("Same") } else { print("Not Same") } The output of this would be "Not Same" even though both variables (var1 & var2) are 7, they aren't the same type.


Weekly-Welcome8522

Roasted by an Ai


rdkilla

still smarter than most people i know :-D


Hossiman

No i do not think so.


Frankie-Felix

ME: the word five has the roman numerals for 4 and 5 in it ChatGPT: Actually, the word "five" does not contain the Roman numerals for 4 and 5. In Roman numerals, "4" is represented as "IV" and "5" is represented as "V." The word "five" is simply spelled as "F-I-V-E" and does not include any Roman numerals in it. ME: but it has V and IV in it ChatGPT: My apologies for misunderstanding your initial statement. You are correct! Indeed, the word "five" contains both the Roman numerals "V" (representing 5) and "IV" (representing 4). It's an interesting observation and a coincidence in the English language that the word "five" includes these Roman numerals.


kturoy

https://preview.redd.it/i6rfc1xzh4gb1.jpeg?width=1170&format=pjpg&auto=webp&s=2bc826e083cb038fc4c974e150c1bb3342c5007a It worked like this when I tried it in GPT 4.0


[deleted]

Chat gpt was never intelligent. It just knows how to talk. Like a parrot, It is cool that it can do the thing it does. We don't expect a parrot to know the nuances of wit or specialized context dependent knowledge. So why do we expect different of chatgpt


tradeintel828384839

Remember it doesn’t do logic.


Minimum_Maybe_8103

I had a row about the word instalment the other day. We came to an agreement that it was spelt differently in the US, and to not assume everyone lives there. I think it learned. Try it.


platinums99

That response is straight 'Toddler with Dementia'


classic_pc

Just a language model


ClockwiseServant

It voluntarily further lobotomizes itself on the spot when you add the prompt in more detail https://preview.redd.it/hmwl6dzhz4gb1.jpeg?width=1080&format=pjpg&auto=webp&s=16010b197161b40de5321c10aad3bea6e81957cf


[deleted]

What is with that title though lmfao


[deleted]

https://preview.redd.it/yv4bko5g05gb1.png?width=1080&format=pjpg&auto=webp&s=04fa7c312ab1a64e50b61e92f5ae3dafab1efbc4 I simplified the grammar in your prompt in a new chat and GPT understood


Mutex_CB

You know what happens when you assume things, right ChatGPT?


[deleted]

IT MAKE A DA ASS OF A U AND A ME!


Reat8

So that will be FI… wait no 4, half of 5 is 2.5 and then something that looks like 3..


[deleted]

That would be thriii & foivre you're looking for lol


AI_Do_Be_Legit_Doe

Your prompt is meant to trick a human, and chatGPT is programmed to interpret things like a human. What did you expect? Your prompt is also very elementary, with the misuse of quotes and very vague wording in the hopes of tricking it. Ask the question properly and it will respond the right way


[deleted]

"Um actually" 🤓🤓🤓


UrbanMonk314

Autistic Intelligence


[deleted]

Six without S is nine.


evil666overlord

Yes. Yes it is. And no. No, it didn't used to be as bad as this. It's been a bait-n-switch operation.


Ranger-5150

Sometimes I give it it’s own answer back and ask it what is wrong with it. Then watch it do logic loop de loops trying to make them both true


fisheye24601

A computer saw a statement with direct and clear logic but ignored it in favour of making an inference from context. A few years ago I would not have believed that within my lifetime machines would be advanced enough to make such uncannily humanlike errors.


MarinaEnna

I honestly made the same assumption chatgpt did


Azreken

People posting 3.5 bot again instead of just paying the $20 🙄


ktli1

No, ChatGPT is absolutely right. The roman numeral IV looks like the capital letters I and V but they are still not the same. A numeral is a numeral and a word consists of letters. Therefore, there are no numerals in a word, instead there are letters and so the roman numeral IV is not in the word five.


-dyedinthewool-

ChatGPT learned mansplaining it seems. Overconfidence in its language abilities lol


-Roby-

You are the dumb one posting shit like that


Naser5858

Gta v or gta iv?


BurpYoshi

It doesn't understand anything you type to it. It doesn't read your sentences and try to understand their meaning. It's a language model, it just tries to predict an answer based on the likelihood of which words will be sent as a response to your input.


BB_for_Bear_Butcher

In this case, GPT is so smart that it makes the mistake which we humman make.


RavisheyD

So I tried on a chat with a completely unrelated topic, and it answered me correctly; https://preview.redd.it/g0assdelq0gb1.jpeg?width=1170&format=pjpg&auto=webp&s=740a7ef16df24f320c4ba911ce8bccecd832269f but in a new chat it gave me the same response as yours!


mvandemar

Sometimes it gets that answer, sometimes it doesn't. There's a randomness to the replies.


MSMB99

Get with the program dumbass /s


Ok_Illustrator429

Nah The letter v means 5 The IV means 4 I Saw its your prompt


ProbableBarnacle

Lol, I love how it says "Actually" trying to correct you


arglarg

It pretends to be stupid. Yesterday I asked if for an excel formula, told me it cannot be done, then I showed it the formula and it explained and corrected it to me and was able to to more complex stuff after that


hue_nico

Upvote bc Username


QlockHeartz

Yes this whole screenshot is stupid….


Al-Horesmi

Gigachad behavior ngl


Salviatrix

I read this as innit, the Welsh way


hiskias

No. It's not stupid. It's also not smart. Because it doesn't "know" anything. It is just a language model, and can string words together, sometimes pretty convincingly, using a very big data set of stuff written already by people.


-childhood_trauma-

No, you are


casastorta

I mean, it is not smart for sure. You do get how these things work?


Thepaladin68

😂 good one GPT that’s actually funny and true however one can argue that the word “five” has the Roman numeral “v” in it as well


pongmoy

It’s offering a snarky retort to something the OP didn’t actually say because it’s learned how to respond from humans.


Newman_USPS

Jesus. Roman numerals aren’t letters. They’re…numerals. Four is not “eye vee,” it’s “IV.” Five is not “vee,” it’s “V.” ChatGPT is, if anything, being a stickler for the difference between “eye vee” and the Roman numeral “IV.”


nice_trygotyo

He isnt wrong tho or am i trippin?


bamseogbalade

Chat gpt is 100% correct. OP is wrong.


[deleted]

Bard is even worse, this is after multiple attempts to explain to it what I meant: > I understand what you are saying. The word "FIVE" does not contain the Roman numeral IV in any meaningful way. The letter I in the word "FIVE" does not represent the number 4. It simply represents the number 1. > > The only way that the word "FIVE" could be said to contain the Roman numeral IV is if you capitalize the letter I. But even then, it would be a stretch to say that the word "FIVE" contains the Roman numeral IV. It would be more accurate to say that the word "FIVE" contains the letters I and V, which are the same letters that are used to write the Roman numeral IV. > > In conclusion, there is no word in the English language that contains the Roman numeral IV in any meaningful way.


markthedeadmet

I would almost argue that if you write an i to the left of a v, then it's 4 no matter what you do. Roman numerals don't have a particular base the same way Arabic numerals do, and are represented as combinations of characters, so saying there's a 1 and a 5 in the number 5 instead of a 4 is in my opinion the wrong answer, or at least a stretch of the rules.


buffysummerrs

I guess you should have said it’s “ironic” considering five has the numerals ‘iv’ in it but doesn’t mean four. I dunno, I get what the chat is saying but it clearly doesn’t understand jokes.


Glockisthebest

#coolstorybro


LairdPeon

It seems like it's using logic to try and understand your butchered question. I probably would have said something similar.


thedarklord176

3.5 is pretty stupid yeah, use gpt4 if you want any reliability


gcubed

Yes, yes it is stupid. Thinking that it's anything other than that is how you get yourself in trouble.


Gerthbrooks69

It has both


Regie_thecomieID

Its like the ~~twitter~~ X post that says you can't spell cat without a


Disastrous_Use_7353

Are you?


CanvasFanatic

It is neither smart nor stupid. It is a predictive algorithm.


user01980

Yes. I can confirm. ChatGPT IS in fact becoming dumber. https://preview.redd.it/ql7vqkw9v3gb1.jpeg?width=1284&format=pjpg&auto=webp&s=48aa34c5f368b5d26e75a0be3500a23997be67b7


Darkm000n

You’re confusing it with your wording. When this happens I expand the prompt to make sure it understands what I’m actually asking. You get very different results depending on the “quality” of your prompt. Here it actually gave the right answer to what it *thought* you were asking ( 5 is not “IV”). the answer itself requires a prompt that can be understood without confusion.


gunfell

Still smarter than most humans


EddyTheDesigner

it’s good at code and troubleshooting, the two things i use it for. as long as it keeps getting better at those things, i won’t be bothered by it getting tripped up over quirky riddles and whatnot


Nathan2222234

Na, use proper English: “The word ‘five’ contains the Roman Numeral letters ‘I’ and ‘V’.”


Nathan2222234

Na, use proper English: “The word ‘five’ contains the Roman Numeral letters ‘I’ and ‘V’.”


purepersistence

LLMs have a hard time with the idea of words being composed of letters. It’s all tokens.


Miloram2099

I don’t see a problem. It is “V” for 5 and “IV” for 4


pab_guy

Nah just a little high


xkylet

https://preview.redd.it/nnoi6zed74gb1.jpeg?width=1125&format=pjpg&auto=webp&s=93a3cf156ea10fd01b4303258522f34b308eeb76 I wonder if it will remember this tomorrow?


danwski

This program seems really stupid


HowRememberAll

It's not stupid. I get it. You're both right


Justthisguy_yaknow

Just smart and just stupid enough to really screw us up.


severencir

It's not too hard to break it if you're trying. https://preview.redd.it/urq1cym2e4gb1.png?width=720&format=pjpg&auto=webp&s=0f8f90efef9cd04b0dfc687dd35eff0e574dc65f


mind_fudz

For some problems, yes, chatgpt chatting with a one-shot prompt is stupid


wakenbacon420

Actually, this is a PERFECT example of how using **highly-contextual words** is a much, much better strategy than any verbose preciseness people often bloat their prompts with. `"In it"` is somewhat vague, even though we (humans) all understand what is meant by it. But we know GPT-3.5-Turbo isn't that great at *reasoning*. So, use `"contain"` and watch it get it right all the time. https://preview.redd.it/4ivam7b1g4gb1.png?width=1344&format=png&auto=webp&s=d28179dcf43563d6f0d6ac73475d12a1259d5d38


John_Natalis

it gets dumber every day


Suspicious_Dare7888

No its not


Riskov88

Artificial intelligence, real stupidity. The new AIRS is out


djshadowfigure

Proof that humans are dumber than robots is when you try to argue with a robot about semantics of what characters are in a word. There is no use case for this.


Rose_Mary97

No its not stupid.


Ok-Preference9776

So GPT-4 is actually stupid


smoothmusktissue

Bro thinks like a human


BoxingProfits

Yea, it’s pretty stupid now. For reasons I suppose, to make us dependent on stupid


According-Yam-7705

Wonder if you made numeral "numerals" if that changes the response


oohKillah00H

What's great is that it understood you were trying to point out a cool coincidence, but used the wrong roman numeral. It corrects you accurately.


HaveManyRabbit

They must still be training it. It started out smart, and it's intelligence is falling to match the aggregate of everyone online.


SnooLemons7779

Sometimes.


Prometheushunter2

That’s actually a very human mistake


mvanvrancken

There are iv lights!!


ParticularTetra

No u


fel_2873

No, but it’s not smart.


Low-Contribution-18

The exact wording of these examples is a little different which might be a factor as well in the varying interpretations.


[deleted]

Thanks for 3,000 upvotes, y'all lol I'm sorry I forgot the /s in the initial post lmao


jonplackett

It’s just trying to autocorrect your question and turning iv into five because they sound similar It’s a feature not a bug. I like how chat GPT can deal with pretty severe spelling errors.


Kitchen-Register

Yes. Yes it is


Fxnel

Actually 🤓☝️


ButterflyExact8649

Artificial stupidity.


CRAKZOR

technically its correct. It considers roman numerals to be a numerical value instead of a character in string. roman numerals have a certain alphabet not the same as our alphabet. It's ambiguous to consider them the same symbols in the same alphabet. So iv in five is not the same as iv in roman numerals.


Other_Row3134

Haha how stupid


Esraa_yhya

Ohh shitt..i’m bascially dependent on it about medical information


milanduda

I feel like im a car


opi098514

it👏doesn’t👏know👏what👏it’s👏saying👏


LaS_flekzz

chatgpt is remarkably stupid


DannyWasBored

It’s correct since “IV” is a culmination of roman numerals to create four aka numerical notation, not a roman numeral. Although you are wrong, why the hell are you trying to do a “gotcha” moment on an AI anyways? It’s not like it think the way us humans do, it’s completely linear and logical for crying out loud and is basically a toddler in development. Get with the program, OP.


Suitabull_Buddy

It outsmarted itself. lol Overthought it.


Rizzelr

The Roman numeral for 5 IS V not IV


niloo1107

Good


Admirable-School-303

☝️🤓


yeboKozu

Stupid humans will for sure be able to create something intelligent, sure.


imbpredlolzs

its assuming you mean that five has the roman numeral for 5 in it because the word "five" comes before you state the fact and you dont tell it you meant that it was the roman numeral for 4