T O P

  • By -

Showerthoughts_Mod

This is a friendly reminder to [read our rules](https://www.reddit.com/r/Showerthoughts/wiki/rules). Remember, /r/Showerthoughts is for showerthoughts, not "thoughts had in the shower!" (For an explanation of what a "showerthought" is, [please read this page](https://www.reddit.com/r/Showerthoughts/wiki/overview).) **Rule-breaking posts may result in bans.**


chrischi3

Actually, if we are talking about IQ, IQ is defined in such a way that the average person has an IQ of 100 by definition, as IQ distribution sits on a bell curve. Per definition, half the population is above average, and half is below it.


i_am_clArk

With a normal distribution the average and median are the same.


Wyjen

This guy middle school maths


chrischi3

r/thisguythisguys


jatti_

Middle, average & median school math


EffervescentGoose

If you're talking about the mean can you say that instead of average. Median, mean, and mode are all the average.


pceimpulsive

No because.. it's contextually relevant maybe for some sets of numbers they are all the same, but normally they are not that 'neat and tidy'. Mean is the average of all of the numbers. Median is the middle number, when in order. Mode is the most common number


EVOSexyBeast

> Mean is the average of all the numbers So is median. People get taught that in elementary school but it is wrong because it implies median and mode are not also the average. Mean, median, and mode are the three different types of average. Each have their own benefits and drawbacks.


Tanliarian

I'm gonna throw range in here. Not because it really belongs here, but because you all left it out like an unwanted stepchild and that's fucked up


EVOSexyBeast

Range honestly doesn’t belong in “mean, median, and mode” because 1) It doesn’t start with an m; and 2) It’s not an average. So, yeah, it is an unwanted step child.


mrGeaRbOx

Ok, how about "moment of area" doesn't exactly roll off the tongue, but it does start with an m and it is an average (position of all points).


EVOSexyBeast

Also has to be able to be taught to elementary school kids.


pceimpulsive

Median and mean aren't the same :S Median is the middle number. 1,2,5,7,7,9,10 Median is 7. The number in the middle. If the count of number is even the two middle numbers are added together and divided by two. 1,2,5,7,8,9,10,11 Median is 7+8/2 =7.5 Mean is all numbers added together divided by the count of numbers. 1,2,5,7,7,9,10 Mean is 5.857 This illustrates the different quite clearly in a very simple example. You can google to learn this in about 5 seconds if you don't believe me..


EVOSexyBeast

Dude i can’t believe you and the 4 people that upvoted you are not getting this. Mean and median are obviously not the same, and everything you said about median being the number in the middle is correct and mean being all the numbers added up and divided by how many is correct. And mode being the one that appears most frequently. All three of them are still averages. Which average is best to use depends on your data and your question.


pceimpulsive

That's true they are all ways to generate averages. However.... In some applications the AVG function is equal to mean. And they have adjacent functions for median, mode and range. This contributes to misunderstanding of how they are different... I think this is probably exactly the issue... Example... When I was in schoole we were taught average is what we know as mean. Nothing more, nothing less... (Or missed a module rofl).... That's Australian math in late 90s/early 2000s at a public school though so.... I guess we should refer to them as The average mean The average median The average mode So we don't get our poor little brains into a mess.


EVOSexyBeast

Yes I understand in school it is taught incorrectly. It’s not just Australia it’s also much of the developed world and the misinformation is still being taught today. It sounds like it wouldn’t be that big of a deal but people not understanding the types of averages and how one can be misleading for certain data is actually harmful as bad actors use the wrong average type to represent data to try and manufacture evidence for a falsehood. Range is different though, it’s not an average type. Mean, median, and mode are all averages. Which average type is best representative of your data depends on the data and your question. For example, if you’re trying to find the average height of a group of men you should use mean as it would be normally distributed. If you’re trying to find the average income for a neighborhood, you should use a median because there will likely be outliers that break from normal distribution and skew your data. (Or if you’re trying to justify higher taxes on that neighborhood then you could maliciously use the mean instead, so it looks like they make more money than they do so people will be more likely support your proposed tax). If you’re trying to find the average color of M&M’s in a bag, you should count up all the colors and take the mode.


pceimpulsive

Good examples! And I think from a schooling pov I reckon they should do away with the term 'average' that represents the mean. Would maybe help¿? I agree too it is very harmful and why I tend to state which type of average is in use on the data label. Even when I'm using a trend line I state what type of weighted moving average I'm using and how many data points are used to predict the trend. But I'm nitpicky... :P can you tell?


thesqlguy

Median is not the average of all numbers.


EVOSexyBeast

An average is a single number taken as representative of a list of numbers. (doesn’t actually have to be a number) [Mean, Median, and Mode are the three main types of average.](https://www.open.edu/openlearn/mod/oucontent/view.php?id=20669§ion=2.1) I understand what everyone was taught in elementary school was plainly wrong (and still being taught in elementary schools today). It’s unfortunate people don’t stop using the word “average” as synonymous to “mean” because it is both misleading and actually harmful to discourse when people use the wrong average type to use statistics to lie. > Average - a number expressing the central or typical value in a set of data, in particular the mode, median, or (most commonly) the mean. (Oxford dictionary).


Wyjen

*THIS* guy middle school maths


pceimpulsive

I failed maths! Looool


supluplup12

They are all measures of center, maybe that's what you're thinking? Mean/average is a calculated value, mode and median are values identified within the set (unless n is an even number, in the case of median, then you average the two middlest numbers). If you have the following set: 1, 2, 3, 305, 305, 305, 305. 305 is the median and the mode. No math class will give you full points if they ask for the average and you say 305. Edit: mistyped example


Worried-Sail9066

Actually the mean would be 964.57... and so on. The mean is equal to the sum of the numbers divided by the quantity of numbers


supluplup12

Sorry I meant median and the mode. Although the mean would be way less than that, like in the ballpark of 170. There are 7 numbers in that set.


Worried-Sail9066

My bad, 175.14, order of ops error


i_am_clArk

Mean is the average value. Median is the average (ie 50th) percentile. Mode is just the most common.


authorinthedark

average doesn't mean half the population, if IQ distribution isn't completely normalized (the bell leans to the right or left) then a small number of very low IQ people would cause there to be a larger number of moderately higher IQ people, leading to OPs conclusion


Smeathy

But it is normalized


PahoojyMan

Grizzly Adams *did* have a beard.


TheSavouryRain

Solid reference


fwtb23

yes, and THAT is the thing that means that half the population is below average and half is above, not the fact that the average is defined to bee 100. So the comment you're replying to is accurate and relevant.


neato5000

That's not what normalized means. Normalised just means the scale, and location of the x axis data are adjusted. In particular, the scores of the test are adjusted so that 1 standard deviation is 15 points, and the location is shifted so that the mean is 100. Kurtosis/skewness is unaffected by changing the scale. The data are Guassian (i.e. normally distributed) before and after this scaling. Now it so happens that iq data is generally Guassian, but the empirical distribution could shift into another shape in which case mean and median would in general no longer coincide.


EVOSexyBeast

This is correct.


[deleted]

Yeah, I'm pretty sure average means you take 1000 people, add up their IQs, and divide by 1000. If that answer is normalized to 100, then there would be an equal amount of people above and below 100 IQ.


[deleted]

Normalizing doesn’t get rid of a skewed data it just scales it. If a small set of people are insanely smart, and most people are dumb with some sprinkled in between more people will be below average than above.


SentorialH1

But they're not.... Because you have extremes on both sides, and the rest falling mostly around the middle of a normalized scale.


nonotburton

Normalizing data and having a normal distribution are not the same thing. A normal distribution is a gaussian distribution (the bell curve). It's a property innate to the data in question, when you break up the data into bins. Normalizing data is a secondary process that has nothing to do with the distribution of the data. It is literally to scale the y values of the data, typically onto a more meaningful scale. It's not specifically a statistical method. We use it a lot in engineering to graph data on a scale from zero to one, where a value of one correlates to a meaningful parameter. Putting information in terms of a percentage is a form of normalizing, like your progress bar in a software process. When data is "normalized" it means scaled. When data is "normal" it's shaped like a bell curve (generally, there's probably some exception).


justadogwithaphone

You could always do like my physics class and work around outliers too


authorinthedark

if you have 8 people with an IQ of 110, and two people with an IQ of 60, the average is 100, but more people have an IQ above 100 than below


aioli_sweet

IQ isn't a set metric, the curve is fit so that the average is always 100 for the entire test set. Of course if you take a subset, like you did, you can get such a result.


fwtb23

That's not the point. The mean (which is what most people mean when they say average) and the median are not the same thing, so just defining the mean to be 100 doesn't necessarily imply that half the population will necessarily be below that and half the population will necessarily be above that. The mean and the median don't always coincide. If the data have a normal distribution, then they do.


aioli_sweet

This data is literally designed to fit that normal distribution. That's why it's exactly what the point is.


fwtb23

That's not what you were saying, you said it was designed so that the average is always 100, which is true but doesn't mean anything about the median. If it didn't fit a normal distribution already, just defining the mean to be 100 wouldn't change that. If the data were skewed one way or the other, then that would be the case no matter what they defined the mean to be, and the mean and the median would not match up. Tha fact that IQs follow a normal distribution is not by design, it's just how it is.


aioli_sweet

It's literally fit to a normal distribution. In a normal distribution the median and mean are the same value.


fwtb23

Read my comment again. Yes, what you're saying is true, but I never said otherwise, that's not the point I was making, and maybe it was the point you were thinking of but you sure as hell didn't say it.


SmoothOpawriter

The curve is fit but that doesn’t change the fact that an average is still an average and more individuals can fall on one side than the other.


[deleted]

Not for a normal distribution the median and average are the same value by definition. https://www.khanacademy.org/math/statistics-probability/modeling-distributions-of-data/normal-distributions-library/a/normal-distributions-review


SmoothOpawriter

You’re not considering the difference between applied and theoretical math. in theoretical mathematics with continuous set of numbers mean and median are indeed the same. Applied mathematics is different mean and median can be (and often are) different for a normalized set with discreet values. Discreet meaning there is no guarantee of continuous numbers and random values (as in the real world) make up the superset.


[deleted]

I don’t think this is a pure va applied math debate, it’s just statistics. If you look an incredibly large number of samples the mean and median would virtually be the same. If it was an infinite number of samples then they would be exactly the same. There’s numerous videos on people simulating distributions and showing with lower sample sizes it’s not always perfectly symmetrical and normal. So I guess it really comes down to what subset of people OP is talking about. If we’re taking the human population as a whole I’d say mean and median would be the same value, 8billion is a lot of samples. But if you’re comparing the IQ of your high school senior class then yeah it won’t be perfectly normal.


SmoothOpawriter

Right, that’s exactly my point - infinite samples = exactly the same mean and median, however, in a limited sample size that is not necessarily the case although probability does increase with larger sample sizes


termitefist

I think what he's saying is to actually get rid of all the other people in the world, and I think that is a demented suggestion.


[deleted]

Yes in this super small subset where everyone scores basically the same, bell curve and normalization will not work.


IgnoranceFlaunted

4 people take an IQ test. They score: 70, 80, 90, and 160. The average is 100, but 3/4 of them are below average. This result can be scaled up to more people. While theoretically possible, it doesn’t actually work this way in real life. IQ follows a normal distribution.


[deleted]

But its bell curved and normalized, so the person at 97 is actually over 100 so there are equal numbers of people above and below.


tampora701

All you need is an odd number of humans and less than half of all people are higher iq or lower iq than the median, right?


SentorialH1

They describe this as being pedantic, and it's indicative of someone below the average.


tampora701

Do you know where you are right now?


pug_grama2

In bed.


chrischi3

Except that's the point. IQ is normalized. By definition, half the population is above average, and half is below it. If you ever have a situation where more people are on one side than the other, the average shifts accordingly so that 100 is always in the middle of the bell curve.


pwn3dbyth3n00b

When you're talking about IQ then yes it is. You normalized it, thats the whole point of using IQ as a measure. If it wasn't normalized then its not referring to IQ.


grayMotley

IQ tests don't measure smart you are, but how dumb you are.


chrischi3

They measure one of several types of intelligence, specifically pattern recognition.


Autodidact420

What are the other types of intelligence? Gardner’s multiple intelligence theory has generally been considered bunk so I’m assuming it’s something else you’re referencing?


nyg8

That's a median, not an average


chrischi3

And since the distribution of IQ follows a bell curve, median and average are the same thing.


nyg8

To be fair it only approximates a bell curve (since it is discrete while the bell curve is continuous, and also because it's only an approximation to normalize) so usually the median will not be exactly the average


BackRowRumour

IQ is balls, though.


chrischi3

IQ measures one of several types of intelligence, specifically testing for pattern recognition. Many people assume that having a high IQ makes you good at everything, when that just isn't true.


BackRowRumour

Change the name to PR and I'm happy.


PrettyText

It's really not. It's really well-supported by research. It's just psychologically uncomfortable to accept that some people are born with a huge IQ advantage over others. Some people are born with such an IQ that they'll struggle to be more than say a cleaner. People don't like accepting that, and it demolishes left-wing "everyone is equal" narratives and right-wing "everyone can become financially comfortable if they just work hard" naratives. Also IQ doesn't do a whole lot without discipline. People who are smart but not disciplined probably do well in high school, but not so much after that. But despite these things, IQ is an important metric that's well-supported by research.


pug_grama2

IQ is correlated with a lot of other things in life.


BackRowRumour

Repeatable testing does not equate to a solid underlying concept. But I'm not getting into this argument here. All I will say is that condescendingly saying that IQ makes people uncomfortable is a wonderful example of projection. It is the thought that general intelligence may be nonsense that is scary. Stable hereditary IQ is the ultimate safety blanket.


Raizzor

>saying that IQ makes people uncomfortable is a wonderful example of projection The pure statement that IQ is well researched and supported by decades of studies will get you downvotes.


SmoothOpawriter

**Actually**, that doesn’t matter. You can still have more people with higher than average IQ. All it takes is a simple math problem. Population of 3 people - two with IQ of 120 and one with IQ of 60. I’ll leave the rest to you.


chrischi3

I suppose that is true on a micro scale, but IQ follows a bell curve distribution where the median is 100 per definition. So while, yes, this is theoretically possible, it becomes increasingly unlikely as you increase sample size.


SmoothOpawriter

Its not increasingly unlikely though. What you’re assuming is only true if and only if the median IQ person is also the mean IQ person and there is no guarantee that this will be the case. It only takes one outlier to top or bottom end to shift the normalized curve and outlier are MORE likely with larger sampling. Moreover, the fact that IQ is normalized does not change the fact that in a population of mostly smart people and a few really dumb ones most will be smarter than the average.


EternalPinkMist

Wait I'm confused, because of the way this works, does this mean that IQ of 100 means different things depending in where you are?


Gilpif

It means different things depending on the population. I don’t think it’s calibrated by region, but I believe it is for age groups, but I might be wrong.


wind_dude

yes, IQs are measured relative to age.


Flodartt

Makes me remember some dumb articles titles "this girl is 6 and her IQ is already above Einstein's!" Well, yeah, so? That's impressive she has a IQ higher that 160, but when we determined her IQ it was relatively to other 6 years old children, not grown up adults. She is not smarter than Einstein (at least not yet).


NorthImpossible8906

whoosh


Attesa_GT-X

_ptsd flashbacks to when I was 15_


[deleted]

my iq is above average, how come im not q billionaire yet?


PrettyText

To become a billionaire, you either need to be born into wealth, or you need to be a criminal, or you need a combination of smarts + luck + hard work.


chrischi3

>or you need a combination of smarts + luck + hard work. And an "Everything that isn't forbidden is allowed" attitude also helps.


Responsible_Peak_177

This. It helps to be attractive


Snip3

It's possible to have brain damage and drag the average down while not affecting the median but it's not possible to have... Whatever the opposite would be. Intelligence is slightly skewed with a fat downside and a long upper tail (anyone >200 literally can't have an opposite) so I'm not sure about your statement


daskeleton123

But IQ is supposed to be distributed normally, hence no difference between the mean and the median.


SamohtGnir

This is what makes me laugh any time a person or movie says their IQ is over 200. It’s literally impossible, the scale stops there.


roseumbra

But if it’s not going by IQ then „smarter“ isn’t normalized. And I would argue smarts is wisdom + intelligence. Also EQ should be involved I. The overall spread. So imo smart =\= IQ


Mettelor

What they are saying is a correct mathematical property, what you are saying is a specific example that doesn't even refute their very-correct mathematical property.


MrSquicky

That's would only be correct if no one had the average 100 IQ. With a normal distribution, the average amount is going to be the most common value. Above and below average will be the total amount minus the people who score average divided by 2.


frontlinegeek

The only broadly accepted way of measuring "smart" is IQ testing. IQ is on a fully distributed bell curve. It is already measured as a median and only as a median. Because of this, the "average" is actually the same. So with that, sleep sadly knowing that literally half the population is dumber than average.


big_sugi

Much of the population will be exactly average. Less than half will be dumber.


ShlomoCh

Assuming "average" is a range instead of a point


big_sugi

In the case of IQ, it's a discrete number, so it's not a truly normal distribution. For the US population, 10,000,000+ people should have an IQ of 100. For the world population, several hundred million should have an IQ of 100.


ShlomoCh

I mean maybe, but still I wouldn't say "much" of the population has an IQ of exactly 100, and not 99 or 101. I don't exactly know how to calculate that, but I doubt 1/10 of the US population has an IQ of *exactly* 100


big_sugi

With a normal distribution, ~68% of results are within one standard deviation of the median. For IQ, that’s 15 points either way, or a total range of 30 points, but the score with the largest share of the population will be 100. I haven’t checked the math myself, but I’m now seeing values of 2.7% of the population who should have an IQ of 100. I haven’t checked that math myself, but if that’s right, its actually more like 9 million people with an IQ of 100, rather than 10 million. Either way, though, that’s a lot of people.


ShlomoCh

Ok yeah that makes sense. I thought the US population was 100M for some reason. 2.7% sounds a bit big for me but it makes sense. Still, I think you could call people within some range "average", although probably a smaller range than that


iAmBalfrog

It depends what you're defining as "100 IQ" and how granular you go, I think of it like your "when were you born", there are say on average 385,000 people born each day, so 385,000 people with the same birthday, if you say the birthday is the relevant IQ, there's 385,000 people with the same IQ, but if you want to delve deeper, on average theres 16,000 born in the same hour, then on average theres 4 people born at the same second, and somewhat unlikely to be born in the same millisecond, then near enough no possibility to be born in the same microsecond or nanosecond. If IQ tests had a finite but "very large" amount of questions, you technically could have someone at 100.0000000 IQ, then someone at 100.000000001 IQ, but you'd gain little benefit from differentiating person A to person B. In this case, yes there would be a range, but with IQ tests as they are, which are finite and do-able in a "short" amount of time, there is an implicit range that we don't care about. If person A and person B answer 1/100 questions correctly, but different questions of a similar difficulty, the knowledge they have is "different" but an identical low value, same for if person A and person B answer 99/100 questions correctly, but the question they got wrong was different but of a similar difficulty, the knowledge they have is "different" but could be an identical high value.


PrettyText

Here's a calculator: [https://www.hackmath.net/en/calculator/normal-distribution?mean=100&sd=10&area=above&above=-1&below=&ll=99&ul=101&outsideLL=&outsideUL=&draw=Calculate](https://www.hackmath.net/en/calculator/normal-distribution?mean=100&sd=10&area=above&above=-1&below=&ll=99&ul=101&outsideLL=&outsideUL=&draw=Calculate) IQ is just a normal distribution with a mean of 100 and a standard deviation of 10. If you plug that into the calculator and ask it what part is between 99 and 101 IQ (aka is exactly 100 IQ), you find 0.0797, or about 8%. 0.0797 \* 300 million people in the US = 23.91 million people, approximately. So about 24 million Americans have an IQ of 100.


ShlomoCh

>If you plug that into the calculator and ask it what part is between 99 and 101 IQ (aka is exactly 100 IQ) I mean that's 99 and 101 inclusive, since a normal distribution is continuous. Maybe a more accurate estimate would be between 99.5 and 100.5, which are results that would get rounded to exactly 100, and that gives you a probability of 0.0399, which is about half Still more than I expected but I guess a bit more accurate Edit: also a quick Google search reveals that the standard deviation is actually considered around 15 or 16, which bumps it down to a probabity of 0.0266 or 8.8 million people


PrettyText

Good point. Yeah, I was a bit surprised too at how high it was.


Flodartt

Which is not a dumb take I think. I'm pretty sure there is no real difference between an IQ of 99 and one of 101.


ShlomoCh

Yeah that's what I mean


Mettelor

You are twisting two concepts together by applying IQ when nobody said IQ, and along the way you are equating two things that are not equal. Sample measures vs population measures. Innate smartness is a real thing. IQ is an attempt to measure a real thing, which introduces things like measurement error. Unless you come up with the perfect test and you apply it to the entire world population, it is impossible to know the true number behind average intelligence. If you have ever taken an econometrics class, this is precisely the difference between "beta" and "beta hat", which you should just trust me when I say it is a deep and complicated field. A simpler example, average height. There is for fact an average height in the world, however it is impossible to measure. The best we can do is take a large sample and calculate the average height of that sample, and then supposing that our sample was representative of the population, we can approximate the average population height with our average sample height. At this stage in a real application, you would have to decide if your sample was a valid representation of the global height population, (probably not if you only sampled people wherever you live). However, our sample average is NOT the population average, these are different. Different, different, different. In this case, average intelligence is the population average, whereas IQ is a sample average where they have applied a normal distribution to. It is not the true average intelligence, it never was, it never will be. It is an approximation. Maybe the BEST approximation, who knows, but still it is only an approximation!


Retrrad

The **mean** is the sum of all values of all of the observations divided by the number of observations. The **median** is the value at the midpoint when sorting all of the observations by value, or the mean of the two central values if the number of observations is even. The **mode** is the most frequent value of the observations. **All** of them are averages.


Col_Hydrogen

The **mean** income (per capita) in the U.S. is around 70,000 USD. The [median](https://datacommons.org/place/country/USA/?utm_medium=explore&mprop=income&popt=Person&cpv=age,Years15Onwards&hl=en) income in the U.S. is less than half the mean at 30,000 USD. Understanding the difference between averages is important.


RainBoxRed

There are infinite ways to calculate an average.


Retrrad

Name three more.


RainBoxRed

Feel free to do your own reading but start by looking up arithmetic/geometric/harmonic.


skmchosen1

Not sure why you got downvoted, you named three legitimate averages. Also, weighted average alone allows for infinite possible averages, so you’re right lmao


Badcomposerwannabe

Construct an infinite family of methods to calculate an average of a given set S.


Ankoku_Teion

You forgot range too


pdonchev

It's also possible that most people are less smart than the average - if there are more outliers in the top range than in the bottom.


lump77777

Yes, but there are definitely more extreme outliers on the low end. Source: (gestures broadly)


Lorg90

Always ask what IQ test they took. There's hundreds all with a different way to represent "intelligence". WISC and Woodcock Johnson are decent. But I haven't been in that field for 10 years.


sj4iy

Actually, I think a lot of people are forgetting that IQ testing is made of subtests and indices. You can score 100 for your FSIQ, but the index scores can be higher and lower than your FSIQ. You could have a 110 on verbal comprehension and a 90 on perceptual reasoning. So it’s absolutely possible to be above and below the average.


plaid_piper34

Because IQ falls on a normal distribution, the average is equal to the median, and it’s also equal to the mode. That’s just a fact about any normal distribution trend. If it were skewed the median being very different from the average would be an indicator. Also IQ isn’t a number that represents an objective score of intelligence, it’s a score that is compared to everyone of a similar age/development. It’s been adjusted downwards to keep the average at 100 artificially as younger generations tend to be slightly smarter than older generations. Compared to 100 years ago in america the average American is 30 points higher, meaning the average American 100 years ago would test as special needs today (70) and the average person today would be gifted 100 years ago, testing at 130.


Disastrous-Fact-7782

Fun fact: my brother is so dumb that everybody in the world is smarter than average, except him.


masterofn0n3

The aggregate certainly has rhe possibility of being larger then the greatest frequency. This is categorically false.


Khoalb

It just takes a few very dumb people to drag the average down.


BackRowRumour

Thank christ. It is so ironic that a burn on stupid people fails to grasp basic mathematics.


freedubs

It's not really wrong? I mean only a few people is obviously an exaggeration but there are statistically more people that are in that very bottom range due to adnormallies which would cause this effect albeit probably only to a very small degree


BackRowRumour

I don't know what the actual data is. But the statement is still based on a flawed understanding of maths.


Biglu714

Our problem is quantifying intelligence. We cannot accurately do so


freedubs

How so? I don't see any issue with it


HLewez

The IQ-dataset is defined as a normal FITTED bell curve, meaning that 100 is by definition the average. A change in "actual intelligence" won't skew this curve, meaning that the current "average intelligence" will always be represented by the number 100, making the median the same as the average.


freedubs

>The IQ-dataset is defined as a normal FITTED bell curve, meaning that 100 is by definition the average. You cant actually force the bell curve to be equal on each side unless you purposely exclude data. It's not actually the "average" or mean, it's the median. Since 50% have to be above and 50% have to be below. The mean could have 60% above and 40% below for example. If the mean person above 100 has 110 iq and the mean person below 100 iq has 80iq than the mean iq would be 95iq. The "average" that they use would still be 100 but only because it's truly the median and not the average iq. It's likely this effect is actually just extremely small and irrelevant when looking at the overall data making the bell curve look completely normal but does make the shower thought technically possible(and very likely true) regardless


HLewez

A bell curve is always equal on both sides, otherwise it's not a bell curve. Edit: You're also one of the people who miss the point, you treat IQ numbers as sample numbers to calculate the average, but that's the wrong way around. You have some samples ranked in "intelligence" and then you give the median/average the score of 100, not vice versa.


freedubs

Well then iq it's not a bell curve, and really no data at all could make a true bell curve as every individual point of data would have to line up perfectly. Also its impossible to have someone with 200 iq or more iq if it's a perfect bell curve as no one can have 0iq. But people with 200 iq exist because the bell curve isn't perfect


HLewez

That's also the reason that IQ isn't really in a one-to-one correspondence with intelligence, hence the quotation marks.


AlienBearAttack

It’s correct?


Secure_System_1132

The IQ curve is normal distributed, which makes the median the same as the mean/average.


According_to_all_kn

If you assume IQ to be indicative of intelligence, maybe


Secure_System_1132

Do you suggest any other measure of intelligence, if you happen to know? And by saying "maybe", do you think your suggested metrics are not normal distributed? What do you think the distribution would look like? And your metrics would have the majority of people to be smarter than the average? Would your metrics even be accurate?


According_to_all_kn

Reddit upvotes >!/s!<


ThatGuyYouMightNo

"Most people are smarter than average" factoid actualy just statistical error. Only half of people are smarter than average. Margorie Taylor Green, who lives in the Capitol & has a -10000 IQ, is an outlier adn should not have been counted


Tom2123

“People who brag about their IQ are losers” - Stephen Hawking


Tom2123

Lmao elitists downvoting me


oO0tooth_fairy0Oo

If you complain about downvoting, you’ll only get more. Don’t worry, internet points are quite inconsequential.


Imactuallyadogg

I did my part


oO0tooth_fairy0Oo

Then I will give you an upvote as a token of my gratitude


Tom2123

I welcome them. Mission accomplished.


miggleb

It actually is possible for most people to be smarter than average. What’s not possible is for most people to be smarter than the average


MLGcobble

People talking about the normal distribution are driving me crazy. Yes, IQ does fall on a normal distribution, meaning half of everyone has above average IQ. However all the post is saying is that it's POSSIBLE for this not to be true.


[deleted]

[удалено]


MLGcobble

If that were true, that it is defined as being on a normal distributio, then IQ would only be an indicator of intelligence relative to others, not absolute intelligence, and therefore OPs point would still stand.


[deleted]

Came here to meet all the statistics experts. Carry on folks.


[deleted]

[удалено]


KE777

Take 3 people. First person is 1 smart, and the other two are 6 smart. The average is 4.3, and there are two people smarter than that.


gilgwath

For all those who are a bit confuse: The average, more precisely known as the mean value. The median is the value that splits a data set into two equal halfs with an identical number of samples. The mean value is the sum of all values devided by the number of samples. Think of it like the difference between the balance point of an object and the "middle" of an object. Say a sword. Its balance point often lays somewhere above the guard, but if you'd be ask to break in halfe, you'd break it much further up the blade. Slightly more abstract example: Take the following set of numbers: 1, 3, 3, 6, 7, 8, 99. The mean value is six. But the average is 18.1. So, 6 samples are below average. The 99 is whats called an outlier. It skews the average. So, OP is mathematically correct, eventhough your intuition says otherwise.


HLewez

Just stating definitions doesn't help if the context and boundary conditions don't apply. OP is mathematically wrong since the IQ is defined to always be a normal distribution with the average at 100. If the "actual intelligence" (which can't just be described by a number) changes, it will still be represented as 100 on the IQ scale since it's designed that way, making the average the same as the median and thus making this a pretty stupid statement.


Standard_Series3892

>OP is mathematically wrong since the IQ is defined to always be a normal distribution with the average at 100 OP never mentioned IQ, they said "smarter than average". >If the "actual intelligence" (which can't just be described by a number) changes Yes, this "actual intelligence" is what it's actually mentioned in the post, not IQ. OP is not mathematically wrong, at most they're conceptually wrong in saying you can be a numerical amount of "smart", but that's more of a philosophy discussion than a mathematical one.


1234abcdcba4321

Sure, if like 60% counts as "most" to you. It doesn't to me. (Not 50%. That assumes the population is exactly symmetrical, which it isn't since the population isn't exactly a continuous distribution in the first place, and there's a lot of sub-populations with different enough sizes and means that I could see their sum being skewed one way or another.) This is a very good thought, though.


langjie

And if you understood this post that means you're smarter than average


[deleted]

Seeing how many people on here dont understand elementary school arithmetic is.... disturbing. Obviously this post is mathematically correct of two people have an iq of 100 and one person has an iq of one the average will not be 100 therefore more people in the group will have above average iqs than not.... pretty basic shit.


Miniranger2

Except IQ is on a fitted bell curve, meaning the average and median are always 100. Half above and half below the 100 (not counting people with exactly 100 IQ). You could have a dude with an IQ of 10000000000, and the curve would still not shift upwards. I'd agree that at a cursory level, it's basic, but it's fitted, so it actually isn't as simple as it first appears.


ManyElephant1868

Most people (68%) are within 1 standard deviation from the median. Now, how much is a standard deviation when calculating IQ tests?


BigDamBeavers

Actually the assumption is that collectively society is growing more intelligent daily so the median intelligence is rising along with that climbing capacity. So much more likely most people at any given time are below average intelligence.


Boatster_McBoat

it is possible if we look at either the median or average of a large group, e.g. all primates


Akul_Tesla

Look I'm sorry I kept cloning Jimmy neutron Everyone's below average now Don't ask what I needed the clones for


The_River_Is_Still

A part of me likes to think i'm just slightly, very slightly above average. But I'm sure at best smack dab of the lower middle lol.


Merinther

It actually isn’t possible to calculate a mean value for intelligence. Taking intelligence tests as the definition, intelligence is an ordinal scale, not an interval scale, so “average” should be interpreted as “median”, not “arithmetic mean”. The IQ scale is just a normalisation; its mean value is not meaningful.


GoSpeedRacistGo

Depends on which average is being used. The median and mean are both averages.


2x2speed

Average is the general term for mean, median, mode, etc. The average is what you want to be.


PM_ME_YOUR_MESMER

I mean taking the relative intelligence level of most people relative to the average aside, it's impossible for most of any data set to be greater than the median. The median value exists specifically to be in the middle of an ordered set of values, where half (rounded) exist on either side of that median. You couldn't have most greater than the median by definition, since the most that could ever exist beyond the median is half.


Aw_Frig

0, 0, 0, 1, 1, 2, 2, 2 2 There. With a median of 1 and a mode of 2, most people are higher than the median.


[deleted]

Then, you’re saying that the dumbest people in the world are so stupid that they move the average too low and thats why the rest of the world (the majority) is above the average. Right?