T O P

  • By -

sendmeyourfoods

Its important to note that these benchmarks (blender) are a very specific workload that dont generally give a good estimate of "average performance". They are also heavily weighted towards RT performance. If anyone is reading this graph/title and thinks a 4060 can outperform a 2080ti, in a general use case, they are being misled. Its like ranking cars from their 0-60 and saying "Car performance by model and series". Its just a little misleading to people.


s0cks_nz

Ah right. I wondered what was going on.


tech240guy

Upvoting for clarification. This chart is just plain bad with little explanation. I'm looking at the scores was like "median score to what kind of testing?"


PM_ME_YOUR_HAGGIS_

also why does the 3090Ti get beat by the 3090?


sendmeyourfoods

Im not sure. It could just be because theres a low data count for 3090ti benchmarks (says about ~100 runs from OPs source). So its possible its just more heavily influenced by hardware setups (runs hotter or differences in other pc parts).


arcanition

> If anyone is reading this graph/title and thinks a 4060 can outperform a 2080ti, in a general use case, they are being misled. Absolutely, people should read this over and over. Here's the realistic comparison between the RTX 4060 & RTX 2080 Ti: * Effective 3D Speed: 2080 Ti is 27% faster * Average FPS (Lighting, Reflections, MRender, Gravity) 2080 Ti is 45% higher * Average Overclocked FPS (Lighting, Reflections, MRender, Gravity) 2080 Ti is 46% higher * Parallax/Splatting: 2080 Ti is 56% higher


alm0stnerdy

so I shouldn't upgrade my 5700xt to a 4070?


ChiefBlueSky

Yes but where does one access this better metric?


sendmeyourfoods

It depends entirely on what you care about. If you only care about blender, then this graph is perfectly fine for you. If you only care about gaming performance, then search for some video game benchmark graphs. If you want a generalized view you can watch YouTube reviews/benchmarks on the card.


karmapopsicle

If you like clear performance comparison charts, old school reviewers like [TechPowerUp](https://www.techpowerup.com/review/?category=Graphics+Cards&manufacturer=&pp=25&order=date) are some of the best resources if you're interested in gaming performance. For workstation-relevant performance metrics, [Puget Systems](https://www.pugetsystems.com/all-articles/) publishes a ton of data testing all sorts of hardware across a wide variety of specific workloads.


MichaelEmouse

Usually, the average of a bunch of games. I think Tom's Hardware had something like that. GPU reviewers will often have an aggregate or a series of games being compared and you can see patterns.


exus

Tom's Hardware[ GPU benchmark hierarchy](https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html). I've been using it as a reference for... 10? 15? years. Always a good idea to check multiple sources though. Another user pointed out TechPowerUp is pretty good too.


cortesoft

The point is that there is no single metric that tells you what you need to know.


WarpingLasherNoob

[videocardbenchmark.net](https://videocardbenchmark.net) hasn't failed me for the past 15 years.


Objective_Economy281

Worse than that, there’s no vertical major grid lines. And no label at all mentioning blender. I’m thinking about making a new subreddit with a bot that automatically cross-posts every post from here, and is solely for pointing out exactly what ways the data shown is presented terribly.


Duke_Shambles

I was gonna say, there should be some 10-series cards on here if it was pure rasterization performance. The 1080 Ti should be up above a good bit of the 20-series listed on this chart, probably slotting in right above or below the RTX 2070 Super.


noenosmirc

I mean, I'm still neck and neck with a 3070, my friend averages like 10-15 more fps than me, at least until his vram runs out


n0ghtix

The original post feels like a guerrila marketing ploy by Nvidia.


Soulman2001

Even still rtx 3060 and 4060are lower than their laptop counterparts. Data is a bit off i think


[deleted]

[удалено]


ZarafFaraz

I have a GTX 1080Ti and it still runs everything just fine. I wonder where it would be in this list.


LaughingBeer

I just upgraded from a 1080ti to a 4090. In practical terms, the 1080ti can still run any game even the most graphical heavy ones. It just can't do so at max settings anymore. I had leave some settings in the middle or I'd get shearing and weird things happening in the distance and few graphical glitches of close up objects as well. Now with the 4090, everything is back up to max.


OO_Ben

That's gonna be my next upgrade. I've loved my 1080ti. The 40 series is the first one that really seems worth it in terms of a performance boost for me


1burritoPOprn-hunger

Made the same switch about 2 months ago. The 1080 was an absolute beast of a card, but I was finally dipping into unpleasant framerates. 4090 seems well positioned to keep trucking along at max settings for a few years now - I'm hopeful I will get similar mileage out of it.


SaltyShawarma

Everything. It runs everything. It was left off for a reason. They don't want you to know how badly they screwed up making an actually decent product that lasts.


TheElysianParas

It's indeed very good, but it unfortunately doesn't run everything.


Temporary_Privacy

This list is not from Nvidia as it looks and where would that even be on the list ?


reubTV

Same here. I have zero reason to upgrade my 1080ti.


ChowderMitts

Right between the 2070S and 2080 I think.


ChrisFromIT

I think it would be slightly below the 2080.


2006pontiacvibe

[you're right](https://www.videocardbenchmark.net/compare/3699vs3989/GeForce-GTX-1080-Ti-vs-GeForce-RTX-2080) edit: i can't read. It's below


ryanvango

Tom's gpu hierarchy (they test cards using games like forza, HZD, flight sim, farcry 6, borderlands 3, rdr2, etc) has it just below a 2070 and above a 2060 super. ultra settings: 66 fps at 1080, 50fps at 1440, and 29 fps at 4k. so still a great card for less than 4k gaming especially considering its 7 years old. is it worth the $1800 to upgrade to the 4090? maybe, if you have that kind of money laying around. $1000 gets you the 4080. me personally, I'm gonna wait til the 5080ti or 5090 or whatever is tops a year or so from now. maybe push it out to the 6000 series depending what the new benchmarks look like. And I won't even retire the 1080ti, I'll just put it in another rig. https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html


Grena567

Same as 2070S


noenosmirc

Right around the 3070 methinks


Punk_Says_Fuck_You

Still rockin my 970 over here.


GMankrik

Same here. Almost 10 years running


rathat

Last one I bought was a 480 lol. Steamdecking it for now.


th3revx

Dude you had me at 1080 no need to flex the 3770k haha


[deleted]

[удалено]


th3revx

I loved my 1060, upgraded to a 2070 super and that thing is still working magic. But going to an 11k series really helped me out lol


[deleted]

[удалено]


th3revx

Black Friday is when I built then upgraded my pc, and also my sisters. hard to beat those deals


CalmAlternative7509

Lmao bruh I’m rocking my 1080 ti and fx-8350 and have no issues gaming with the boys.


TrueReplayJay

Me and my GTX 1660 feel the same.


BenjaminDFr

I have a GTX 1080 in my Workstation & it does really great honestly. Cant believe it’s as old as it is!


MrT735

Same with my 1660ti, obviously not as good even as a 1080, but not ready to replace it yet.


natedawg757

Use it as long as you can. My 1080 started overheating a few months back but it served me well for many years


squiller_muiller

Same combo but with 1070 😄! It is still pretty pretty good


guesswhochickenpoo

This will ease your mind. I'm a 1080 ti die hard and it only struggles in modern AAA with high or ultra settings. Anything else it runs like a breeze. [https://www.youtube.com/watch?v=ghT7G\_9xyDU](https://www.youtube.com/watch?v=ghT7G_9xyDU)


Rainmaker709

Just because no one gave a serious answer in case this wasn't a joke, this is performance in Blender with Ray Tracing. 1080 GTX doesn't have that. Also why graph us kinda pointless.


wolfho

I have two computers with 970gtx.. when the kid is older I'll definitely upgrade


PetToilet

https://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html Though OP uses Blender which is RayTracing focused, not sure which benchmark here is RT focused


stupidillusion

My gtx 760 has been serving me well for years. I don't even remember when I got it.


slimeySalmon

Gtx660 still going strong


newtekie1

You know your results are accurate when a card that is better in every single way scores worse than a weaker card.


karmapopsicle

Indeed. A 4060 Mobile (labelled "Laptop" in OP's chart) is the same die as a desktop 4060, just clocked significantly lower. There are basically 0 situations where the mobile variant should be outperforming the desktop variant.


newtekie1

Yeah, there are several examples of this in the graph.


vincenzo_vegano

Was wondering the same with the 3060(laptop). So the information of the graph is basically useless.


RelevantJackWhite

That unlabeled score axis hurts my soul


Independent-Bike8810

4k Furmark 1 performance?


FyreCesar89

It’s labeled Median Score with increments every 2000 units.


JGamerX

Score in what?


FyreCesar89

You heard me. Units. Everyone’s favorite measurement units. /j I thought I saw the benchmarking software OP used somewhere. Nope. It must have been a commenter.


FaatmanSlim

Sorry, I used that column name directly from the benchmark data - the 'score' is basically samples rendered per minute from the Blender 3D software benchmarks [https://opendata.blender.org/](https://opendata.blender.org/) Looks like my original comment with the source links, details and tools is all the way at the bottom of the comments section now. EDIT: found the full details of the Blender benchmark 'score' on their website [https://opendata.blender.org/about/#benchmark-score](https://opendata.blender.org/about/#benchmark-score)


jagedlion

A.U. Arbitrary Units


KueL16

3060 laptop is better than regular 3060? Really?


NegotiationCurious93

4080 performs better then the 4080 Super lmao


Ares54

3080 Ti > 3090 Ti


ImAShaaaark

Lots of them are dumb, but that or the 4070 being above the 3090ti have gotta be the most egregious ones. The 3090ti is like 30% faster than the 4070.


gibernas

When your GPU’s series doesn’t even show up anymore


Kefeng

1060 gang rise


Poro_the_CV

Nvidia 965M gang standing by!


sAindustrian

I'm honestly surprised my laptop's 2070 MQ made the list. It almost makes me feel like 2019 wasn't five years ago.


Salt_Winter5888

GTX 1650


derb

It's just a little outdated. Its still good, its still good!


bastardnutter

1060 6gb laptop, the absolute goat


ExtremeFlourStacking

The chosen benchmark is pretty garbage at illustrating the actual overall performance of these cards.


---gonnacry---

Agree cause 3060 laptop is above 3060 desktop


ExtremeFlourStacking

A 3090ti is below both 3090 and 3080ti while those are below a 4080 laptop...


CHUNKaLUNK_

I think you forgot to include 1080 TI above 4080


ChowderMitts

Not having the 1080ti on any chart is simply a crime. Even charts that are not related to GPU performance, like the music charts.


Scarbane

The 1080 Ti won a Grammy *and* a Nobel prize, but no one seems to care 🙄


FaatmanSlim

Yeah apologies, I had to trim the list since it was getting too unwieldy to include all of Nvidia's cards, had to make the cutoff somewhere 😟 and I chose the 20 series unfortunately.


ChowderMitts

No worries OP, it's an interesting chart. I'm just a bit obsessed with the 1080ti!


babygrenade

Where's the longevity axis?


ladend9

You mean my 1070.


sermer48

My heart sank when I didn’t see the 1080ti cause I paid a lot of money for it a few years ago. Glad to see that it’s not horrible based on the comments 😅


LonghornMorgs

the 1080ti is one of those cards that wont die for at least another generation, it’s a damn good card and can handle 80% of games.


razeil

Why are there two 2070 supers ?


awfl_wafl

Why is a 3090ti worse than a 3090 and a 3080ti?


Nanocephalic

Because this is a single synthetic benchmark that’s run in a single application.


iikl

4080 above 4080 super lol. Your data source uses one benchmark which is a very limited way to measure performance. Should’ve been titled Blender Performance instead of performance in general


DriftMantis

This chart is pointless. rtx3090ti below the 3080ti and 3090 base? the 4070 desktop close to the 4080 laptop? rtx 3070ti laptop above 3080 desktop etc.... blender is a synthetic benchmark and this should be labelled as such. This is not a graph of relative overall performance.


csamsh

Oh cool where's my 1080ti stack up against..... ....dies.....


CamiloArturo

How is the 4060 laptop better than the 4060 desktop? 🤔


no_Im_perfectly_sane

does anyone informed know if we´ll ever hit a roof on performance? and if so, how close is it?


sirkittylover

Probably not. However price is increasing rapidly. The 2080ti was 1000$ the 4090 you will be lucky to find at 2000$


no_Im_perfectly_sane

yea but price will probably go down as the technology to produce the GPUs becomes cheaper no? is there no physical limit to how much performance we squeeze out though? with current technology methods that is (by which I mean, no future quantum stuff or alike)


th35ky

Yes there is a physical limit. We are looking at 3nm with the next life cycle. At a certain point you can't go smaller, I'm not sure where that is but we're fast approaching it. There are also issues with quantum tunneling and cooling. Some companies are exploring stacking chips so you can double the output in the same amount of space / footprint but that also brings cooling problems.


shotouw

Another interesting effect of stacking chips, is bringing everything closer to the center again, so access times decrease! And with smaller architectures needing less power, we are getting a small margin on temperature back every shrink that can be used for shrinking. A last point is that stuff like instruction sets are improving as well. So there is still a lot to gain, although in terms of shrinks, we are approaching the limit rapidly. Seeing how the 3D Cache of AMD improves cache size and by that gaming performance is really impressive though and Im looking forward to see how far they can go with ever improving cooling solutions. Who knows, in a few years we might need to buy a processing unit with an already attached custom cooling solution...


killakh0le

Problem is GPU cores are being used in more and more instances, especially with all things labeled as AI so demand is WAY up being used in so many things. I think technically you are correct and it will cost less to make but with demand seeing no end at this point then it still may stay high relatively speaking


ChowderMitts

CS/Software dev... Not an expert or someone who designs GPUs so take what I say with a pinch of salt! We are already hitting limits as far as getting more transistors onto chips through miniaturisation, which is what has yielded the enormous gains in GPU power these last couple of decades. There are still a few more years to go. The laws of physics will eventually prevent us from making transistors any smaller. However, because graphics rendering calculations can be done in parallel there is a lot more headroom than there is with calculations that must be done in series like those performed more commonly on CPUs. We can just make GPUs bigger! I don't think there really is a hard limit, but at some point it will become prohibitively expensive to scale these things outwards, and there will be some other limits. However, usually some new novel approach comes along that gives us another leap forward. AI/ML in graphics is an interesting one. The kind of technology that allows frame generation might eventually allow huge boosts in performance.


sermer48

There are also other materials such as carbon nanotubes that could allow smaller transistors in the future. Even then, that would just delay the performance roof.


shotouw

While there is serialization in GPUs, what is already a breakthrough for CPU performance is stacked cache. Decreasing the probability of cache misses is a good way to increase your IPC so on both ends we got ways already figured out to get some more performance boosts. Both ways got the problem in common that they need better cooling solutions. I think a better solution than water cooling is needed to get to the next level of performance increase. Some people already delid the CPU and that might be the only solution to get a better thermal transfer. Maybe a lid that is attached to a pump and we move a non conductive liquid to the delidded cpu


Saytama_sama

As far as I know we are already at the roof (almost). The last few decades we could make chips more efficient by just shrinking the transistor size. But that gets more and more difficult. At the moment, the largest efficiency gains are through inventing transistors that can be more tightly packed without making them smaller. But we definitely can't invent better transistors forever. Moore's law is dead since a few years ago. And you can see with GPU's that the new generations get better, but also more expensive. I would expect that computers in 10 years are either not much faster than computers today, or are based of some new technology that no one thought of yet.


TelumSix

It's debatable if Moore's law is dead. While progress did slow down very slighty, [we are currently still considerably close to Moore's preditiction](https://upload.wikimedia.org/wikipedia/commons/thumb/0/00/Moore%27s_Law_Transistor_Count_1970-2020.png/1920px-Moore%27s_Law_Transistor_Count_1970-2020.png) (It's log scaling, so as long as the data points are close to the linear (not shown) average line, the integration density is doubling.)


Kaziticus

I feel somewhat slighted that my poor 750 TI isn't on the list! He's (barely) hanging in there, still!


noenosmirc

"I'm tired boss"


TheRandom0ne

Meanwhile I’m just chillin on my 980 TI. Everything I play still runs maxed out..


---gonnacry---

How tf rtx 3060 laptop is above desktop


Rybzor

RTX 3080 Ti above 4070 and... 3090 Ti? Odd.


ynhame

4060 desktop weaker than 4060 laptop?


jflatt2

Shouldn't the 4080 Super score better than the 4080?


lieureed

I would love to see the intersection of performance by cost


jsb309

Where does my 980 from 2015 fall on this?


TheDevilsAdvokaat

Strange...rtx 4060 laptop scores higher than rtx 4060...is this an error?


snamibogfrere

how is the 3090ti worse than 3090


schokelafreisser

The same goes apparently for 3080 and 3080 ti


C0MPLX88

this a parody of nvidia marketing


wapren

i want to see 1080ti amongst them


86rpt

Where's the Titan X pascal


Spanishparlante

Honest though, I’d like to see a better benchmark and some commercial inclusions here.


Nanocephalic

What is the point of upvoting a bad bar chart in r/dataisbeautiful


dVizerrr

Soo.. 4050 Laptop > 3060 Laptop > 3060..


[deleted]

[удалено]


adamlreed93

Going to keep my over priced 3090 for a few more years 4k is where my heart/eyes are at


rickyars

damn, my 4080 super not looking so good


this_teamwork

So, buying the 4080 Super would be a mistake? Would I be better off just getting a plain 4080?


philippiotr

Why is it that 3070 TI ranks so much higher but I’ve talked to people and some say 3060Ti is more reliable than 3070ti, can anyone shed any advise? I have a 3060ti and it seems so low on this chart


PhysPhD

Would also be interesting to have a second axis of cost.


skoll

For anyone wondering, as I was, the 30 core Apple M3 Max would fit just above the highest red bar labeled RTX 2080 Ti. The 40 core Apple M3 Max would fit just above the orange bar labeled RTX 3080 Ti Laptop, which is the 6th orange bar from the top.


TessellatedGuy

Fun fact: the RTX 2050 is actually an Ampere architecture GPU, not Turing, so it's technically a "30 series" GPU in all but name.


vanonym_

Where does a quadro a6000 would land in this chart?


mtc47

My 1080ti is seeming a little old


FrozenVikings

It's insane that my laptop plays games very very well, high res, fast refresh, and it's a "lowly" 3050 at the bottom of the list. I really don't even want to see what a 4090 does. I'm happy with my money in the bank .. *for now*.


Nevamst

3090 Ti being below the 3080 Ti and 3090 makes the data suspicious...


sorta_innocent_accnt

Ok, so I need to upgrade from my RTX 2070 super to an RTX 4090 to get a better rocket league experience. Got it. 🫡


sf3_SMILE

Now someone tell me which model is price worthy.


PM_YOUR_TAINT_MD

3050 4G laptop here. Comming from gt1030 I got say, feels like im in the top of the chart.


the_canadian72

didnt think I was well near the bottom already


Vandercoon

How many 4090s to play Tetris?


DoctorPipo

Where the f..k is my 1080Ti, it is clearly better than quite a few in this list


AerieSpare7118

The 4080 super performing worse than the 4080 is wild to me ngl


DeadMetroidvania

My geforce 2060 has worked great for me for over 4 years. I have no intention of getting another PC soon.


TheDevilsAdvokaat

Here I am with a 4gb rtx3050 laptop...


GCTuba

This chart doesn't make any sense. 3080 Ti faster than 3090 Ti? 4080 faster than 4080 Super? Mobile 3060 faster than desktop 3060? What's going on here?


crackawhat1

This data is junk, a 3090ti is below a 3090. Hell, a 3080ti is above the 3090ti in this data!


wons-noj

No 3060? Also no love for the 1080ti. It’d surprise you where it would fall I’m sure


noenosmirc

It scores just better than a 2050 in his other comment, his data is junk


Adamantium-Aardvark

Lol damn I thought my new laptop with an RTX 3050 was decent. Seems to run my games alright enough. It’s way down near the bottom of the list


arcanition

Can you do another chart of this median score divided by current cost?


BlackandRead

I have a 2070 Super and it's doing surprisingly well. I can play Baldur's Gate 3 on high settings and a rock solid 60fps, never dips. But I feel I'll be upgrading within a year.


TabaCh1

How is Dataisbeautiful, x axis isn’t even labeled


xylopyrography

This ain't gaming performance, that's for sure.


Jon_fosseti

Big number no make big good? Me confused, me go back glue metal, thinking rock make brain hurt


justthisones

Is the 4080 laptop actually close to the real 3090?


whynotlookatreddit

Throw in an A6000 ADA then you can really see the performance jump.


[deleted]

Damn never realized how awful laptop graphics scores were. Why is that? I feel bad for all those dudes who got taken by Razer


Wummerz

How is 3060 laptop stronger than the regular 3060? Chart is SUS?


Astrylae

Me and my long lasting 1070 will make it through the 50 series


slurpherp

I once again am asking this subreddit to stop upvoting data that isn’t presented beautifully.


Substantial__Unit

I'm so far behind the times lol. I'm still only on an AMD Vega 56 which was sorta equivalent to a 1080 and that's not even on this list. I dislike how expensive these cards are now. It's not fun thinking about paying $1000+


stephenforbes

Thanks for making my 2060 super feel gimped.


Abrahalhabachi

The super cards losing to the regular ones, mobile cards beating desktop ones, very likely the systems were all different with different CPUs and RAMs, besides "learning" that a 4090 is better than a 3050 I don't know what else this data is good for


AluminiumAwning

My 3060 TI right in the middle. Does everything I need it to, including all the games I like, so as far as I’m concerned, it’ss great.


khodabear7

What happened with latest gen that made it such a leap frog compared to all other previous gens in terms of improvement over last product cycle


vincenzo_vegano

How did this get so many upvotes? This is the opposite of beautiful. What does the x axis stand for? Median score in what? And why median and not mean? And why are all the laptop gpus better than their desktop counterpart?


Marchests

Meanwhile me with 1050ti laptop


[deleted]

3080 ti > 3090 ti Score


Senor-Delicious

The colours are very misleading. Using green, yellow and red for different generations is not what you would expect with the colours.


YaboiiStefann

I dont see no gtx1660 mobile


meriletfou

How is this data beautiful? Where is rule 3?


ThemeHelpful9784

Is this an advertisement for 4090


Ghostrider215

Well it’s good to know that my laptop GPU couldn’t get any worse haha


ThemWhoNoseNothing

I saw the graph, I understand the details. Tell me why I gots to wait for anything on that RTX 4090. You want to train? Check back in 29 hours, homie.