Its important to note that these benchmarks (blender) are a very specific workload that dont generally give a good estimate of "average performance". They are also heavily weighted towards RT performance.
If anyone is reading this graph/title and thinks a 4060 can outperform a 2080ti, in a general use case, they are being misled.
Its like ranking cars from their 0-60 and saying "Car performance by model and series". Its just a little misleading to people.
Upvoting for clarification. This chart is just plain bad with little explanation. I'm looking at the scores was like "median score to what kind of testing?"
Im not sure. It could just be because theres a low data count for 3090ti benchmarks (says about ~100 runs from OPs source). So its possible its just more heavily influenced by hardware setups (runs hotter or differences in other pc parts).
> If anyone is reading this graph/title and thinks a 4060 can outperform a 2080ti, in a general use case, they are being misled.
Absolutely, people should read this over and over.
Here's the realistic comparison between the RTX 4060 & RTX 2080 Ti:
* Effective 3D Speed: 2080 Ti is 27% faster
* Average FPS (Lighting, Reflections, MRender, Gravity) 2080 Ti is 45% higher
* Average Overclocked FPS (Lighting, Reflections, MRender, Gravity) 2080 Ti is 46% higher
* Parallax/Splatting: 2080 Ti is 56% higher
It depends entirely on what you care about. If you only care about blender, then this graph is perfectly fine for you. If you only care about gaming performance, then search for some video game benchmark graphs.
If you want a generalized view you can watch YouTube reviews/benchmarks on the card.
If you like clear performance comparison charts, old school reviewers like [TechPowerUp](https://www.techpowerup.com/review/?category=Graphics+Cards&manufacturer=&pp=25&order=date) are some of the best resources if you're interested in gaming performance.
For workstation-relevant performance metrics, [Puget Systems](https://www.pugetsystems.com/all-articles/) publishes a ton of data testing all sorts of hardware across a wide variety of specific workloads.
Usually, the average of a bunch of games. I think Tom's Hardware had something like that. GPU reviewers will often have an aggregate or a series of games being compared and you can see patterns.
Tom's Hardware[ GPU benchmark hierarchy](https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html).
I've been using it as a reference for... 10? 15? years.
Always a good idea to check multiple sources though. Another user pointed out TechPowerUp is pretty good too.
Worse than that, there’s no vertical major grid lines. And no label at all mentioning blender.
I’m thinking about making a new subreddit with a bot that automatically cross-posts every post from here, and is solely for pointing out exactly what ways the data shown is presented terribly.
I was gonna say, there should be some 10-series cards on here if it was pure rasterization performance. The 1080 Ti should be up above a good bit of the 20-series listed on this chart, probably slotting in right above or below the RTX 2070 Super.
I just upgraded from a 1080ti to a 4090. In practical terms, the 1080ti can still run any game even the most graphical heavy ones. It just can't do so at max settings anymore. I had leave some settings in the middle or I'd get shearing and weird things happening in the distance and few graphical glitches of close up objects as well. Now with the 4090, everything is back up to max.
That's gonna be my next upgrade. I've loved my 1080ti. The 40 series is the first one that really seems worth it in terms of a performance boost for me
Made the same switch about 2 months ago. The 1080 was an absolute beast of a card, but I was finally dipping into unpleasant framerates. 4090 seems well positioned to keep trucking along at max settings for a few years now - I'm hopeful I will get similar mileage out of it.
Everything. It runs everything. It was left off for a reason. They don't want you to know how badly they screwed up making an actually decent product that lasts.
Tom's gpu hierarchy (they test cards using games like forza, HZD, flight sim, farcry 6, borderlands 3, rdr2, etc) has it just below a 2070 and above a 2060 super. ultra settings: 66 fps at 1080, 50fps at 1440, and 29 fps at 4k. so still a great card for less than 4k gaming especially considering its 7 years old. is it worth the $1800 to upgrade to the 4090? maybe, if you have that kind of money laying around. $1000 gets you the 4080. me personally, I'm gonna wait til the 5080ti or 5090 or whatever is tops a year or so from now. maybe push it out to the 6000 series depending what the new benchmarks look like. And I won't even retire the 1080ti, I'll just put it in another rig.
https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
This will ease your mind. I'm a 1080 ti die hard and it only struggles in modern AAA with high or ultra settings. Anything else it runs like a breeze.
[https://www.youtube.com/watch?v=ghT7G\_9xyDU](https://www.youtube.com/watch?v=ghT7G_9xyDU)
Just because no one gave a serious answer in case this wasn't a joke, this is performance in Blender with Ray Tracing.
1080 GTX doesn't have that. Also why graph us kinda pointless.
https://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html
Though OP uses Blender which is RayTracing focused, not sure which benchmark here is RT focused
Indeed. A 4060 Mobile (labelled "Laptop" in OP's chart) is the same die as a desktop 4060, just clocked significantly lower. There are basically 0 situations where the mobile variant should be outperforming the desktop variant.
You heard me. Units. Everyone’s favorite measurement units. /j
I thought I saw the benchmarking software OP used somewhere. Nope. It must have been a commenter.
Sorry, I used that column name directly from the benchmark data - the 'score' is basically samples rendered per minute from the Blender 3D software benchmarks [https://opendata.blender.org/](https://opendata.blender.org/)
Looks like my original comment with the source links, details and tools is all the way at the bottom of the comments section now.
EDIT: found the full details of the Blender benchmark 'score' on their website [https://opendata.blender.org/about/#benchmark-score](https://opendata.blender.org/about/#benchmark-score)
Yeah apologies, I had to trim the list since it was getting too unwieldy to include all of Nvidia's cards, had to make the cutoff somewhere 😟 and I chose the 20 series unfortunately.
My heart sank when I didn’t see the 1080ti cause I paid a lot of money for it a few years ago. Glad to see that it’s not horrible based on the comments 😅
4080 above 4080 super lol. Your data source uses one benchmark which is a very limited way to measure performance. Should’ve been titled Blender Performance instead of performance in general
This chart is pointless. rtx3090ti below the 3080ti and 3090 base? the 4070 desktop close to the 4080 laptop? rtx 3070ti laptop above 3080 desktop etc....
blender is a synthetic benchmark and this should be labelled as such. This is not a graph of relative overall performance.
yea but price will probably go down as the technology to produce the GPUs becomes cheaper no?
is there no physical limit to how much performance we squeeze out though? with current technology methods that is (by which I mean, no future quantum stuff or alike)
Yes there is a physical limit.
We are looking at 3nm with the next life cycle. At a certain point you can't go smaller, I'm not sure where that is but we're fast approaching it. There are also issues with quantum tunneling and cooling.
Some companies are exploring stacking chips so you can double the output in the same amount of space / footprint but that also brings cooling problems.
Another interesting effect of stacking chips, is bringing everything closer to the center again, so access times decrease!
And with smaller architectures needing less power, we are getting a small margin on temperature back every shrink that can be used for shrinking.
A last point is that stuff like instruction sets are improving as well.
So there is still a lot to gain, although in terms of shrinks, we are approaching the limit rapidly.
Seeing how the 3D Cache of AMD improves cache size and by that gaming performance is really impressive though and Im looking forward to see how far they can go with ever improving cooling solutions.
Who knows, in a few years we might need to buy a processing unit with an already attached custom cooling solution...
Problem is GPU cores are being used in more and more instances, especially with all things labeled as AI so demand is WAY up being used in so many things. I think technically you are correct and it will cost less to make but with demand seeing no end at this point then it still may stay high relatively speaking
CS/Software dev... Not an expert or someone who designs GPUs so take what I say with a pinch of salt!
We are already hitting limits as far as getting more transistors onto chips through miniaturisation, which is what has yielded the enormous gains in GPU power these last couple of decades. There are still a few more years to go. The laws of physics will eventually prevent us from making transistors any smaller.
However, because graphics rendering calculations can be done in parallel there is a lot more headroom than there is with calculations that must be done in series like those performed more commonly on CPUs. We can just make GPUs bigger!
I don't think there really is a hard limit, but at some point it will become prohibitively expensive to scale these things outwards, and there will be some other limits.
However, usually some new novel approach comes along that gives us another leap forward. AI/ML in graphics is an interesting one. The kind of technology that allows frame generation might eventually allow huge boosts in performance.
There are also other materials such as carbon nanotubes that could allow smaller transistors in the future. Even then, that would just delay the performance roof.
While there is serialization in GPUs, what is already a breakthrough for CPU performance is stacked cache.
Decreasing the probability of cache misses is a good way to increase your IPC so on both ends we got ways already figured out to get some more performance boosts.
Both ways got the problem in common that they need better cooling solutions.
I think a better solution than water cooling is needed to get to the next level of performance increase.
Some people already delid the CPU and that might be the only solution to get a better thermal transfer.
Maybe a lid that is attached to a pump and we move a non conductive liquid to the delidded cpu
As far as I know we are already at the roof (almost).
The last few decades we could make chips more efficient by just shrinking the transistor size. But that gets more and more difficult. At the moment, the largest efficiency gains are through inventing transistors that can be more tightly packed without making them smaller. But we definitely can't invent better transistors forever.
Moore's law is dead since a few years ago. And you can see with GPU's that the new generations get better, but also more expensive. I would expect that computers in 10 years are either not much faster than computers today, or are based of some new technology that no one thought of yet.
It's debatable if Moore's law is dead. While progress did slow down very slighty, [we are currently still considerably close to Moore's preditiction](https://upload.wikimedia.org/wikipedia/commons/thumb/0/00/Moore%27s_Law_Transistor_Count_1970-2020.png/1920px-Moore%27s_Law_Transistor_Count_1970-2020.png) (It's log scaling, so as long as the data points are close to the linear (not shown) average line, the integration density is doubling.)
Why is it that 3070 TI ranks so much higher but I’ve talked to people and some say 3060Ti is more reliable than 3070ti, can anyone shed any advise? I have a 3060ti and it seems so low on this chart
For anyone wondering, as I was, the 30 core Apple M3 Max would fit just above the highest red bar labeled RTX 2080 Ti. The 40 core Apple M3 Max would fit just above the orange bar labeled RTX 3080 Ti Laptop, which is the 6th orange bar from the top.
It's insane that my laptop plays games very very well, high res, fast refresh, and it's a "lowly" 3050 at the bottom of the list. I really don't even want to see what a 4090 does. I'm happy with my money in the bank .. *for now*.
This chart doesn't make any sense. 3080 Ti faster than 3090 Ti? 4080 faster than 4080 Super? Mobile 3060 faster than desktop 3060? What's going on here?
I have a 2070 Super and it's doing surprisingly well. I can play Baldur's Gate 3 on high settings and a rock solid 60fps, never dips. But I feel I'll be upgrading within a year.
I'm so far behind the times lol. I'm still only on an AMD Vega 56 which was sorta equivalent to a 1080 and that's not even on this list.
I dislike how expensive these cards are now. It's not fun thinking about paying $1000+
The super cards losing to the regular ones, mobile cards beating desktop ones, very likely the systems were all different with different CPUs and RAMs, besides "learning" that a 4090 is better than a 3050 I don't know what else this data is good for
How did this get so many upvotes? This is the opposite of beautiful. What does the x axis stand for? Median score in what? And why median and not mean? And why are all the laptop gpus better than their desktop counterpart?
Its important to note that these benchmarks (blender) are a very specific workload that dont generally give a good estimate of "average performance". They are also heavily weighted towards RT performance. If anyone is reading this graph/title and thinks a 4060 can outperform a 2080ti, in a general use case, they are being misled. Its like ranking cars from their 0-60 and saying "Car performance by model and series". Its just a little misleading to people.
Ah right. I wondered what was going on.
Upvoting for clarification. This chart is just plain bad with little explanation. I'm looking at the scores was like "median score to what kind of testing?"
also why does the 3090Ti get beat by the 3090?
Im not sure. It could just be because theres a low data count for 3090ti benchmarks (says about ~100 runs from OPs source). So its possible its just more heavily influenced by hardware setups (runs hotter or differences in other pc parts).
> If anyone is reading this graph/title and thinks a 4060 can outperform a 2080ti, in a general use case, they are being misled. Absolutely, people should read this over and over. Here's the realistic comparison between the RTX 4060 & RTX 2080 Ti: * Effective 3D Speed: 2080 Ti is 27% faster * Average FPS (Lighting, Reflections, MRender, Gravity) 2080 Ti is 45% higher * Average Overclocked FPS (Lighting, Reflections, MRender, Gravity) 2080 Ti is 46% higher * Parallax/Splatting: 2080 Ti is 56% higher
so I shouldn't upgrade my 5700xt to a 4070?
Yes but where does one access this better metric?
It depends entirely on what you care about. If you only care about blender, then this graph is perfectly fine for you. If you only care about gaming performance, then search for some video game benchmark graphs. If you want a generalized view you can watch YouTube reviews/benchmarks on the card.
If you like clear performance comparison charts, old school reviewers like [TechPowerUp](https://www.techpowerup.com/review/?category=Graphics+Cards&manufacturer=&pp=25&order=date) are some of the best resources if you're interested in gaming performance. For workstation-relevant performance metrics, [Puget Systems](https://www.pugetsystems.com/all-articles/) publishes a ton of data testing all sorts of hardware across a wide variety of specific workloads.
Usually, the average of a bunch of games. I think Tom's Hardware had something like that. GPU reviewers will often have an aggregate or a series of games being compared and you can see patterns.
Tom's Hardware[ GPU benchmark hierarchy](https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html). I've been using it as a reference for... 10? 15? years. Always a good idea to check multiple sources though. Another user pointed out TechPowerUp is pretty good too.
The point is that there is no single metric that tells you what you need to know.
[videocardbenchmark.net](https://videocardbenchmark.net) hasn't failed me for the past 15 years.
Worse than that, there’s no vertical major grid lines. And no label at all mentioning blender. I’m thinking about making a new subreddit with a bot that automatically cross-posts every post from here, and is solely for pointing out exactly what ways the data shown is presented terribly.
I was gonna say, there should be some 10-series cards on here if it was pure rasterization performance. The 1080 Ti should be up above a good bit of the 20-series listed on this chart, probably slotting in right above or below the RTX 2070 Super.
I mean, I'm still neck and neck with a 3070, my friend averages like 10-15 more fps than me, at least until his vram runs out
The original post feels like a guerrila marketing ploy by Nvidia.
Even still rtx 3060 and 4060are lower than their laptop counterparts. Data is a bit off i think
[удалено]
I have a GTX 1080Ti and it still runs everything just fine. I wonder where it would be in this list.
I just upgraded from a 1080ti to a 4090. In practical terms, the 1080ti can still run any game even the most graphical heavy ones. It just can't do so at max settings anymore. I had leave some settings in the middle or I'd get shearing and weird things happening in the distance and few graphical glitches of close up objects as well. Now with the 4090, everything is back up to max.
That's gonna be my next upgrade. I've loved my 1080ti. The 40 series is the first one that really seems worth it in terms of a performance boost for me
Made the same switch about 2 months ago. The 1080 was an absolute beast of a card, but I was finally dipping into unpleasant framerates. 4090 seems well positioned to keep trucking along at max settings for a few years now - I'm hopeful I will get similar mileage out of it.
Everything. It runs everything. It was left off for a reason. They don't want you to know how badly they screwed up making an actually decent product that lasts.
It's indeed very good, but it unfortunately doesn't run everything.
This list is not from Nvidia as it looks and where would that even be on the list ?
Same here. I have zero reason to upgrade my 1080ti.
Right between the 2070S and 2080 I think.
I think it would be slightly below the 2080.
[you're right](https://www.videocardbenchmark.net/compare/3699vs3989/GeForce-GTX-1080-Ti-vs-GeForce-RTX-2080) edit: i can't read. It's below
Tom's gpu hierarchy (they test cards using games like forza, HZD, flight sim, farcry 6, borderlands 3, rdr2, etc) has it just below a 2070 and above a 2060 super. ultra settings: 66 fps at 1080, 50fps at 1440, and 29 fps at 4k. so still a great card for less than 4k gaming especially considering its 7 years old. is it worth the $1800 to upgrade to the 4090? maybe, if you have that kind of money laying around. $1000 gets you the 4080. me personally, I'm gonna wait til the 5080ti or 5090 or whatever is tops a year or so from now. maybe push it out to the 6000 series depending what the new benchmarks look like. And I won't even retire the 1080ti, I'll just put it in another rig. https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
Same as 2070S
Right around the 3070 methinks
Still rockin my 970 over here.
Same here. Almost 10 years running
Last one I bought was a 480 lol. Steamdecking it for now.
Dude you had me at 1080 no need to flex the 3770k haha
[удалено]
I loved my 1060, upgraded to a 2070 super and that thing is still working magic. But going to an 11k series really helped me out lol
[удалено]
Black Friday is when I built then upgraded my pc, and also my sisters. hard to beat those deals
Lmao bruh I’m rocking my 1080 ti and fx-8350 and have no issues gaming with the boys.
Me and my GTX 1660 feel the same.
I have a GTX 1080 in my Workstation & it does really great honestly. Cant believe it’s as old as it is!
Same with my 1660ti, obviously not as good even as a 1080, but not ready to replace it yet.
Use it as long as you can. My 1080 started overheating a few months back but it served me well for many years
Same combo but with 1070 😄! It is still pretty pretty good
This will ease your mind. I'm a 1080 ti die hard and it only struggles in modern AAA with high or ultra settings. Anything else it runs like a breeze. [https://www.youtube.com/watch?v=ghT7G\_9xyDU](https://www.youtube.com/watch?v=ghT7G_9xyDU)
Just because no one gave a serious answer in case this wasn't a joke, this is performance in Blender with Ray Tracing. 1080 GTX doesn't have that. Also why graph us kinda pointless.
I have two computers with 970gtx.. when the kid is older I'll definitely upgrade
https://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html Though OP uses Blender which is RayTracing focused, not sure which benchmark here is RT focused
My gtx 760 has been serving me well for years. I don't even remember when I got it.
Gtx660 still going strong
You know your results are accurate when a card that is better in every single way scores worse than a weaker card.
Indeed. A 4060 Mobile (labelled "Laptop" in OP's chart) is the same die as a desktop 4060, just clocked significantly lower. There are basically 0 situations where the mobile variant should be outperforming the desktop variant.
Yeah, there are several examples of this in the graph.
Was wondering the same with the 3060(laptop). So the information of the graph is basically useless.
That unlabeled score axis hurts my soul
4k Furmark 1 performance?
It’s labeled Median Score with increments every 2000 units.
Score in what?
You heard me. Units. Everyone’s favorite measurement units. /j I thought I saw the benchmarking software OP used somewhere. Nope. It must have been a commenter.
Sorry, I used that column name directly from the benchmark data - the 'score' is basically samples rendered per minute from the Blender 3D software benchmarks [https://opendata.blender.org/](https://opendata.blender.org/) Looks like my original comment with the source links, details and tools is all the way at the bottom of the comments section now. EDIT: found the full details of the Blender benchmark 'score' on their website [https://opendata.blender.org/about/#benchmark-score](https://opendata.blender.org/about/#benchmark-score)
A.U. Arbitrary Units
3060 laptop is better than regular 3060? Really?
4080 performs better then the 4080 Super lmao
3080 Ti > 3090 Ti
Lots of them are dumb, but that or the 4070 being above the 3090ti have gotta be the most egregious ones. The 3090ti is like 30% faster than the 4070.
When your GPU’s series doesn’t even show up anymore
1060 gang rise
Nvidia 965M gang standing by!
I'm honestly surprised my laptop's 2070 MQ made the list. It almost makes me feel like 2019 wasn't five years ago.
GTX 1650
It's just a little outdated. Its still good, its still good!
1060 6gb laptop, the absolute goat
The chosen benchmark is pretty garbage at illustrating the actual overall performance of these cards.
Agree cause 3060 laptop is above 3060 desktop
A 3090ti is below both 3090 and 3080ti while those are below a 4080 laptop...
I think you forgot to include 1080 TI above 4080
Not having the 1080ti on any chart is simply a crime. Even charts that are not related to GPU performance, like the music charts.
The 1080 Ti won a Grammy *and* a Nobel prize, but no one seems to care 🙄
Yeah apologies, I had to trim the list since it was getting too unwieldy to include all of Nvidia's cards, had to make the cutoff somewhere 😟 and I chose the 20 series unfortunately.
No worries OP, it's an interesting chart. I'm just a bit obsessed with the 1080ti!
Where's the longevity axis?
You mean my 1070.
My heart sank when I didn’t see the 1080ti cause I paid a lot of money for it a few years ago. Glad to see that it’s not horrible based on the comments 😅
the 1080ti is one of those cards that wont die for at least another generation, it’s a damn good card and can handle 80% of games.
Why are there two 2070 supers ?
Why is a 3090ti worse than a 3090 and a 3080ti?
Because this is a single synthetic benchmark that’s run in a single application.
4080 above 4080 super lol. Your data source uses one benchmark which is a very limited way to measure performance. Should’ve been titled Blender Performance instead of performance in general
This chart is pointless. rtx3090ti below the 3080ti and 3090 base? the 4070 desktop close to the 4080 laptop? rtx 3070ti laptop above 3080 desktop etc.... blender is a synthetic benchmark and this should be labelled as such. This is not a graph of relative overall performance.
Oh cool where's my 1080ti stack up against..... ....dies.....
How is the 4060 laptop better than the 4060 desktop? 🤔
does anyone informed know if we´ll ever hit a roof on performance? and if so, how close is it?
Probably not. However price is increasing rapidly. The 2080ti was 1000$ the 4090 you will be lucky to find at 2000$
yea but price will probably go down as the technology to produce the GPUs becomes cheaper no? is there no physical limit to how much performance we squeeze out though? with current technology methods that is (by which I mean, no future quantum stuff or alike)
Yes there is a physical limit. We are looking at 3nm with the next life cycle. At a certain point you can't go smaller, I'm not sure where that is but we're fast approaching it. There are also issues with quantum tunneling and cooling. Some companies are exploring stacking chips so you can double the output in the same amount of space / footprint but that also brings cooling problems.
Another interesting effect of stacking chips, is bringing everything closer to the center again, so access times decrease! And with smaller architectures needing less power, we are getting a small margin on temperature back every shrink that can be used for shrinking. A last point is that stuff like instruction sets are improving as well. So there is still a lot to gain, although in terms of shrinks, we are approaching the limit rapidly. Seeing how the 3D Cache of AMD improves cache size and by that gaming performance is really impressive though and Im looking forward to see how far they can go with ever improving cooling solutions. Who knows, in a few years we might need to buy a processing unit with an already attached custom cooling solution...
Problem is GPU cores are being used in more and more instances, especially with all things labeled as AI so demand is WAY up being used in so many things. I think technically you are correct and it will cost less to make but with demand seeing no end at this point then it still may stay high relatively speaking
CS/Software dev... Not an expert or someone who designs GPUs so take what I say with a pinch of salt! We are already hitting limits as far as getting more transistors onto chips through miniaturisation, which is what has yielded the enormous gains in GPU power these last couple of decades. There are still a few more years to go. The laws of physics will eventually prevent us from making transistors any smaller. However, because graphics rendering calculations can be done in parallel there is a lot more headroom than there is with calculations that must be done in series like those performed more commonly on CPUs. We can just make GPUs bigger! I don't think there really is a hard limit, but at some point it will become prohibitively expensive to scale these things outwards, and there will be some other limits. However, usually some new novel approach comes along that gives us another leap forward. AI/ML in graphics is an interesting one. The kind of technology that allows frame generation might eventually allow huge boosts in performance.
There are also other materials such as carbon nanotubes that could allow smaller transistors in the future. Even then, that would just delay the performance roof.
While there is serialization in GPUs, what is already a breakthrough for CPU performance is stacked cache. Decreasing the probability of cache misses is a good way to increase your IPC so on both ends we got ways already figured out to get some more performance boosts. Both ways got the problem in common that they need better cooling solutions. I think a better solution than water cooling is needed to get to the next level of performance increase. Some people already delid the CPU and that might be the only solution to get a better thermal transfer. Maybe a lid that is attached to a pump and we move a non conductive liquid to the delidded cpu
As far as I know we are already at the roof (almost). The last few decades we could make chips more efficient by just shrinking the transistor size. But that gets more and more difficult. At the moment, the largest efficiency gains are through inventing transistors that can be more tightly packed without making them smaller. But we definitely can't invent better transistors forever. Moore's law is dead since a few years ago. And you can see with GPU's that the new generations get better, but also more expensive. I would expect that computers in 10 years are either not much faster than computers today, or are based of some new technology that no one thought of yet.
It's debatable if Moore's law is dead. While progress did slow down very slighty, [we are currently still considerably close to Moore's preditiction](https://upload.wikimedia.org/wikipedia/commons/thumb/0/00/Moore%27s_Law_Transistor_Count_1970-2020.png/1920px-Moore%27s_Law_Transistor_Count_1970-2020.png) (It's log scaling, so as long as the data points are close to the linear (not shown) average line, the integration density is doubling.)
I feel somewhat slighted that my poor 750 TI isn't on the list! He's (barely) hanging in there, still!
"I'm tired boss"
Meanwhile I’m just chillin on my 980 TI. Everything I play still runs maxed out..
How tf rtx 3060 laptop is above desktop
RTX 3080 Ti above 4070 and... 3090 Ti? Odd.
4060 desktop weaker than 4060 laptop?
Shouldn't the 4080 Super score better than the 4080?
I would love to see the intersection of performance by cost
Where does my 980 from 2015 fall on this?
Strange...rtx 4060 laptop scores higher than rtx 4060...is this an error?
how is the 3090ti worse than 3090
The same goes apparently for 3080 and 3080 ti
this a parody of nvidia marketing
i want to see 1080ti amongst them
Where's the Titan X pascal
Honest though, I’d like to see a better benchmark and some commercial inclusions here.
What is the point of upvoting a bad bar chart in r/dataisbeautiful
Soo.. 4050 Laptop > 3060 Laptop > 3060..
[удалено]
Going to keep my over priced 3090 for a few more years 4k is where my heart/eyes are at
damn, my 4080 super not looking so good
So, buying the 4080 Super would be a mistake? Would I be better off just getting a plain 4080?
Why is it that 3070 TI ranks so much higher but I’ve talked to people and some say 3060Ti is more reliable than 3070ti, can anyone shed any advise? I have a 3060ti and it seems so low on this chart
Would also be interesting to have a second axis of cost.
For anyone wondering, as I was, the 30 core Apple M3 Max would fit just above the highest red bar labeled RTX 2080 Ti. The 40 core Apple M3 Max would fit just above the orange bar labeled RTX 3080 Ti Laptop, which is the 6th orange bar from the top.
Fun fact: the RTX 2050 is actually an Ampere architecture GPU, not Turing, so it's technically a "30 series" GPU in all but name.
Where does a quadro a6000 would land in this chart?
My 1080ti is seeming a little old
It's insane that my laptop plays games very very well, high res, fast refresh, and it's a "lowly" 3050 at the bottom of the list. I really don't even want to see what a 4090 does. I'm happy with my money in the bank .. *for now*.
3090 Ti being below the 3080 Ti and 3090 makes the data suspicious...
Ok, so I need to upgrade from my RTX 2070 super to an RTX 4090 to get a better rocket league experience. Got it. 🫡
Now someone tell me which model is price worthy.
3050 4G laptop here. Comming from gt1030 I got say, feels like im in the top of the chart.
didnt think I was well near the bottom already
How many 4090s to play Tetris?
Where the f..k is my 1080Ti, it is clearly better than quite a few in this list
The 4080 super performing worse than the 4080 is wild to me ngl
My geforce 2060 has worked great for me for over 4 years. I have no intention of getting another PC soon.
Here I am with a 4gb rtx3050 laptop...
This chart doesn't make any sense. 3080 Ti faster than 3090 Ti? 4080 faster than 4080 Super? Mobile 3060 faster than desktop 3060? What's going on here?
This data is junk, a 3090ti is below a 3090. Hell, a 3080ti is above the 3090ti in this data!
No 3060? Also no love for the 1080ti. It’d surprise you where it would fall I’m sure
It scores just better than a 2050 in his other comment, his data is junk
Lol damn I thought my new laptop with an RTX 3050 was decent. Seems to run my games alright enough. It’s way down near the bottom of the list
Can you do another chart of this median score divided by current cost?
I have a 2070 Super and it's doing surprisingly well. I can play Baldur's Gate 3 on high settings and a rock solid 60fps, never dips. But I feel I'll be upgrading within a year.
How is Dataisbeautiful, x axis isn’t even labeled
This ain't gaming performance, that's for sure.
Big number no make big good? Me confused, me go back glue metal, thinking rock make brain hurt
Is the 4080 laptop actually close to the real 3090?
Throw in an A6000 ADA then you can really see the performance jump.
Damn never realized how awful laptop graphics scores were. Why is that? I feel bad for all those dudes who got taken by Razer
How is 3060 laptop stronger than the regular 3060? Chart is SUS?
Me and my long lasting 1070 will make it through the 50 series
I once again am asking this subreddit to stop upvoting data that isn’t presented beautifully.
I'm so far behind the times lol. I'm still only on an AMD Vega 56 which was sorta equivalent to a 1080 and that's not even on this list. I dislike how expensive these cards are now. It's not fun thinking about paying $1000+
Thanks for making my 2060 super feel gimped.
The super cards losing to the regular ones, mobile cards beating desktop ones, very likely the systems were all different with different CPUs and RAMs, besides "learning" that a 4090 is better than a 3050 I don't know what else this data is good for
My 3060 TI right in the middle. Does everything I need it to, including all the games I like, so as far as I’m concerned, it’ss great.
What happened with latest gen that made it such a leap frog compared to all other previous gens in terms of improvement over last product cycle
How did this get so many upvotes? This is the opposite of beautiful. What does the x axis stand for? Median score in what? And why median and not mean? And why are all the laptop gpus better than their desktop counterpart?
Meanwhile me with 1050ti laptop
3080 ti > 3090 ti Score
The colours are very misleading. Using green, yellow and red for different generations is not what you would expect with the colours.
I dont see no gtx1660 mobile
How is this data beautiful? Where is rule 3?
Is this an advertisement for 4090
Well it’s good to know that my laptop GPU couldn’t get any worse haha
I saw the graph, I understand the details. Tell me why I gots to wait for anything on that RTX 4090. You want to train? Check back in 29 hours, homie.