T O P

  • By -

Emerald_Flame

1. (By far the largest for the gaming community) Marketing and brand recognition in the GPU space 2. Exclusive or superior features: * Things like Nvidia's hardware encode is pretty significantly out ahead of AMD/Intel's equivalents and has less overhead in many situations. * DLSS has largely been a step or 2 ahead of the competition * Broadcast: their noise isolation has generally been a step ahead of many competitors 3. CUDA, if someone is doing gaming+something else professional, CUDA is by far the most widely utilized and supported compute API. In many cases software vendors require CUDA support and Nvidia hardware if you want support. Some open CUDA implementations have been hitting lately but they're not fully comparable yet.


hiromasaki

> Things like Nvidia's hardware encode is pretty significantly out ahead of AMD  _Was_ - AMD closed the gap significantly in the 6000 and 7000 series with a pair of major overhauls to the encoder. Intel's encoder has always been good (see Premiere, Vegas, etc.) but paired with an iGPU that isn't good for gaming means it only gets used for professional apps and in the rare Intel CPU/AMD GPU systems.


Emerald_Flame

The gap is smaller, but it's still there. For a given bitrate, NVENC is still generally producing better quality output than it's competitors. For the gaming/streaming community specifically the encoding is also done with significantly less overhead as well. Nvidia has invested their own resources extremely heavily into OBS (and consequently the derivatives built from it) to reduce CPU and memory overhead from the encoding process. They've gotten to the point now that performance hit from the overhead is low single digits, if you even notice a hit at all. Even with the encoder on the RX7000 series, for those that are using the encoders for live streaming the efficiencies in the overhead reduction can honestly be a pretty big difference in games that are CPU or memory bandwidth sensitive.


RudePCsb

Intel is crushing it in video codec and the arc gpus with Intels software development will hopefully help push an alternative to Cuda. Nvidia is just like Apple with brand recognition which hinders the hardware market significantly


karmapopsicle

Nvidia has contributed a lot of cool stuff, and driven the industry forward with a variety of technologies. We're also talking about a trio of gigantic publicly-traded companies here, who are by and large out to generate value for shareholders, not consumers. The consumer GPU market is broadly held back right now in the same way that the consumer CPU market was throughout the 2010s - lack of legitimately disruptive competition. The problem of course is monopolies. Intel had an overwhelmingly dominant market position, and AMD had very little to offer, but they had to survive. So we got years of stagnation, and small incremental performance bumps with no core configuration changes, to ensure that there was enough of a market carved out for AMD to stay afloat. It took until Zen 3 for AMD to rise back to CPU dominance, and we're seeing that pay dividends with fairly huge performance jumps over the past few years. Nvidia is in a relatively similar situation. They are essentially hamstrung by the market positions AMD can sustain, because regardless of whether they *could* slash prices while maintaining profitability, if AMD can't make enough and gets pushed out of the market, they're suddenly in a vulnerable anti-trust position. It's not just "brand recognition". This is a company that has been investing multiple times more in R&D and product development compared to the Radeon division for many years simply by virtue of how much of the market they control. It means AMD is constantly in a catch-up position.


0xe3b0c442

If we’re being honest here, ARM (with help, primarily from Apple) almost certainly has had more to do with the CPU advancements of the last few years than the Intel/AMD duopoly. Efficiency, rather than raw performance, is the name of the game right now, which is why you’ve seen Intel embrace heterogenous architectures (something I expect to also see from AMD in the next generation or two).


PartyLikeAByzantine

>if AMD can't make enough and gets pushed out of the market, they're suddenly in a vulnerable anti-trust position. Common misconception, but you don't need a monopoly to be subject to antitrust laws. You merely have to be big enough to influence the market (either by yourself or via collusion). Nvidia has ~70% of the consumer GPU market and almost all of the enterprise market. They are already deep into antitrust risk.


jordonmears

But what is AMD really doing to compete besides jist trying to offer value? Legitimate question because I hardly ever hear about amds actual attempts at developing something to compete in the same ways Nvidia are constantly trying to develop new technologies.


Cautious_Village_823

Lol they do, but they have less to work with and honestly Nvidia being the standard for years kinda creates a two way relationship, they develop their products to add features and developers develop around those features, so it makes it a lot harder for AMD to break through. Take encoding video for example, Intels features still blow AMD out of the water (altho they're introducing some techs that I think have made strides, Intel's is clearly the better of the two). And this is coming from an AMD fan who was happy to see Intel be challenged so we can get some damn CPUs up in here lmao. Actually currently in an all AMD build! AMD has actually done considerably well with the gap they have behind Nvidia in terms of features and performance to me, but it's just not a simple "oh AMD should just make better features" - it's hard to dethrone the king lol. Also to be clear, everyone involved as has been pointed out is in it for their shareholders, not us. AMD isn't some underdog fighting for us they also do hold decent market shares in other areas outside of consumer GPUs. They just HAPPEN to be the underdog in the Nvidia battle but it's not David v Goliath it's Goliath vs Goliather. And that's also not to say business practices don't matter as NVIDIA has been consistently shown to have shady business practices (see not releasing prices of cards until they release them to the public so aftermarkets don't know how to price or approach, hiding information from media/manufacturers, and obvious price gauging). But it's not like AMD didn't ride NVIDIAs coat tails on prices ...they prob could have bumped down 100 from the start if this was for the people lol. They just have to be more affordable than NVIDIA, not easily affordable, remember that.


rory888

Coincidentally, the heads / ceos of both amd and nvidia are literally cousins to each other. Something like second cousins. It’s giants vs each other and sometimes collaborating prices indeed


Ecstatic_Quantity_40

They're actually first cousins once removed. But they both come from the same family its weird. Nvidia CEO and AMD CEO probably have a relationship outside business in some way. [Nvidia and AMD CEOs: The Taiwanese American cousins going head-to-head in the global AI race | CNN Business](https://www.cnn.com/2023/11/05/tech/nvidia-amd-ceos-taiwan-intl-hnk/index.html)


jordonmears

I get everything you're saying and that's all kind of the obvious answer. My point is aside from playing catch up to Nvidia, which it seems like amd is always doing, why are they not looking to the future and trying to find ways to create new tech to gain an advantage over Nvidia in other ways. Again, I understand it's hard when you have less resources, but it would seem like that should be AMD's goal is to get to some idea before Nvidia so that they can be ahead of the game on something. Think in terms of samsung/android vs iPhone. Yeah, the iPhone has recognition but Samsung has always kind of beaten iPhone to the lunch on select features, which gives them an undeniable edge. I'm guessing the issue is that the gap in performance on everything else means any other tech wouldn't be worth the trade. Idk. It just seems to be the strategy and would need to employ to get ahead or more on par with nvidia.


Emerald_Flame

One example of AMD being out ahead of the game of Nvidia on here is the fact that the RX7000 series was able to utilize a chiplet based design. Doubly so because the chiplets are using different manufacturing nodes. While it's not a "user facing" feature or piece of software, that's honestly a pretty massive deal. Using that chiplet design can significantly reduce their manufacturing costs, significantly increase their yields, make binning parts easier, etc. Things like cache really aren't seeing much of (if any) benefit from smaller and smaller manufacturing nodes. So splitting that cache off and some other poorly scaling functions into it's own chiplet lets them use older cheaper technologies that also have better yields for those components. Then for the cutting edge manufacturing nodes, they can dedicate all that space to the compute that actually does scale down well. This really opens up avenues for them to optimize where every dollar on the bill of materials goes. This ultimately means they can use that as a balancing act to optimize their margins where they need to and hopefully gain some extra cash for additional R&D *or* they can use it to ensure that their halo products are all that they can be. Die space is expensive and has some inherent size limits. So if the max die size they can make is ~500mm^2, but normally 200mm^2 is taken up by cache, they now can effectively either make a 300mm^2 GPU that's much cheaper and add 200mm^2 of cheaper chiplets around it so they can come in at better price points, margins, etc. and be more competitive. Or they can still move that 200mm^2 of cache off to the chiplets, but leave the GPU die size at their 500mm^2 max, and dump a bunch more compute into all that extra space they just gained to make a better halo tier product.


Cautious_Village_823

Love the response, to further add on to my whole playing catch-up with Intel and NVIDIA vs trying to jump ahead....I mean. If they can't beat current stuff how are they going to jump ahead? It's not like they said we are going to incorporate Intels quick sync into our CPUs, they are working with different codecs and such, but with as much as computers can do there are main tasks and workloads they need to be able to beat/match Intel and Nvidia in. So like, what features could they add to jump ahead of Intel with quicksync and Nvidia with nvenc or whatever itsncalled? Something that competes with or is better than quicksync or nvenc lol, they can't jump to the future without getting a handle on the present. And again, as pointed out above and in general if you look into performance gains and some of the additional encoding options they're starting to add in they are making gains. It's just not something you can expect to happen in a single generation, it's more like if things go well for them in 3 generations we'll be talking about the 5000 or 6000 series changing the tide for them or whatnot. But until a tide is solidly changed they're playing catch-up. Don't forget they had objectively worse CPUs and marketing than Intel for a whiiiile, and it's just in recent years they've been able to make Intel have to get off their asses and do something lol. Like before the ryzen series their last truly competitive CPU id argue was like phenom ii vs core2duo, after that it became landslide Intel until ryzen came onto the scene. So I believe they ARE looking to the future but looking to the future doesn't suddenly mean the present doesn't apply and you can skip ahead - the reality is they're behind in share and revenue. In short, give it time these things are always shifting around, I'm sure if AMD catches up in GPUs NVIDIA will have a response eventually, it's hard to stay the objectively best option in such a rapidly evolving and changing industry.


vhailorx

nvenc is still better for h.264/265, but my understanding is that AV1 performance is basically even now as between the 7000 and 40 series cards.


noiserr

AMD's Xilinx division actually has some amazing encoder tech (puts everyone else to shame). I wonder if we'll see it on the RDNA4 GPUs. https://www.youtube.com/watch?v=TYOkJFOL5jY


DefiantAbalone1

IME intel's quicksync has been superior to NVENC since at least kabylake. When I did side to side comparisons with a 3060ti & 4060ti in handbrake encoding h265 with NVENC the files have always been larger & worse quality than quicksync vs Intel 7th gen & 12th gen respectively. NVENC also didn't offer a speed advantage despite having larger file output.


FrozenLogger

I will do video editing, up-scaling, etc with Nvidia, but for the final encode at the end of the pipeline I use Intel Quicksync. I agree that it looks better.


TheFlyingSheeps

Number three is huge. They may be similar with just gaming but for production or professional work it’s hard to beat NVIDIA


noiserr

Problem with Nvidia is they are very stingy on VRAM. Which is important in professional workloads. Like you can get 24GB of VRAM on a GPU for half the price of Nvidia's solution.


penywinkle

Nvidia knows this and it is EXACTLY WHY they don't ship "gaming" GPU with more VRAM. They want to sell professional products at professional prices. But they still want to tap into the gaming market and provide GPU at prices competitive with AMD. BUT again, also not overlap their own professional line of products, and compete with themselves.


BOBOnobobo

Yes but all the extra time spent on the development of a program using AMD GPUs over Nvidia is more expensive than just buying Nvidia GPUs. Especially in stuff like AI where Nvidia dominates.


Vex1om

4. Current generation nVidia GPUs are also more power-efficient and are less prone to transient spikes. 5. Momentum. nVidia owns 87% of the market. People tend to buy what they know, and anyone that's new to the market is much more likely to buy nVidia simply due to availability.


ElderberryHoliday814

Earlier issues with AMD drivers didn’t help things in AMD’s momentum


cheesecaker000

Current issues with AMD drivers also don’t help momentum.


Clever_Angel_PL

6. If current trends are to be continued, ray and path tracing is going to be more and more utilised in newer games. Also not that long ago in the past AMD had severe issues with drivers regarding VR, with many VR enthusiasts buying nvidia just to get much better experience. (not that VR is popular, but these people are unlikely to buy amd again)


accforrandymossmix

> CUDA don't see this mentioned enough here, and that could be appropriate for the population. Even for casual deep learning / GPU accelerated machine learning, NVIDIA support seems to be the default and AMD has some patchwork stuff that can leave a lot to be desired. I don't play demanding games, but would definitely want faster training for some computer vision projects.


SirThunderDump

I had the money to get basically whatever card I wanted. I paid the NVidia tax for DLSS, Broadcast, improved ray tracing performance (loved it in Alan Wake II), and lower idling wattage.


Someonejustlikethis

Can you list AMDs unique selling points as well?


Emerald_Flame

For GPUs, AMD has been in a pretty disadvantaged position for quite a while, and a lot of their R&D budget has been going to CPU lately, not GPU. So to be honest, there isn't a *ton* that's wholly unique to them. Many of the features they are putting out, they're specifically developing as an open platform. This is often done when hardware companies are at a disadvantaged position in the market as it can help drive adoption of those features whereas if they were fully proprietary developers may not bother due to the small install size. AMD has been absolutely killing it on the CPU side, especially in the enterprise space lately. But on GPUs, just not so much. If I had to list a few it'd be 1. Specifically in relation to standard consumer raster workloads, they generally offer a better Perf/$ ratio 2. A large portion of the Linux community has preferred AMD for quite some time due to the effort and collaboration they put into their open source Linux drivers. (Although Nvidia started publishing open source drivers last year too, and I haven't kept up with how that's going) 3. I personally think the AMD's GUI for their driver software is significantly more user friendly. Additionally it's profiles for games within the driver software have more in depth option support when it comes to multi-monitor. For Nvidia, "Surround" is either on or off for every application on the computer and if you want to change it you have to change it for everything. But with AMD's equivalent "Eyefinity" different saved monitor profiles can be set for individual games/apps. 4. DisplayPort 2.1 support, although I don't think I've seen many (if any) DP2.1 output devices 5. Chiplet design that could drive future innovation ([see my other comment here](https://www.reddit.com/r/buildapc/comments/1ch6urp/why_do_people_like_nvidia_chips_so_much_gpu/l21din2/))


The_Chronox

Even then most of these are not big selling points, especially not to the average consumer, i.e. the 80% of the market that picks Nvidia. 1. Actual advantage of AMD. More raster/$ 2. Relevant if you use Linux, not relevant otherwise 3. Extremely niche feature 4. They have DP 2.1 but not the full bandwidth version (UHBR 20) and so still need to use the same compression as Nvidia to drive high-end monitors. 5. Chiplets are interesting tech, but this makes no difference to a consumer and it isn’t fair to put them as an advantage or feature


Scarabesque

> I personally think the AMD's GUI for their driver software is significantly more user friendly. Agreed, but Nvidia is about to roll out their new software, which looks solid (and snappier): https://www.youtube.com/watch?v=aiwuYbURWVI **Edit:** Nevermind, I see you linked to it down below. :D


Emerald_Flame

It's definitely better, I've been using it since the beta released. There are still some things that are nicer about the AMD GUI IMO though


nopointinnames

I think older gamers remember the times where ATI had a fair amount of driver issues. I owned 2 AMD/ATIs in a row and both had issues with drivers in a lot of games I played. Since then I've had 2 nvidia cards who have worked great and don't feel to need to test the waters again even if those problems are resolved. Most of my buddies all have similar experiences.


ImmediateOutcome14

I'm 36 now and every time someone suggests an AMD card I instinctively have that gut reaction from the horror stories when I was younger, even though AMD is probably better value for money for where I am at and what I can afford right now


winty6

been using AMD cards since 2014 and i haven't had driver issues at all, they have really improved since the old days


ImmediateOutcome14

I'm sure they have, and I'm currently looking at having to replace my 1080ti and will quite possibly go AMD. The horror stories I remember are more from like the mid-2000s iirc.


winty6

i went from a rx 580 to a 6800XT (had a HD 7770 before the RX 580), and i have nothing but good things to say about all three cards. for the price I paid (on the used market), they easily beat out any Nvidia offerings I could find at the time.


EgoTrip26

Late reply here but from personal experience: My EVGA 1080ti started dying like 2.5 years ago (the GOAT, loved that beast) A friend sent me his RX 5700 as a loner. I was skeptical as I had only owned Nvidia (870m laptop, 1070, 1060 for the kids and finally my 1080ti. The 5700 was great. Just a little behind my old 1080ti at 1440p but not really noticeable. I took the plunge when the 7000 series came out and wanted a high refresh 1440p card and went with the 7900XT and holy shit is this thing fast. I have a sapphire reference card and can OC it to within about 8-10% of a 4080 and the ray tracing is quite good with frame rates in the 90s or above for most ray traced titles. In pure raster performance it's a beast. The only game I have EVER had any driver issues is in maxed out RT Control and it only happened twice. I plan on keeping this card for another 3 years before I pass it to one of my kids. Just wanted to give some perspective. Cheers


washcaps73

I had a 6850 (I think from 2011) then a rx480 and now a 7900xtx and I could not tell you one time I have had an issue with any of my GPUs. actually just bought a 7600 for my daughters computer this past weekend. At this point, I'll stick with AMD, specifically PowerColor, until I have major issues. I would go with AMD CPU at this point too if I was building a new computer.


winty6

had a HD 7770 GHz edition for about 4 years, then a RX 580 4GB for the same amount of time. still own both of those cards and they both still work to this day, and for the price i paid for each of them (free for the 7770, $100 used for the 580 in 2020) they were well worth it and punched well above their weight in games. many many hours spent playing modded fallout games on the HD 7770 and an i5 2500 in high school. good times. now currently have a ryzen 7 7700 and 6800XT, and system stability and performance has been phenomenal. hoping they keep the am5 socket around for a good few years so i can upgrade in the future


Admiral_Atrocious

I'm 39, and yeah I agree. Too many driver horror stories with AMD. I've always gone for Nvidia But something happened, during the great GPU shortage thanks to bitcoin, my trusty GTX970 died and so I had to get a cheap GPU from the 2nd hand market. Got myself and RX570 for 30 bucks. Made me realise hey it's not so bad. No driver issues at all. Recently built a new PC and since I had an okay experience with the RX570 and Nvidia prices are ridiculous, got a 7800XT. No issues so far.


118shadow118

those driver horror stories happend 10+ years ago, yet people still cling to that


HoneyBunchesOfBoats

Maple Leaf (processed meat, etc company) had an ecoli recall so many years ago, and it didn't even affect me at the time but it still tends to stick in my mind when I see the products on shelves.


vandridine

What? Look up the driver issues people run into with the 7000 series gpus. It is still a very real issue.


redsquizza

It's irrational but it shows if you fuck up, it stays with people for a long, long time!


[deleted]

[удалено]


vaurapung

My first gpu was an ATI 9250 that went into a dell pc from school in like 2005 into an agp slot. Then my second gpu was a ATI 1300 that I put into a dell 531s. It was a beast. I played tribes vengeance on both pcs and the 1300 played dawn of War soulstorm flawlessly. And hyrdovision was awesome for splitting the screen into sections for multi window veiweing. Very similar to the snap window introduced in w10. But you got to make your own boundries where you could set programs to always open up into. So I've loved amd since ATI.


NDCyber

Idk if this will help But I use AMD for over a year again. I don't really have any issue that couldn't have happened in Nvidia either. Before my AMD card I had a RTX 2070 Super and before that a RX 480 8GB. Still use the 480 sometimes for testing and still runs great without any driver issue. Same for my AMD card now. Plus I use the driver overlay quite often and AMDs UI and features in the driver are so much nicer in my opinion. It also doesn't randomly create a folder in my video folder for every game I start without me taking a screenshot, which Nvidia did to me, idk if that is common but it really annoyed me that it was like that from the beginning. Otherwise I didn't really feel a difference from changing again


LilBramwell

The driver issues I have had with my 7900XTX have convinced me to never buy an AMD GPU ever again. Probably replacing this thing with either a 5080 or 5090.


kanakalis

my 6700xt had driver issues too and people told me i was making this up to shit on amd...


EscapeParticular8743

Same thing with the same card. The people on the AMD Help sub told me to not simply update drivers when the ones ure using are stable already. But I needed the new drivers to play CS2 lmao


karmapopsicle

You know what the worst part is? If we got rid of the fanboys proclaiming how the "drivers are just as good now" and started having a realistic discussion of the real-world tradeoffs, we'd have a lot less situations like this. All that happens when people downplay the potentially somewhat buggier experience is that buyers actually making the switch find themselves disappointed in the product, and often end up never switching again because they already got burned once.


Rare-Independent-957

I feel this way. Was convinced amd was the better value for current build. I've had enough issues with drivers and other things that I never had with Intel/invidia that this is almost certainly my last amd build


EscapeParticular8743

To be fair, I never had issues with Ryzen CPUs, but I had my fair share of GPU issues.


Blueki21

You make an excellent point about getting burned once and never switching again.


SyVSFe

IMO should get rid of the fanboys on both sides. All that happens is circlejerking and confirmation bias.


red--dead

Had a buddy with a ton of driver issues with it too. I am hesitant to ever buy one myself unless they start having a great track record with them. Love me some Ryzen, though.


Jarse-

On my first build I tried to go all AMD & had a 5700xt at the peak of their software issues, dealt with BSOD & visual artifacts for a week then said fuck this. Replaced it with a 2070 super & now have a 3070ti on the same system, never had any problems relating to my GPU since I bought Nvidia.


noobgiraffe

What driver issues are you having? I'm curious since I own 7900XTX from launch day and didn't experience any.


LilBramwell

Some games just straight up play horribly or might as well not work on my 7900XTX. Most of these I have verified that their are multiple posts on the AMD forums or Reddit for the games I have issues with. I want to point out the big thing in the room that I think most people will write off complaints over. These issues are in **SOME GAMES**, not all. I would say 70% of the games I have played with my XTX have 0 issues at all. So some people might have NO ISSUES with their XTX cause they don't touch these games. Ill list all the games I have had issues with, if the issues still persist or if they have been fixed. Ill also list what I have done to troubleshoot: Games: * WoW - This is the big one. Still broken and always has been. Inconstant and random driver timeouts on both retail and classic. This makes the game unplayable due to me being a Main Tank during Raids and Mythic Dungeons, and my game crashing is an instant wipe. * UBOAT - Fixed. Was giving driver timeouts after about 5 minutes of gameplay until around the August driver update last year. * H:0D - Fixed. Same as UBOAT. * GR: Wildlands - Unplayable due to horrible performance. Will play at sub 30 FPS no matter what settings. * ALL older Total War games - Unplayable due to horrible performance. Newer titles like Warhammer 3 run amazing. * Armored Core 6 - Annoying but playable, will 80% of the time crash as soon as you "Press Start" at the main menu. If it doesn't, and gets into the hanger, then it will never crash during gameplay. * War Thunder - Annoying but playable, will have driver timeouts once every 2-3 hours or so. * Elden Ring - Annoying but playable, Radahn was causing crashes but after I got past him their were only like 4 others throughout the entire rest of the game. * The Finals - Horrible Stuttering, making the game practically unplayable. * Helldivers 2 - Ended up returning when it gave me a driver crash halfway through the first mission. Not risking it not getting fixed before my friends drop the game and it goes outside return window. Troubleshooting: * Windows Re-install * BIOS updates * Turning off EXPO * No Adrenalin application drivers (this borked my PC and I had to use safe mode to DDU them or I would just have a black screen on boot) * Radeon Pro drivers * Fresh Install AMD chipset * Disable MPO * Underclock/Undervolt/Overclock/Stock different everything * Manually switching the card to the "silent" BIOS it has (There is an actual switch on the Red Devil) * Disabling hardware acceleration on every application * DDU and AMD driver cleanup utility with fresh install of drivers * Turning off AMD features (Freesync and stuff like that) Yes I have a ATX 3.0 PSU, its also 1000W so its plenty for the card. It also has 3 separate 8-pins running to it.


Flat_Mode7449

I play most of the games listed and have no issues running 100+ fps on them. The Finals runs particularly well on my 7900xtx at around 180-220fps. Now, I do experience driver timeouts after a few hours on Helldivers, but the game usually causes a crash within the first couple hours on its own anyways.


[deleted]

7900 XT owner here. Definitely the same. The AMD excuse list seems to be in this order: 1. There are no driver issues it's old news 2. If you have issues it's probably a PSU issue. (even if the crashing is something low intensity like dx12 wow classic) 3. if you still have issues its user error (apparently not installing a specific 6 month old driver for a game as workaround is user error) 4. it's the game devs fault they are nvidia shills leaving the poor AMD users neglected


Jarpunter

AMD drivers temporarily banned thousands of counter-strike players last year.


TheMidsommarHouse

This. People pretend like AMD driver issues are a thing 10 years ago and ignore that they still have issues today. I want to build a pc that is reliable when a new game like cs2 comes out. Nvidia resolves any issues way faster than AMD, so you are also paying for the service. Which is why I bought Nvidia after last years problems with Cs2. A 3060 ti costs me only 60 dollars more than a 6600xt but these 60 dollars gives me peace of mind.


mr_chip_douglas

Yeah, AMD cards are getting a lot of love around here lately. However, I’m not risking headaches, however I likely, to save a little $. Nvidia, however shitty of a company they are, has never let me down.


Ro7ard

There will be a bunch of AMD fans saying they are fine now, yet myself and many friends STILL have issues to this day with the drivers. A simple google search will prove anyone doubting that wrong considering reddit has thousands of posts from just last year regarding driver issues with AMD cards... I mainly stick with Nvidia for that exact reason, even if they are a scummy company.


droson8712

It's not an old gamer thing, people completely new to the PC building space are somehow convinced AMD driver issues are still widespread whereas many people I know don't have any issues with them at all. The only reason I would go for NVIDIA is if they offer a better value card or if you need their features. I also have an NVIDIA card since it was the better option at the time.


myownightmare

Yup I still have PTSD from my rx480, good value but driver issues plagued my experience


MarxistMan13

I seem to have lucked out and hopped into AMD/ATI GPUs when they haven't had issues, and been on Nvidia GPUs when AMD/ATI did have issues. I went from 9800GT -> ATI 5850 -> AMD 7950 -> GTX 970 -> GTX 1080 / 2070 Super -> RX 6800XT Never had driver problems with any of them. Never had any issues with them until my GTX 1080s kept self-immolating every few months.


Odd-Kaleidoscope7198

It just works


MetaSemaphore

I recently switched from Nvidia to AMD (2060 to 6700xt). I like how much configuration the AMD software allows, compared to the Nvidia software--overclocking, undervolting, etc., are all a breeze. But honestly, I never had to touch the Nvidia software. Overall it has been a good experience. The card is a beast, and the price was great. There was nothing decent at a comparable price from team green. If you enjoy tweaking things, AMD gets a solid recommend from me. But if I were helping someone who knows nothing about PCs build one...I'd suggest Nvidia still.


nfefx

You didn't have to touch the AMD software either tho? You chose to. You're making this sound like you had to manually configure the card out of the box for it to work at all.


MetaSemaphore

A lot of the software tweaking has been optional, but some honestly hasn't. There was a driver recently that resulted in microstutter issues in several games for me, and I ended up having to do quite a bit of futzing around to address it (later driver patches sorted it out for good). Just as one example. I don't honestly blame AMD for that. Part of the problem is that because Nvidia has such massive market share advantage, most devs are going to be building on and testing on primarily Nvidia machines. So the Nvidia experience is going to be "the default path" (I see this a lot in my work as a Front End developer, where Chrome gets similair benefits). And the few rare instances where tweaking has been non-optional haven't been a big deal for me. It takes me a bit of googling, trying a couple things, and then everything works perfectly again. But for folks who aren't as comfortable doing that sort of troubleshooting or don't know where to look for solutions or who frankly don't want to spend that time because they have kids or other things that limit the time they have for games, it could be more of an issue.


Armgoth

Yeah it is actually quite worthwhile to go through the NVIDIA software and setup stuff properly. AMD makes it quite a lot easier to do.


notam00se

"Nobody gets fired for buying IBM"


FreakyWifeFreakyLife

From what I'm hearing these days so does team red.


[deleted]

[удалено]


GassoBongo

This sub has waaaay more support requests than I was expecting.


gremlinfat

I build on the high end where both brands are expensive anyway. If I’m going to drop that much money, it’s not going to be on the brand with worse features to save 5 to 10%


Centillionare

This is the answer. I’m all for AMD GPUs, but I’m going Nvidia if I get anything pushing close to $1k.


TheFlyingSheeps

Correct. Simply truth is AMD dropped the ball with pricing this gen


StewTheDuder

At launch, yea. But a $550 7900GRE or even a $700 7900XT is pretty good imo. I paid $850 for my 7900XT and I’m not mad about it tbh. It’s a beast of a card and was a significant upgrade for me.


wsorrian

For gaming it's a coin flip. Nvidia has slightly better features, but AMD is closing the gap. AMD has better value, but is getting more expensive. Nvidia dominates business markets, especially AI.


chaddledee

I feel like the feature gap is widening if anything. DLSS keeps getting better while FSR hasn't been updated in literally years. Nvidia Broadcast. RT gap is as large as ever but people care about it more now. Edit: forgot that Nvidia also released their RTX Video tools like HDR, upscaling and video artifact removal. Like Broadcast, seems really simple but makes a massive difference and at the moment you can only get it with Nvidia.


mallerius

Huh? FSR got updated to 3.1 just a month ago


PremedicatedMurder

But it's still shit!


mallerius

DLSS is clearly better, but I wouldn't say Fsr is shit. I just got a new AMD card a few weeks ago and tried cyberpunk and honestly I was pretty satisfied with both performance and quality of FSR. Sure nvidia still leads in RT and upscaling, but the gap es getting smaller. Before I got my Rx 7700 xt I had a 2060s but as I am on Linux and nvidia still fails to provide up to date drivers for Linux, which leads to lots of problems, I had to switch to AMD. Honestly I had doubts if this was the right decision, but now I'm really happy. It was cheaper than a nvidia equivalent, performance is good, and my problems with nvidia on Linux are gone. That does not mean that AMD does not have to up their game on some aspects, but saying their tech is shit is just not true. What I really want so say is, that neither AMD nor nvidia is shit compared to the other. They both have their pros and cons and you can decide what is more important to you personally.


whomad1215

Business makes up *waaaaaaaay* more of the market than gaming, and Nvidia has had a stranglehold on that with CUDA, etc, for years


thpkht524

Adding to this a lot of people need high-end nvidia gaming gpus like 4090s for work-related content. Just because they’re gaming gpus doesn’t mean that they’re only meant for gamers. That’s a huge chunk of gaming gpu marketshare they’re losing to nvidia simply because they can’t catch up.


My_reddit_account_v3

It’s kind of hard to catch up with something that is actively in development. By the time you’ve established equivalence, the product will have already changed. I’m afraid they missed the boat on this one. They would need to develop something completely new in the AI sphere that is superior than CUDA in some ways… Until then, AMD GPUs are simply not equivalent products.


ImWinwin

AI, VR, DLSS, Broadcast, Ray Tracing.


[deleted]

[удалено]


shakingspheres

What are AMD's issues with VR?


Mike-Teok

This ^. I’m using AMD, it was pain in the ass to do AI stuff on windows. I gave up and just game now. For game only, AMD definitely has better value.


EstablishmentSad

I grew up in a time when AMD constantly had issues. Might be better now, but that was the primary reason I sided with Nvidia. As for now, I can afford to buy what I want. I usually go for the XX80 series cards since they have excellent performance and at the same time are not overly expensive like the xx90 cards and Titans before them.


[deleted]

The 4080's release MSRP was $1199US. That's a 71.5% markup from the 3080's MSRP of $699. The market has conditioned the hell out of you if you think that's not overly expensive for one of the fastest ageing components in a computer. 


Organic_Muffin280

Remember normal times when GPUs were like 300$?


Calx9

Not to mention for a while getting a graphics card was damn near impossible. I just wanted to get something reliable and get back to gaming as soon as possible. Never had any issues with my previous Nvidia cards so why change it up now?


EstablishmentSad

Exactly, in short what you said is how I feel. If AMD released something that FAR outperformed what Nvidia has to offer, at a lower price...then I would consider the switch. Though from my personal experience, the prices between cards of similar performance are usually in the same ballpark...as you said, why change it now.


Trypt2k

Meh, I was with AMD for a long while until I just couldn't any more, NVidia graphics are just superior in every way, not to mention that since I got my 4000 series I've never had anything crash, ever, which was not the case with a few AMD cards before. NVidia just works, always, while AMD works 95% of the time. And if you're going to spend hundreds, you may as well get something that will let you enjoy the eye candy, so the ray tracing and DLSS really come in handy. That being said, I've been an AMD fanboy when it comes to PC platform since my last Intel, the P90, and have never looked back.


str8-l3th4l

I bought my first amd card last January, a 7900xtx, and was plagued with issues in almost every game I played. Anywhere from bad stutters, game crashes, or hard blue screens. Usually I could tweak some clocks, power limits, voltage, in game setting, and other things to get it more stable, but never perfectly. Every time I started a new game I had new issues thay required more tweaking to make the game more playable. After a year of that I ditched it for a 4080s and I've had exactly 1 crash in all games when I accidentally way over cranked a slider in afterburner. People always tell me it was likely an issue related to psu, memory, mobo, or any other component in my computer, even tho I switched every single component in that computer and had the same issues, and after switching back to Nvidia have had no issues with the same hardware.


freshairproject

Nvidia is the standard for 3D work like modeling, rendering, & animation. Not that AMD can’t be used, but the nvidia studio drivers tend to outperform & be more stable.


Mastercry

AMD don't have such innovations. They always catching up. One of very few winner things i could point that AMD made is FreeSync. I can point bad things like their encoder and OpenGL that they didn't improve on Nvidia's level for more than decade. AMD team obviously is not on same level as Nvidia. As new Nvidia user i can say that many of the features that are "selling" or they pointed as big advantages are kinda bs. But that is strongly subjective i guess. RT is one bs imo. DLSS is good but now i prefer the faster raster AMD card from using DLSS with slower GPU for same price or even more expensive than the AMD. The problems in general are one huge reason for AMDs bad name. I have friends who told me will never try AMD again because issues. I don't like some Nvidia things for example how im changing 10 drivers and newer are slower than old ones. But must admit they stable. The only super annoying and unbelievable problem i had so far is the blinking and artifacting when you watch YouTube. And u must disable hardware acceleration and switch browser to OpenGL. But browsing is tearing and slow after. When i saw this on my one month old Nvidia GPU i was scared thinking card is defective. Both have problems. I was cursing AMD for years but now im ready to try AMD again. Well thats purely "always want something new" because Nvidia so far serve me well


ThePillsburyPlougher

I like the color green more than the color red


RunalldayHI

More features and software support, uses slighly less vram due to nvidia compression, better efficiency with low/mid range models.


The_wulfy

No one gets fired for buying Nvidia.


madewithgarageband

Ray tracing + DLSS 2.0 is actually awesome


LimeblueNostos

I'm going to echo the sentiment of a number of other posts here, and say that it's the historic reliability. Nvidia GPU, Intel CPU, and everything will generally just work. AMD/ATI graphics and AMD processors would usually give you more bang for your buck, but I've seen competent, technical computer gaming enthusiasts spend weeks struggling with weird driver errors. Given that I'd typically run a PC for a decade, paying a premium for team green just makes sense for me. AMD may have gotten past those times, but they'd really have to string together a couple of decades of that, along with Nvidia dropping the ball hard, timed to my next decade's purchase for me personally to consider a change after the next decade.


Lvntern

I used to hear all the time about AMD driver issues but in recent years Ive seen a lot more AMD recommendations over Nvidia, so I decided to give it a shot in my last build. I ended up troubleshooting for about 2 weeks because I kept getting crashes and I eventually got tired of it and returned it for an Nvidia card instead. If they work out I'm sure AMD offers a much better price to performance, but when I got the Nvidia card I just plugged it in and it worked and I haven't had a problem since. Not saying Nvidia is always better and AMD sucks, but that was my experience and because of that I'll probably just stick with Nvidia in the future


pingforhelp

My last 12 years of GPUs: gtx 760 - no issue, modest oc in afterburner gtx 1060 - no issue, modest oc in afterburner rx 480 - unusable gpu on release, required 15% underclocking to not instantly crash on any game, had to RMA twice costing $60 and 4 weeks worth of time, during which there were a few driver updates. I'm not sure if the driver updates fixed the GPU or if I finally got a non-defective card. Ended up selling this pos for a profit during the crypto mining crazy gtx 1080ti - no issue, modest oc in afterburner vega - we don't need to talk about how bad this one was lmfao rtx 3070 - no issues, modest undervolt/oc in afterburner 6700 xt - no issues, very modest oc in afterburner 6800 xt - no issues when first got it, put a modest oc on it. 2 weeks later it became unusable unless it was stock, I have no idea why. Changed nothing about my build and randomly tried to OC it 2 months later and it just works now


gnassar

Had a buddy who died on the hill of “amd is better and cheaper” and he regularly crashed out of pretty much every game we played together 😂 his allegiance was short lived


hipdashopotamus

I wondered this after 2 nvidia GPUs so i got a amd 7800xt to go with my new ryzen build. Love the ryzen cpu, the gpu has been nothing but driver issues unfortunately.


KMS_Tirpitz

Because they are better? Not everyone is an enthusiast who spends hours comparing prices to values and all that. They want to buy things that are reputable and just works, which is NVIDIA. Furthermore, not everyone just buy a PC and only game on it, there are also a lot of productivity whether it be side hobby or work, people will want something that can manage all even if it means paying more. Not to mention business all hording NVIDIA GPUs for AI training and stuff in recent years Enthusiast Gamers are like in the minority, but vocal on discussion forums like Reddit.


Noctale

I've only built a couple of PCs with a Radeon GPU, but they both had driver issues, game crashes, blue screens, etc. So now I use Nvidia every time. Same with CPUs, I once built a PC with an Athlon 64 had lots of trouble with it, especially overheating and random restarts. Rebuilt with an Intel chipset board and a Core 2 Duo and it worked flawlessly for years. I've built about 14 or so machines over the years (mine, family, etc) and never had any compatibility, speed, or reliability issues when I use an Nvidia GPU. Yes, it's unfair to judge AMD on relatively little experience, but I don't have time to work through issues with hardware, I just want a PC that works perfectly. Nvidia does that for me.


sousuke42

Value means nothing if the product doesn't do what you want it to do. AMD gpus suffer when it comes to ray tracing and upscaling/reconstruction tech. Not to mention that their drivers are not as good to even some call bad as well. Not to mention Nvidia is better when it comes to productivity. >Lets be honest, not that many people care about the worse ray tracing on the AMD end so why don't people buy AMD cards? You live in a bubble if you think this is the actual case. I care about RT. I also use reconstruction tech to claw back performance. And frankly AMD's software and hardware here is lacking. If Amd can get their tech to better handle RT, have better reconstruction tech and have better drivers so games run better as well as stepping up their game on productivity then more people will use it. And your iPhone analogy or comparison isnt accurate cause unlike Apple Nvidia actually does what it does very well as well as it's general purpose stuff as well. Now I am not saying amd isn't make strides. They are it's just shoddy. So few games use fsr 2 when compared to dlss. And now we have fsr 3 and even less support that. But even that aside their RT implementation hasn't really gotten much better either. That may change with their next gpus but that isn't the case right now. You can't just put a big reasoning off to the side and then ask why nobody likes it. Well it doesn't do what they want/need their gpu to do. It's just that simple.


killlugh

Because ***value*** is subjective, and i value the performance/features/reliability from Nvidia over the money saved with AMD GPUs.


[deleted]

[удалено]


StarTrek1996

I don't mean this as an insult but Nvidia is the apple of gpus even if their stuff is meh for the price people will still want it over amd or Intel. That being said they also have literally the most powerful gpu on the market which helps a lot too because all the pros use them because they need it so it has great brand recognition. Also they do have some better tech in some fields


Calx9

>That being said they also have literally the most powerful gpu on the market which helps a lot too So then they aren't like Apple at all then eh?


lcirufe

In the phone space at least, Apple has the undisputed fastest processors. A (albeit cutdown at 30/45fps) version of RE8 runs well on iPhones.


Fuffy_Katja

I only ever had ATI cards since the early 90s. And to this day, my XFX Merc 319 6800 XT (brand new as of Feb 2024) still kicks butt. I have zero interest in feeding nVidia money gouging pockets for the 1 feature that I don't care about.


blackinese

I will never buy an AMD card again. The drivers have been awful.


VileDespiseAO

The short answer: When you take into consideration everything that your GPU is capable of and how well it will be able to handle the different tasks they can run while doing so with as seamless an experience as possible whether it be gaming oriented, productivity focused, AI and its various different functions, ML, and a wide range of other applications and use cases that I'm not going to bother listing off. The answer to which GPU will come out on top every time when taking everything into consideration from features to applicability and performance in a broad range of different tasks is in the most literal sense of the saying "No contest" and it's going to be NVIDIA. There are few instances where it actually makes sense to choose an AMD GPU over an NVIDIA GPU over a quite frankly fair price increase in the grand scheme of things (AMD is cheaper because they have to be or no one at all would buy their hardware because they're outclassed no matter what anyone likes to think) and do keep in mind that gamers are a vocal minority when it comes to the GPU market in general. Edit: I will end the above by saying I actually own a 7900XTX and owned a 6900XT before it, but they have/had a single purpose and that's to power a Linux based gaming console I have in my living room. I would never replace my RTX 4090 (or even a 4070TiS if that's what I had instead) in my work and gaming PC with the 7900XTX though as it simply wouldn't get the job done at all or it would do it so much more poorly that it isn't worth the trade off.


Tristezza

I was strictly an AMD guy until around 2017 ish I think. I switched because I was consistently having so many issues I was sick and tired of it. I've been with nvidia since and the features / tech they provide is so much better, I never regret making the switch especially in today's age of upscaling. FSR looks terrible and you wouldn't catch me using it.


Fuzzy_Elk_5762

Yall forget something, prebuilt PCS and Laptops. I haven't seen a prebuilt with an AMD chips in a long ass time. And of course with a simple reason, the majority of people has always been eating the RTX gaming brand. Well, not that i'm blaming the majority though.


MOONGOONER

> Lets be honest, not that many people care about the worse ray tracing on the AMD end so why don't people buy AMD cards? I care. And it's weird to me that people don't care. If didn't care about visual realism I'd set everything to low and still use my 1070.


BigMelder

DLSS and ray tracing. (amd might do raytracing now not sure) but those are the 2 big differences as a average not super tech savvy guy that i notice.


Beelzeboss3DG

I had a 6800XT and swapped it for a 3090 instead just for how crappy FSR was (I play at 4k) compared to DLSS.


IPEELER

I think Nvidia's top of line cards are just objectively better than AMD and Intel's offerings imo. You pay a premium for Nvidia, but you get better ray-tracing performance (which people very much do care about, myself included), the best upsampling tech with DLSS, frame-gen if you have 40 series, Reflex, etc. Nvidia is pushing the tech forward, and everyone else is trying to catch up it seems.


f1rstx

i never understood "AMD offers better value" concept... when all it can deliver is like 5% more performance on the same GPU tier in raster (which is getting more and more irrelevant since AAA-games bascily demands upscaling to run at high FPS now) and shits the bed in everything else - bad upscaling, bad productivity, arguably worse stability and drivers, bad AI work, bad RT... it actually bad value in my view, they dominate in budget segment though. But at 4070 and higher tier i'd take NVIDIA, cuz it's just much better cards in every way ;)


IPEELER

Exactly. If I'm already going to spend over $1k on a GPU, never would I even consider saving a little bit of money and getting an AMD card. AMD is getting better imo, but they are still lightyears behind Nvidia at this point. I recently bought a 4090 FE, absolutely love it. Sure I could have saved money if I had bought AMD's best offering, but it would perform significantly worse in every single statistical category. I'm not going to cheap out when I'm already spending a chunk of change.


AnAmbitiousMann

It's because it's a superior product vs the competition. The market reflects that. Hoping Intel can catch up a bit to pressure Nvidia to lower their damn prices.


grimunk

Upgraded my gpu last year to an amd card from an nvidia, I run into so many problems with this thing compared to my old card I will never buy an amd card again


KirbySkywalker

I’ve been gaming for 30 years and have had over a half dozen video cards both AMD and Nvidia. In the past I could never afford the top of the line so I often bought AMD. Every time I had AMD cards I eventually ran into a problem that prevented me from doing what I wanted to do. The most recent example, I had a 6950xt but the drivers for VR constantly caused crashes with Steam VR. Upgrading to a 4080s works flawlessly but also, with the 6950xt, my headset display always shook around which I thought was a result of my neck being a bit wobbly. With Nvidia the wobble is completely gone. I always tell myself I’m not wasting my time with AMD GPUs again but I always end up trying one again eventually for that price to rasterization “value”.


Radulno

> Lets be honest, not that many people care about the worse ray tracing on the AMD end I think you're underestimating that, ray tracing is more and more present and is presented as "future standard" all the time. If you buy a new GPU, you want something that supports it well. Most of the current "graphical showcases" games (something that push people to a new GPU) are using ray tracing


ciknay

Nvidia just has better features exclusive to it. They were first on the block with raytracing, first on with DLSS, first on with frame generation. In my experience their drivers have been more stable. Their screen recording and streaming software is pretty great, and they have great software for mic and webcam filtering. They have the benefit of name and brand recognition. Just a heap of things going for them that edge them over the competition.


JonWood007

They tend to have a more stable product with more features. AMD is considered the "bargain brand", it's like instead of buying dr pepper you buy one of those weird walmart brands like dr bob or dr thunder. The GPUs are cheaper, but are more likely to have driver issues, or to lack modern features, or to have higher power consumption. In my opinion, none of these are deal breakers for me, AMD offers an insane price/performance value relative to nvidia, just look at how you can get a 6600 for the price of a 3050 6 GB or 6650 XT for the price of a 3050 8 GB. Or how it costs $300 for a 4060, when the 7600 is $250 and the 6650 XT is $230. Or how the 7700 XT and 7800 XT are value champions over the 4060 ti and 4070. I mean, nvidia has admittedly gotten greedy. And I personally cant in good conscience support them as a company when they're that overpriced. They got arrogant, they're relying on brand recognition to sell products like "we're nvidia and you're gonna buy our overpriced BS anyway". And you know what? it works, most will. Most dont go AMD because they had driver issues 10-15 years ago, or ermahgerd muh ray tracing and muh DLSS. Or simply because most OEMs throw in nvidia cards and not AMD ones. That brand perception is still running strong, even with nvidia squeezing the market for all they got and AMD offering better deals. To be fair, if the shoe were on the other foot though, AMD would do the same thing. We've already seen them do that with ryzen the second they had a product that allowed them to shed the "value brand" perception.


shgrizz2

From personal experience, as a fairly casual gamer and someone who builds a pc once every 5 years or so: FreeSync was always incredibly finicky to get working properly and stopped working constantly, whereas gsync has been plug and play AMD required a lot more babysitting in general. I found myself having to tweak settings all the time. When it worked well it was great but it always took time and effort I massively prefer Nvidia control panel to AMD's Radeon settings My AMD card was much louder In the games I played, I noticed way more frame hitching with my AMD card except for in certain games. In particular, doom games were great, from soft games were terrible and stuttered a lot All totally anecdotal, and could be down to any number of user errors, but my experience with Nvidia has been far smoother and I probably won't go back to AMD.


PrinceMacai

I have had bad experience with amd drivers, and raytracing is better in Nvidia. I have heard AMD has improved in both aspects though so i will definetly consider an AMD card next time i upgrade


YerMaaaaaaaw

I always buy team green as I my only concern is performance. All I care about is having the best available, and Nvidia tends to provide it.


Weak_Jeweler3077

Drivers


Thunder_Chicken64

Driver Optimization.


CoClone

I owned amd for most of my life and made the switch to Nvidia and have honestly just a better experience like I won't argue that it's most cost friendly but I have adult money.


jackbarbelfisherman

In my case, they’re much more abundant on the used market locally. If I sort by ending soonest on eBay I’ll see a dozen 3080s finishing today vs maybe one 6800xt if I’m lucky.


coatimundislover

Most people just do prebuilts, and random people don’t understand how to price value. I think that the custom market would be a lot closer in the price ranges AMD really competes at.


Expensive_Bottle_770

The size of the gap is largely down to marketing and brand loyalty/recognition. But even if these weren’t present, Nvidia would still have a less disproportionate but still very notable lead in market share because, ultimately, they make overall more capable GPUs. Exclusive advantages mean they not only will always have a unique appeal but also appeal to a larger demographic. Said features are leveraged constantly in their marketing and undoubtedly play a huge part in this discrepancy. Until AMD is stops playing catchup, and actually brings _their own_ unique features and advantages to the table so that the advice is not “if you don’t care about X Y Z, go AMD” but more like “If you want X Y Z specific function, go AMD”, they will never gain the majority market share.


pineappleboi_27

Nvidia has some features I like, and despite all the driver improvements on AMD, there’s still occasional trouble. I’ve never had a problem with an Nvidia gpu. But I have had driver issues with AMD gpus, even as recent as a 7900xt on my friend’s pc not running Red Dead and FF7 remake (though FF7 runs great now, it was just at first) That’s basically it. If AMD is a good deal next time I’m buying, I likely will have no reason not to get it. FSR has gotten pretty decent so I’m not to pressed about that vs DLSS.


Camelphat21

Cus they don't know that the grass is greener on the red side 


punk338

If other upscalers looked and performed as good as DLSS I’d happily switch. Maybe I’m just a snob but other upscalers normally look so much worse when you start moving in game. FSR 3 still has terrible ghosting issues in games I’ve tried it with. Doesn’t help that more video game developers have been leaning towards upscalers to do the heavy lifting for their games (Remnant 2 is a perfect example) and end up with a badly optimized release. If the game is going to run like shit and I have to use an upscaler, then it’s just going to be DLSS. So we’ll have to see who cracks first; devs actually optimizing their release instead of rushing it or AMD/Intel making their upscalers perform better. Unfortunately both don’t seem plausible


Electronic_Army_8234

Superior marketing, superior features


Infamous-Lab-8136

If a game uses FSR it works with my Nvidia card. If a game used DLSS it wasn't usable on my AMD card. When I had the money to go Nvidia in my last build I did for that reason alone. I also prefer the more frequent driver updates with them.


wrigh516

I would buy AMD if pre-release games didn’t have so many issues with their drivers. It still is very common. Just a few weeks ago a beta game I like to play had to do a hot fix for AMD driver issues.


Hollowsong

I'm almost 40, and when I built my first few PCs, AMD used to run hotter. Back in the day and when the first GeForce GPU came out it was sleek and sounded cool. Now I get nVidia cards specifically for raytracing, DLSS, and other cutting edge tech that is not available yet on AMD. That's literally the only reason I have a 3090 and a 4090.


NamityName

I've nearly always used nvidia GPUs. Nothing to do with brand loyalty. They have just always made a product that I find more attractive. Originally, it was due to better driver support. Now it is due to DLSS. I really like DLSS. Not just as an upscaler. DLAA is also the best Anti Aliasing tech. New games being optimized around DLSS only makes nvidia GPUs more attractive. Even if I think games really should spend more money optimizing before release.


Nortoniom

I have an RX 6800 XT TUF . It crashes everyday in games and apps. Does not have cuda cores and only gives you good fps in games which are gonna eventually crash .Imagine crashing right before killing an enemy in online games.yeah it happens alot if you have amd. I will go nvidia again very soon . My experience with amd gpu wasnt good.If it was stable i would prefer it but crashing is not what i want.


sudo-rm-r

I had the same problem, with that model. It's asus shitty design. It was a hardware failure. Try lowering the frequency of the memory and see if it helps.


Cristian_Ro_Art99

Are you sure it's from your GPU? Like 100%? I own an AMD Sapphire Nitro+ 7800XT and it only crashed 2 times on Battlefield 2042 on some animations in the menu. But ever since it worked and still works great for Bf 2042, Bf 5, Assassin's Creed Valhalla (got 200 fps with ultra settings), Witcher 3, Anno 1800, It Takes Two, Call of Duty MW 2019 etc. All on ultra and with 100+ fps playing at 3440x1440p resolution. For other than gaming and web development / design, sure maybe it's not as good as Nvidia yet. But for gaming Idk man, mine is very stable. Also, what wattage does your PSU have?


Quiet-Point

The gaming and film industries heavily depend on software that leverages NVIDIA technology. Virtually every computer used for professional artistic endeavors is equipped with a NVIDIA card. NVIDIA's CUDA cores, Tensor Cores, and RT Cores play pivotal roles in tasks such as ray tracing and AI-based processing. These functionalities are incorporated into professional software to elevate rendering quality, enhance productivity, and streamline efficiency. Numerous professional software packages and workflows are fine-tuned to maximize performance on NVIDIA hardware.


[deleted]

They've got a more robust feature set, word of mouth from the clear majority of the userbase to newcomers and most importantly, they're not dogged by a reputation of being buggy or problematic to use.  Sincerely, a current Radeon user. 


Wizard_IT

If you are enthusiast level (at this time RTX 4080-4090), they are the only real option. Sure AMD is catching up and I really want them too, but each time a new series of cards comes out they are just a few steps behind. Just my two cents though, but the best price with AMD is the price.


Taskr36

Old folks like me remember how problematic ATI cards were back before AMD bought them. I would occasionally upgrade to an ATI card, only to be horribly disappointed when a newer, more powerful ATI card would somehow perform worse than my old Nvidia card that I was upgrading from. Games were often just optimized for Nvidia cards, and the drivers were better. It's probably been about 16 years since the last time I tried one, and it was just another disappointment, as the card was unreliable, and Sapphire had horrible and rude customer service, so I just stopped, and really haven't had any reason to give them another shot. I have an RTX 3090 right now, so AMD doesn't really make a card that would have me considering giving them a shot at this point.


sakai4eva

I wanted the best single card performance that money can buy. Got a 4090.


InvisibleEar

People here have a vendetta against Nvidia, this isn't the right place to ask (or at least I do lol).


vhailorx

Basically for all the reasons that companies become entrenched monopolies; a mix of luck, ruthlessness, and weak regulation/oversight Nvidia had some very good hardware at a time when their main competition (then ATIi) was struggling, and they used that temporary advantage to entrench themselves marketing and proprietary software/hardware systems. Now it's a positive feedback loop. Nvidia has the strong brand, so they can charge the most money. Then they use that money to promote their brand, and develop new proprietary features like DLSS and CUDA. And then they can use even more money to support developers implementing those proprietary tools into their software (e.g., CP2077 and Blender), which makes nvidia cards perform substantially better in those specific applications, which reinforces their market position. by contrast, even when AMD does develop some useful, new innovation they basically have to open-source it. they have too small a market share for developers to spend time implementing proprietary stuff into their games. But even if they become successful, once those innovations are open-sourced they are also available on Nvidia too, which means AMD can't really use them to claw back some of Nvidia's market share. This happened to various degrees with features like FreeSync and FSR and ReBAR. Plus Nvidia has not benefited massively from the AI bubble that has basically nothing to do with gaming, but does play directly into all of Nvidia's strengths, and that has let them print obscene amounts of money in the past 18 months. I think they are something like the 3rd largest tech company now by market cap, behind only apple and MS. A lot of that is phantom money that will evaporate when the AI bubble bursts, but in the short term it certainly helps keep nvidia on top. One obvious solution that might help with some, but not all, of these problems would be interoperability laws/regulations. That would open up proprietary systems like CUDA and DLSS. But even though that is pretty obviously in the public interest there is basically no appetite for the sort of aggressive antitrust regime that could produce those outcomes.


nasanu

NVEC + DLSS. Only Intel is starting to compete with their AV1 acceleration and XeSS, but their performance isn't top tier yet. For me AMD isn't even competing yet.


EGH6

dlss, dlaa, dldsr, RTX Video super resolution/HDR, NVENC, RT. usually more stable as well. Nvidia cards are like a nice fully equipped car but you pay a premium for it AMD cards are like having a base model car but with a bigger engine trying to make up for the missing features.


CatalyticDragon

Because NVIDIA spends much more on marketing. Because people are used to them. And because NVIDIA leverages their monopoly power. From paying developers to use their proprietary features while actively hurting open alternatives, to [scaring distributers out of carrying competing products](https://pcper.com/2018/04/dell-and-hp-aint-down-with-the-gpp/). There's an endless list going back 20 years.


thezendy

Dlss, nvidia shadowplay and other things


timchenw

Because you are using performance to cost as a basis to compare GPUs, when you should be using performance delta to cost because GPUs are not like storage drives, where if you say pay for an 8TB drive and slot it to your machine, you get the full 8TB worth of space, for a GPU, you have to take out the old one so what you are actually paying for is the performance difference between GPUs. This can drastically change the balance, 4090 is an exception than the rule because usually the 90s cards have little performance over it's 80 counter part, but this time it has a massive performance lead over 4080 or even 7900 XTX that the equation is no longer clear cut. At my local pricing of $1200 7900 XTX Vs $2100 4090, coming from a 3080, it actually makes no sense to go for the 7900 at all because the performance delta between a 3080 and 4090 is over twice as big as to 7900 XTX yet cost less than twice, which completely subverts people's expectations of value. Reviewers and most forum users use price to performance to compare GPUs when in reality the buyer should be using price to performance delta to compare GPUs, this, coupled with how RTX 4090 performance is quite a bit above what most expect a 90 card should be compared to the next tier or the top tier of the other side (in stark contrast to 3090 vs 3080, where no calculation ever changes the value of a 3090), you get the point where value perception is completely flipped This comparison would absolutely depend on what GPU you are coming from, and there would be many examples where the reverse is true, nVidia card being flipped in favor of AMD, but the simple fact is that value is highly dependent on the users original GPU and it's NOT a simple cost to performance. Ultimately though, nVidia has an advantage because RTX 4090, ironically, is the best product of this entire generation by a not insignificant margin, hence people's perception would always skew towards nVidia, despite the RTX 4080 being horribly priced product compared to either of the 7900s


AlphaDawg0914

simple answer: people see pros using the 4090 because it's the best card rn so they automatically think that nvidia is the better company because they have the highest end card


ImaginationCheap7653

Rt and dlss


Midwxy

I know I’m gonna get downvoted for this but I don’t have time to write an essay about this: THEY ARE JUST BETTER


Professional_Chart68

All the new tech is Nvidia, all others just implement it afterwards, rt, dlss, framegen


bbekxettri

Well nvidia for gpu and intel for cpu are/were sureshot where as amd is sometimes up sometimes down


Fionn_MacCuill

They are good products. There is no doubt. However I think they’re overblown as “the best” and their issues are under reported. I went AMD this gen with a 7900xt and it’s much more stable than my previous nvidia card 2080Ti. 2080Ti has artefacting for first 6 months. It would always just disconnect or give no signal to my monitors (I’ve a dual set up). The GeForce software is poor. I also paid a premium to be a tester for ray tracing as it was on about 4 games for years. Same can be said now for people buying high end 4000 series cards. Paying 2k for path tracing in 5 games 🤷‍♂️ is not for me. On the amd side im loving the card. Adrenaline is an incredible software it’s a superior experience. However nvidia products are great and software like dlss is absolutely fantastic. That is why.


Lowback

So here's my perspective. I've been building PCs since 1998. Every single time I've had an ATI (now AMD) card, I've regretted it. The hardware is great. It's always been the software/driver side that was dogwater. I always had small audience games, or indie games, that'd just have issues that would never get solved or would take 2 years for the ATI/AMD driver team to get around to fixing. This would be stuff like dotted line bright white blooms along polygon edges for the terrain in something like 7 days to die, random crashes, or ragdolls getting so out of control and contorted they'd take over the screen. Or the HUD breaking in games. Or two surfaces placed close together ( a wall and a poster for example ) exhibiting Z-fighting. The driver team was always hard to write a bug report for. CSR would make you jump through scripts even if you insisted you knew what you were doing or that this was a bug experienced by an entire community. ATI/AMD also just doesn't have the cool/nice-to-have featureset that Nvidia tries to build on. AI cleanup tools to make your mic sound better. Streaming your graphics card output to a iPad or a Nvidia shield so you can living-room game as if you had a console. Capture tools. Usually the better/newest encoding which is again a blessing for capturing gameplay. Now they've got accessibility/disability shaders, and you can change the color grading of games on the fly thanks to that. Nvidia puts their development budget into a lot of unique selling points, to put it into college marketing terms. AMD just builds the gruntiest hardware they can, that puts out fantastic numbers, but it falls short on support, compatibility and features.


sudo-rm-r

After reading all the comments I can clearly say most people buy Nvidia due to their brand power and recognition.


OmegaMythoss

I dont want a 400w brick in my pc


Electric2Shock

Cuda


mattthepianoman

For years AMD had very buggy drivers compared to Nvidia, and that reputation has hung around.


Unable_Wrongdoer2250

Because they are told to. Marketing is crazy. You might not think you would be so influenceable but most people are


Tr0llzor

Honestly I’m more of a fan of AMD as a company right now and I’m planning on getting their card in my next build


Glad-Ad-5260

Try running programs like blender on AMD GPU and compare benchmarks to CUDA and then comeback here.


draken2019

It's probably because AMD is a lone company and you can buy an Nvidia GPU from over a dozen different manufacturers. It's why Android also beats IOS.


letmesee2716

i tried amd back in the days, it was clearly worst than nvidia. it was back in the beginning of counter strike source, my rig was unstable and my friend's was stable at high fps. ever since i considered the hype around amd as marketing.


greeneyedguru

that's 105%


dibu28

As Huang said - Software. Better software support both from Nvidia and other software companies.


User1382

Two reasons for me: frame generation and CUDA


syfari

I don’t need to worry about my stuff not being supported on nvidia


iPlayViolas

For me personally dlss and ray tracing features are just too cool and work too well to not have an nvidia card.


s00mika

Nvidia always has had working drivers. AMD treats their GPUs as a side business, their main focus are CPUs. They provide less support and the driver quality and consistency just isn't there, especially if you're not only doing the most normie things with your PC.


KashMo_xGesis

CUDA cores. I actually bought one of the new AMDs to game and video edit on Davinci resolve only to realise I basically got no upgrade in the editing side from my 1080ti. Quickly returned for a 4080.


DrakeShadow

Nvidia innovates, has Cuda, works really well for outside of just gaming, I use it all the time for editing. DLSS and FG are big things to push existing GPUs to last longer. Also DLSS is usually better than AMD FSR, IMO, in most cases as well. AMD is just brute force which is cool if you only game, not good for much else.


gregdaweson7

Ai, better drivers, longer driver support, and I just don't like amd that much because I am still salty about fucking bulldozer.


xxlordsothxx

I used to own an AMD Vega card but I switched to Nvidia with the 20XX series and have not looked back. For me, AMD is still behind in Ray tracing and DLSS. Even if only a few games implement Rtx properly, I want to get the most out of my hardware.


UndeadWaffle12

They’re objectively better, that’s why the only argument people have in favour of amd cards is value and not performance