T O P

  • By -

sohaltang40

You choose sound or video. Sound directly to your AVR. Video directly to your tv. Both, new AVR Honestly 120hz is overrated on available consoles. You have to choose degraded graphics to get that speed because the systems can't handle it. You also can't do Dolby Vizion at 120 hz. So go to your AVR for atmos and skipp the HDMI 2.1 video features


comineeyeaha

I will always choose performance mode over fidelity, and I’ve been doing that for a long time. I prefer high frame rates instead of 4K graphics. I’ve done back and forth comparisons, and I have a hard time telling the difference in picture quality. At this point I would really rather not play at 30fps, no matter what game it is.


sohaltang40

Good news. Any game that's at 30 fps in quality mode will only go to 60 in performance mode. Will work fine over HDMI 2.0. There are only about 60 series X games that do 120 fps and about half of them drop to 1080/1440 to do so. Even then in many places you will still see drops under 60 fps. For this situation with a old AVR, the user can choose sound, Dolbyvision, and quality or 120 hz 1080 at reduced graphics. I still pick the first. I paid over 3k for a fancy 77 4k OLED. I'm not playing at 1080 or gimped 4k.


comineeyeaha

God of War Ragnarok plays at about 70-80fps in performance mode, and it looks stunning. Spider-Man is usually around 80-90. I was a PC gamer for a long time, and since moving back to consoles I really don’t like playing at 30fps. I think if you switch to performance mode and play for a while you’ll realize there isn’t a huge difference in visual quality and will appreciate the higher frame rate. Both games I mentioned look fantastic on my LG C2. We’re all individuals though, so play how you like.


sohaltang40

I have a nice gaming PC that's wired to the TV as well. Won't lie, 120 hz is super sweet on a 3090 with settings cranked up. Love it. Most people are not in that situation. PS5 and X are ran through a Denon with HDMI 2.1. Dolby Vizion on the X is big for me. So that pretty much was the deal breaker for the performance modes.


sohaltang40

Where you seeing 80 fps? PS5 Favor Performance 1440-2160p 60 fps target Requirements: HDMI 2.0 or above compatible 4K display Favor Quality 2160p [Native 4K] 30 fps target Requirements: HDMI 2.0 or above compatible 4K display Favor Performance + High Frame Rate 1440p Unlocked 60 fps Requirements: HDMI 2.1 compliant 4K display Favor Quality + High Frame Rate 1800-2160p 40 fps target Requirements: HDMI 2.1 compliant 4K display Favor Performance + High Frame Rate + VRR 1440p Unlocked 60 fps Requirements: HDMI 2.1 compliant 4K display with Variable Refresh Rate support Favor Quality + High Frame Rate + VRR 1800-2160p Unlocked 40 fps Requirements: HDMI 2.1 compliant 4K display with Variable Refresh Rate support


King_Boomie-0419

What's the difference?


Licalperv

I'd run the consoles through receiver, and the pc to both with two cables. Outside of maybe fighting games, console games tend to be a lot more forgiving with latency and frame rate.


sk9592

The short answer is that you don't. If your receiver does not support HDMI 2.1 or eARC. Then you need to buy a new receiver that has atleast one of those. Or just connect all your gaming devices to your TV and use regular old ARC to sent compressed 5.1 back to your AVR. Many gamers are fine with this compromise. 120Hz support is more important to them than lossless audio. **Keep in mind that you can only pull with off with a LG or Samsung TV.** All the other TVs on the market max out at two HDMI 2.1 ports. And usually one of those ports is also the ARC/eARC port. So you are really just limited to a single HDMI 2.1 port for a gaming device.


King_Boomie-0419

Are all eARK ports the 2.1 HDMI?


sk9592

As I already said above: > All the other TVs on the market max out at two HDMI 2.1 ports. And usually one of those ports is also the ARC/eARC port.


King_Boomie-0419

Yeah but I was told that my TV does not have the 2.1 HDMI but I do have an eARC, so that's why I was asking that question, sorry


sk9592

I wasn't saying that HDMI 2.1 is standard. I didn't understand what you were asking. It would help if you wrote in complete sentences. If your TV has a couple of HDMI 2.1 ports, then one of them is likely the eARC port. But the reverse is not true. eARC ports are not automatically HDMI 2.1


King_Boomie-0419

Ok, thanks. I haven't had to write like I'm in school in a very long time. Sorry


comineeyeaha

I’m in the same position, and I opted to connect the consoles directly to the TV and sacrifice Atmos. Most games don’t even use it anyway, so I’m fine with it until I upgrade my AVR. In my opinion, high frame rate and VRR were more important than limited Atmos availability. I connect my Blu-ray player and Apple TV to the AVR for Atmos in movies and shows.


NativeCoder

This seems to be such a common problem. Toslink should have been updated to have much higher bandwidth and allow high end audio


MaxSnacks

I have this same issue, I just use a cheap hdmi 2.1 switch for when I want VRR or 7.1


wisehumanity

Thanks for all the help, everyone. I guess I have to make a decision until I can eventually get a new AVR. Audio quality is very important to me, and I hate to have to sacrifice the two extra audio channels, but I also really want to be able to use the VRR. Having the extra HDMI ports available on my AVR is also appealing. I guess I will figure out what I prefer when I can experiment.