Aside from fps, is there any difference in quality of raytracing in Nvidia and AMD or is it the same(like they say that DLSS is better than FSR)?
Aside from fps, is there any difference in quality of raytracing in Nvidia and AMD or is it the same(like they say that DLSS is better than FSR)?
I have a Freesync monitor (MSI) with an Nvidia RTX 3060, and Nvidia control panel gives me the option to “enable support for unverified displays” or something. Works just fine for me?
That’s FreeSync, although Nvidia, confusingly, calls it G-Sync, just like their other frame sync tech.
G-Sync required an expensive module in the display, FreeSync doesn’t.
Nvidia lost the G-Sync vs FreeSync battle, but because of their marketing chops, they managed to get away with just slapping their name on it and going with the open solution.
DLSS has been much more successful, but it’d be like if they started using FSR, but rebranded it as DLSS.
Oh that’s really gross… but of course they get away with something like that.
That just means your Nvidia card can make use of a Freesync monitor, there’s no „real“ Gsync happening there.
Actual Gsync comes with a dedicated hardware module in the monitor, and it used to be only compatible with Nvidia cards, but that’s also not the case anymore.
So how does it work? Is it a fake software level that mimics G-Sync behavior? Something like V-Sync?
It’s pretty much just leveraging the open VESA Adaptive Sync standard, which AMD Freesync is practically speaking a rebrand of. It’s indeed purely software to make it vendor-agnostic.
Well, unless the vendor locks it down/blocks it on purpose, which is what Nvidia has done up until… whenever Gsync Compatible became a thing.
(Misleading name imo, because as said before, there’s no actual Gsync running)