PC hardware is incredibly diverse. Some gamers have brand new top of the line gaming PCs which supports the latest graphic cards, CPUs, and fast SSDs. For some gamers who have old PCs, the potential solutions to poor performance in nearly every game are the same; they want you to upgrade your hardware, change your settings or drop your screen resolution. It’s the last point which I want to discuss today.
Monitor’s native resolution
In a perfect world, you want to run all of your games at your monitor’s native resolution. Well, when I start gaming, so we play games at 320x200. These days I have multiple 4k and ultra-wide monitors, and the graphics were amazing in that.
Well, there are some games where even the fastest current hardware simply isn’t capable of running a new game at 4k, maximum quality, and 60 fps. Borderlands 3 and Gears of War 5 are included in these types of games.
This thing totally depends on the game. You can play games at 4k, but at lower quality settings, the difference between ultra and high settings is often more of a placebo than something you noticed without comparing screenshots or benchmarks. Dropping from ultra to medium is too much to compromise it shows a clear difference. There are games like Rage 2 in which from maximum to minimum we go so it shows clear graphics difference.
Upgrading the hardware
By upgrading hardware and dropping the resolution, by doing this, your performance will boost. 1440p runs fast as 4k on the same settings, and 1080p is usually around 30 to 40 percent faster than 1440p.
Playing games at 1080p on a 4k monitor will double your framerate on everything short of a 2080 Ti. That’s why many gamers suggest 1440p as the best gaming resolution. And there’s no compromise on the graphics of games. Things still looks real, and you will enjoy it.
But what if you want a 4k and 1440p resolution both for general use, for work, and for watching movies, but you also want to play games on it. But what if you want to play games on it? Does it look worse to play at 1080p on a 4k or 1440p display than if you simply use a 1080p monitor instead? The answer is yes, it seems worse to people, but it is not that worse.
In 2004, when we use CRTs, which runs at lower resolutions than your native monitor resolution, it was commonplace. However in CRTs inherently less precise and always has a bit of blurriness. There are many issues with that CRTs; its high resolution also shows blurriness. When we shifted from CRTs to LCDs, the pixels suddenly were perfect at different resolutions.
You can take the simple example of a 160x90 resolution display with a diagonal black stripe running to it. Now try to stretch that image at 256x144. We run into a problem of not easily being able to scale the image.
LCDs and video drivers
LCDs and video drivers have to take care of the interpolation, and while the results can look decent, there are no denying deficiencies of both nearest neighbors and bicubic scaling. NN results that some pixels get doubled and some not, and bicubic scaling cause loss of sharpness. Running 1080p on a 4k display ends up being one fourth the native resolution. Suppose your graphic card supports integer scaling so you can able to double the width and height and get a sharper picture.
Some companies like Intel and Nvidia now support integer scaling, though it required an Ice Lake 10th generation CPU for intel (laptops) or a Turing GPU for Nvidia. As well as if you are facing issue of Shadowplay Not Recording with Nvidia, You can easily relsove this by following this guide.
High resolution effects
It’s better to show what’s this looks like while dealing with resolutions, like 1080p scaled to 1440p and 4k using integer scaling vs. bicubic filtering. Integer scaling is an excellent feature for pixel art games, but it’s not that much important for other games. So you guys are thinking that why I am talking about integer scaling and various filtering techniques in drivers when I was talking about playing games at lower native resolution?
Well, this is because these two topics are connected to each other. Integer scaling can be used at a lower resolution if you have the right hardware. Most displays are doing bicubic scaling these days.
So while running at your displays, the native resolution is ideal; it’s often not practical for games. Well, many people need higher resolutions, not only for playing games. They want it for other tasks also. If you own a 4k monitor and play games at 1080p or 1440p, you will notice that it’s not that much evident when you go close to the monitor, but from far, it is good.