You could do that, but at that point, wouldn't good anti-aliasing be computationally cheaper, and look just as good?
AA is essentially blurring so will never look as good.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
You could do that, but at that point, wouldn't good anti-aliasing be computationally cheaper, and look just as good?
The only good anti aliasing is supersampling. Everything else is inferior to upping the resolution.You could do that, but at that point, wouldn't good anti-aliasing be computationally cheaper, and look just as good?
And once you hit a certain DPI, using DSR instead of increasing increasing the resolution should provide similar results.
Personally I like 100DPI for the desktop. I hate any kind of desktop scaling, and don't see any need to go above this. A desktop screen is not a phone, we don't hold it mere inches from our eyes. (at least most don't)
The only good anti aliasing is supersampling. Everything else is inferior to upping the resolution.
DSR is just a fancy name for supersampling. The only difference compared to SSAA is that it supersamples everything not just apparent edges.
Scaling is not the devil. Bad scaling that unfortunately every windows version has is the problem. There is absolutely no problem with UI scaling on an Android phone, and the crisp clear fonts and menus are looking great. We're currently missing out on that on the desktop due to the absolutely shit implementation of scaling in windows.
So while I agree for desktop use scaling looks bad currently, but the topic is about gaming.
Saying scaling is shit therefore we shouldn't up the resolution ever again, is not a solution.
MS should be pressured to make it better, or die. And hope for everything's sake that's holy that windows 10 is not the last windows.
Then they'll find other reasons to buy new GPUs. Or they'll come around.Pretty quickly we are going to reach the point where the overwhelming majority of users will think that the marginal improvement they get is not worth having to buy a $1,200 GPU.
Then they'll find other reasons to buy new GPUs. Or they'll come around.
In 1994 most people thought 320x200 is enough for gaming why do I want 640x480?
in 1999 most people thought 640x480 is enough for gaming why do I want 1024x768?
In 2004 most people though 1024x768 is enough for gaming why do I want 1440x900?
In 2009 most people thought 1680x1050 is enough for gaming why do I want 1920x1080?
In 2014 most people thought 1920x1080 is enough for gaming why do I want 3840x2160?
Do you see the pattern there?
And now in 2017 two years early most people say why do we need 8K?
Like it or not resolution is doubling roughly every 5 years. So by 2019 I expect 8K monitors to be widely available and used.
I do, but screen size has also grown to keep maintaining desktop DPI at about 100.
I have a 48" 4k TV on my desk, I use at about 2.5ft distance. I don't think I either can or want to go any biggerEven the 48" screen is a tad big. (I think 4k would be perfect at ~42-43")
So I just bought some GTX 1050s for my family's HTPCs, which has HDMI 2.0b standard... hope it gets updated for 2.1.![]()
You sure we're talking about the same card? https://www.techpowerup.com/reviews/MSI/GTX_1050_Ti_Gaming_X/25.htmlWow, those are some beefy GPU's for HTPC use.
I currently have GT720's in my three HTPC's I've been waiting for and hoping Nvidia will launch follow up low end GPU's so I can add newer versions that have hardware HEVC decoding, because 1050's seem like total overkill, are too expensive, and use too much power - IMHO - for my movie watching needs.
I'm starting to think that Nvidia has decided to permanenty cede the low end to on board graphics, which is a shame, as none of the on board solutions from either AMD or Intel have been fully stable in my implementations.
You sure we're talking about the same card? https://www.techpowerup.com/reviews/MSI/GTX_1050_Ti_Gaming_X/25.html
Idle power draw is 3 watts, which IMO is incredibly efficient. TDP is only 75 watts. Noise is non-existent. I did a lot of research, and considered it probably the ultimate HTPC video card for a budget of ~$100.
All problems I have ever had with HTPCs have been due to using Intel drivers. Screen flicker and copy protection crap and what not always limited to my integrated graphics systems, resolved with a dedicated video card, which offers far better overall performance anyway since HTPCs are used for just about everything these days. Heck, on my own HTPC, I'm running a 290x, and when I retire my 295x2 from my primary rig, it will go in my HTPC.![]()
Anyone know if the newly increased HDMI 2.1 bandwidth is more or less than the USB3 standard? As much as I appreciate HDMI as a backwards-compatible standard, I'd still like to see USB-C connectors start appearing on monitors/televisions/gpus.
Stop buying $299 TVs at Wal-Mart.
Every LG OLED that BB has listed has at least 3 and the majority have 4.
Samsung KS8000: 4
Samsung KU7000: 3
Sony X850D: 4
Vizio D3: 4
Sharp 7000U: 4
Toshiba 621I: 3