Going back to non-G-Sync is literally painful.
Really?! Where does it hurt??
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
Going back to non-G-Sync is literally painful.
Seriously I get you aren't interested in Vega but is it necessary that you post every so often the exact same negative drivel over and over? Its a GPU that is stronger than other cards in its own stack. It doesn't best Nvidias line. If you need Nvidias performance level then buy Nvidia. If you hate paying for Nvidia, well tough. Pick one.
If you give any shit about performance of an FPS you are running at whatever is the lowest input lag frame-rate - which is normally uncapped or capped based on the monitor technology (like maximum refresh rate -2 on a Gsync monitor).
If you can do that at maximum ingame settings, it is preferable. There should be no variable framerate which makes G-Sync/Freesync pointless. The smoothness debate on FPS multiplayer games is pretty moot.
I know that isn't how HardOCP works or tests, but it is how to do it.
I have only 1 question about this test, and that question will remain unanswered properly until we know official MSRP:
Why the hell did they use monitors with vastly different panels? I know, i know, these Samsung VA panels are the shit and Samsung are magicians for making VA work with high refresh rates, but why the hell use LG panel on G-sync monitor? Just to make sure you are using the most expensive G-sync monitor on the market?
In fairness, you have just been proven wrong by Kyle's test. IMO.
Nice, these are the type of videos that make me love this site. ..............................
Exactly the opposite!
I'm a person that uses numbers and stats. I can not do anything without them!!!
This video had neither of them. At the beginning i thought that at least in this video we would witness some FPS measurements at DOOM. Instead, i found myself being in a shock-mode when i realised that... not only this video had only subjective opinions, we also didn't have any info about the drivers from AMD's setup!!!
Can someone please tell me: What kind of conclusions can someone have, if we don't even know the driver's version? !!
While it was entertaining to watch Kyle and the other guys expressing their opinion, this video didn't provide me with anything really useful.
It's like if they don't like the result, attack the process. If ya can't attack the process, attack the source. What next?
............... As for the AMD drivers, what about them? They are most likely an unreleased version of the drivers. How does that make a difference in regard to the conclusions we can make concerning Vega?..................
I will answer with a question: If the driver's version is of no importance, then why does AMD feels the need to keep them secret?
Things I took away form this video:
NVIDIA needs to REALLY examine the premium cost of g-sync monitors.
NVIDIA needs to rethink the price of their card lineup at least some.
AMD's gotten pretty secretive over the last 10 years or so. I'm not surprised they did this. It might be a modified version of an existing driver, or a driver that might never see the light of day. I get what you are implying. The performance we saw with this driver may not be reproducible with whatever driver hits when the card officially launches. The implication being that this driver was massaged to put on a show in which image quality was somehow compromised in order to improve the speed of Vega during testing. Does that sound like it's inline with what you are thinking?
I'm not going to say that's impossible. We've seen NVIDIA and AMD pull shit like that in the past. I will say that we all discussed Vega and asked the AMD rep questions about Vega which can't be discussed at this time. None of us saw any image quality issues or differences between setups. These systems were right next to each other for easy visual comparison. I do think that Vega's amazing performance, which seemed to exceed the GeForce GTX 1080Ti is limited to Doom and more specifically Doom under the Vulkan API.
The last thing I'll say, which Kyle alluded to in the video is that AMD had a different playbook for this which was thrown out as usual. Kyle said in the video that he pulled the AMD supplied NVIDIA card and blew both OS installations away to ensure that the test couldn't be skewed by AMD. The only thing he didn't do was install the AMD Vega drivers.
My personal take, is that AMD knows the card is slower, and they want to prove that the performance deficit doesn't matter because you can't really tell the difference in a Pepsi Challenge type scenario. More over, a lower priced setup on the AMD side will serve you just as well and put money in your pocket. As for the AMD drivers, what about them? They are most likely an unreleased version of the drivers. How does that make a difference in regard to the conclusions we can make concerning Vega?
As for the panel differences, it's easy to say why didn't you use X panel vs. X panel. The reality is, sometimes you have to work with whatever is on hand. Sometimes you can't lineup everything you want to and if you want to get an article out on time, you need to run with what you have. Lastly, FreeSync and G-Sync panels can't be the same because monitors are either one technology or the other which prevents you from being able to do a perfect apples to apples comparison on them.
Totally agree on the drivers. What does it matter what driver AMD installed? They surely weren't going to install duds, and no magic driver is going to push hardware beyond what it can do.
This the the prefect example of real world test results number and figures don't tell you shit. The fact that a majority can't see the difference tells you that paying 300$ for "numbers and figures " means nothing if you can't see a difference. So if you want to get that extra .05% and pay $300 that's your prerogative. This video tells me that the "on paper" results don't show the real world results.Exactly the opposite!
I'm a person that uses numbers and stats. I can not do anything without them!!!
This video had neither of them. At the beginning i thought that at least in this video we would witness some FPS measurements at DOOM. Instead, i found myself being in a shock-mode when i realised that... not only this video had only subjective opinions, we also didn't have any info about the drivers from AMD's setup!!!
Can someone please tell me: What kind of conclusions can someone have, if we don't even know the driver's version? !!
While it was entertaining to watch Kyle and the other guys expressing their opinion, this video didn't provide me with anything really useful.
This the the prefect example of real world test results number and figures don't tell you shit. The fact that a majority can't see the difference tells you that paying 300$ for "numbers and figures " means nothing if you can't see a difference. So if you want to get that extra .05% and pay $300 that's your prerogative. This video tells me that the "on paper" results don't show the real world results.
No i wasn't thinking something that complicated such as the one you said (*i was impressed by your thoughts to be honest!! ).
I was more like thinking that AMD could have used a newer driver version (*compared to EDIT: FRONTIER Edition, not FOUNDERS as i wrote by mistake), but perhaps they were afraid about its performance compared to the 1080Ti (*don't forget that at first, Kyle was thinking to test 5 games instead of DOOM), and they didn't want to reveal to the public that their newest driver can't compete with the competition (*just a thought, but it's AMD's fault because they chose to keep the drivers secret)
The perfect blind test would have removed the freesync vs Gsync BS and used the same exact panel on both machines, with exact same settings, that would have truly been a perfect blind test about GPU/system performance, and not monitor A vs monitor B test.. all the rest of the hardware was actually irrelevant.
You mean they should not have done the test the way they did it at all and throw it out for regular testing? The irony.
This the the prefect example of real world test results number and figures don't tell you shit. The fact that a majority can't see the difference tells you that paying 300$ for "numbers and figures " means nothing if you can't see a difference. So if you want to get that extra .05% and pay $300 that's your prerogative. This video tells me that the "on paper" results don't show the real world results.
I like the video style and testing being done in a blind manner, but like it has been mentioned before Doom is never going to dip below 100fps on at least a 1080ti. For me it stays closer to 200fps. So even if Vega was half as powerful both systems would display the equivalent locked 100hz as if v sync was on with the only difference being possible input lag between freesync and gsync but not visual differences. If the experience was the same that's because it was the same not because I've was faster etc. Without a time by frame graph we only know that the limiting factor was the slow panels. I do however enjoy the format regardless of this and like hearing from real gamers. Hopefully a full review will be out soon if AMD doesn't delay Vega until 2019.
So that was with default settings on both machines.
AFAIK, on nvidia the default is still v-sync OFF (and probably the same with AMD) so when you go over the VRR range, smoothness suffers (and there is tearing too but it's hard for most people to see it at high fps/hz especially in a game like Doom). I'm pretty sure the 1080 ti must have been going past 100fps a lot of the time there, much more often than Vega anyway. I would have preferred Vega too.
V-Sync is on by default when you install the drivers with a G-Sync display attached, and G-Sync is enabled for full screen applications. I always install the drivers with the clean install option and this is the behavior for me.So that was with default settings on both machines.
AFAIK*, on nvidia the default is still v-sync OFF (and probably the same with AMD) so when you go over the VRR range, smoothness suffers (and there is tearing too but it's hard for most people to see it at high fps/hz especially in a game like Doom). I'm pretty sure the 1080 ti must have been going past 100fps a lot of the time there, much more often than Vega anyway. I would have preferred Vega too.
*happens to me whenever I upgrade my drivers using the "clean install" button or when I set up a new system with g-sync
V-Sync is on by default when you install the drivers with a G-Sync display attached, and G-Sync is enabled for full screen applications. I always install the drivers with the clean install option and this is the behavior for me.
The extra 300 bucks can make a difference, a difference that can't be seen in a subjective test based on one game. That is why you need both the subjective and hard data to make an informed discussion.
On paper results don't give you the subjective results, but if there were other games in the test suite the results can be very different in a subjective test. That is why the hard data is extremely valuable. from the hard data you can extrapolate to a certain degree what other apps might function like based on raw performance although its not wise to solely do it based on that.
Also we are talking about 2k there, wasn't Vega marketed as a 4k card?
Many things NOT aligned with early marketing efforts by AMD. Something changed in there views of Vega? Could be.
Totally disagree. Exactly the opposite.!!
Check out JosiahBradley's #34, about why:
Could be the .inf isn't consistent among the varying hardware, but I believe you. I'm sure that if there was tearing on one machine and not the other it would have been brought up by the people testing them.OK, not sure why it doesn't for me then. It also doesn't for some other people, remember on the nvidia forums when people went up in arms because they were seeing a bit of tearing with g-sync monitors? The issue was exactly this, v-sync off with g-sync on and high framerate games (and of course those people had not read the changelog of the drivers).
Could be the .inf isn't consistent among the varying hardware, but I believe you. I'm sure that if there was tearing on one machine and not the other it would have been brought up by the people testing them.
Well you would have to also factor in other variables, like the cost of the actual panel being used. VA panels are typically cheaper than IPS to begin with.Correct me if I am wrong here:
The $200 G-Sync tax we have all been assuming doesn't seem correct for Ultra-wides. In fact, Ultra wide monitor's G-sync tax is actually closer to $500...
That changes the entire argument.