cageymaru
Fully [H]
- Joined
- Apr 10, 2003
- Messages
- 21,912
HotHardware has conducted an interview with Nvidia Director of Technical Marketing, Tom Petersen where they discuss various topics such as how Turing compares to Pascal. Tom Petersen had this to say.
"The fact is that Turing, just a just the traditional Turing where you think about shaders and you think about memory bandwidth and all that kind of stuff. Turing is a beast and Turing is going to significantly improve the gaming experience on old games and it's going to just rocket when you adopt new technology. Right so we didn't do Turing just for Deep Learning and just for AI. Turing is our next step in that normal you know progression of Nvidia GPUs getting better every couple of years. Turing is our next big step so you know we did share some data that showed a bunch of games; looking at things like Final Fantasy, PUBG and you'll see the perf roughly somewhere between 35 - 45% better and roughly the same generation. So 2080 Ti to 2080 Ti and of course that is going to vary based on the game and based on the setting."
We think that Tom misspoke when he said "2080 Ti to 2080 Ti," and actually meant "1080 Ti to 2080 Ti," which of course would make sense in the context in which he was speaking.
"So if you are CPU limited, like you're running an older CPU and you're running at lower resolutions, then really the GPU can't do anything because you're already... You know you're not giving the GPU enough work. And in those cases well you know we don't actually you know we don't get more work to do so it runs the same. But in most cases if you're paying you know if you're buying a high end GPU, you have a higher end resolution monitor or higher refresh rates, you're turning on more of the eye candy. And in the places where you are in fact GPU limited where you're seeing performance impacted by the GPU; Turing is gonna crush it! And it's it's a combination of faster shaders, you know faster clocks, wider memory bandwidths and architecture improvements. So even running existing games you're gonna see a nice bump."
"The fact is that Turing, just a just the traditional Turing where you think about shaders and you think about memory bandwidth and all that kind of stuff. Turing is a beast and Turing is going to significantly improve the gaming experience on old games and it's going to just rocket when you adopt new technology. Right so we didn't do Turing just for Deep Learning and just for AI. Turing is our next step in that normal you know progression of Nvidia GPUs getting better every couple of years. Turing is our next big step so you know we did share some data that showed a bunch of games; looking at things like Final Fantasy, PUBG and you'll see the perf roughly somewhere between 35 - 45% better and roughly the same generation. So 2080 Ti to 2080 Ti and of course that is going to vary based on the game and based on the setting."
We think that Tom misspoke when he said "2080 Ti to 2080 Ti," and actually meant "1080 Ti to 2080 Ti," which of course would make sense in the context in which he was speaking.
"So if you are CPU limited, like you're running an older CPU and you're running at lower resolutions, then really the GPU can't do anything because you're already... You know you're not giving the GPU enough work. And in those cases well you know we don't actually you know we don't get more work to do so it runs the same. But in most cases if you're paying you know if you're buying a high end GPU, you have a higher end resolution monitor or higher refresh rates, you're turning on more of the eye candy. And in the places where you are in fact GPU limited where you're seeing performance impacted by the GPU; Turing is gonna crush it! And it's it's a combination of faster shaders, you know faster clocks, wider memory bandwidths and architecture improvements. So even running existing games you're gonna see a nice bump."