First RTX 2080 Benchmarks Hit the Web with DLSS

Meh, we will see. AMD is far enough behind atm that they would need to make decent IPC gains in addition to the node shrink to best the 2080ti in performance. Not saying it won't happen, just that the node shrink alone probably won't get the job done for AMD.

True, but common though. Why do you think they release 2080ti for the first time with 2080. They have never done that before. Because they know there wouldn't be enough time for them to wait on it because 7nm is coming next year. So they are cashing out on it. I can pretty much bet we see 7nm cards from Nvidia and AMD next year. Heck even if its a Turing refresh it will be pretty damn fast.
 
So, if you want the new fancy AA, you cannot use Ray Tracing

View attachment 98143

WTF hahaha. ITs funny how they are going about it. Almost like they are controlling the developers. Nvidia does have enough cash to throw around I guess. So games that support DLSS don't support RTX. Hahaha. They know it can't do both so lets make sure no games support both? lol Nvidia seriously got some serious pull in games.

That chart looks like its either one or the other.
 
Meh, prefer to wait for reviews, i trust those internal benchmark charts about as much as i trust a hungry rottweiler with my balls in its mouth.
 
What about those games like mech warrior 5 where it’s yes across the board?

I think for those no one knows yet, however.....all the other ones it's one or the other. I really hope I can run both as I bought 2 cards...
Maybe in the new SLI I can have one do RT and the other card do AA?
 
I already dropped the coin for a MSI 1080 Gaming X and sell my 1070 Gaming X, Im not paying $650 or more for a mid range card like the 2070 just to have ray tracing and the ability to run at a resolution that I dont have, I just bought a 2K monitor after being at 1080P.

1440p 144hz here and it’s really nice. Also no 1080ti to sell me?! ;)
 
I think for those no one knows yet, however.....all the other ones it's one or the other. I really hope I can run both as I bought 2 cards...
Maybe in the new SLI I can have one do RT and the other card do AA?

If that’s the case then my wallet is not just gonna get raped. It’s gonna get tossed out in the dumpster like a used condom
 
If that’s the case then my wallet is not just gonna get raped. It’s gonna get tossed out in the dumpster like a used condom
q3iCWqE.gif
 
True, but common though. Why do you think they release 2080ti for the first time with 2080. They have never done that before. Because they know there wouldn't be enough time for them to wait on it because 7nm is coming next year. So they are cashing out on it. I can pretty much bet we see 7nm cards from Nvidia and AMD next year. Heck even if its a Turing refresh it will be pretty damn fast.

It's not much, if any different than normal really. Just that the GPU that they are calling the 2080ti now was called a Titan for Pascal, Maxwell and Kepler. Heck even the price of the 2080ti matches that of the previous Titan cards.

Don't be surprised if the 2070ti is released in 9 months slightly cut down from the 2080ti possibly higher clock speeds making them nearly identical all at a substantially lower price. Likely to coincide with the release of NAVI to make sure they have a GPU in the same tier of price/performance as whatever the AMD flagship will be.
 
People were complaining that NVIDIA didn't provide any performance numbers and they still are now that they did...
Yeah well people were complaining Nvidia was sitting on their laurels and not releasing new cards too. The bandwagon is in full swing though.

You'd swear nobody has been thru an Nvidia release before.
 
Last edited:
People were complaining that NVIDIA didn't provide any performance numbers and they still are now that they did...

These aren't real performance numbers. Look at that axis? If they want to be believed, they need to provide something with enough information that it can be reproducible, so that we can hod them accountable later if it doesn't line up with 3rd party testing.

I just drew a chart that's about equally useful:

150744_IMG_20180822_1726282.jpg
 
Last edited:
These slides raise more questions than answers. Nvidia should just man up and allow reviews to drop in the next week rather than at launch.
 
These slides raise more questions than answers. Nvidia should just man up and allow reviews to drop in the next week rather than at launch.

And it's like this every time with a new GPU launch. Cards at this level of performance with really no competition from AMD anytime son are going to sell early one even at these prices without detailed reviews.
 
It's not much, if any different than normal really. Just that the GPU that they are calling the 2080ti now was called a Titan for Pascal, Maxwell and Kepler. Heck even the price of the 2080ti matches that of the previous Titan cards.

Don't be surprised if the 2070ti is released in 9 months slightly cut down from the 2080ti possibly higher clock speeds making them nearly identical all at a substantially lower price. Likely to coincide with the release of NAVI to make sure they have a GPU in the same tier of price/performance as whatever the AMD flagship will be.

Why would 2070ti be slightly cut down from 2080ti. Wouldn’t that make 2080 obsolete? I think what you mean is 2080+ if that was to ever happen.
 
And it's like this every time with a new GPU launch. Cards at this level of performance with really no competition from AMD anytime son are going to sell early one even at these prices without detailed reviews.
Which is why it is even more perplexing. We know nothing is going to touch them or slow them down, so what is there to hide now that is going to change on the 20th?
 
Which is why is even more perplexing. We know nothing is going to touch them or slow them down, so what is there to hide now that is going to change on the 20th?

nVidia is just trying to do really basic hype stuff here that's some where in the ballpark of real. Nothing new. Unless these cards are just garbage there's no need to get reviews out there that early.
 
If they're comparing SSAA to DLSS of course yoyou' going to see a mad boost, it's literally 4x resolution with all it's headaches versus hardware accelerated upscaling. This is like those 4k "ready" Blu Ray player, a marketing joke.
 

Damn that is a long wait. By then initial orders will be shipped already I think. Also it just seems that nvidia is just marketing Ray tracing and DLSS. So pretty much you will see a big boost if you enable those features. When it comes to pure raw performance I think we won't see huge gains like we did with pascal. Only ones to benefit from this is those who have high end monitors it seems. But when we get to 7nm I think these cards will be much cheaper and AMD will have something based on 7nm tech. So I think next year will be much better buy.
 
Damn that is a long wait. By then initial orders will be shipped already I think. Also it just seems that nvidia is just marketing Ray tracing and DLSS. So pretty much you will see a big boost if you enable those features. When it comes to pure raw performance I think we won't see huge gains like we did with pascal. Only ones to benefit from this is those who have high end monitors it seems. But when we get to 7nm I think these cards will be much cheaper and AMD will have something based on 7nm tech. So I think next year will be much better buy.

Initial batches RTXs won't ship until September 20th so it should be no problem canceling a preorder before reviews come out if the 14th is end of embargo date and the reviews are ready to go. My two FE 2080 Tis won't ship until 10/08/2018 and I pre-ordered about an hour after they went on sale Monday.

You can always play the waiting game. I'm looking at the 2080 Tis to smooth out 4k, the 1080 Tis a pretty solid there but quite a ways from a reliable 60 FPS at 4k maxed.
 
For those that are wondering more about DLSS:

Watch for about 2-3 minutes

What we can infer from this segment is that DLSS is not actually an anti-aliasing method as much as it is actually a image upscaler.
So render at sub-4K, get tensor cores to fill in the gaps.

You can figure it out yourself if the hit to texture quality is worth the speed boost. I'd say a 40% boost is probably worth it (according to those charts). But the fact is DLSS is not available as a general setting, and requires support.

And you might cringe and say, well that's not native 4K, and you'd be right. But we're also now getting engines that are capable of dynamically changing resolution or changing resolution for certain effects, such as explosions. All these different tricks could be combined into something absolutely fearsome.

*Actually it looks like on the chart the boost is about 30% compared to native rendering, but you get the point.
 
So that looks like around ~45% average FPS increase, not too bad. The 2080Ti over 1080Ti should be about the same. So that would put the 2080Ti around 10% faster than the Titan V if it scales linearly.

If this is truly the case, the price makes sense now.

Not really! The 1080 Ti launch price was about the same as the 980 Ti launch price. We aren't really getting a performance per $ reduction.
 
Initial batches RTXs won't ship until September 20th so it should be no problem canceling a preorder before reviews come out if the 14th is end of embargo date and the reviews are ready to go. My two FE 2080 Tis won't ship until 10/08/2018 and I pre-ordered about an hour after they went on sale Monday.

You can always play the waiting game. I'm looking at the 2080 Tis to smooth out 4k, the 1080 Tis a pretty solid there but quite a ways from a reliable 60 FPS at 4k maxed.

Nope I have never played the waiting game. Ever! But I am smart with my money. I would have busted 700 in an instant for a Ti like last time. Nvidia is money grabbing at this point. 7nm is too close and I wouldn't drop 1200 on a card that might have a short life cycle plus I am not on 4k. I like widescreen displays better. Gsync hdr is going to cost an arm and a leg. So no way I am dropping 3-3.5k on both when in a year they will be slashed in half. Sometimes its okay to play the waiting game. I like being on the bleeding edge, but this time there is too much bleeding and this will have half the life span of Pascal I think. Brute force will come with 7nm. This year is already almost over. Like I have been saying early adopters can fund my discount for 7nm cards, all power to them lol!
 
ill believe this fluff when i see the [H]ard review

also kyle your wife is going to kill me this but you could use this https://www.amazon.com/Acer-Predato...=UTF8&qid=1534984261&sr=1-1&keywords=4k+144hz

maybe pateron raffles pay per entry or special pateron content after your done with the hardware and dont want to keep it :D


I think next year you could buy that for 50% discount lol! 2k for a 27 inch monitor is nuts. I think Nvidia has made high end gaming luxury gaming now. Did you see the review with the IPS glow on that monitor? 2k for that much ips glow lol
 
For those that are wondering more about DLSS:

Watch for about 2-3 minutes

What we can infer from this segment is that DLSS is not actually an anti-aliasing method as much as it is actually a image upscaler.
So render at sub-4K, get tensor cores to fill in the gaps.

You can figure it out yourself if the hit to texture quality is worth the speed boost. I'd say a 40% boost is probably worth it (according to those charts). But the fact is DLSS is not available as a general setting, and requires support.

And you might cringe and say, well that's not native 4K, and you'd be right. But we're also now getting engines that are capable of dynamically changing resolution or changing resolution for certain effects, such as explosions. All these different tricks could be combined into something absolutely fearsome.

*Actually it looks like on the chart the boost is about 30% compared to native rendering, but you get the point.



Yea I think I remember him saying its doing something like dynamically managing what to use anti aliasing on and dynamic resolution as well. So I think once reviews are out we will find out whats going on. He also said in that video about tensors cores being trained with millions of images or something of that sort. Hard to grasp from short clip. In depth write out on it will help.
 
I heard a rumor that they were using a box chiller to oc the shit out of a titan v...

:p
 
For all intents and purposes I consider the 2080ti a Titan. It’s die is 50% larger than a Titan Xp and is less than 10% from max die size.

Exactly. Jensen is just pissing on the consumers with this pricing/naming scheme. They could have EASILY gone with:

$999/$1199 = Titan RTX

$699/$799 = 2080 Ti

$499/$599 = 2080
 
What do you guys think about this analysis?



Around 12:xx it gets interesting. Looks like game devs are gonna have allot of work to do, juggling two GPU families to optimize for.

Barring such optimizations, expect something closer to 20 percent performance gain vs last gen (2080 vs 1080 non-Ti etc) in current titles give or take as per analysis's claim.
 
Last edited:
Thanks Kyle for the continual RTX coverage. Really hard to fight the temptations of trying to pre-order one these right now. Can't wait for the real benches to start coming in and one of the best things about [H]ard is you don't just pick the hyped feature as your only point of review.
 
I think if the price was actually AT the MSRP of $700 flat I'd go ahead and pre-order a 2080, like an idiot.

Cheapest price I've seen on the 2080 is $780 which is just more than I'm willing to bite on. I paid $500 for the 1080 9 months ago. I don't care about the depreciation, that's life. But for the replacement of the 1080 to be $270 MORE? Uh... that's more than my impulse-buy meter can handle.

But with all of the RTX cards being over MSRP? Yeah, I realize that we've spent years with Nvidia cards always being over MSRP but this time the price is so high I'm willing to see if it settles after launch day. In the meantime I'll look for a buyer for my 1080.

To be honest though, I'm thinking this tech is worth $600 tops. And that's already crazy. We've gotten too used to crazy. It was only a few years ago that I considered $350 to be stupid expensive for a video card.
 
It's hlarious how much free press coverage NVIDIA is getting by creating controversy. Well plaid NVIDIA!

Any developer that has any sense will not optimize their game for this new tech unless they can benefit financially from it. I bet that the only ones who are doing right now are getting GREEN money from NVIDIA. At these stupid high prices these GPUs will be a niche for the foreseable future. I wouldn't buy one. Most I ever paid for a GPU was $850, and that was the EVGA GeForce GTX 1080 Ti FTW3 Liquid Cooled - and that was crazy.
 
Exactly. Jensen is just pissing on the consumers with this pricing/naming scheme. They could have EASILY gone with:

$999/$1199 = Titan RTX

$699/$799 = 2080 Ti

$499/$599 = 2080

Stop making sense, they have to make the 50% performance boost not a lie. Or 100-160% boost if you throw in DLSS!...
(100-160% boost in performance based on extrapolated difference between 30something frames and 78 frames in infiltrator claim).

/s
 
Back
Top