We know that 1080p is 2.07 Megapixels, 1440p is 3.7 MP and 4k is 8.3 MP.

So if a game has a y of 3 GB and an x of 0.1 GB, vram usage would be:

3.207 GB at 1080p (2.207*0.1+3) , 3.37 GB at 1440p (3.37*0.1+3)and 3.83 GB at 4k (8.3*0.1+3)

If a game has a high "y value" in relation to the x value, it seems inefficient imho.

Lets go way back to Crysis 1 in 2007 with much lower resolutions:

http://hardforum.com/showthread.php?t=1456645

defaultluser reports .31 GB@ .48 MP, .36GB@ .79 MP .45GB@ 1.31 MP

and .575 GB at 1.9 megapixels

Using the highest and lowest res:

1.9x + y = .575 GB and 0.48x + y = .31 GB We can substitute for y and solve for x:

ie y = .31 - .48x which means...

1.9x + (.31 - .48x) = .575 or...

1.42x = .265 GB so

**x = .187 GB**

We can than solve for y in the equation above.

y = .575 - 1.9(.187) so

**y = .22 GB**

Here, x and y factors were VERY close.

This can be checked using the middle resolution

.79(.187) + .22 = .368 GB vs .36 GB that was recorded.

Moving to 2015 where the FuryX looked like it had a very bleak future with only 4 GB of vRam.

Starting with Tomb Raider in 2015 which uses 1.5 GB ram at 1080p and 3.1 GB with 4k

Now use 2.077 (megapixels) for 1080p and 8.1 for 4k

Using the same formulas above...

8.3.x + y = 3.1 .... and 2.07x + y = 1.5 .... using substitution,

8.3x + 1.5 - 2.07x = 3.1 ...reduce to x and

**x=.257**or in other words .257 GB needed for each Megapixel

now we can solve for y:

2.07x + y = 1.5

y = 1.5 - 2.07x ....replace x with .257 and

**y=.968**or about 1 GB overhead

Now lets test with 1440p: 3.7x + y = 1.92 vs 1.94GB Actual

But other game were far worse:

Take for example Middle Earth:SoM as recorded by Tweaktown. Using the above formulas...

**x = .1 GB and y = 4.56 GB**

I tested on 1440 p and got 4.93 GB vs. 4.97 GB Actual by Tweaktown.

Metro Last light was also tested

**x = .115 and y = 1.06**

testing with 1440p gave me 1.49 GB vs. 1.46 GB that they recorded.

And also Far Cry 4.

**x = .43 GB**/ megapixel (highest so far!) with

**y being 2.17 GB**

Again testing on 1440p: 3.7x + 2.17 = 3.76 GB vs 3.77 GB actual by Tweaktown

So even in 2015, you had "y overheads" ranging from 1 GB to a whopping 4.5 GB.

Which brings us to 2018 with Battle Field V:

Both TPU and Guru 3d had Vram numbers:

Staring with TPU: https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V/4.html

8.3x + 4.83 - 2.07x = 6.74

8.3x - 2.07x = 6.74 - 4.83

6.23x = 1.91 or

**x = .307**Now solving for y...

2.07x + y = 4.83

6.35 + y = 4.83 or y =

**4.195 GB**

testing with 1440p... 3.7x + y = 5.33 GB vs 5.36 GB actual

And then Guru3d: https://www.guru3d.com/articles_pages/battlefield_v_pc_performance_benchmarks,7.html

8.3x + 4.94 - 2.07x =6.99 ....

6.23x = 2.05 or

**x = .329**again solving for y...

2.07x + y = 4.94

.681 + y = 4.94 or

**y = 4.259**

testing with 1440p... 3.7x + y = 5.48 GB vs 5.49 GB actual

Both Reviews had very close values.

The point here is that these numbers are no worse than what 2015 was getting us, despite the 10x increase going from 2007 to 2015. Also, let's not forget that vRam requested is not the same as vRam required. Vulkan perhaps being the exception.

** It is interesting to note that the FuryX was just behind the GTX 1070 in TPU and WAY behind in Guru3D

Perhaps the slightly higher settings and vRam requirements of Guru3d were JUST enough to put the FuryX on its face.

So, would I buy a mid to high end card with 4GB in 2018 to play the newest games? Hell no, but I am willing to wager that 8 GB will be sufficient for FAR longer than most anticipate.