Nvidia have made it no secret that they plan to release something “super” in the RTX line of graphics cards, with a reveal expected at E3 2019. According to recent reports, Nvidia have not one but two graphics cards planned for release.
The main contender that I posited as one of the possibilities was, of course, the RTX 2070Ti – or perhaps it will be the RTX 2070 Super. Of course, that still leaves the question of the other graphics card announcement, but rumoured specifications for the RTX 2070Ti have surfaced online.
So, what do the specs show? According to the leaks, the GeForce 2070Ti is said to feature 2560 CUDA Cores, a slight improvement over the 2304 CUDA cores of the vanilla 2070. Just as a refresher, this puts it just behind the 2080, which has 2944 CUDA Cores. The rumours also state that we can expect to see 14gbps, 8GB DDR4 GDDR6 VRAM, with a boost clock of 1770Mhz.
If – and I do stress these are rumoured specifications – these specs are true, this would give the RTX 2070Ti a 10-15% performance boost over the vanilla RTX 2070. Based on these numbers, we can reasonably assume that we will see 40 SM, 40 Ray Tracing Cores, 320 Tensor Cores, and 160 Texture Mapping Units (TMU).
Now, while this does put the RTX 2070Ti in a similar ballpark to that of the GTX 1070Ti (in terms of the difference between them and their XX80 counterparts), there’s some things that don’t make complete sense. We have heard from several sources that we can expect a 16gbps upgrade for several of the Turing cards, such as the 2080 and 2070. So, if this is true, it would make no sense to release a Ti variant with less memory bandwith than it’s younger sibling.
So what does this mean for the leaked specs? Well, the CUDA Cores is probably correct, but the rumoured memory speeds are most likely false or inaccurate. Again, Nvidia are likely to release variants of the top end Turing GPUs with improved memory bandwidth, so it would make zero sense to lumber the Ti with 14Gbps memory. So I am skeptical about these leaked specs, but we should all wait and see what Nvidia reveal one way or the other.