Hot topics close

Nvidia's GeForce RTX 5070 at $549 — How does it stack up to the ...

Nvidias GeForce RTX 5070 at 549  How does it stack up to the
Same price, different architectures, and some big tech changes. (Updated)

Nvidia made a big splash with the official announcement of its upcoming GeForce RTX 50-series Blackwell GPUs during the CES 2025 keynote. And while the halo RTX 5090 certainly looks like an absolute monster, for a lot of people, it's the mainstream (-ish) RTX 5070 at $549 that will be the star of the show. The RTX 4070 has been one of the best graphics cards since it launched, and now its replacement is on the way.

Nvidia claims the 5070 will offer "RTX 4090" levels of performance, at about one third the price and a bit over half the power. But how do they really stack up, and how does the 5070 compare to the existing RTX 4070? Let's find out, and we've filled in a few bits and pieces with best guess estimates for now, but most of the specifications are correct.

Swipe to scroll horizontally
Graphics Card RTX 5070 RTX 4090 RTX 4070
Architecture GB205 AD102 AD104
Process Node TSMC 4NP TSMC 4N TSMC 4N
Transistors (Billion) ? 76.3 32
Die size (mm^2) ? 608.4 294.5
SMs 48 128 46
GPU Shaders 6144 16384 5888
Tensor Cores 192 512 184
RT Cores 48 128 46
Boost Clock (MHz) 2512 2520 2475
VRAM Speed (Gbps) 28 21 21
VRAM (GB) 12 24 12
VRAM Bus Width 192 384 192
L2 Cache 48? 72 36
Render Output Units 64? 176 64
Texture Mapping Units 192 512 184
TFLOPS FP32 (Boost) 30.9 82.6 29.1
TFLOPS FP16 (INT8 TOPS) 494 (988) 661 (1321) 233 (466)
Bandwidth (GB/s) 672 1008 504
TBP (watts) 250 450 200
Launch Date Feb 2025? Oct 2022 Apr 2023
Launch Price $549 $1,599 $599

First, let's be perfectly clear: The idea that the RTX 5070 will match the RTX 4090 in all workloads looks like some very rose-tinted glasses. It's obvious that Nvidia is going big on AI with Blackwell, and it's counting on DLSS 4 and other neural rendering techniques to make up the difference. But raw specs still matter in a lot of existing games — barring a driver-side solution that enables higher performance without requiring patches and updates.

The RTX 5070 will have 48 SMs compared to the 46 SMs on the 4070. That's not a very big change at all, and it's a far cry from the 128 SMs in the 4090. The overall FP32 graphics compute works out to 31 TFLOPS for the 5070, 29 TFLOPS on the 4070, and 83 TFLOPS for the 4090. It's extremely hard to believe that, in general, the 5070 will come anywhere near the 4090 in performance without leveraging DLSS 4 and related technologies.

There's also the VRAM to consider. The 4090 has 24GB, compared to half that amount on the 4070 and 5070. There aren't too many games where 12GB is insufficient, but Indiana Jones and the Great Circle, with full RT and without upscaling, definitely exceeds 12GB at 4K. More games are likely coming that could push beyond 12GB of VRAM use at higher resolutions and settings.

This is where "RTX Neural Materials" could come into play. That seems to be the enablement of Neural Texture Compression, something Nvidia discussed back in 2023, fully implemented in a game. Will it work with any game? According to Nvidia CEO Jensen Huang in a Q&A session, the answer is no — it will need work by content creators to enable the feature in future games (or game patches). Without NTC or RTX Neural Materials, the 12GB will definitely keep the 5070 from matching a 4090.

There's also bandwidth to consider. RTX 4090 has 21 Gbps GDDR6X on a 384-bit interface, compared to the 5070's 28 Gbps GDDR7 on a 192-bit interface. So that's 1008 GB/s of bandwidth on the 4090 versus 672 GB/s on the 5070. Again, without NTC or neural materials, it's not going to keep up at higher resolutions.

AI workloads like LLMs also like having lots of VRAM capacity. Quantization only gets you so far, and neural compression of LLMs isn't a thing (as far as we're aware). The RTX 4090 with 24GB of VRAM can simply load larger LLMs than the 5070, which will only match the 4070 in terms of AI model sizes.

It's a different story when we look at AI computational performance. We know the RTX 50-series will have FP4 number format support, but just as important, it seems to have twice the compute per tensor core as the RTX 40-series. That's not enough compute for the 5070 to surpass the 4090, but it's 'only' about 25% slower in theoretical performance. And if something can leverage FP4 on the 5070 where the 4090 needs to use FP8, then it might run better on the 5070. But even the INT8 TOPS favors the 4090.

The real kicker is of course the pricing. There are a lot of gamers that simply can't afford a $1,599 graphics card — never mind the scarcity induced $2,000+ prices we're currently seeing on the 4090. A $549 GPU, even if it's slower in most games, is another matter entirely. Nvidia's xx70-class GPUs have traditionally been the sweet spot for mainstream gamers, and the 5070 looks like it will continue that pattern. Even if it doesn't beat the 4090, if it can consistently deliver performance close to the level of the RTX 4080, it should end up being extremely successful.

DLSS 4 | New Multi Frame Gen & Everything Enhanced - YouTube DLSS 4 | New Multi Frame Gen & Everything Enhanced - YouTube
Watch On

But really, it all comes down to AI features and DLSS 4. We haven't tried multi-frame generation yet, and after our experiences with DLSS 3 frame generation, we're skeptical at best. It will generate up to three frames from a single rendered frame, plus motion vectors and other data. But DLSS 4 will generate those frames more quickly, and according to Jensen, again from our Q&A, DLSS 4 will "predict the future" — meaning the net result should be no worse latency than DLSS 3 framegen, with additional frames and a smoother appearance.

Far more promising than multi-frame generation, in our view, are the enhancements and upgrades to DLSS upscaling and ray reconstruction. Until now, DLSS has used a CNN (Convolutional Neural Network) for the AI training and inference. Now there's a new transformer-based model, which can apparently be utilized on any existing DLSS 2/3 games.

Transformer models have revolutionized many areas of AI development, and the sample sequences in the above video showing CNN vs transformer DLSS look extremely promising. Nvidia has been claiming "better than native" rendering from DLSS for a while now, but the DLSS transformer model may finally deliver on those claims. If it does, that could be the killer feature that makes the 50-series worth the price of admission. Except, the transformer model also works on existing GPUs, so maybe not.

As we've noted in the past, while the RTX GPUs promised ray tracing as a new technology, over time it's really been the AI features that have come to the front as the most important aspect of the RTX series. With the RTX 50-series, Nvidia yet again doubles down on AI, and the supporting DLSS software continues to outpace the RT aspect. Whether or not multi-frame generation proves to be a killer feature, if you don't already have a 40-series GPU, the 50-series including the RTX 5070 could entice you to upgrade.

Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Similar shots
News Archive
  • Luke Letlow
    Luke Letlow
    Louisiana Congressman-elect Luke Letlow dies at 41 from COVID-19
    30 Dec 2020
    1
  • Clarice
    Clarice
    Who Is Clarice Without Hannibal?
    12 Feb 2021
    1
  • Roger Cook
    Roger Cook
    'This Old House' Star Roger Cook Dies at 70
    23 Aug 2024
    3
  • Power
    Power
    NFL Week 10 Power Rankings: Aaron Rodgers' Packers are done as they fall into bottom 5; Dolphins, Ravens rise
    8 Nov 2022
    2
  • Sepp Straka
    Sepp Straka
    Kevin Kisner lent his caddie to Sepp Straka. Now what?
    23 Jul 2023
    2
  • Dricus Du Plessis
    Dricus Du Plessis
    UFC 297 Sean Strickland vs Dricus du Plessis -live results and ...
    20 Jan 2024
    6
This week's most popular shots