When Nvidia released their Maxwell architecture, AMD responded by rather aggressively slashing the prices of their 200 series of Radeon graphics cards. It’s an effective strategy, and combined with a few of Nvidia’s own PR blunders (namely, the GeForce GTX 970) AMD are gaining traction from gamers.
For this review we’re going to be putting the Sapphire Tri-X Radeon R9 290x 4GB through its paces, the R9 290X representing AMD’s current single chip flagship card. As many will know, a problem with the original reference design of the R9 290X was the noise and heat of the card. AMD released new driver updates (adjusting fan curves and such) to help fix the issues, but many review websites were eagerly awaiting the third party cooler design. Sapphire’s spin on the Radeon features a custom cooler, which does a pretty great Job at eliminating these early issues. Despite the fans generally spinning at a low RPM, heat doesn’t appear to be much of an issue, and the GPU’s operation remains virtually silent. If you manually clock the fan speed higher (particularly past 75 percent) you’ll start hearing the fans intrude in gaming (we’re benching with an open rig too).
To give you an indication – with 100 percent GPU usage during our BF4 tests, we left the GPU’s fan profile on “auto” and kept an eye on it. The temps hit the high 60’s (at worse) during testing, and the GPU fan kept at the ‘auto’ speed of only 25 percent.
The basic R9 290X specs features 2816 Stream Processors (split over over 44 CU’s, Compute Units), 176 Texture Units and 64 ROPS. The Tri-X kicks things up a notch over the reference R9 290X, with a factory overclock providing a modest boost to the clock speeds, raising the core to 1040MHZ from 1000MHZ, and adds 200MHZ to the RAM, bring it to 5.2GHZ; the card boasts an astounding 332.8 GB/s of memory bandwidth.
The R9 290X features eight Asynchronous Compute Engine’s (ACE), each of which handles eight compute queues each. Naturally this is extremely important for compute uses, and the card scores rather well (naturally) in certain compute orientated gaming benchmarks or if you’re planning on bitcoin mining or possibly plan on doing a lot of video editing (or other similar work).
In terms of power connections, it has the standard 6 and 8 pin PCIe adapter requirements, and inside the retail box we were given a couple of ATX 4 pin molex converters, just in case you’re shy a connection.
For display outputs the Tri-X has – 2x DVI out, a Display Port and finally a standard HDMI interface. All in all, business as usual for this type of card, and should give you ample connections for virtually any display configuration.
R9 290X | Sapphire Tri X | GTX 780 Ti | GTX 980 | GTX 780 | |
Stream Processors | 2816 | 2816 | 2880 | 2048 | 1664 |
Texture Units | 176 | 176 | 240 | 128 | 104 |
ROPs | 64 | 64 | 48 | 64 | 56 |
Core Clock | 727MHz | 727MHz | 876 MHz | 1126MHz | 1050MHz |
Boost Clock | 1000MHz | 1040MHz | 928 Mhz | 1216 Mhz | 1178 Mhz |
Memory Clock | 5GHz GDDR5 | 5.2GHz GDDR5 | 7GHz GDDR5 | 7GHz GDDR5 | 7GHz GDDR5 |
Memory Bus Width | 512-bit | 512-bit | 384-bit | 256-bit | 256-bit |
VRAM | 4GB | 4GB | 3GB | 4GB | 4GB |
FP64 | 1/8 FP32 | 1/8 FP32 | 1/24 FP32 | 1/32 FP32 | 1/32 FP32 |
TDP | 290W | 290W | 250W | 165W | 145W |
Naturally the R9 290X supports AMD’s True Audio, FreeSync and finally the Mantle API. True Audio is a nice piece of technology, using audio processors on the GPU to offload audio processing from the CPU, and is supported by a handful of games, including Star Citizen, Thief and Murdered: Soul Suspect. Mantle is a low level API created by AMD, and works similar in principal to the much touted DirectX 12 (which the GPU also supports ). More games support Mantle than True Audio, and Thief, Star Citizen, BattleField 4 and Hardline, Dragon Age Inquisition are among a few of the bigger titles to enjoy Mantle support.
FreeSync is an alternative to Nvidia’s G-Sync, and while it’s still early, support looks solid, with a nice array of monitors supporting the technology being released from various manufactures. The premise of FreeSync (such like G-Sync) is to reduce both the latency of having vertical sync enabled, but also eliminating screen tearing by having it disabled. The positive Free Sync has over GSync is that it doesn’t require special circuitry inside the monitor, thus cutting the cost of the screen rather heftily.
Test Setup
The test rig we’re using for the benchmark was an Intel Haswell 4770K running at 4.4GHZ, with 16GB of DDR3 RAM, clocked at 1750MHZ and paired with an SSD. The latest Nvidia and AMD drivers are downloaded and used in benchmarking. For the operating system, we’re running Windows 8.1 64-bit and all games are their latest patched version from either Steam, Uplay or Origin. All benchmarks were run on their highest settings at either 1920×1080 or 2560×1440 and V-Sync was of course disabled for each benchmark. We’re looking at the average frame rate throughout.
1080P Benchmarks
About 95 percent of PC gamer’s run at the 1080P resolution, at least according to the Steam Hardware survey. It shouldn’t come as a shock than any £250 GPU today can rip through virtually any title at this common resolution at 60FPS or above, unless you’re running with SSAA enabled.
Crysis 3 for instance, hits an average of 70FPS for all 3 cards we’re benching today, should you opt to use the shader based FXAA for your Anti-aliasing. The action does occasionally dip below 60FPS when things become particularly hectic (for instance, not being sneaky and causing a lot of explosions in the Welcome to the Jungle level), but V-Sync at 60FPS is certainly more than possible.
Thief’s results might surprise you – with not just a large difference between the Mantle and DX11 results, but the DX11 results of R9 290X being considerably lower than either Nvidia card. We re-ran the tests a few times but 65 seemed to be about the average, it’s possible DX11 performance hasn’t been optimized on Thief because of Mantle. If you’re running a Radeon card on Thief (or any other Mantle supported title) you’ve no reason to opt to select DX11. Perhaps the biggest benefit of Mantle isn’t the average or high frame rates, but the “lowest” which is 43.9FPS for Direct3d 11 compared to Mantle’s 69FPS.
1920×1080 (average) | Radeon R9 290X | GTX 780 Ti | GTX 970 |
Crysis 3 + FXAA | 69.1 FPS | 71.2 FPS | 69.8 |
Metro: Last Light (SSAA) | 46.65 FPS | 52 FPS | 46.21 |
Tomb Raider FXAA | 77.9 FPS | 58 FPS | 80.1 |
Bioshock Infinite | 96.4 FPS | 1116FPS | 109 |
Thief | 97FPS (MANTLE) 65.0 FPS (DX11) | 73.1 FPS (DX11) | 72.1 FPS (DX11) |
1440P BenchMarking
Since 1440P screens have fallen in price, they’re considerably more common to run than 4K… not least of all because you’re able to achieve playable frame rates with a single, cheaper card. The pixel count increases by 77 percent over 1080P, which means the GPU’s have do a lot more work to render each frame of animation. Some gamer’s will also be tempted to use this resolution for “downscaling” on their 1080P screen, instead of, or combined with another Anti-Aliasing method.
If you’re unwilling to compromise on either image quality or resolution and are a ’60FPS or nothing’ kind of gamer, then you’ll possibly want to invest in a CrossFire or SLI rig. While each of the GPU’s being tested certainly can reach the 60FPS mark, when things get particularly busy (for example in the infamous Crysis 3 level ‘Welcome to the Jungle’) you’ll notice the frame-rate naturally taking a hit. It’s certainly very playable, but not ideal conditions.
Metro: The Last Light running at 2560x1440P with SSAA achieves virtually 30FPS locked on all of the GPU’s, in even the punishing built in benchmark. Overclocking the card (mentioned in further detail below) gives the extra performance required. While 30FPS isn’t ideal, it’s still the frame rate of a heck of a lot of console FPS titles, and of course you’re running a far higher resolution and with better graphics to boot.
2560×1440 (average) | Radeon R9 290X | GTX 780 Ti | GTX 970 |
Crysis 3 + FXAA | 45.1 FPS | 45 FPS | 43.3 |
Metro: Last Light (SSAA) | 29.3 FPS | 30.2 FPS | 29.1 |
Tomb Raider FXAA | 58.2 FPS | 61.8 FPS | 55.1 FPS |
Bioshock Infinite | 67.44 FPS | 74.1 FPS | 74.2 FPS |
Thief | 76FPS (MANTLE) | 52.3 FPS | 51.7 FPS |
Overclocking the Sapphire’s R9 290X
Overclocking any piece of hardware is luck of the draw, the silicone lottery if you will. The Tri-X already features a small factory overclock, so the question is – how much more can we get out of the GPU? Because of time constraints we haven’t had much time to play around with overclocking results, but we managed to push the core to 1160MHZ without any issue, and the memory clocks raised to 1340MHZ. We simply bumped up the core voltage and raised the power limit by 10 percent using MSI’s Afterburner.
The main point of the exercise was to see if the results scaled linearly with clock speed – and we’re pleased to report that this is indeed the case. Thief’s 1080P Mantle results went up to a rather nice 113 FPS, at 1080P; up from 97FPS at default clocks. Metro climbed to 32 FPS at 1440P with SSAA enabled, compared to 29.3 at standard clocks. Finally, Tomb Raider running at 1440P went to 61.5 FPS, up from 58.2.
The real bonus was the cooler managed this without us needing to mess about with fan speeds. We simply left it all on auto, and it stayed within the same 25 – 30 percent it had with the ‘stock’ speeds.
AMD Vs Nvidia Technology
The GPU market has changed significantly in the past 12 months, and not just because of the imminent release of DirectX 12. You’ll likely know Nvidia and AMD are both pushing their own technologies. Nvidia have G-Sync and of course hardware Physx. AMD meanwhile are pushing their Mantle API and True Audio technology, and FreeSync, which is their counter to Nvidia’s Gsync.
If you’re interested in Virtual Reality, AMD are also releasing LiquidVR SDK’s too, the technology claims to release the latency of VR hardware… then there’s Nvidia’s GameWorks of course, allowing certain games to have Nvidia only graphics options.
It’s tricky to ‘predict and then buy’ for the future, and despite Mantle’s very impressive performance numbers with BF4, Hardline, Dragon Age Inquisition and Thief it’s still a work in progress. There’s also the question of how much support it’ll have once DX12 is fully launched. There’s currently a reasonable smattering of Mantle API titles, so it’s still a very nice bonus if nothing else. TrueAudio’s future is somewhat less known. Thief (for example) uses it and it does reduce CPU usage, but not enough games currently have the future to sway a decision over a graphics card.
Meanwhile Nvidia’s G-Sync has been released and does a great job of both eliminating screen tearing and reducing latency which is associated with V-Sync enabled. Those who have used the technology are reluctant to go back to their old screens, but currently the selection of screens its available for is limited. In addition to this, it adds to the price of a new monitor. With that said, FreeSync is gaining a lot of traction in the market, and does a pretty damn similar job – and cheaper too. It’s hard to argue with “cheaper” – and it’s not as though the monitor range is poor either. As much as G-Sync is awesome, FreeSync is possibly better for the average customer, simply due tothe monitors costing less.
I actually really do like Nvidia’s hardware Physx technology, in certain games it adds a lot to the atmosphere, such as the wisps of smoke in Assassin’s Creed 4 Black Flag. There are other effects in Metro Last Light, such as debris and dust, and so on. It’s a nice extra, and for someone like myself who regularly makes graphics comparisons it’s a nice extra. For most gamer’s though they’re willing to do without it.
AMD’s TressFX technology (at least currently) works on Nvidia’s cards along with AMD’s own. Unfortunately TressFX hasn’t been widely adopted yet, likely because it’s so GPU intensive. Even TressFX 2 takes a huge toll on the consoles with Tomb Raider Definitive Edition.
It’s worth for a moment discussing DownSampling – the act of running your GPU at a higher resolution than your monitor can handle, and then having the GPU ‘scale’ it to your monitors native resolution. It proves greater visual details, and certainly was a feather in Nvidia’s cap – as it was extremely tricky to do with AMD cards. Nvidia adopted the feature and called it ‘Dynamic Super Resolution’. AMD have now added the much requested feature, and have labeled it ‘Virtual Super Resolution’.
While AMD have their Raptr application, and Nvidia their GeForce experience, tweaking your settings using these applications is a roll of the dice. For gamer’s who’ve a reasonable understanding of the basic settings, you’re better served to handle things yourself. If you do want to record your gameplay, GeForce does have Shadowplay, which I personally find quite useful. While AMD do have alternatives built in with Raptr, the technology isn’t quite so far along as ShadowPlay, thus I have to give the subtle nod to Nvidia on this point.
All in all, there’s certainly an argument of whose cards offer the best exclusive technology… but if DX12 rumors are true (and they’re unconfirmed at time of my writing this) you’ll be able to pair up an Nvidia and AMD card into the same machine and gain the benefits of both GPU’s.
AMD Radeon R9 290X Sapphire Tri-X Conclusions and Verdict
It’s hard to argue with the raw performance value of AMD’s R9 290X series, for gamer’s who’re looking at playing at 1440P or below, it’ll simply tear through titles without too much trouble. While it’s slower than Nvidia’s GTX 980, it’s priced to compete against the GTX 970, which it trades blows with in a variety of different benchmarks.
As I’m writing this, Nvidia’s offering is a little more expensive than AMD’s, but there’s certainly not a huge amount of pennies in it. The GTX 970 also received quite a lot of negative attention from gamer’s after the specification scandal, and that’s certainly buying AMD a lot of good will at the moment. I’ve little doubt some gamer’s will avoid Nvidia for a bit, at least until they win back customer confidence.
Sapphire’s GPU is well built, quiet and I’ve not run into any thermal problems at all. If you’re considering running two of these cards in CrossFire, remember the higher TDP requirements and ensure your PSU is up to the task. While most gamer’s rigs likely are outfitted with a good enough PSU to deal with one card (a 700 – 750W PSU is ideal, but you might be okay with a lower spec one assuming it has excellent rails), two cards will require something a little beefier.
So, the big question is – should you buy the R9 290X? Well, if your budget is around the £250 – 300 range, then it’s a very interesting purchase. While we do know AMD are waiting in the wings with R9 300 range, it’ll likely cost considerably more than the £250 a 290X will… and for a lot of gamer’s (particularly those who’re only running at 1440P or below) this GPU is more than you’ll need for some time.
While Mantle’s future is far from certain, there are titles which benefit from it now – including the upcoming BattleField Hardline. Until DX12 is released, it’s a good solid option for a low level API. The frame rate increases with Mantle over DX11 are obvious in any test (should your CPU be up to the task… a Quad Core or better is highly recommended).
It really comes down to the GTX 970 or the R9 290X in this particularly price range, but currently (as of time of writing) AMD’s offering is slightly cheaper. Performance is about even over the board, so really it’s hard to not recommend the cheaper option, particularly if you’re interested in FreeSync.