Hardware Guides stub

AMD R9 390 CrossFire vs. SLI GTX 970 Benchmark, Ft. Devil 13 Dual-Core 390

Posted on January 21, 2016

Our last head-to-head GPU comparison benchmarked the performance of a single GTX 980 Ti versus two GTX 970s in SLI. Following some astute reader suggestions, we've acquired a PowerColor Devil 13 dual-core R9 390 – two GPUs on one card – to test as a CrossFire stand-in against SLI GTX 970s. Performance analysis is accompanied by power draw and thermal tests, though a proper, full review on the Devil 13 card will follow this content in short order.

For today, the focus is on this head-to-head comparison. FPS benchmarks look at performance of 2x CrossFire R9 390s vs. 2x SLI GTX 970s, including supporting data from a GTX 980 Ti, 980, and R9 390X. We'll also work toward answering the question of whether CrossFire and SLI are worth it in this particular scenario, as opposed to investing in a single, more expensive GPU.

CrossFire R9 390 vs. SLI GTX 970 Benchmarks [Video]

PowerColor Devil 13 Dual-Core R9 390 Specs

ModelAXR9 390 II 16GBD5
GPUR9 390 x2
Core Clock (GPU)1000MHz
Stream Processors5120 total stream processors
Memory Config2x 8GB GDDR5 banks
Memory Interface2x 512-bit
Memory Speed (GPU)1350MHz

PowerColor's Devil 13 video card is a three-slot, five-pound behemoth. It's a single-card solution branded as “dual-core,” pursuant to the existence of two R9 390 GPUs on the PCB. The weight primarily comes from a rigid backplate for structural support, opposed by a thick alloy heatsink under the faceplate. PowerColor goes heavy on the metals for this card, and that's something that we'll look at more closely in the forthcoming, standalone review – sag is definitely a concern. The card runs for $600 – similarly priced to dual-GTX 970s (ranging from ~$620-$660) – but has a pretty massive $100 MIR right now.

The Devil 13 uses a three-fan array for cooling, each using a five-fin scoop design. The power block is the most impressive – 32 pins of power (4x8-pin headers). Granted, that's the same as you'd find for most dual-card configurations, it just looks impressive on one card.

PowerColor's Devil 13 is equipped with 16GB of GDDR5 memory, but that's not “stackable” memory for any of the games we tested; each GPU has access to its own 8GB GDDR5 pool, and they cannot share memory pools to exceed 8GB in DirectX 11.

At its stock speeds, the Devil 13 runs a core clock of 1100MHz and memory clock of 1350MHz. Overclocking won't be tested until the review.

EVGA GTX 970 SuperSC Specs

 EVGA 970 HybridEVGA 970 SSCMSI GTX 970 GamingGTX 970 Stock
Base Clock (GPU)1140MHz1190MHz1140MHz1050MHz
Boost Clock (GPU)1279MHz1342MHz1279MHz1178MHz
Memory Clock7010MHz7010MHz7010MHz7000MHz
Mem Spec4GB GDDR5

From our previous article:

Considerations of SLI

There are two primary scenarios where SLI or CrossFire are used: A later upgrade when half the configuration is already owned and a day-one, brand new build. In the event of the first scenario – where the user already owns one GTX 970 and is considering a second – the value considerations are different and will be discussed only in the conclusion. Also in that scenario, we'd always recommend buying that second card (if truly desired) before it exits production. Almost every time a card exits official production, prices spike on retailers and second-hand markets. It's almost always better to buy a newer, single card once that happens, as the spiked prices are hardly sane or good value.

SLI and CrossFire are also historically prone to micro-stuttering as a result of their dual-processing technique (normally AFR, or alternate frame rendering). GPUs render alternating frames when using the AFR technique; more explicitly, GPU A will render all odd frames (1, 3, 5, 7) while GPU B renders all even frames (0, 2, 4, 6). Micro-stutter can be so extreme in some games and driver sets that SLI becomes undesirable, even if average FPS is improved over single-card configurations. In these situations, disabling one of the two GPUs (but leaving the GPU physically installed) will reduce or eliminate micro-stutter, but then you're only getting half the investment outputting – certainly an unwanted situation.

Micro-sutter is observable as a result of disparate frame-time gaps, where the time between frame renders is inconsistent enough that the user can perceive a jarring difference – e.g. jumping from a 16ms render to a 30ms or 40ms render time (or worse). Adaptive synchronization technologies have helped to mitigate this phenomenon. Monitors supporting G-Sync and FreeSync are of particular importance for consideration when running SLI or CrossFire. The GPU connected to the display manages the sync technology. NVidia SLI setups fully support G-Sync. AMD CrossFire setups, as of driver version 15.7 from July 2015, also fully support FreeSync.

Another performance consideration for multi-GPU cards – in a similar vein to micro-stutter – 1% and 0.1% low frame performance can sometimes be worse than single-card setups. This is another point that could potentially favor a single, higher-end GPU, but recent optimizations made to drivers and games may reduce the impact to manageable territory – we'll look at that below.

No overclocking was applied during these tests, which does mean that the slower of the two cards (the 970 Hybrid, at 1140MHz) will marginally impact the overall performance. SLI overclocking is being done in one of our next tests and has been reserved for that content. We've already tested this 1140MHz vs. 1190MHz differential over here, if you're curious about what kind of delta that produces. We mostly saw differences centered around the 1.87% to 2.5% area.

Test Methodology

We tested using our 2015 multi-GPU test bench. Our thanks to supporting hardware vendors for supplying some of the test components.

The latest AMD drivers (15.12) were used for testing. NVidia's 361.43 drivers were used for testing the latest games. Game settings were manually controlled for the DUT. All games were run at presets defined in their respective charts. We disable brand-supported technologies in games, like The Witcher 3's HairWorks and HBAO. All other game settings are defined in respective game benchmarks, which we publish separately from GPU reviews. Our test courses, in the event manual testing is executed, are also uploaded within that content. This allows others to replicate our results by studying our bench courses.

Each game was tested for 30 seconds in an identical scenario, then repeated three times for parity. The results in the tables are averages of these three runs.

Z97 Bench:

GN Test Bench 2015NameCourtesy OfCost
Video Card

This is what we're testing!

CPUIntel i7-4790K CPUCyberPower
Memory32GB 2133MHz HyperX Savage RAMKingston Tech.$300
MotherboardGigabyte Z97X Gaming G1GamersNexus$285
Power SupplyNZXT 1200W HALE90 V2NZXT$300
SSDHyperX Predator PCI-e SSDKingston Tech.TBD
CaseTop Deck Tech StationGamersNexus$250
CPU CoolerBe Quiet! Dark Rock 3Be Quiet!~$60

X99 Bench:

GN Test Bench 2015NameCourtesy OfCost
Video Card

This is what we're testing!

MemoryKingston 16GB DDR4 PredatorKingston Tech.$245
MotherboardEVGA X99 ClassifiedGamersNexus$365
Power SupplyNZXT 1200W HALE90 V2NZXT$300
SSDHyperX Savage SSDKingston Tech.$130
CaseTop Deck Tech StationGamersNexus$250
CPU CoolerNZXT Kraken X41 CLCNZXT$110

Average FPS, 1% low, and 0.1% low times are measured. We do not measure maximum or minimum FPS results as we consider these numbers to be pure outliers. Instead, we take an average of the lowest 1% of results (1% low) to show real-world, noticeable dips; we then take an average of the lowest 0.1% of results for severe spikes.

Overclocking was performed incrementally using MSI Afterburner. Parity of overclocks was checked using GPU-Z. Overclocks were applied and tested for five minutes at a time and, if the test passed, would be incremented to the next step. Once a failure was provoked or instability found -- either through flickering / artifacts or through a driver failure -- we stepped-down the OC and ran a 30-minute endurance test using 3DMark's FireStrike Extreme on loop (GFX test 2).

Thermals and power draw were both measured using our secondary test bench, which we reserve for this purpose. The bench uses the below components. Thermals are measured using AIDA64. We execute an in-house automated script to ensure identical start and end times for the test. 3DMark FireStrike Extreme (GFX test 2) is executed on loop for 25 minutes and logged. Parity is checked with GPU-Z.

Thermals, power, and overclocking were all conducted on the Z97 bench above.

Fallout 4 Benchmark – 970 SLI vs. R9 390 CrossFire, 980 Ti, 980, 390X


Fallout 4 was unexpectedly brutal in this particular benchmark. The SLI 970 configuration handily stomps the CF 390 FPS by 36.5%, which seems to be a result of non-existent CrossFire scaling in Fallout 4. This issue presents itself to some degree in a few other games, but is most notable in Fallout 4. Until a point at which scaling is properly supported on this configuration (Devil 13 with 2x R9 390s), it appears that Fallout 4 runs most efficiently on just about any other configuration.

Metro: Last Light Benchmark – 970 SLI vs. R9 390 CrossFire, 980 Ti, 980, 390X


970v390cf-mll-1440 970v390cf-mll-4k

At the lower two resolutions – 1080p and 1440p – the SLI GTX 970s marginally outperform the dual R9 390 configuration. 1080p fronts a ~1FPS difference, or 0.87% delta. 1440p is, again, 1FPS different (1.13% delta). These measurements are outside of margin of error and were a combination of multiple test passes, validated as accurate. At 4K, the R9 390 setup pulls ahead by 3.51% (58FPS vs. 56FPS AVG on the 970s). This is similar to what we've seen on most of AMD's cards, which is generally an upward performance trend as resolution increases.

At all three resolutions, performance is close enough to be inconsequential and unnoticeable to the end-user. There is effectively no perceptible difference in AVG FPS metrics. Even the 1% low and 0.1% low frametimes are respectable across the board here – respectable, but inconsequential for comparative purposes.

Shadow of Mordor Benchmark – 970 SLI vs. R9 390 CrossFire, 980 Ti, 980, 390X


970v390cf-mordor-1440 970v390cf-mordor-4k

The CF 390s finally pull ahead in Shadow of Mordor, leading the SLI 970s by 4.7% at 1080 (130 vs. 124FPS AVG) and 4.1% at 1440p (100FPS vs. 96FPS AVG). As with the previous test, 4K produces a slightly furthered lead for the AMD solution (8% – 61FPS vs. 56.3FPS AVG). Again, not hugely noticeable. The 8% delta begins to enter a realm of possible detection by users, but isn't quite there yet. The average user will not detect the FPS difference between these two solutions.

As for frametimes, 0.1% metrics run poor on both the SLI and CrossFire configurations when compared against neighboring single-card alternatives. The 2x R9 390s run a bit worse. 51.3FPS 0.1% low against an average throughput of 130FPS (1080p) will be detected as the occasional 'dip,' 'stutter,' or 'lag' (or other colloquialism) on rare occasion.

Assassin's Creed Syndicate Benchmark – 970 SLI vs. 980 Ti, 970, 980, 390X


We technically have a line for the PCS Devil 13 above – but in reality, the card was more of a “DNF.” We were only able to get Assassin's Creed: Syndicate to survive for long enough to execute one test pass per game launch. The dual R9 390 setup experienced such frequent crashes in ACS that we could not operate the 1440p/ultra settings for longer than one minute in-game.

The data was included above because it was confidently collected, but from a user standpoint, ACS would have been unplayable in our tested configuration for the Devil 13. We have reached-out to appropriate parties and are researching this issue. This seems most likely to be some sort of driver or game optimization issue – not particularly surprising, as ACS has experienced SLI / CF scalability issues several times in the past.

COD: Black Ops III Benchmark – 970 SLI vs. 980 Ti, 970, 980, 390X



We've done extensive testing with Black Ops III, including a graphics optimization guide for performance tuning. The game has since released major patches which affected performance on nVidia and AMD devices, but most heavily on the AMD side. Our above chart includes the most up-to-date patch for benchmarking.

At 1440p, the CrossFire R9 390s win-out over the SLI GTX 970s by a measurable 10.24% (154FPS vs. 139FPS). At such framerates it becomes debatable how usable an extra 15 frames realistically is, but users seeking saturation of a 144Hz throughput potential would be advantaged by using the 2x R9 390s in this case. To be fair, performance tuning on the 970s could produce a similar FPS – but for maxed settings, the 390s punch a bit higher. The GTX 970s have superior 1% lows, but the gap between 65.3 and 72 1% low FPS is effectively imperceptible at the end of the day.

4K punishes the GTX 970s in the 0.1% low department, something we saw reflected across two re-tests (nine total test passes) for validation.

The Witcher 3 Benchmark – 970 SLI vs. 980 Ti, 970, 980, 390X



The Witcher 3 benefits SLI GTX 970s, producing a large 22.2% advantage at 1440p and 40% advantage at 4K resolution. The CF 390s exhibit significantly worse 0.1% and 1% low frametimes (more than 100% worse than the 970s), something we saw 'clinically' as framedrops while testing.

GTA V Benchmark – 970 SLI vs. 980 Ti, 970, 980, 390X


970v390cf-gtav-1440 970v390cf-gtav-4k

GTA V had some anomalous issues with the CrossFire configuration during testing. Very rarely – about once per four-minute test period – we observed a significant freeze in performance which endured for about 1-1.5 seconds. This is reflected abysmally in low frametimes for some tests, depending on whether or not the issue was reproduced during the test period.

At 1080p, the SLI 970s are advantaged by a noticeable 17.86% (122FPS vs. 102FPS AVG). More noticeable is the 0.1% low metric at 77.3FPS vs. 34.7FPS for the 2x R9 390s. 1440p saw a 6.1% AVG FPS delta that favored the 970s, but severely low 0.1% output (8FPS) on the 2x 390s prohibited smooth play.

Just Cause 3 Benchmark – SLI 970 vs. 980 Ti, 980, 970, 390X



We've already ruled that Just Cause 3 has poor multi-GPU scaling – twice, actually – but there's no harm in re-testing. Our experience with Just Cause 3 has been disappointing for multi-card setups. Even as SLI scaling has improved, the delta against a single card is small enough that it'd be a waste to run two cards rather than a single, more powerful one. This is especially true for the 390s, where performance seems to indicate that there's no CF scaling at all with the Devil 13.

Performance Data Recap

 AVG FPS WinnerDeltaNotesNoticeable?
MLL1080: CF 390
1440: CF 390
4K: CF 390
980 Ti loses ground as resolution increases.No
FO41080: SLI 97036.50%CF scaling does not seem to work with the 2x390 Devil 13.Yes!
Mordor1080: CF 390
1440: CF 390
4K: CF 390
Poor 0.1% low frametimes on CF390 (and SLI, though not as bad).No, though 4K begins to enter this territory.
W31440: SLI 970
4K: SLI 970
Bad low frametimes on CF 390 (102% worse than SLI 970).Yes!
BLOPS31440: CF 390
4K: CF 390
970 has better low frametimes at lower resolutions, but falls hard at 4K.Yes - pushes into 144Hz range, desirable for competitive FPS.
ACS1440: SLI 970N/ADNF. CF 390 crashed ACS.Yes - CF 390 DNF.
JC3Both SLI & CF are losers here.N/ANo reasonable SLI or CF scaling in JC3 at this time. Very bad value to buy multi-card for this game.N/A
GTA1080: SLI 970
1440: SLI 970
4K: CF 390
SLI 970 noticeably better at 1080. Disparity vanishes as resolution increases.Yes, at 1080.
No at 1440/4K.

Power Draw Benchmark – SLI GTX 970s vs. CrossFire R9 390s


Here's where we start seeing differentiating stats. The PCS Dual R9 390 card puts the peak system load up to 582.73W, compared against the SLI GTX 970 power draw of 411.72W. ~170W is certainly a noticeable difference and will impact PSU selection. The performance gain – when there is one – does not seem to correlate with the 34.4% difference in power.

Thermal Benchmark – Devil 13 R9 390 & GTX 970 SSC


This chart and data will be discussed more heavily in our future review of the card, since thermals are more heavily impacted by coolers and AIB partners in multi-GPU situations. The cards operate about 1C apart – but that's pretty useless for shoppers of CrossFire configurations, since the thermals will hinge entirely upon which two cards you buy. The thermal measurement is only useful in scenarios where the Devil 13 is considered. We don't objectively measure dBA output (though may begin soon), but can subjectively state that the Devil 13 is the loudest card we've ever tested. It's within 'unpleasant' territory when operating at full-tilt.

Conclusion: SLI GTX 970s or CrossFire R9 390s?


The short answer, as we declared in the “SLI 970s or 980 Ti?” article, is “it depends.” It is still safer to trend towards similarly-priced single-GPU configurations; such a purchase avoids buyer's remorse in situations like those generated by Just Cause 3, where scalability may as well not exist. This stated, existing PC builds which already contain one of the two GPUs could benefit from an upgrade – and it'd be in the ~$300-$350 range (GTX 970 or R9 390), not in the $600+ range for a new build.

For a brand new system build, we'd generally advise in favor of a single-GPU for its versatility. Some specific games are so heavily advantaged by multi-GPU – Black Ops III in particular, for both the 970s and CF 390s – that it'd be worth doing, but only if that game is going to be a primary source of entertainment or competition.

Our advice is to look over the charts relevant to your gaming, read the accompanying text, and determine what's best for you.

Do note that running two R9 390s will draw substantially more power than two 970s (and more still than a single Fury X or 980 Ti). Adjust PSU purchases appropriately.

Editorial & Testing: Steve “Lelldorianx” Burke
Video & Video Editing: Keegan “HornetSting” Gallick