Another retail card reveals the results
It looks like AMD might still have some issues on their hands with the R9 290 series of cards
Since the release of the new AMD Radeon R9 290X and R9 290 graphics cards, we have been very curious about the latest implementation of AMD's PowerTune technology and its scaling of clock frequency as a result of the thermal levels of each graphics card. In the first article covering this topic, I addressed the questions from AMD's point of view – is this really a "configurable" GPU as AMD claims or are there issues that need to be addressed by the company?
The biggest problems I found were in the highly variable clock speeds from game to game and from a "cold" GPU to a "hot" GPU. This affects the way many people in the industry test and benchmark graphics cards as running a game for just a couple of minutes could result in average and reported frame rates that are much higher than what you see 10-20 minutes into gameplay. This was rarely something that had to be dealt with before (especially on AMD graphics cards) so to many it caught them off-guard.
Because of the new PowerTune technology, as I have discussed several times before, clock speeds are starting off quite high on the R9 290X (at or near the 1000 MHz quoted speed) and then slowly drifting down over time.
Another wrinkle occurred when Tom's Hardware reported that retail graphics cards they had seen were showing markedly lower performance than the reference samples sent to reviewers. As a result, AMD quickly released a new driver that attempted to address the problem by normalizing to fan speeds (RPM) rather than fan voltage (percentage). The result was consistent fan speeds on different cards and thus much closer performance.
However, with all that being said, I was still testing retail AMD Radeon R9 290X and R9 290 cards that were PURCHASED rather than sampled, to keep tabs on the situation.
After picking up a retail, off the shelf Sapphire branded Radeon R9 290X, I set out to do more testing. This time though, rather than simply game for a 5 minute window, I decided to loop gameplay in Metro: Last Light for 25 minutes at a resolution of 2560×1440 with Very High quality settings. The results are you'll see are pretty interesting. The "reference" card labeled here is the original R9 290X sampled to me from AMD directly.
Our first set of tests show the default, Quiet mode on the R9 290X.
Click to Enlarge
For the first nearly 3 minutes of game play, both cards are performing identically and are able to stick near the 1.0 GHz clock speed advertised by AMD and partners. At that point though the blue line, representing the Sapphire R9 290X retail card, starts to drop its clock speed, settling somewhere in the 860 MHz mark.
The green line lasts a bit longer at 1000 MHz until around 250 seconds (just over 4 minute) have elapsed then it too starts to drop in clock speeds. But, the decrease is not nearly as dramatic – clocks seem to hover in the mid-930 MHz.
Click to Enlarge
In fact, over the entire 25 minute period (1500 seconds) shown here, the retail R9 290X card averaged 869 MHz (including the time at the beginning at 1.0 GHz) while the reference card sent to us from AMD averaged 930 MHz. That results in a 6.5% drop in clock speed delta which should almost perfectly match performance differences in games that are GPU limited (most of them.)
Click to Enlarge
The fan speed adjustment made by AMD with the 13.11 V9.2 driver was functioning as planned though – both cards were running at the expected 2200 RPM levels and ramped up nearly identically as well.
But what changes if we switch over the Uber mode on the R9 290X? The setting that enabled 55% fan speeds, and with it more noise?
Click to Enlarge
Click to Enlarge
You only see the blue line here from the Sapphire results because it is overwriting the green line of the reference card – both are running at essentially the same performance levels and nearly keep the 1000 MHz frequency across the entirety of the 25 minute gaming period. The retail card averages 996 MHz while the reference card averages 999 MHz – pretty damn close.
Click to Enlarge
However, what I found very interesting is that these cards did this at different fans speeds. It would appear that the 13.11 V9.2 driver did NOT normalize the fan speeds for Uber mode as 55% reported on both cards results in fan speeds that differ by about 200 RPM. That means the blue line, representing our retail card, is going to run louder than the reference card, and not by a tiny margin.
The Saga Continues…
As we approach the holiday season, I am once again left with information that paints a bad light on the retail versus sampled R9 290X cards but without enough data to really make any kind of definitive conclusions. In reality, I would need dozens of R9 290X or R9 290 cards to make a concrete statement on the methods that AMD is employing, but unfortunately my credit card wouldn't appreciate that.
Even though we are only showing a single retail card against a single sampled R9 290X from AMD directly, these reports continue to pile up. The 6.5% clock speed difference we are seeing seems large enough to warrant concern, but not large enough to start a full-on battle over it.
My stance on the Hawaii architecture and the new PowerTune technology remains the same even after this new data: AMD needs to define a "base" clock and a "typical" clock that users can expect. Otherwise, we will continue to see reports and reporting on the variance that exists between retail units. The quick fix that AMD's driver team implemented to normalize the fan speed on RPM rather than percentage clearly helped, but it has not addressed the issue in total.
Here's hoping AMD comes back from the holiday with some new ideas in mind!
- AMD Radeon R9 290X 4GB – $549 (Newegg.com)
- AMD Radeon R9 290 4GB – $399 (Newegg.com)
- AMD Radeon R9 280X 3GB – $299 (Newegg.com)
- NVIDIA GeForce GTX TITAN 6GB – $999 (Newegg.com)
- NVIDIA GeForce GTX 780 Ti 3GB – $699 (Newegg.com)
- NVIDIA GeForce GTX 780 3GB – $499 (Newegg.com)
- NVIDIA GeForce GTX 770 2GB – $329 (Newegg.com)
AMD always has quality issues
AMD always has quality issues in their flagship. Crossfire, weird anomalies in retail cards, noise, heat, power draw all way too much, and when they get caught with their pants down, it takes forever for them to fix it. Pros buy Nvidia 80% of the time because they have to. They really don’t have choice.
True that!
True that!
Well thanks for that bout of
Well thanks for that bout of pointless speculation and hearsay. Well done.
Well thanks for that bout of
Well thanks for that bout of pointless speculation and hearsay. Well done.
Those are the same issues
Those are the same issues Nvidia had with past cards.
True that!
True that!
I have one explanation: Ryan
I have one explanation: Ryan Shrout = Fanboy
Its obvious.
^ Troll, ignore.
^ Troll, ignore.
^ Troll, ignore.
^ Troll, ignore.
^ TROLL, IGNORE
^ TROLL, IGNORE
Ryan, if you check out
Ryan, if you check out Legit’s article, you’ll see the problem seems to just be caused by the BIOS as when LR flashed the review card’s BIOS onto a shop-bought one, they performed identically.
http://www.legitreviews.com/amd-radeon-r9-290x-press-sample-versus-retail_129583
Any chance of you trying this and updating your article with the results?
i’m tired of reading about
i’m tired of reading about the fanboys on both sides of this issue in the comments. my god people grow up for you act like 10 year olds. for one stop posting Anonymously and use a name if you wish to bash one side or other. my guess is your to chicken shit to do so, please prove me wrong then. the key is to do your research by looking at many diff sites and coming to a your own conclusion on what works best for YOU. Screw off you fanboys on both sides. your just showing that your stuck at the age 0f 10 and can not grow up and have an adult discussion. thanks for all reviews, ignore the fanboy asses, and keep up the good work for all of us.
the 6er
Says the anonymous
Says the anonymous guy…..hypocrite.
What voltages were the cards
What voltages were the cards at because it could be that sapphire was being generous and gave their card 1.2V as stock voltage which is much too high for 1Ghz on GCN 1.15V is enough for my terrible(1.285V gives 1190mhz)7970 to do 1150mhz. so this should only need 1.1 to get 1Ghz.
It’s not rocket science to
It’s not rocket science to say what’s going to fix these cards. It’s going to take a series of bios updates along with new coolers to get these cards fixed, I personally saw a lot of issues with the 7000 series cards since I got the 7850’s earlier this year.
Case and point. I had issues with my 2GB ASUS Radeon 7850’s in various games, including Diablo 3, Mass Effect 3, and Skyrim, until I flashed them one by one to the GOP UEFI BIOS from ASUS, and that fix a lot of problems, including the driver crashing when alt tabbing out of games. I’m assuming that ASUS got the bios from AMD, and then ASUS then fine tuned it for their cards.
I don’t hate either company, but I have seen products from both companies that were rushed out the door before they were truly ready, all in the name of profit and worshipping the God Of Money, Payola!!!
I have yet to see a review of any of the ASUS R200 series cards anywhere on the internet most of the mid range cards, except for the highend cards have dual fan HSF’s on them, it would be nice to see a review of midrange ASUS R200 series cards.
As much as dyed in the wool
As much as dyed in the wool ATI fans would like to attempt to deny the poor QC from ATI add-in-board partners, mixed with half-baked, or completely ill-thought engineering descisions, in both hardware and software, from ATI itself — I think anyone rational, without skin in the game, or a purchase dragging at their mind, has noticed a pattern in terms of things like this.
I myself made the mistake of purchasing 3x 5870s, when they were widely lauded as the best card at the time, as having incredible multi-gpu scaling, quiet performance, and good build quality.
Perhaps this was true in the 4XXX series, but it certainly wasn’t true of the 5870 family of cards. Constant grey screens at 2560×1600, awful driver problems, terrible multi-gpu scaling, with, at that point in time, NO OPTION to force AFR behaviors on a per application basis. What a joke those cards were. Even today they have issues with hardware accelerated flash. They performed so poorly I GAVE THEM AWAY.
Since then, I’ve had 580 sli, and a GTX 780. Each laptop I’ve purchased has been Nvidia, and any card I’ve bought as a present will be Nvidia. Fact of the matter is, when we stop hearing stories like this, I may believe ATI has turned around. But it is JUST like them to sweep a problem like this under the rug right after release, and persist in doing so for months, before finally copping to the idea that there is a problem, and slowly working to fix it, before giving up, like they did on the 5870 class of card.
AMD needs to show and tell, show and prove, not tell and talk, and say and tell, like is happening now.
good, we even it out. because
good, we even it out. because after Tom (Peterson’s)? condescending half-truth regurgitated bile “we may in the future” to Ryan’s “will Gsync be made available as cross platform” question. I will never buy another Nvidia product.
The answer was and is “no”. had he said; nope, let them make their own! I could have respected that…they made it after all.
nite nite