At its suite at CES this year, AMD was showing off a couple of new technologies. First, we got to see the upcoming Polaris GPU architecture in action running Star Wars Battlefront with some power meters hooked up. This is a similar demo to what I saw in Sonoma back in December, and it compares an upcoming Polaris GPU against the NVIDIA GTX 950. The result: total system power of just 86 watts on the AMD GPU and over 150 watts on the NVIDIA GPU.
Another new development from AMD on the FreeSync side of things was HDMI integration. The company took time at CES to showcase a pair of new HDMI-enabled monitors working with FreeSync variable refresh rate technology.
PC Perspective's CES 2016 coverage is sponsored by Logitech.
Follow all of our coverage of the show at https://pcper.com/ces!
Working 14nm silicon! Go
Working 14nm silicon! Go AMD! hope they are first to market since they have GlobalFoundries/Samsung and Nvidia is only TSMC.
if they are the one first to
if they are the one first to market then it is even better for nvidia actually.
i wonder if a Gsync monitor’s
i wonder if a Gsync monitor’s HDMI port would be compatible freesync with the adequat drivers, if that’s the case it would ruins nvidia’s eco-system, and ppl would be able to switch vendor more freely without feeling traped by investing extra on the monitor.
I don’t think it’s going to
I don’t think it’s going to happen. First Nvidia will not let any future monitors with GSync to also support FreeSync of any kind, either by using a second DP, or an HDMI. As for older monitors I doubt they have the capability of supporting FreeSync with a firmware update.
What will happen in the future is someday Intel to support Adaptive Sync and then Nvidia will be left alone to either come up with new innovations that will give value to GSync again, or finally to support Adaptive Sync. Of course with 80% discrete cards market share Nvidia can continue it’s greedy practices and keep taking $200 from every Nvidia card owner wanted to have adaptive sync in his system.
well if a monitor supports
well if a monitor supports G-sync, then it’s panel can support freesync, and nvidia cannot release a monitor without HDMI, and AMD didnt say HDMI 2.0 only, so apparently it works with 1.4 if am not wrong, so technicaly the G-sync monitor can be freesync compatible by driver update, and i dont see why a monitor vendor wouldn’t add it, that is still quiet a bump into possible prospect, unless Nvidia pays him for exclusivity.
thats about already released monitor, even if i am wrong on this, the future gsync monitors will most likely support freesync too.
anyhow i think by the end of this year, Nvidia will either release some unique features for G-sync, or throw it away and get on the adaptive sync train.
Why would anyone buy a G-Sync
Why would anyone buy a G-Sync + FreeSync monitor that will cost $200 more then just the HDMI FreeSync monitor
might be expensive but using
might be expensive but using such monitor will not going to lock you to AMD or nvidia. you can say it is nvidia fault for not supporting adaptive sync but the simple truth still remain if you were using adative sync monitors that will force you to use AMD gpu. doesn’t matter if intel supports adaptive sync or not because intel did not sell discrete gpu.
Unfortunately FreeSync over
Unfortunately FreeSync over HDMI changes nothing. What you see today in the market, with GSync only and FreeSync only monitors, will continue. Only Nvidia can change that. Why I am saying that HDMI changes nothing? Because it was extremely easy for monitor vendors to add a second DP for FreeSync support. Why haven’t they done it? Especially these last months? Because Nvidia forbids it. Also on those monitors the Adaptive Sync capability probably is on the GSync module itself, so by using the HDMI and a card that can support FreeSync over HDMI, I don’t think it will show any Adaptive Sync support coming from the monitor. Nvidia holds all the keys here and if there is any way to hack it, you can bet that the next version of the GSync module will prevent it.
The Gsync module polls the
The Gsync module polls the GPU for its vendor ID, and immediately locks out variable refresh rate if anything other than whatever the Nvidia equivalent of “GenuineIntel” gets returned. There’s also that.
i doubt it.
the GSYNC module
i doubt it.
the GSYNC module replaces the scaler unit which is also part of the issue supporting more than one input.
I assume HDMI via GSYNC as a separate input will be “GSYNC v2.0” and higher only thus still using the same module perhaps with switch between inputs.
Freesync assumes the scaler is there, and of course GSync drivers are needed to support the GSync module.
We’ll have an open standard once it makes financial sense for NVidia to do this and that’s probably a few years off, though if we could get HDTV adoption and enable FreeSync on the new consoles (PS4 XB1) that would be amazing and help push an open standard as SteamOS would adopt that too (but only work on AMD cards at first).
PCper covered alot of cool
PCper covered alot of cool things this week, but still missed one from AMD. HDR.
https://www.youtube.com/watch?v=MnvctltAKLE
An yes all HDR monitors/TVs at CES were running R9 Fury nano cards which was confirmed by AMD on their twitter feed this week.
Which is even more tech that Nvidia is just going to blow off.
HDR (with or without 10bpp)
HDR (with or without 10bpp) is not exactly a new feature. Lost Coast was over a decade ago, every GPU supports switching between PC and TV levels (a small dynamic range change) and multiple colourspaces, including arbitrary colourspaces with user-entered lookup tables.
As for the last few decades, the limitation is not in the GPU or driver support, but in software actually supporting esoteric video standards. Games would need to support larger dynamic ranges, different colourspaces and/or increased bit-depth to make a difference.
I think is likely that Sony
I think is likely that Sony will adopt freesync for PS4 as well as their 2016 TVs. Considering that some of their existing 4K TVs accept 1080p/120Hz this would many of us interested.
I have to admit that the
I have to admit that the power consumption numbers from that Polaris demo blows my mind. Is my mental math completely off, or could we be talking ~30W GPU power in that demo?
I am less enthusiastic as
I am less enthusiastic as there are already Intel/Nvidia notebooks capable of the same under 100 Watt.
Really? Are you sure? Isn’t
Really? Are you sure? Isn’t the GTX 950 one of Nvidia’s most efficient cards? And after all, all current Nvidia cards are based on the same microarchitecture, so other than some mobile designs possibly going “slow and wide” for added efficiency, as well as lacking a few components a desktop card would have, they are the same.
What you said really makes a
What you said really makes a difference. Game chosen was also not an accident. IIRC AMD worked with EA on Mantle support for Battlefield so they know it pretty well, don’t you agree?
The chosen game is listed on
The chosen game is listed on nvidia’s list of optimised games, what game would you have them use, Fallout 4?
I am just saying we should
I am just saying we should get it with a grain of salt.
My estimate is that AMD Polaris Perf/Watt (as is) is ~30% better then Nvidia Maxwell in this particular game. I think it can give AMD noticeable lead only if Nvidia is late at 14/16nm and we take drivers out the equation.
Personally I hope for quick adoption of DX12/Vulcan as it can be good opportunity for other GPU vendors to compete in PC graphics card market.