UPDATE: If you missed our live stream today that covered NVIDIA G-Sync technology, you can watch the replay embedded below. NVIDIA's Tom Petersen stops by to talk about G-Sync in both high level and granular detail while showing off some demonstrations of why G-Sync is so important. Enjoy!!
Last week NVIDIA hosted press and developers in Montreal to discuss a couple of new technologies, the most impressive of which was NVIDIA G-Sync, a new monitor solution that looks to solve the eternal debate of smoothness against latency. If you haven't read about G-Sync and how impressive it was when first tested on Friday, you should check out my initial write up, NVIDIA G-Sync: Death of the Refresh Rate, that not only does that, but dives into the reason the technology shift was necessary in the first place.
G-Sync essentially functions by altering and controlling the vBlank signal sent to the monitor. In a normal configuration, vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time. That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc. What NVIDIA will now do in the driver and firmware is lengthen or shorten the vBlank signal as desired and will send it when one of two criteria is met.
- A new frame has completed rendering and has been copied to the front buffer. Sending vBlank at this time will tell the screen grab data from the card and display it immediately.
- A substantial amount of time has passed and the currently displayed image needs to be refreshed to avoid brightness variation.
In current display timing setups, the submission of the vBlank signal has been completely independent from the rendering pipeline. The result was varying frame latency and either horizontal tearing or fixed refresh frame rates. With NVIDIA G-Sync creating an intelligent connection between rendering and frame updating, the display of PC games is fundamentally changed.
Every person that saw the technology, including other media members and even developers like John Carmack, Johan Andersson and Tim Sweeney, came away knowing that this was the future of PC gaming. (If you didn't see the panel that featured those three developers on stage, you are missing out.)
But it is definitely a complicated technology and I have already seen a lot of confusion about it in our comment threads on PC Perspective. To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Monday afternoon where he will run through some demonstrations and take questions from the live streaming audience.
Be sure to stop back at PC Perspective on Monday, October 21st at 2pm ET / 11am PT as to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming. You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA G-Sync Live Stream
11am PT / 2pm ET – October 21st
PC Perspective Live! Page
We also want your questions!! The easiest way to get them answered is to leave them for us here in the comments of this post. That will give us time to filter through the questions and get the answers you need from Tom. We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with.
So be sure to join us on Monday afternoon!
I seen over Neogaf forum that
I seen over Neogaf forum that one rep from Nvidia stated that G-Sync will give 3d vision superior performance so no worry boys, nvidia will not give up on 3D 🙂
But please ask more about how G-Sync will work and tell him that the 3d vision comunity feels very cheated as nvidia dont answer even one reply over their own forum so we feel left out in hte cold.
So what the future for nvidia 3d and 3d vision + G-Sync ask him that for us 3d gamers.
Dear Tom,
G-sync seams
Dear Tom,
G-sync seams great.
Any updates on dual DVI tiled monitor support on Geforce?
Thank you
Oskar
Is g-sync can be used when
Is g-sync can be used when lightboost is enabled in 2d mode ?
If not do you have any plan for this?
I’ve been using LightBoost
I’ve been using LightBoost lately to play games in 2D (not stereoscopic 3D). It makes such a huge difference in the perception of blur in the image that I simply cannot play without it. But it does make anomalies like stutter and tearing even more noticeable (at least to my eyes), making it necessary to use V-sync to lock in the fps to get the best effect. So my question is this: how will G-Sync work in conjunction with LB? Will they be separate features or will they work in harmony with one another (i.e. the strobing rate of LB will dynamically change with the framerate, similar to the behavior of G-sync with the refresh rate) I personally am hoping for the latter of course. Oh, and just when will LB for 2D gameplay become a native feature in the control panel software?
Hi! I’m a flightsimmer and
Hi! I’m a flightsimmer and use Microsofts Flight Simulator (FS2004 and FSX). I have the option to lock the fps in-game to anything between 0-90fps. How will this play along with G-Sync? Especially FSX is a demanding piece of software and usually gives fps between 30-60, so using 1/2 refresh rate vsync (30hz) and limiting to 30fps in-game has been the best solution so far.
How labor intensive will will
How labor intensive will will it be to install the G-Sync module in current Asus VG248QE monitors?
can 4K monitor have 3D vision
can 4K monitor have 3D vision support and have g-sync module too?
Is the DIY G-SYNC kit useable
Is the DIY G-SYNC kit useable on other monitor types? If not, are there any plans to release upgrade kits for the most common higher-end brands (Dell Ultrasharp, etc).
Monitors are infrequently replaced so I would think it would be in NVIDIA’s best interest to release upgrade kits for the most popular monitor brands to encourage adoption, unless this is cost-prohibitive or the display companies are planning to gouge customers with new G-SYNC monitors.
I second this:
“But please
I second this:
“But please ask more about how G-Sync will work and tell him that the 3d vision comunity feels very cheated as nvidia dont answer even one reply over their own forum so we feel left out in hte cold.
So what the future for nvidia 3d and 3d vision + G-Sync ask him that for us 3d gamers.”
Please tell the boys at Nvidia that 3Dvision gaming is alive and kicking. Having good 3D support is a major factor in my decision to buy a certain game or not.
Why do they not even mention 3Dvision in their presentations at all?
So please ask them about the future of 3D vision.
thanks
I’d like to know more about
I’d like to know more about nVidia’s plans for ongoing 3d vision support, as the number of 3d-ready titles has been in sharp decline. Why is it that the community are able to fix a lot of games themselves, but the experts at nVidia can’t improve support?
Nvidia Please stop making and
Nvidia Please stop making and putting more resources into Nvidia Shield Nobody want’s that thing that was a complete waste of time and money. What the high-end enthusiast want is a completely bezel-less monitor a monitor of this type would push enthusiast to purchase more than one monitor and most likely more than one video card $$$
I’m curious to know if Nvidia
I’m curious to know if Nvidia have their sights on just revising TN panels that are favoured by gamers, or if, down the line, this would be implemented in newer generations of IPS panels, 1440 and 1600 sized monitors, emerging 4k monitors, etc.? How “ready” is G-Sync to be implemented in, say, larger IPS panels? Is this a “Qx 2014” eventuality, or something more vaguely down the line? Makes me wonder if, eventually, this innovation might expand beyond Nvidia proprietary technology.
official launch dates?
official launch dates?
I’d like to know where 3D
I’d like to know where 3D Vision is going as well. Does G-Sync’s mentioned improvement for 3D Vision mean they’ll keep supporting/improving it?
1) will we be able to see the
1) will we be able to see the difference through the stream?
Since my monitor and the recording technology still samples at 30 or 60 fps will we be able to really see the benefits. Even the videos posted online were not entirely representative.
2) If I use this in surround can I only have the main central monitor using gsync and the other two as regular monitors since they are really only in my peripheral vision anyway?
this is the second coming of
this is the second coming of physx.
don’t ever expect this to work with AMD cards.
I now swear to allah or someone like him.
I will never buy Nvidia…ever.
Althought i am excided for
Althought i am excided for this tech, i dont like they use double buffering as an example, because more superior solution – tripple buffering – exists for some time.
sorry, wrote this before i
sorry, wrote this before i saw this is actually addressed. Tnx Tom.
sorry, wrote this before i
sorry, wrote this before i saw this is actually addressed. Tnx Tom.
Great to see them trying to
Great to see them trying to implement change…but two points
1 The need for a module would be removed if scanning was done instantaneously….no tearing so no need for vsync or its associated lag or stutter…..but I guess this is something out of the control of Nvidia so they are trying to effect change the only way they can.
2 Vsync solves tearing but introduces stutter…but the size of the stutter is only a portion of the frame refresh time ie part of 16ms for a 60Hz panel.
So using a high refresh rate panel would reduce this stutter…maximum of 7ms on a 144Hz panel…..this is surely getting into the realm of being un-noticable anyway….just bring out panels with even higher refresh rates and use vsync??
please answer this question:
please answer this question: when the NVIDIA engineers first had the idea to invent G-SYNC? it is important for me