News
 

Bookmark and Share

(23) 

Nvidia Corp. on Friday unveiled its new G-Sync technology that solves the decades-old problem of onscreen tearing, stuttering and lag by enabling synchronization between the GPU and the display. The technology will require special monitors with integrated G-Sync module (pictured below), which will be available from companies like Asustek, BenQ and ViewSonic.

Since their earliest days, displays have had fixed refresh rates – typically 60 times a second (hertz). But, due to the dynamic nature of video games, GPUs render frames at varying rates. As the GPU seeks to synchronize with the monitor, persistent tearing occurs. Turning on V-Sync (or Vertical-Sync) eliminates tearing but causes increased lag and stutter as the GPU and monitor refresh at different rates. Nvidia G-Sync technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. As a result, scenes appear instantly, objects are sharper, game play gets smoother.

"Our commitment to create a pure gaming experience led us to G-Sync. This revolutionary technology eliminates artifacts that have long stood between gamers and the game. Once you play on a G-Sync monitor, you will never want to go back," said Jeff Fisher, senior vice president of the GeForce business unit at Nvidia.

G-Sync technology includes a G-Sync module designed by Nvidia and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs (GK110, GK104 and GK106). Each G-Sync module resembles MXM card and features a custom ASIC with its own memory buffer.

Many of the industry's leading monitor manufacturers have already included G-Sync technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.

Leading game developers are impressed with Nvidia G-Sync technology.

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. Nvidia G-Sync elegantly solves this longstanding problem. Images on a G-Sync display are stunningly stable and lifelike. G-Sync literally makes everything look better," said Tim Sweeney, founder or Epic Games.

"Nvidia's G-Sync technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed," said Johan Andersson, technical director of DICE, a subsidiary of Electronic Arts.

G-Sync is a yet another Nvidia’s attempt to create a proprietary ecosystem within the PC ecosystem with the aim to improve PC gaming and gain loyal customers. In the past, few display manufacturers supported Nvidia’s 3D Vision stereo-3D technology that relied on proprietary glasses. This time Nvidia asks display makers to build-in a piece of hardware directly into their products in a bid to create an additional selling point and naturally increase pricing (and/or use previous-generation panels and technologies to compensate the cost of the G-Sync module).

While the advantages that Nvidia G-Sync provides is just what the doctor ordered for many video games on the PC, the business-related complexities may not allow the technology to really take off and become mass.

Tags: Nvidia, G-Sync, Geforce, Kepler, GK110, GK104, GK106, ASUS, BenQ, Viewsonic, TSV, Philips

Discussion

Comments currently: 23
Discussion started: 10/19/13 07:14:58 AM
Latest comment: 11/01/13 09:35:43 AM
Expand all threads | Collapse all threads

[1-8]

1. 
Interesting...........
1 0 [Posted by: tks  | Date: 10/19/13 07:14:58 AM]
Reply
- collapse thread

 
I interpret that as sarcasm.
0 1 [Posted by: linuxlowdown  | Date: 10/19/13 12:56:32 PM]
Reply

2. 
Nvidia HQ leakers now report that Mr Huang is pooping his pants daily due to the extreme competition from AMD's console chips, newly announced cards, Mantle API and upcoming ARM HSA collaboration. This is just another Nvidia lock-in proprietary technology without much worth and full of hype. Smart gamers will realise this and not opt for the cheap trick lock-in. Give Huang the middle finger today.
1 3 [Posted by: linuxlowdown  | Date: 10/19/13 01:03:04 PM]
Reply
- collapse thread

 
This is fanboyism.
3 0 [Posted by: tks  | Date: 10/19/13 02:31:31 PM]
Reply

3. 
" As a result, scenes appear instantly, objects are sharper, game play gets smoother." lol better than 16x AA?

They should just should just name is, "framskip" technology. I mean most graphics cards are rendering too fast and some of the frames aren't needed. By limiting it to x amount of renders they can increase frame response times. Nvidia is already light years ahead of AMD on this tech. Why don't they focus on running modern games 60fps on 4K monitors instead?
2 1 [Posted by: Sum Ting Wong  | Date: 10/19/13 04:57:46 PM]
Reply
- collapse thread

 
It's not frameskip technology.
Also, nVidia already got G-Sync working experimentially on a 4K display too; from what I heard. It really benefits 4K since 45fps actually looks beautiful. (45fps-60fps random varying framerate now looks like 60fps@60Hz locked -- no stutters).
2 0 [Posted by: Mark Rejhon  | Date: 10/20/13 03:13:26 AM]
Reply
 
G SYNC is a major improvement. Having seen it in action I think it will be a game changer.
0 0 [Posted by: beck2448  | Date: 11/01/13 09:35:43 AM]
Reply

4. 
DOA no doubt about it. Most true gamers are packing 120-144hz monitors with v-sync disabled. A good gaming 144hz monitor is around $299, a G-Sync equipped equivalent is $399, why not forward the extra $100 towards a better video card?
3 1 [Posted by: redeemer  | Date: 10/19/13 07:57:32 PM]
Reply
- collapse thread

 
It's that impressive:
G-Sync produces a bigger image-quality upgrade than adding a $400 additional graphics card. This is because G-Sync'd 40-55fps (fluctuating framerate) looks much better than VSYNC OFF 70-90fps (fluctuating framerate). Adding additional graphics cards in SLI does not reduce GPU lag since frame rendertimes (per card) are not usually lowered. Contrary to this, G-Sync lowers input lag. Far more than doubling up your cards to SLI.

$100 extra is very cheap for this quality improvement. It's as stunning a display upgrade as HDD->HDD, or the upgrade from 60Hz->120Hz. It looks better than LightBoost during variable fluctuating-framerate situations. (And G-Sync boards will still also have a low-persistence LightBoost-like strobing mode option, too! -- http://www.blurbusters.co...strobe-backlight-upgrade/ )

(confirmed)
1 1 [Posted by: Mark Rejhon  | Date: 10/20/13 03:06:52 AM]
Reply
 
Why? Because >60fps with massive overdrive and occasional stutter/lower quality (for >60fps on a 2560x1440 screen you need to drop quality or use two cards) is not as good as 60fps vsync without sync lock delays.

Basically you can use a gtx760 + gsync at $350 rather than gtx 780 at $650 and still get better quality.
0 1 [Posted by: basroil  | Date: 10/20/13 01:55:27 PM]
Reply
 
Bang!!! Indebatable argument.
0 0 [Posted by: MHudon  | Date: 10/21/13 11:25:46 PM]
Reply

5. 
So, this is NVidia's play since they didn't win any consoles and is getting squeezed by embedded GPUs. Stick their chips in every kind of other device they can think of.
2 1 [Posted by: KeyBoardG  | Date: 10/20/13 03:49:23 AM]
Reply

6. 
It is very likely that a 120+hz monitor will not need it as much since at high frequencies of refresh tearing is very hard to notice and for the same reason vsync can be off and hence not get the relevant pauses either.
2 0 [Posted by: Curzon Dax  | Date: 10/20/13 01:31:26 PM]
Reply
- collapse thread

 
show the post
0 3 [Posted by: basroil  | Date: 10/20/13 02:03:57 PM]
Reply

7. 
why do we need this. if I'm not mistaken nvidia GPU already have frame pacing inside it's GPU. So why not just control the frame pacing from the GPU. surely they can do this without having another chip.
2 1 [Posted by: Amir Anuar  | Date: 10/21/13 01:12:57 AM]
Reply
- collapse thread

 
Monitors currently don't allow it because they use VESA specifications. It's not like an internet packet where you can send packets at random times. DVI/HDMI/ HDMI like DP require frames be sent at exact intervals. Tearing on images occurs when the image sent is overwritten part way though the transmission sequence. The reason GSYNC only works with higher end fermi cards is that they are rewriting the protocols used in displayport to allow for frames to be sent when they are done rather than fixed intervals. Perhaps other cards that support DP can use gsync too, but only time will tell if nvidia will release the tech to other companies.

Interestingly, intel is actually pushing for a similar tech for laptops and tablets, since there's no reason why you need to be rendering and transmitting the same frame a few hundred times when the display is still. They went one step further to allow sections to refresh at different times, but that's pointless in games where every inch of the screen is moving in some way
0 2 [Posted by: basroil  | Date: 10/21/13 11:11:31 AM]
Reply
 
I'm questioning the need for extra chip on the monitor thus increasing the monitor price and the power consumption.there are already a frame pacing mechanism on nvidia GPU i bet it can do the same job as G-sync.

i don't know about intel but i bet they will push it on the same silicon as it CPU&GPU.This dual chip solution is not an efficient way to implement.
0 1 [Posted by: Amir Anuar  | Date: 10/21/13 08:48:13 PM]
Reply
 
As I said, YOU CANNOT DO THAT ON A MONITOR IF IT USES VESA TIMINGS! The issue is that DVI and HDMI are impossible to override legally (licensing issues), and both the computer and monitor need to be modified to prevent timing issues (the vsync lag), which works only on DisplayPort using packet information rather than regular timing based data transfer.
0 0 [Posted by: basroil  | Date: 10/22/13 09:54:59 AM]
Reply

8. 
I seriously do not understand why so many people do not comprehend what G-SYNC does. Do you not see judder or stutter? Do you not experience input lag with Vsync and triple buffering?
This is the biggest breakthrough since God knows when, and I cannot wait to get my hands on that tech.
0 1 [Posted by: Harry Lloyd  | Date: 10/21/13 05:22:53 PM]
Reply
- collapse thread

 
Speaking truly... no, I don't experience those input lags, judder, stutter and Vsync troubles. I'm just an happy AMD Radeon user, maybe this explains that?

G-Sync looks great, it really is in a lot of sense, but it's not a game changer.

Set your videos parameters better, or if you feel the need to spend money, buy a 120hz monitor to solve most of your issues.
0 0 [Posted by: MHudon  | Date: 10/21/13 11:40:48 PM]
Reply
 
120Hz doesn't solve the issue, just mitigates it assuming your card buffers the entire last frame and keeps sending that until a new one is finished. For a pair (new and old) 1920x1080 frame that's a "reasonable" 16MB chunk of memory that could have been put to better use, while a 4K screen will end up eating away 150MB! But of course the lag will still be there, it'll just be 8ms rather than 16ms (or more) it is with vsync.

And the 7790 is known to suffer massive tearing in Bioshock:Inf. when run at 1080p, so AMD isn't going to save you.
0 1 [Posted by: basroil  | Date: 10/22/13 10:04:28 AM]
Reply
 
120 Hz makes judder and stutter even worse. You ALWAYS get judder when the framerate does not match the refresh rate. At 120 Hz, you need constant 30, 40 or 60 FPS in order not to get judder. Anything between 60 and 120 results in judder, as does anything between 40 and 60. Maybe you cannot see it, but it is there.
It is like watching 24 FPS movies at 60 Hz, which is awful.
0 0 [Posted by: Harry Lloyd  | Date: 10/22/13 05:22:52 PM]
Reply
 
You're right, 120hz might not solve everything.

Still, I've built many gaming rigs in the past year and took some time to ask the owners on my mailing list if they've experienced any judder or anything at all. Nothing! That's about 15 pc's built around both AMD and Intel platforms and no problems.

Before changing one's graphic solution and replacing a monitor, I would first recommend upgrading to full SSD, better quantity/quality Ram and separate Audio.

G-Sync... IMO is another niche gadget serving nothing else but Nvidia's "We do it better but charge you big time for it" suicidal marketing strategy.
0 0 [Posted by: MHudon  | Date: 10/22/13 08:29:05 PM]
Reply

[1-8]

Add your Comment




Related news

Latest News

Thursday, November 6, 2014

6:48 am | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

8:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

12:22 pm | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

9:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

6:41 pm | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture