Seeking Alpha
Long/short equity, tech
Profile| Send Message| ()  

Prior to CES 2014, Nvidia (NVDA) has been giving demonstrations of the new G Synch technology (I, II).

In this article, I will briefly explain G Synch, talk about the problems it alleviates, and give my interpretation as to the impact of the technology on the desktop GPU market.

So What is G Synch?

Images taken from AnandTech, linked to above:

G Synch is essentially some video memory mounted to a monitor. What this allows the G Synch enabled video card to do is coordinate with the monitor as to when the monitor should paint frames to the screen.

(click to enlarge)

When things are less than perfect, you get the condition above known as "tearing." Essentially, the video card and monitor get out of synch. This leads to simultaneously seeing two or three images on the screen for a brief second, which makes the picture look less than ideal.

One fix is enabling something called V Synch, which essentially puts the video card and monitor in lock step. However, this in turn introduce input lag, and possibly stutter. I am not as perceptible to stutter, but input lag with V Synch enabled is annoying, so I (and many gamers) prefer not to use it.

One final important term to this puzzle is "refresh rate." A monitor or TV (think a 120 Hz/240 Hz LED-LCD panel) paints an image on the screen at the refresh rate of the monitor. V Synch will typically lock the screen and video card together, and the monitor will not paint the next picture until the video card has sent it, so you have pictures painted that match the refresh rate of the monitor. This causes images to get rendered at a multiple of the monitor refresh rate, typically at 30 or 60 FPS (frames per second).

G Synch allows images to be rendered at various intervals of the monitor's refresh rate to minimize tearing.

The Issues With Usefulness

G Synch seems to be a proprietary hardware at this point. The first monitor to use the technology looks to be an ~$280 Asus panel that will require a DIY installation of an additional ~$100 worth of hardware in that panel (note I have not seen an official pricing yet, so the $100 may be off when released). Available 2014, Asus will ship a $400 monitor that is pre-configured.

So other than the fact it is the wrong color, the mounting screws seem awkward to install, it may or may not come in the packaging material that is frustrating to open, and when opened it will have that "new" smell to it (this is a sarcastic comment that my normal readers will enjoy), it looks to solve some problems with gaming.

As for genuine issues - you have to look at what G Synch actually accomplishes. This feature is most useful at lower frames per second. This goes back to the monitor and video card getting out of step. Say the video card is sending 70 FPS to a 60 Hz monitor, that means that there is a new image coming in faster than the image that is being rendered, which limits much of this tearing.

If you look at any benchmarks, you'll see that once you get into the $300 pricing segment or so for video cards, for most games you'll be at a high enough frame rate to mitigate most of this advantage, with the exceptions being games like Metro: Last Light or Crysis. As you move down the product stack to the $150-$250 market, this product becomes more important.

So herein lies the first issue I bring up; G Synch is most useful in those situations in which a less power (read cheaper) video card cannot push frames to the monitor fast enough. The solution? Buy a $400 monitor, or a $300 monitor and mod it yourself.

Second, looking at a calendar I see that next year is 2014. The first monitors being announced for G Synch are 1080P panels as far as I'm aware. I have written an article before detailing that most gamers play at 1080p, which is the sweet spot in resolution. However, more and more gamers are switching to higher resolutions - hence all the reviewers starting to focus on 1440/4K gaming. G Synch, as of right now, requires a substantial outlay of cash for a monitor at the low end of the resolution spectrum.

Going to higher GPU price points, if you show me a gamer that spent $650 on a GTX 780 Ti to game at 1080p, I'll show you a gamer that spend $250 too much. Meaning, for those that have GTX Titans or R9 290Xs, these problems of tearing on 1080p monitors should be pretty minimal. G Synch could actually cannibalize some high end sales of GPUs, as any Nvidia graphics card that can maintain frames above a 35 FPS or so minimum should look pretty good on a 1080p G Synch enabled monitor.

Also, as integrated graphics become stronger, more and more designs opt for single silicon solutions - meaning no discrete GPUs. This is best suited for desktop gaming.

Lastly, I am not sure as to how much of a barrier this would represent for Advanced Micro Devices (AMD). From what I have read so far (I haven't seen many very technical details), all of the work seems to be done by the GPU and the monitor. If Nvidia chooses not to license this technology, it looks like it would require AMD to spend some R&D funds to create a similar solution, and some SG&A funds to partner with some monitor manufacturers to introduce a similar solution.

What G Synch Looks To Do and Conclusion

Unless Nvidia makes this technology licensable, from an outsider looking in this appears to be an attempt to lock consumers into Nvidia hardware, creating a moat around the desktop space used to keep AMD out. It looks to require a substantial gamer investment into a setup that is compatible with only Nvidia GPUs. GPUs are typically refreshed on a yearly cadence, and most gamers I know refresh GPUs more often than monitors. If a consumer spends $400 on a gaming monitor that is only compatible with Nvidia cards, the consumer will be less likely to switch to AMD in the future.

This looks to be a good technology with only a few downsides, and is something that could give Nvidia a desktop moat if it catches. This technology looks to be officially debuted at CES 2014 in January, with supporting monitors showing up throughout 2014. So next year will likely be when we watch this story unfold, with the real impact finally showing during 2015 and beyond.

Consider this to be a first-part article focused on the implications for Nvidia from an AMD bull's perspective. The next will focus on implications for AMD, and I will delve into detail as to why I feel Nvidia is now releasing technology designed to create a moat around desktop gaming.

Source: Nvidia: Attempting To Secure Desktop Market Share With G Synch

Additional disclosure: I own both shares and options in AMD, and actively trade my position. I may add/liquidate shares/options or add a small hedge via puts at anytime.