Review: NVIDIA GeForce GTX 680

Recommended Videos

For some of us, upgrading our PCs with the best hardware and enjoying the most jaw-dropping games in all their glory is a not-so-distant memory. Even though I’ve consistently owned beefy rigs my whole life, I’ve spent less and less time utilizing their full power, due to a slowly dwindling and fluctuating PC market. For several years, my PC gaming life was akin to the Brooks Hatlen character from The Shawshank Redemption — every day, continuing to feed the pigeons in the park, hoping for his old raven buddy to return.

Lately, the PC gods have arisen from their slumber and decided to bless us with some stellar (and demanding) PC-focused games. Now high-end graphics technology is starting to look more and more worth its market value. Over the past week, I’ve had the opportunity to pilot NVIDIA’s upcoming beast of a doomsday machine, the GTX 680, and make obedient subjects of the powerful games that once laughed at my PC’s inadequacies.

NVIDIA set out to make not only a more powerful graphics card, but a more efficient one that doesn’t require a lot of extra tech (such as a whole extra card) to get the best experience possible.

NVIDIA GeForce GTX 680
Manufacturer: NVIDIA

Release: March 22, 2012
MSRP: $499

Important GeForce GTX 680 specs:

  • CUDA Cores: 1,536 (Three times more than the 580)
  • Base Clock: 1006 MHz
  • Boost Clock: 1058 MHz
  • Memory Clock: 6008 MHz
  • Interface: 256-bit
  • Total Memory: 2048MB GDDR5
  • Total Memory Bandwidth: 192.26 GB/s
  • Texture Filtering Rate (Bilinear): 128.8 Giga Texels/sec
  • Connectors: 2 x Dual-Link DVI, 1 x HDMI, 1 x DisplayPort
  • Recommended Power Supply: 550 Watts
  • Thermal Design Power: 195 Watts (244 Watts for 580)
  • Power Connectors: 2 x 6-pin (One 6-pin and one 8-pin for 580) 

My general PC specs:

  • Windows 7 x64 (cards are already compatible with Windows 8)
  • Intel i7 2.80 GHz
  • 8GB DDR3 SDRAM

I’m just going to get straight to what most of you want to know: How it handles games. The reason I posted my rig’s not-so-uber specs above is actually a way of pointing out that my 680 has been taking a lot of the stress with all of the graphically heavy games I’ve thrown at it. It’s all for the card. Believe me, I would have gladly neglect mentioning my specs, as mentioning your lackluster PC to techies is like showing your four-incher to a porn star.

The first game I tested — and the most obvious — was Battlefield 3. An important note about its Frostbite 2 engine is that it’s very efficiently built to run on an adequately powered rig. My old 560 Ti (always overclocked) could handle ultra settings at around 40 to 50 FPS, minus VSync or anti-aliasing. So it’s obvious that a 680 would make short work of this game’s demands. However, even on powerful rigs, framerate drops are common during heated battles with an abundance of particle effects (smoke and explosions) and game models crowding the screen.

To test this out the best I could, I played through several “Conquest Large” matches on BF3‘s biggest maps available, all on completely maxed-out settings. Even when a team had only one point captured, and the player focus was centered on that entire area, I didn’t witness a single drop in smoothness. I mean I literally kept my eyes on the framrate as the sh*t was hitting the fan, and noticed no fluctuation at all.

I then tested the 680 on an engine that isn’t very efficiently built. The Witcher 2‘s RED Engine has turned quite a few heads due to its looks, though it’s no secret that maxed-out settings require an overall top-end PC. More specifically, the “Uber Sampling” feature is typically what kills the game’s performance, as it renders each scene several times over to provide a smoother image quality. Most people turn this feature off, as the slight visual improvement doesn’t justify the hardware demands.

On my 560 Ti, you could swear I was playing some game I received from ten years into the future; a whopping 15 FPS was the best I got out of it. With the 680, though, it ran at a very stable 40 to 50 FPS, with almost no drops below that 40 (even during combat).

Mainly due to the RED Engine’s cumbersome features, The Witcher 2 was one of the most technically demanding games I could test on it. A close equal was Crysis 2 with its DX11 upgrades, which ran surprisingly better than Witcher. Out of all the games I tested on the 680 (others include Skyrim, Rage and Just Cause 2), the most rewarding was undoubtedly Crysis 2.

The Frostbite 2 engine looks beautiful because of it versatility, but the CryEngine 3 looks incredible because of its cutting-edge features, and the 680 handles them all brilliantly. Displacement maps, high-quality HDR, real-time reflections, and particle motion blur all look absolutely fantastic. My 560 Ti could barely handle Crysis 2 on max settings at around 30 FPS. My 680 laughs at it, with a strong 60 FPS, only dropping to near 50 during moments of extreme action (lots of explosions and particle effects).

After all of this, I can’t say I’m surprised that the 680 performed the way it did. Many people may remember the Unreal Engine 3 Samaritan Demo from last year’s GDC. Well, that demo, mind-bogglingly beautiful as it was, originally required three GTX 580s and a power supply the size of a small child. When I was first presented the 680 at NVIDIA’s Editor’s Day event during GDC 2012, the same tech demo was shown … running on a single 680 and nothing else.

NVIDIA wants this card to really mean something to the gaming community, not only by being ultra powerful and providing us with longevity, but also via the cutting-edge features that are idiosyncratic to NVIDIA cards alone.

A lot of you might have been wondering how I got such close framerate fluctuation with VSync presumably on (it was). Without getting too technical, a big issue people have with VSync is that it forces the framerate to drop by positive integers, based on your monitor’s refresh rate (i.e. a 60Hz monitor dropping by 60FPS, 30, 20, 15, etc.) all for the sake of preventing “screen tearing.” We gamers can see the obvious problem with this, as the drastic drop in framerate results in “jittering.” To combat this, NVIDIA has developed what’s known as “Adaptive VSync,” which automatically turns off global VSync whenever the framerate needs to fall to anything below your monitor’s max refresh rate. No more jitter and no more screen tearing.

As another means of providing a smoother gaming experience, NIVIDA is aspiring to do away with MSAA (Multisample anti-aliasing) by providing their own FXAA, which can be activated within the card alone and be applied to any game. They’re also providing the upcoming TXAA, a new film-style AA that is at least 4X more effective than MSAA. The result is a welcome addition, as we’ve been long overdue for an upgrade in this area.

Another great feature that we’re all becoming acquainted with is PhysX, NVIDIA’s proprietary physics engine. PhysX has been steadily appearing in a lot of high-quality titles, providing great rigid and soft body dynamics, as well as fluid and cloth simulations. At NVIDIA’s Editor’s Day, Gearbox Software CEO Randy Pitchford showed off Borderlands 2 and how it implemented PhysX. Fluids pooled and flowed in complete real-time, and even reacted to explosions — splashing about into numerous smaller puddles. Cloth materials reacted accordingly to foreign objects, and could even be torn and shredded when fired at. It was quite incredible how these effects could be handled with such relative ease in real-time, when just five years ago it took me several hours to render them for 3D animations on a high-end PC.

The last upgrade I’m going to mention is, in a lot of ways, more of a downgrade, but sold me on the card merely due to my living situation. As stated before, the 680 is a very efficient card, and that applies more than anything to its power consumption. The 680 is so streamlined that it actually draws less power than its predecessor, the 580 (see the specs above).

What does this mean for me? Well, as a city that desperately tries to retain some sort of bullsh*t identity, San Francisco is adamant about holding on to their Victorian architecture of the 1920s. This includes the f*cked up power distribution systems that came with them. With that said, I can only have about two appliances on at any one time, before I cause a power surge and my place goes completely dark. When it comes to PC gaming, this presents a problem. I actually used to SLI two 560s, but had to get rid of one if I wanted to game with my heater on — enduring cold San Francisco nights is definitely not worth an extra 560 Ti.

So you can imagine that a card like the 680 fairs well for someone with my situation, if not also for people who dig the environment or like saving money on bills. Not only does it consume less power than the best of last generation, but its TDP is only 25 more than my freakin’ 560 Ti. After seeing the Samaritan demo and what it took to run it last year, I don’t know how they accomplished what they have with the 680. It’s like someone sold their ass to the Devil to make this thing.

To really explain every notable change and addition with the GeForce GTX 680 would take so much more time. This new line of graphics cards is leaps and bounds beyond the 500 series. This review alone is obviously not going to convince you to throw down $499 on a new card, but I do hope it drives you to do a little more digging into the fine details of the 680 … especially if you plan on upgrading.

PC gaming is slowly but surely making a comeback, and the GeForce 680 is the card to welcome it with open arms. Several games are in development right now with this very card in mind (others shown at the NVIDIA event were Max Payne 3 and The Secret World). If you yearn for the time when you filled your PC with the best of the best tech in preparation for the hottest-looking games to come, then the time is certain now, and the tech is certainly this card.

Oh, and before I go, all of you hardcore NVIDIA fans are probably going to want to watch this:


Destructoid is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more
related content
Read Article Yoshida acknowledges FFXIV Dawntrail Benchmark complaints, updated version on the way
Miqote female in the FFXIV: Dawntrail benchmark
Read Article CoD Mobile weapon camo takes players into a war movie
CoD Mobile
Read Article Original Fallout lead designer Tim Cain says he loved the TV show
Vault Boy from Fallout
Related Content
Read Article Yoshida acknowledges FFXIV Dawntrail Benchmark complaints, updated version on the way
Miqote female in the FFXIV: Dawntrail benchmark
Read Article CoD Mobile weapon camo takes players into a war movie
CoD Mobile
Read Article Original Fallout lead designer Tim Cain says he loved the TV show
Vault Boy from Fallout
Author