GeForce GTX 690 announced – twin Kepler GPUs, 915 MHz base clock

Monday, 30th April 2012 00:48 GMT By Brenna Hillier

The latest nVidia graphics card has an even bigger number than the last one, which makes it at least twice as fancy.

The GTX 690 has some big shoes to fill, but nVidia claims the new card’s performance is “almost identical” to two GTX 680′s harnessed in SLI – but quieter, cooler and more energy efficient, as the heat and noise of two Kepler GPUs in one card is minimised by evaporative heat sinks and specially designed fans.

Check our nVidia’s specifications below, and visit the company’s announcement for graphs and comparison shots showing how it stacks up against the GTX 680.

    GeForce GTX 690 Specifications

  • CUDA Cores – 3072
  • Base Clock – 915 MHz
  • Boost Clock – 1019 MHz
  • Memory Configuration – 4GB / 512-bit GDDR5
  • Memory Speed – 6.0 Gbps
  • Power Connectors – 8-pin + 8-pin
  • TDP – 300W
  • Outputs – 3x DL-DVI, Mini-DisplayPort 1.2
  • Bus Interface – PCI Express 3.0

Thanks, Talkar.



  1. Charlie Sheen


    #1 3 years ago
  2. mad1723

    Unless you’re running ludicrous resolutions, this card is not really worth it, at the moment mind you. I own a GTX 680 and it lets me play any game on the market full bore. The 690 has 1 big opponent, GTX 680 in SLI.. Same price, higher clocks, higher performance in theory.

    And honestly, 999$….That’s a bit much… you can buy 2 GTX 680 in SLI for that price… As usual, it’s a card for a real niche market and to show that they have the most powerful single card on the market.

    #2 3 years ago
  3. GwynbleiddiuM

    My Radeon 5850 runs almost everything 1080p 60 fps highest setting, there are exceptions like witcher 2 but overall I’m still happy as the day I got it.

    #3 3 years ago
  4. albo88

    the same bullshit expensive cards that become useless after 1 year
    an example? gtx 295 no patch or bios update to support dx 11 even if that card was a monster in terms of specs so yeah i replace it with the gtx 560 now im set till the new generation of consoles come out

    #4 3 years ago

    20% better visuals for 400% more money.

    #5 3 years ago
  6. xxJPRACERxx

    @4 It’s not because a new card come out that older ones are useless!

    #6 3 years ago
  7. endgame

    Wow! What a beast! :) And the TDP is quite reasonable for what this card offers. The price on the other hand, is not. :)) Because just as the others said, you don’t really need this. My 560 Ti can run BF3 on High with a minimum of 45 fps. If I would run it on high. Which I don’t, because I like it better on Low. :) It’s easier to see targets on that setting.

    #7 3 years ago
  8. jacobvandy

    @2 If you read the description, they explain how they tried really hard to make this almost exactly the same as two GTX 680s in SLI. Only the maximum (boosted) clock speed is slightly lowered, by 39 MHz, which is pretty much insignificant.

    I guess if you wanted to overclock, you could get better results with two separate cards because then you’d have a cooler for each chip, but why bother with that at this level of performance? I would go for the single card over two of them any day, as it’ll be less power, heat, and noise.

    Plus the GTX 690 looks friggen sweet, with the LEDs and little windows! :p

    #8 3 years ago
  9. silkvg247

    I’m skipping the 6* generation, sticking with my trusty 580. It isn’t breaking a sweat on anything I throw at it so meh.

    I’ll probably get a 780. The 680 doesn’t really offer anything new, and I’m not overly fussed about a slightly higher framerate.

    #9 3 years ago
  10. Maximum Payne

    Before we had tons of software to maxed your rig just remember doom 3,far cry,half life…Now we have tons of power in both gpu and cpu and we used that for 3D ,multiple monitor or my favorite gimmick, dx11…

    #10 3 years ago
  11. Erthazus

    It’s not 20% better visuals.

    It’s 2560×1600 RESOLUTION and you can run Starcraft II for example in 160 frames per second.
    Battlefield 3 with 690 QUAD SLI can be runned by 120 frames per second in that resolution.

    the only thing i don’t like in it is power consumption. It’s pretty big.

    1080p and 2560×1600 is night and day if you have a monitor to support that resolution.

    Games like even Call Of Duty look decent with that resolution.

    #11 3 years ago
  12. silkvg247

    I’m waiting for super high res monitors to drop in price before I even consider going above 1080.

    #12 3 years ago
  13. Fin


    I could buy a Vita, PS3, Xbox and a load of games for that price. Graphics are hardly that much better than console anyway.

    #13 3 years ago
  14. Maximum Payne

    Is ever going to be bigger resolution then 1080p but on 22 inch monitor?

    #14 3 years ago
  15. Erthazus

    “Graphics are hardly that much better than console anyway.”

    Yeah, because 2560×1600 resolution with 4xAA/Max Settings/16xAFF 60 frames per second for every game (at minimum) is not a big difference to 720p with low end visuals and without physX.

    Retard much?

    @14, 22 inch don’t support 2560×1600 resolution. Even if you buy this card or 580 GTX, no way you are going to experience this resolution. But expect every game to run at 100 frames per second at 1080p :D

    I bought one year ago 27 inch DELL monitor that supports 2560×1600 resolution just for GTX 580 and i’m not coming back from that res.

    #15 3 years ago
  16. Da Man

    You miserable piece of bolshevik genital warp,

    Start a blog.

    #16 3 years ago
  17. viralshag

    I might have to get one of these. Minecraft will clearly be an even better game with it. I better get a huge monitor too so I can fully appreciate the 2560×1600 resolution with 4xAA/Max Settings/16xAFF 60 frames per second…

    #17 3 years ago
  18. manamana

    @16 tourette-syndrome much?

    #18 3 years ago
  19. Da Man

    Why, I just don’t like his comments, see.

    #19 3 years ago
  20. manamana

    I thought so … btw. Its not the badest of all ideas for Erthazus to start an own tech-blog.

    #20 3 years ago
  21. Da Man


    #21 3 years ago
  22. OrbitMonkey

    ^ A sock or gel? That is the question…

    #22 3 years ago
  23. silkvg247

    I don’t see anything wrong with being an enthusiast as I am one myself (just got an AW M17XR3; good price, though), so I’m a little dissapointed as the sarky comments about AA and 1600 res and so on. It does make a massive difference, it looks gorgeous. My GF has a monitor that supports the res, and games look fabulous. She’s also getting a free upgrade to a GF680 from her 580 via EVGA’s scheme, lucky sod.

    Erthazus might be a bit.. overly passionate.. about how much “better” PC’s are, but underneath all that he does have a point. Consoles are massively inferior now, and even the next gen will be. I agree that Fin’s comment was a bit silly, the gap between PC and console isn’t slight – it’s a chasm – and if you don’t realise that, you haven’t been on a high end PC lately.

    #23 3 years ago
  24. Fin

    Man have you SEEN the latest Call of Duty on Xbox? Those graphics are INSANE.

    #24 3 years ago
  25. silkvg247

    I don’t dispute that they’ve come on a fair bit with console trickery – you only need to compare Infamous 1 to Infamous 2 to see just how hard they’re pushing the hardware now. But that’s just it, it’s now being pushed to it’s limits.

    Lower resolution, less or no AA, smaller FOV (I always set mine to 80-90 on pc if possible), clipping (buildings/trees pop up, or the textures on them do), lower viewing distance, and so on. They sacrifice a lot to get the “in your face” visuals almost on par with today’s PC games; it’s the stuff in the background that they sacrifice mainly.

    Granted not that big a deal to some, but I like my shinies to be shiny all around.

    #25 3 years ago
  26. elmander99

    Forever Amd

    #26 3 years ago
  27. Fin


    It’s not as fun when people post a calm and measured response to flamebait :(

    #27 3 years ago
  28. Len

    I am so close to buying the Dell 30′ Ultrasharp and a 680 if only I could find a 680 to buy! As tempting as this is it’s just too expensive.

    #28 3 years ago

Comments are now closed on this article.