Nvidia Titan

Discussion in 'Video Cards and Displays' started by DarkStarr, Feb 19, 2013.

  1. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    So what do we think of the Titan? I doubt it will stay within that envelope of 250w, I think in actual testing it will end up much closer to 300w+. I gotta say, I dunno how well its gonna sell at that premium and because a lot of people who purchased 680s for launch price are not gonna want to sell for hundreds less to buy new, especially if what they have works fine. That's not to say however some wont but, I imagine the resale vale on the 680s is gonna drop quite a bit. I wonder how well its going to live up to expectations since Nvidia is know for artificially limiting GPU compute, will this card show its true power or will it be more of oh... its basically on par with where the 680 compute should have been ([email protected] for example). Personally I think for sales, its to late and I don't think in most situations its going to be enough to convince users to upgrade. I do hope however, it puts some pressure on AMD and they step up the Radeons, even if it means my 7970 is outdated :(

    EDIT: WOW! Only a 8+6 Pin setup! That doesn't bode to well for overclocking..... I mean if it is above that 250w that isn't much headroom.

    EDIT 2: Neat little graph, which if true is pretty damn impressive.
    [​IMG]
     
    Last edited: Feb 19, 2013
  2. Rob Williams

    Rob Williams Editor-in-Chief Staff Member Moderator

    12,080
    1
    Jan 12, 2005
    Atlantic Canada
    I think it's an amazing card, both technoligically and performance-wised (based on the specs, that is... I don't currently have one). As for TDP, that's generally when cards are used normally, not stress-tested, so I think it'll definitely go far beyond that as well. Heck, my own PC with six-core and 680 can reach 1KW when fully stressed ([email protected]).

    I do think sales are going to be rough, but the fact AMD isn't following up with a launch of its own right away is going to help things. It's also an expensive-as-hell card, so its customer base is already very limited. For what it offers though, it's priced-right I'd say, if you want to go the 3x1 route. I think even for a single 30", this card would be overkill for any game. But then again, it's kind of a "future-proofed" card, more than the others anyway.

    As for overclocking, did you even read our article? NVIDIA hit 1,100MHz in the lab, which is something like 270MHz over stock.
     
  3. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    You say 3x1 EXCEPT when did Nvidia allow a single card to do 3 monitors IN surround? Last I knew it took SLI to run surround. As for the 1100 that could be without load or any number of variables. Looks to be pretty good though if true, it also seems they changed directions damn fast on the boost. No OV to OV and pushing it farther, probably hurt their sales a bit not allowing voltage control. That or the outrage, however I doubt that since they don't listen any other time.

    EDIT: Quoted from the Nvidia site:
     
    Last edited: Feb 19, 2013
  4. Rob Williams

    Rob Williams Editor-in-Chief Staff Member Moderator

    12,080
    1
    Jan 12, 2005
    Atlantic Canada
  5. Doomsday

    Doomsday Tech Junkie

    1,477
    0
    Nov 24, 2008
    KHI, PAK
    Whoaa Beastly! :D
     
  6. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    Hope your right, I don't have any Nvidia hardware anymore, definitely not any 600 series cards.
     
  7. Rob Williams

    Rob Williams Editor-in-Chief Staff Member Moderator

    12,080
    1
    Jan 12, 2005
    Atlantic Canada
    Something I wrote from our quick look at the 680 on release:

    "With its Kepler architecture, NVIDIA “fixes” something that has bugged me for a while: the inability to run three monitors off of a single GPU. In fact, the company has gone one further by offering support for up to 4 monitors (a typical configuration might be 3×1 and then another monitor up top, center). Given just how powerful today’s GPUs are, it’s nice to have the option to stick to just one for multi-monitor gaming."

    As far as I was aware at the time, that spanned the entire Kepler line - as long as the GPU has that many video connectors, of course. That pretty much rules out most of the low-end, but 660 and up should handle it no problem.
     
  8. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    Sorry, I meant more of I could run 2 screens on an 8600 GT but could only game on one. I was unsure as to whether Nvidia had allowed gaming across multiple screens on a single card. Also last I know it was either one screen or 3+. My SLI 480 drivers wouldn't allow 2, huge part of why I sold them for a single 7970.
     
  9. Rob Williams

    Rob Williams Editor-in-Chief Staff Member Moderator

    12,080
    1
    Jan 12, 2005
    Atlantic Canada
    Well.. I run an emulator on a second screen sometimes full-screen, but I can't really "use" my primary one, because it ends up booting the emulator out to Windowed mode. I think this is more of a game issue than an NVIDIA issue though. I'll be playing a regular game on a second monitor next week, so I'll see then what's possible.
     
  10. RainMotorsports

    RainMotorsports Partition Master

    352
    0
    Jul 1, 2011
    Of course you can. Its pretty much a matter of game support. I could run Sins of a Solar Empire on 2 screens with my laptops 9800M. Many games that support windowed mode can be forced upon as many screens as you can drive as well.

    Unless you had an epic card like this I would recommend gaming like this but you can run 3 screens indirectly via many tricks. While some cards are offered with 4 or 5 ports via a chip. You can also use USB adapters which are basically a pathway but the GPU is still doing the rendering. All the solutions available are generally fine for software workflow purposes. But generally not for gaming across.

    I accidently ran 3 screens thanks to the on CPU intel graphics. I forget which way I had it configured but the special drivers were not installed for multi gpu support with the intel. I had 2 screens on the Nvidia one on the Intel all 3 showing up in nvidia control panel.
     
  11. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    Windowed does not count. It slows down the FPS for one and no true gamer would run a game windowed across multiple screens.
     
  12. Kougar

    Kougar Techgage Staff Staff Member

    2,588
    0
    Mar 6, 2008
    Texas
    For what I think... I think it's awesome. Shorter length, lower noise, and less power draw than a 690, yet significantly better performance in HPC workloads. Still waiting to see how it compares in single-precision workloads like [email protected] though... Clocks up to 1Ghz on air, which is impressive given the 8800 GTX G80 core wasn't even close and was smaller in die area. Can't wait to see what an EVGA Titan Hydrocopper can do as it will cut the temps in half. EVGA is even going to be launching two versions of the watercooled model this time around.

    I am very, very happy to hear that NVIDIA is completely removing the artificial double-precision performance cap as well. It needed to go, and now in DP-capable programs Titan actually has a performance justifications for existing beyond a simple, expensive luxury product.

    The monitor overdrive option is also extremely awesome. With those Korean panels easily able to clock higher users needed a simple exposed setting rather than hacking drivers to test it out. I can only hope this will help encourage people to start asking for better refresh rate displays, and then maybe manufacturers will start making them. Because if A- panels are capable of hitting 120Hz, I'm sure better quality ones, or at least tweaked electronics designs will easily be able to do it with IPS panels.
     
  13. marfig

    marfig No ROM battery

    739
    0
    May 15, 2011
    I'm not convinced by this card.

    I think it completely eclipses the 690 while asking too much money per FPS count when compared to the 400 USD 680. This is a card that managed to both make their top offering on the 600 line obsolete, while not adding anything that could make 680 users look further. This is just simply a card that has nothing to offer, except for anyone wishing to pay 2000 or 3000 USD for a dual or 3x solution.

    2x 680 = ~800 USD
    2x Titan = 2000 USD

    Meanwhile, the 7870 is even faster than the 680, making the Titan an even less appealing solution to AMD users. This card simply has no place in the vast majority of the market.

    Sure we can go wide eyed and be impressed at the specs. But the specs need to match a desire to buy. The Titan is anything but. And when that happens, this just makes it a bad product. More, the Titan is going to apparently remain an isolated product. The 700 and 800 line of cards will quickly eclipse it at much smaller prices.

    Frankly, if I want to just open my mouth at specs, I can imagine one monster card in my dreams. I don't need for anyone to actually make one.
     
  14. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    My thoughts on this is, Nvidia wanted nothing more than the TITLE. They wanted to say they have the fastest single GPU card in the world. That's why its priced like this, that's also why it wont sell. Its a pretty good card but around the net I am seeing people say oh titan? nah let me get a second 680.

    Anyways AMD is making a dual GPU card that has dual 1Ghz chips on it. Sounds really epic TBH.
     
  15. marfig

    marfig No ROM battery

    739
    0
    May 15, 2011
    Tomorrow or the day after the embargo on the Titan will end and we'll get to see a whole bunch of benchmark data. That will put any doubts to rest. My thought is that it is going to be generally disappointing. Particularly when comparing it to the 7970. (No wonder AMD reacted by saying they won't be replying to the Titan).

    BTW, I did mean 7970, not 7870 on the previous post. But I guess anyone got it :)
     
  16. TerrellStanley

    TerrellStanley Obliviot

    1
    0
    Oct 21, 2013
    You seems to be very experienced person..Running two to three screens is not easy.. I have tried many a times but failed..
     

Share This Page