Tesla M2090

Discussion in 'Video Cards and Displays' started by Psi*, Oct 19, 2011.

  1. Psi*

    Psi* Tech Monkey

    751
    0
    Jun 17, 2009
    Westport, CT
    I have a Tesla C2070 for double precision number crunching. Paid a ~$2200 for it from a legit company and it is at least 5X faster than the i7-990X OC-ed at 4.5GHz.:eek: Nice. I am happy. :)

    But in my what used to be rare to never ventures on eBay I am starting to see these new Tesla M2090s show up and for <<$2K. This card has 512 CUDA cores versus 448 in the C2070 and my software guys tell me to expect a 20 % improvement.

    Background in a nut shell ... did I forget to mention that I OC the C2070 by 34%? I am using an MSI utility for their Nvidia cards to handle the OC & fan control. When the C2070 was initially installed (virgin like), the card's temp was around 83 deg C at idle. Confused, concerned, & frustrated I pulled the card out & parked it for a few weeks until I could get more info. Nvidia does offer a utility for setting the Core, Shader, & Memory clocks as well as controlling the on board fan ... but I could not get it to work at all or at least "remember" the settings. I have no recollection of why I decided to even try the MSI utility, but it was trivial to use. So easy that I haven't even tried to find an alternative much less go back to the Nvidia utility. And at idle, the fan keeps it about 58 deg C. When crunching it speeds up as necessary per the card temp. Just like you would want, what the heck?

    Back to the M2090. This thing does not have an integrated fan. If you follow that link above, the pic of the card with a large heat pipe heatsink is the only current offering. So while I wait for it to show up, I am thinking about what could be done. I cannot find any kind of picture or board layout (w/o the heatsink) to get an idea before it shows up. Since most people are probably buying these things for >$4K and probably not their money, they may be a bit hesitant to dig into it much. Me, any way to gets things done faster I am all over.

    I will be posting pics of the card when it shows up. I suspect that I will pull the heatsink pretty quickly & am hoping that some GeForce GTX 580 cooling solution would fit right on. Maybe someone has WC-ed a GTX 580 & would sell the stock cooler!

    Stay tuned ... pics sometime soon.
     
  2. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    well, it depends, the board layout COULD possibly be identical to the 580 3gb just higher density ram on it.... if so any aftermarket cooler or a waterblock SHOULD fit but there is the possibility that they changed vrm locations or other stuff so you will have to check it.
     
  3. TheCrimsonStar

    TheCrimsonStar Tech Monkey

    763
    0
    Apr 12, 2010
    Strawberry Plains, TN
    Alright, nub question. What's the difference between a desktop graphics card and a workstation card? Can you use the workstation cards for gaming?
     
  4. Psi*

    Psi* Tech Monkey

    751
    0
    Jun 17, 2009
    Westport, CT
    @TheCrimsonStar ... not such a nuby sort of question. There is a lot of discussion about this on many web sites. The short answer is that these cards have superior double precision math capability versus the GTX 580, for instance. I run software that uses a great deal of double precision math & it runs for hours even on this card. With single precision math the GTX 580 runs over the Teslas ... not sure that is still true with the M2090 tho.

    There are lengthy geeky posts by people who have written CUDA programs, Perl scripts, etc. that do enable some of the Tesla features on cards like the GTX580. There is even some firmware changes that have been done, but to the best of my scouring of the net they never quite get to the capability of the actual Tesla cards. I also have the C2070 versus the C2050 for the 6GB of on board RAM (I wish there was more!)

    In playing around with the card (C2070 which does have a dual DVI video output) I got blazing speed in the normal video benchmarks ... FurMark, FluidMark, Heaven Benchmark, probably some others. Sorry I did not keep the results as I was anxious to start number crunching, but I will revisit & post.

    Interestingly, it is reported that using the video port slows the math thru put. So I also have the recommended accompanying NVS 300 card which has only 8 (I think) CUDA cores for video support. On my ASUS P6X58D m/b the C2070 is the primary card & I cannot find a way of changing this. But this only matters when needing to go into bios ... in other words bios is only accessible via the C2070 & the NVS 300 doesn't output video until Windows 7 actually boots. So as long as there is nothing to change in bios the monitor stays on the NVS 300.

    Wonder what fun it will be to get the machine going with the M2090? The M2090 has no video output at all ... ma-a-aybe it won't even be an issue?!?! :confused:
     
  5. Rob Williams

    Rob Williams Editor-in-Chief Staff Member Moderator

    12,080
    1
    Jan 12, 2005
    Atlantic Canada
    You overclocked a $2,500 workstation card? That is... just awesome.

    But 34%? That seems rather high... the desktop cards can't even OC that much. What were the before and after clocks, if you don't mind me asking?

    Did you ever get to the bottom of this? 83C is too damn high at idle... and at that point I am guessing the card wouldn't even get much hotter. The max temperature is something like 95-100C.

    Does it need to be actively cooled, though? If it's not sold with an active cooler (which blows me away), it might be designed to run cooler somehow. You might just want to test it as is first. The lack of a fan concerns me though... dust build-up will only be amplified.

    When crunching away, have you ever monitored the GDDR5 usage using a tool like GPU-Z? I'd be interested in knowing if that 6GB is utilized, and how often.

    I'm intrigued by all this though. You have some serious kit right there.

    What Psi* said, but I believe it can get even more complicated than that. Workstation cards will be slower for gaming than regular desktop cards due to their tweaked architecture and also the drivers. Workstation cards are meant for game designers, movie creators and then people like Psi* who can take full advantage of the super-fast mathematical performance and parallelism of the GPU.

    One of the reasons workstation cards cost so much also ties into the support. Contrary to the best feature of its cards, NVIDIA gives unparalleled (what a horrible joke) support to its workstation customers. These customers aren't just trying to get a game to work, they're often in business where downtime is a non-option.
     
  6. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    I do suppose you have an epic option to turn the new card into a turbine lmao Strap a couple deltas (or at least some decent fans) to it and run it like that. It has a massive heatsink on it so with some fans it would cool extremely well.
     
  7. Psi*

    Psi* Tech Monkey

    751
    0
    Jun 17, 2009
    Westport, CT
    I have a screen capture with the card running a model. Unfortunately I am a dunce in getting this uploaded to the forum & I have to leaverightnow ... later
     
  8. Optix

    Optix Basket Chassis Staff Member

    1,514
    0
    Dec 15, 2009
    New Brunswick, Canada
    You can overclock a $2k card but can't upload? Hehehehe.

    Just giving you a hard time, Psi*. Some Scythe Gentle Typhoons we be money!
     
  9. RainMotorsports

    RainMotorsports Partition Master

    352
    0
    Jul 1, 2011
    While it does seem high. I have pulled 22% out of a laptop gpu. 9800M GS 530-1325-800 to 650-1625-900. Dunno if i ever posted it here but burned it a good 2 hours.
     
  10. Psi*

    Psi* Tech Monkey

    751
    0
    Jun 17, 2009
    Westport, CT
    @Optix ... that was your opening & someone was expected to take! :)

    If the attachment does not fit well on ppl's monitors let me know & I'll split future ones.

    So this shows the number cruncher in the background behind GPU-Z, half of the MSI Afterburner display in top right, CPU-Z in lower right, & gud ole Perf. Mon. left of center ... too much in one pic? There are soooo many views to be had.

    Default clock rates for the GPU are; GPU clock ... 574, Memory ... 747, Shader ... 1147. So those OCs are 30%, 24%, & 30% respectively. I thought sure that some benchmark program reported 34%. I have looked at too many at odd hours so maybe I did dream that. But I am a liar by only 4%. :eek:

    Note this is on the i7-920 @ 4.2GHz. This is the machine that has the slow SATA 3 SSD response. We talked about this in some other thread. I have discovered that this is because of the OC. I'll find that thread & add it to this thread. This is sort of, "when is an overclock, not an overclock".
     

    Attached Files:

  11. Psi*

    Psi* Tech Monkey

    751
    0
    Jun 17, 2009
    Westport, CT
    A couple of useful comments about the MSI Afterburner plots;
    GPU1 is the C2070 & GPU2 is the NVS 300 used for display.

    So the noise on the GPU2 usage % is due to screen activity ... program updates & mouse moving around.

    The top graph is GPU1 Temperature & is in the low 70s C. So, yes, Rob I figured out a solution to the high idle temps of the C2070. The MSI utility has the ability to simply setup a fan control curve where the higher the GPU temp rises (as the input), the faster the fan spins ... so cool. :cool:

    Lower is the GPU1 % usage at ~80%. This particular software never maxes CPUs or apparently GPUs to 100%.

    The bar graph part of the GPU-Z shows the GPU card's memory usage at 3709 GB of the 6 GB total. The number cruncher does give an indication that 75% of the GPU meory is allocated for this problem. Yes, problems sent to the GPU must fit in that memory space. When they don't the number cruncher just sends it to the i7-920. So on problems that get kicked to the CPU I'll try to reduce the problem size to make it fit ... sometimes it makes sense, sometimes not.
     
  12. Psi*

    Psi* Tech Monkey

    751
    0
    Jun 17, 2009
    Westport, CT
    I'm getting a 2nd M2090! Now I need a dual CPU system board.
     
  13. Kougar

    Kougar Techgage Staff Staff Member

    2,588
    0
    Mar 6, 2008
    Texas
    Plenty of dual SBE boards coming out in a week... :D
     
  14. Psi*

    Psi* Tech Monkey

    751
    0
    Jun 17, 2009
    Westport, CT
    THAT is good news. I am all about getting a quality m/b with current crop of CPUs (more modest price) & then up grading the CPUs in the future if it makes sense.

    As I understand it so far, the will be more PCIe channels now Gen 3 as well as a few legacy Gen2. I forget the mix but that doesn't matter so much. This next build may be more about how many PCIe slots & having a case large enough to accommodate double wide cards. Having 2 Tesla M2090s is serious computing, but 4 would be perfect & the ultimate. Each of the Teslas of current manufacture are PCIe Gen 2 X16. And, there still needs to be at least a single slot video card for actual video.

    Mark me as excited & anxiously awaiting the news & reviews! And hoping there isn't a double dip recession. :(
     
  15. tentonine

    tentonine Guest

    I am curious about how the cooling turned out with the M2090s. Did it work out without modification or did you have to go with a GTX 580 cooler or at least some fans?
    I am thinking of buying an M2090 or other M-series card (there are some really cheap M2050s on eBay), but I'm concerned about this.

    Thanks for any comments! I'd be particularly interested in hearing if you found a cooler for a GTX 580 (or some other card) to fit the M2090.
     
  16. Psi*

    Psi* Tech Monkey

    751
    0
    Jun 17, 2009
    Westport, CT
    I haven't started it yet. :(

    I have searched the net trying to find someone as adventurous, but there doesn't seem to be anyone. At the first break I have toward the end of the month I will take the heat sink off 1 card & figure out how to make a 1:1 image/drawing of the board & mounting holes. I'll send that out to see if someone will give an educated guess that a GTX580 water block will fit. Maybe to frozenCPU for instance.
     
  17. tentonine

    tentonine Guest

    Great. Please can you let me know how it turns out?

    I ended up here because I couldn't find anyone else this adventurous either...
     
  18. DarkStarr

    DarkStarr Tech Monkey

    585
    0
    Apr 9, 2010
    Just post the bare card on here and we can compare to a ref 580. If its the same it should fit.
     
  19. atypicalguy

    atypicalguy Obliviot

    36
    0
    Jul 28, 2012
    LA
    I'm also taking the M2090 plunge. Thanks for the details; very informative. Think I will stick with air cooler, fan in the middle, blow straight in with vents on either end. I think one would be bold to discard the nice stock vent setup, unless space is an issue, as one should be able to achieve airflow equal to or exceeding the server setup quite easily. See you on the other side.
     
  20. Rob Williams

    Rob Williams Editor-in-Chief Staff Member Moderator

    12,080
    1
    Jan 12, 2005
    Atlantic Canada
    I'm at a loss as to why NVIDIA doesn't bundle a more efficient cooler than it does on these things, if many people are willing to go to extra lengths just to keep it cooler.

    I guess they expect all of these to be used in jet-engine server rooms.

    Welcome to the forums, atypicalguy! Be sure not to miss this thread:

    http://forums.techgage.com/showthread.php?t=11514
     

Share This Page