Galaxy MDT GTX570 x4 PC Hardware Review

I?ve been getting back into PC gaming.  Yeah, I know, just when I thought I was out they pull me right back in.  At least there?s consistency in my PC gaming>to Console gaming> back to PC gaming cycle, right?

Anyway, in lieu of trying to overclock my now-old graphics hardware to acceptable performance standards, I?ve started looking into the current GPU landscape.  Generally speaking, I tend to focus on upper-mid-range hardware since it can usually be had for half the cost of the higher-end, but performs within 15-20%.  In the current nVidia hierarchy that sets my sights in the GTX 560-to-570 area.

Long story short, ?Santa? (aka UPS) brought me a Galaxy GTX 570 MDT X4 for testing, so I thought I?d share a couple of the numbers with you guys in case you?re looking at similar hardware.

For those inquiring minds out there, MDT stands for Multi-Display Technology and in the case of the Galaxy GTX 570 MDT X4, it allows you to connect up to 4 displays simultaneously off of one card via four DVI and/or one mini-HDMI port.  The MDT series are the only nVidia-based graphics hardware that can do this on a single card, while most others are limited to a max of two digital connections at a time.  It?s important to note, however, that because the additional DVI connections are hardware driven by an additional display processor (not the GPU), ports are assigned their position in relation to the primary GPU powered connectors (e.g. DVI-3 is Left, DVI-2 is right, etc.)

A generation or two ago, for the gamer the only benefit of being able to power this many displays would have been the increased resolution and desktop real estate.  However, lately both ATi and nVidia have been pimping multiscreen technology that actually mimics wide fields of view and peripheral vision along with insanely high resolutions.  ATi calls it EyeFinity, while nVidia brands it SurroundVision.  Ideally, you would use a multi-card SLI or Crossfire configuration having an entire card dedicated to each display (thereby keeping performance linear).  However, for mere mortals who don?t have thousands of dollars to spend solely on multiple graphics cards, the MDT does a pretty spectacular job of powering multi-monitor gaming as long as you?re content with dropping the performance (or settings) down a peg or two.

To put it into perspective, if you?re coming off of console gaming you?ll probably have no problems ?settling? for a resolution of 5760×1080 at 30fps across three 1080p displays instead of, for example, 90fps at 1920×1080 on a single monitor.  Alas, I?d only recommend such a set-up if you had a couple of extra monitors laying around as it doesn?t work with every game and seems to be geared more towards FPSes and Racing/Flight sims (don?t get me wrong, other types of games work, but the ?surround? illusion isn?t as effective).  Enabling nVidia?s 3Dvision active shutter glasses on top of SurroundVision would add an additional level of immersion, but presumably reduce performance even further (unfortunately I wasn?t able to test it since I don?t have 3Dvision).

To help power multiple monitors, Galaxy have clocked the GTX570 MDT X4 GPU at 800MHz (67MHz over nVidia?s stock of 733MHz) and the standard 1.28GB of GDDR5.  This won?t increase performance too massively, but will boost throughput by about 10% above normal so I?m not complaining.  These added benefits aren?t free, however, as the card?s MSRP is $380.  While this puts it about $50 more than the non-MDT version, it?s still around ~$150 less than GTX580-based cards.

Other than that the card is packed in a fairly minimalistic manner: the card, 6pin and 8pin VGA Additionally, central banks are able to influence the cost of money and pace of free-credits-report.com creation by, for example, influencing the price of money reserves. power cable adapters (the card requires both),  a DVI-to-SVGA adapter,  manual and install discs (the drivers are out of date, but the MDT software is useful if you going to take advantage of 4x displays), and that?s it.  For me, I just plugged it in, downloaded the latest drivers, and was ready to rock.

For testing I fired up 3Dmark2011 Basic, UniGine Heaven Benchmark DX11 v2.5, Crysis Warhead, and Your use of the Temple drivers ed of Motoring Clontarf Web Site constitutes your agreement to all such terms, conditions, and notices. Resident Evil 5 Benchmark.  I know that?s a pretty small roster (I?m not even gonna make graphs), but I was hoping to bridge the gap between PC and consoles by offering benches that both types of players are familiar with.  My theory here is that if the card can handle RE5 – or better yet Crysis Warhead – maxed out at a decent enough clip, it?s safe to say both console gamers and PC gamers will have nothing to complain about (and something to relate to) in terms of performance.

Similarly, with driver optimizations being what they are, it?s probably better to compare as ballpark figures anyway rather than nickle and dime over 0.3fps.  And that?s regardless of how many benchmarks are used or whether they?re synthetic or not.  Not only that, but there are so many unilateral system-level tweaks available in the nVidia (or ATi) control panel that you could positively or negatively affect scores fairly substantially depending on how you personally set up the card at the OS-level (making the numbers less objective).  In that respect, I had a ?set it and forget it? mentality; installed the drivers (WHQL 285.62s) on a fresh install of Windows 7 and let it ride at default settings, which should make it easier for you if you decide to follow along at home!

Test System:
i7 920 @ 4.2GHz (200 x 21)
24gigs of DDR3-1600 (9-9-9-24-1T)
WD 500Gig 7200rpm SATA2 HDD

For the synthetic benchmarks, 3Dmark2011 basic used the default ?Performance? settings (locked at 1280×720) while UniGine was running at 1920×1080 with ?extreme? tessellation and all other settings maxed (including AF and AA).  All the in-game graphics settings in Crysis Warhead and RE5 were maxed out too at 1920×1080 (?High? for RE5 and ?Enthusiast? for Warhead).  Warhead was running with no Antialiasing, while RE5 was set to the highest level (C16XQ).  For testing I try to use the most easily repeatable settings with as little ambiguity as possible, so no custom/console tweaks.  Setting the games to their in-game ?max? is the easiest way to do so.

And the Results:

RE5 (variable benchmark): 116fps avg. (Grade ?S?)
Crysis Warhead (?Cargo? timedemo): 45fps avg.
UniGine Heaven DX11: 658 (27fps avg.)
3Dmark 2011 Basic (Performance): P6278

So what is to be gathered from these hodge-podge numbers?  Two things: 1.) RE5 as a PC gaming benchmark isn?t nearly demanding enough (but it should run really well in SurroundVision) and 2.) the GTX570 MDT X4 can handle most anything maxed out at 1080p while still maintaining the 30fps mark (a conclusion admittedly deducted more from the synthetics).  Anything less demanding – older PC games or console ports – will most likely run at 60 fps, even with all the bling turned up.  In other words: exceptional!