NVIDIA GeForce GTX295 - the new speed benchmark? Overview of dual-chip video card ZOTAC GeForce GTX295 Game tests: Race Driver: GRID.

The trend of releasing multi-chip hi-end solutions only intensifies every year, and both GPU developers managed to introduce the second generation of video cards with two GPUs of a unified architecture. AMD even succeeded in this regard, allowing a bunch of a pair of Radeon HD 4850s to be born, while the flagship operates at higher frequencies and is equipped with GDDR5 memory, just like the Radeon HD 4870. Due to technical problems, it is easier to develop productive solutions based on two mid-range chips than on one more powerful one, but this affects the final power consumption and, accordingly, heat dissipation. Besides, support for SLI or CrossFire mode on the part of the application is required at the proper level, otherwise the result produced by the accelerator will be equal to a single card.

Initially, AMD and NVIDIA followed their own path, developing two-chip solutions: the first created a compact device on a single PCB, the second produced tandems in the form of a “sandwich” of two separate boards cooled by a common radiator. Naturally, the latter option is less preferable, since the use of two half-cards affects the final cost of the product and requires efficient heat removal from graphics processors located opposite each other. But recently, the Californians still switched to a new design of their flagship GeForce GTX 295, which now uses only one printed circuit board. There were even rumors on the Web about the allegedly long length and high power consumption of the new product compared to the old version of the card. But is it so, we will try to find out in our review.


Inno3D GeForce GTX 295 Platinum Edition

As a representative of the new revision of the GeForce GTX 295 accelerator, a video card from Inno3D with a very loud name - Platinum Edition got to the test. The card comes in a small box with a picture of a warrior girl, which also shows the main characteristics of the adapter and the presence of two games in the kit.


Package Inno3D GeForce GTX 295 Platinum:
  • DVI/D-Sub adapter;
  • DVI/HDMI adapter;
  • audio cable (S/PDIF);
  • disc with the game Warmonger;
  • CD with Far Cry 2 game;
  • disk with drivers;
  • 10% discount coupon for the purchase of five games.



Designed by GeForce GTX 295 Rev. B is fundamentally different from its predecessor, both due to the layout of elements on the same board, and due to the 85 mm fan located in the center, the air from which exits both outside the system unit and inward through the end of the card. Something similar was used back in the days of the GeForce 7900 GTX.





On the reverse side, two heatsink plates are screwed, cooling part of the memory chips. A similar, but integral solution is used in the Radeon HD 4870 X2.

In terms of its size, the new product does not differ at all from the old revision, although there was information on the Web about a slightly longer accelerator length relative to the previously released GeForce GTX 295.



In addition, the card lost the HDMI connector, and now progressive monitors or TV screens will have to be connected in the old fashioned way, through an adapter. The remaining peripheral connectors, namely two Dual Link DVI, remained in place. Naturally, there is no HDTV connector. Immediately on the bar there is an indicator that is responsible for the correct connection of the power cables - nothing has changed in this regard.



Like its predecessor, the new revision of the GTX 295 is equipped with only one MIO interface, which allows you to combine two identical cards in Quad-SLI mode. NVIDIA's technical department, with some hesitation, confirmed that a couple of different accelerators can work in this mode, but all attempts to somehow boot with two video cards of different revisions were unsuccessful.

Structurally, the cooling system has much in common with that of its competitor Radeon HD 4870 X2, but, as noted above, the adapter in question is equipped with a large fan instead of a turbine, part of the air from which, cooling the radiator of one GPU, enters the system through a hole in the back casing.



The cooler cover is easy to remove, unlike the first cards based on the 65nm G200, and to avoid plastic rattling during system operation, small springs are installed on the guides around the perimeter.



The cooler consists of an aluminum base that cools the memory, an NV200 switch chip, two NVIO2 chips and power cells, and two separate heatsinks with heat pipes are used to cool the GPUs.





The heatsinks are small, even smaller than the single-chip cards of the GeForce GTX 200 series, have a copper insert and two heat pipes that transfer heat to thin aluminum fins.



The radiators are blown by a fan with a maximum speed of about 3400 rpm. In the nominal operating mode, the fan speed does not rise above 3100 rpm, but despite this, the system works much quieter than turbine-type coolers.

The disassembled cooling system looks like this:



NVIDIA engineers had to seriously work on the design of the printed circuit board, because compared to the Radeon HD 4870 X2, the tandem from California has a large number of memory chips and additional chips.





But in the end, the layout of the elements on the board turned out to be quite compact, both due to the transfer of part of the memory chips to the back side of the card, and due to the grouping of additional chips and power piping in the center of the board. The power subsystem, as in the old revision of the GTX 295, is separate: three phases per GPU and one for memory. The card is powered by one 8- and one 6-pin connector.



The card has G200-400 and G200-401 revision B3 GPUs. There are no protective frames, and they are not needed, given that the base of the cooler serves as a stiffening rib for the entire board.



Twenty-eight GDDR3 Hynix H5RS5223CFR-N0C chips (fourteen for each processor) designed for an effective frequency of 2000 MHz are used as memory.



The total amount is 1792 MB, but the actual amount is only half, since due to the specifics of the SLI mode, like CrossFire, the data in the memory of each GPU is duplicated. The memory bus is 448 bits per "half" of the card.

The operating frequencies of the Inno3D GeForce GTX 295 Platinum almost correspond to the reference ones - 576/1242 MHz (core and shader domain) and 2016 MHz (memory). In this case, the memory frequency is overstated by 18 MHz.



When switching to 2D mode, to reduce power consumption, the cards first drop the frequencies to 400/800 and 600 MHz (core and memory), then to 300/600 and 200 MHz.

The overclocking of the card was 684/1476 MHz (raster and shader domains) for chips and 2376 MHz for memory, which is very good for a two-chip solution. For example, a representative of the GeForce GTX 295 Rev. A failed to reach such frequencies.



In addition to the overclocking potential, I was pleased with the temperature regime of the adapter - no more than 82 ° C on the GPU when testing in gaming applications and no more than 86 degrees Celsius when running FurMark.

For comparison with the new product, we used a video card from XFX of the old revision - XFX GF GTX295 576M 1792MB DDR3. The card comes in a large green-and-black box, on which only the features of the product and the presence of the Far Cry 2 game are marked.



Contents of delivery:
  • DVI/D-Sub adapter;
  • audio cable (S/PDIF);
  • HDTV adapter;
  • Molex/PCI-E power adapter;
  • disk with drivers;
  • CD with Far Cry 2 game;
  • instructions.



The design of the card fully corresponds to the reference one: the same "sandwich" as the GeForce 9800 GX2, but in order to save money, it has no back cover and is equipped with a simpler front one.





There are two Dual Link DVI and HDMI on the card, next to it on the bracket there is an indicator for connecting power cables. There is one MIO interface for combining similar cards in SLI mode, but our two test instances of different revisions refused to work together.



The design of the cooling system is slightly different from the GeForce 9800 GX2, but, as before, part of the air is thrown into the system unit, and part is already coming out.



To power the adapter, the half-cards have two connectors: one 6-pin and one 8-pin. Disconnecting cables is now easier, but not as if the connectors were located along the board.

Graphic processors operate at a frequency of 576/1242 MHz, memory, with a total volume of 1792 MHz - at a frequency of 1998 MHz, which is fully consistent with the reference characteristics. Transitions to 2D mode are similar to the map discussed above.



The overclocking of the card was only 612/1332 MHz for the chip and 2232 MHz for the memory, which is significantly lower than the result of the GeForce GTX 295 Rev. b.
The temperature of one of the cores during acceleration reached 98 degrees Celsius, while the turbine worked at 85% of the maximum speed. This applies to gaming applications. If we run the FurMark benchmark, then the GPU warmed up to 100 ° C, and the turbine speed was already at 100 percent. Imagine the rumble from such a system, we think, will not be difficult. Force3D Radeon HD 4870 X2

To compete in the upper price range of video cards, AMD released a two-chip solution based on the RV770 last year, which was able to win the palm from the GeForce GTX 280, and then from the GTX 285 on the updated G200. Naturally, at first, the 65-nm process technology did not allow NVIDIA to combine two graphics cores in one video card, but with the transition to thinner technological standards, this became possible, which made it possible to release a competitor for the Radeon HD 4870 X2 - the GeForce GTX 295.

A solution based on a pair of RV770s came to us for testing in an OEM configuration, i.e. without anything, so let's go straight to the product description. The Black Force3D Radeon HD 4870 X2 video card is made according to the reference design and differs from similar adapters only by a branded sticker on the cooling system. Compared to the single-chip accelerator Radeon HD 4870, this adapter is longer and equal in size to products of the GeForce GTX 200 series.





The set of peripheral connectors is standard: two Dual Link DVI with the ability to output digital audio via a DVI/HDMI adapter from the built-in multi-channel audio codec, HDTV output and one interface for connecting a bridge in CrossFire mode.



The cooling system consists of an aluminum base that cools the memory, a chip switch and power elements, a pair of copper heatsinks for GPUs and a memory heatsink plate on the back of the card.



The turbine, installed at the edge of the adapter, drives air through one radiator, which, after heating, no longer removes heat from the second one as efficiently, and the difference between the cores can reach 15 ° C (75 versus 90). In this case, the cooler of the new revision GeForce GTX 295 looks much more interesting.



The printed circuit board, unlike the new revision of the GeForce GTX 295, has a lot of free space on the right side, and all the main components are grouped in the center and near the peripheral connectors - these are two graphics processors, a PLX PEX8647 switch chip, half of the memory (the rest is moved to the back side cards) and power strapping.



The RV770 processor is equipped with a protective frame that prevents the core from chipping when installing the cooling system.



The switch chip for connecting two GPUs was previously used on the Radeon HD 3870 X2, but now it supports the PCI Express 2.0 bus, unlike PCI-E 1.1 in the old version.



Sixteen Hynix H5GQ1H24MJR-T0C GDDR5 memory chips totaling 2048 MB (1024 MB per GPU) are designed for an effective frequency of 4000 MHz. Memory bus 256 bits for each chip.



The operating frequencies of the Force3D Radeon HD 4870 X2 fully correspond to the reference ones and are equal to 750/3600 MHz, chip and memory respectively. When switching to 2D mode, the frequencies are reduced to 507/2000 MHz.



The overclocking of the card was 792 MHz for the GPU and 3800 MHz for the memory, which is not particularly impressive, but considering the heating of the accelerator, it can be considered quite worthy.
As for the heating of the adapter, the temperature of one of the processors reached 90 degrees Celsius, the other - only 75 °C. When tested in FurMark, the temperature rose to 93 ° C, and the card became more like a vacuum cleaner than a gaming solution, so if possible, it is better to change the reference cooler, for example, to Accelero XTREME 4870X2 from Arctic Cooling. BFG GeForce GTX 285OC

The products of the American company have recently appeared on the domestic market, and although it is not particularly represented, it still won sympathy from a certain category of users.

The BFG GeForce GTX 285 OC comes in a small box with a painted “bodybuilding magician” picture, which also shows the operating frequencies of the video card and provides reference specifications so that you can feel the difference on the counter.



Contents of delivery:
  • DVI/D-Sub adapter;
  • DVI/HDMI adapter;
  • HDTV adapter;
  • audio cable (S/PDIF);
  • Molex/PCI-E power adapter;
  • instructions;
  • warranty memo;
  • advertising insert;
  • disk with drivers;
  • stickers.



The BFG GeForce GTX 285 OC fully complies with the reference products and differs only in a branded sticker on the cooling system.





The set of peripheral connectors is the same: two Dual Link DVI and HDTV. There are a couple of MIO interfaces for combining such cards in SLI mode. The card is powered by two six-pin connectors.



The operating frequencies of the accelerator are slightly different from the reference GeForce GTX 285: 666/1512 chip and 2484 MHz memory, which is 18/36 MHz higher for the GPU. The transition to 2D mode is the same as with the XFX GF GTX295 576M 1792MB DDR3.



The overclocking of the card was not particularly impressive and amounted to 678/1548 MHz for the chip and 2646 MHz for the memory.



The temperature of the chip during acceleration reached 90 ° C, and the turbine rotated at its 100%, which was about 3160 rpm. Of course, you can’t call CO quiet operation, but for hi-end products this has already become the order of things.
Characteristics of the cards in question
Video adapter XFX GF GTX295 576M 1792MB DDR3 Force3D Radeon HD 4870 X2 BFG GeForce GTX 285OC
Core 2x G200b 2x G200b 2x RV770 G200b
Number of transistors, million pieces 2x 1.4 billion 2x 1.4 billion 2x 956 million 1.4 billion
Process technology, nm 55 55 55 55
Number of stream processors 2x 240* 2x 240* 2x 800* 240
Number of texture blocks 2x80 2x80 2x40 80
Number of render units 2x28 2x28 2x16 32
Core frequency (nominal), MHz 576 576 750 666 (648)
Shader domain frequency (nominal), MHz 1242 1242 750 1512 (1476)
Memory bus, bit 2x448 2x448 2x256 512
Memory type GDDR3 GDDR3 GDDR5 GDDR3
Memory size, MB 2x896 2x896 2x 1024 1024
Memory frequency (nominal), MHz 2016 1998 3600 2484
Supported version of DirectX 10 10 10.1 10
Interface PCI Express 2.0 PCI Express 2.0 PCI Express 2.0 PCI Express 2.0

* - GPU architecture from AMD and NVIDIA differ from each other

Test stand and test conditions

Testing was carried out on the following stand:

  • Processor: Intel Core 2 Duo 8500 (3, [email protected].27 GHz, FSB 450 MHz);
  • Motherboard: ASUS Rampage Formula (Intel X48);
  • Cooler: Noctua NH-U12P;
  • RAM: G.Skill F2-8800CL5D-4GBPI (2x2048 MB, [email protected], 4-4-4-12-2T, dual channel);
  • Hard disk: Samsung HD252HJ (250 GB, SATA2);
  • Power supply: Silver Power SP-S850 (850 W);
  • Housing: Housing: Chieftec BA-01B-B-SL;
  • Operating system: Microsoft Windows Vista x86 SP1;
  • Drivers: NVIDIA GeForce 185.85 and ATI Catalyst 9.4.
The performance of the video drivers was tuned for maximum quality. PhysX version 2.9.09.2 was installed, PhysX acceleration via the GPU was not disabled, except for testing in 3DMark Vantage.

The following applications were used for testing:

  • 3DMark'06 - v1.1.0, default settings;
  • 3DMark Vantage - v1.0.1, "Performance" and "High" profile, basic tests;
  • Last Remnan Benchmark - no settings other than resolution;
  • Cryostasis TechDemo - "High" preset;
  • Far Cry 2 - built-in benchmark, "Ultra High" profile;
  • S.T.A.L.K.E.R.: Clear Sky Benchmark - "Ultra" quality, Day test, DirectX 10.1 support selected for Radeon HD 4870 X2;
  • H.A.W.X. - maximum quality, SSAO - Very High, support for DirectX 10.1 was selected for the Radeon HD 4870 X2 card;
  • Crysis - v1.2, used Crysis Benchmark Tool 1.05, standard demo, quality settings - "Very High";
All measurements were carried out at a screen resolution of 1680x1050 (except for 3DMark "06 and 3DMark Vantage when the "Performance" profile was selected) to create a more or less serious load on video accelerators. In addition, image quality enhancement modes were used in all applications that allowed this - 4x or 8x anti-aliasing and/or 16x anisotropic filtering AA and AF were not forced in the drivers.

On graphs where there is no legend, the average and minimum fps are indicated.

Test results



In the old version of the 3DMark synthetic test package, the leader is the Radeon HD 4870 X2. However, the Inno3D card easily compensates for the lag due to its good overclocking potential. The single-core NVIDIA flagship is not far behind the top cards in simple mode, but with anti-aliasing it is up to 40% behind the GeForce GTX 295.





In 3DMark Vantage, the leadership is already on the side of NVIDIA dual-chip cards, AMD's flagship lags far behind them, and with an increase in resolution, this lag only increases (up to 28%). The single-core GeForce GTX 285 card is not able to compete with its older brothers, but about 20% separates it from the Radeon HD 4870 X2 card, which, however, does not help to catch up with any overclocking.

The Last Remnant



The meager difference between NVIDIA dual-chip cards and the GeForce GTX 285 immediately catches your eye. And the result of the Radeon HD 4870 X2 is very low, and if you look at our previous reviews, you will see about the same results for the regular Radeon HD 4870. So in CrossFire does not work in this game, and SLI does not seem to work either.

Cryostasis TechDemo



Due to the use of the NVIDIA PhysX physics engine, Radeon graphics accelerators do not look good in this test, but in a real game, PhysX can be disabled and play safely on Radeon cards. As for the difference between GeForce video adapters, although dual-chip cards are in the lead, their advantage is no more than 20%. And if we compare by the minimum fps, then the difference is even smaller.

Far Cry 2



Unusual results are observed in the game Far Cry in easy mode. All dual-chip cards are inferior to the GeForce GTX 285, and overclocking does not bring a performance increase, or even leads to a slight drop in the final result, which suggests that the reason for this lies in the drivers. But let's see how things will be when anti-aliasing is enabled.



In heavy mode, everything looks more or less adequate for NVIDIA cards, but not for Radeon, which loses about 25% of performance and becomes an outsider. On the other hand, dual-chip GeForce cards show approximately the same results as in the simple mode, outperforming the GeForce GTX 285 by 14%. But these cards almost do not differ in minimum fps. But here a small difference can be explained by the high processor dependence of this test.

S.T.A.L.K.E.R.: Clear Sky



Dual-GPU cards from NVIDIA and AMD demonstrate almost identical results, but the Radeon HD 4870 X2 still has a minimal advantage in terms of minimum fps. High overclocking potential easily brings Inno3D GeForce GTX 295 Platinum to the lead. The gap between the single-chip GeForce GTX 285 and more expensive accelerators is about 45%.



Slightly mixed results are seen with both GeForce GTX 295s when anti-aliasing is enabled in this game. Their minimum fps sags a lot, and in this indicator they are inferior even to the GeForce GTX 285. It seems that the drivers for dual-chip NVIDIA still need significant improvement.

H.A.W.X.



In this application, the undisputed leader is the Radeon HD 4870 X2, especially in heavy mode with anti-aliasing. It has a 70% advantage in this mode over the GeForce GTX 295 and more than a twofold advantage over the GeForce GTX 285!



In Crysis, NVIDIA's two-chip flagships again confidently occupy the leader's place. The good overclocking potential of the updated GeForce GTX 295 helps the Inno3D card confidently outperform all rivals. The GeForce GTX 285, although significantly inferior to older cards, is not so far behind the Radeon HD 4870 X2 in terms of minimum fps.



When anti-aliasing is enabled, the performance of the Radeon HD 4870 X2 drops much less than that of all GeForce cards, and therefore, in nominal terms, it even overtakes the GeForce GTX295, but again it cannot compete with it in terms of minimum fps. In such a heavy mode, there is a lack of computing power in the GeForce GTX 285, which is only growing behind the leaders.

power usage

The Seasonic Power Angel was used to measure the total power consumption of the system. As a load on the video adapters, we used: a 10-minute run of the FurMark benchmark, three flights over the island in Crysis with Very High and AA8x graphics quality, as well as the first three tests in Devil May Cry 4 with Super High and MSAA8x quality. The screen resolution has always been 1650x1050. Idle power consumption of the system was measured after the test was completed after 10 minutes.







So, the new revision GeForce GTX 295 card was less voracious under load, and the peak power consumption in the first two tests was 10 W lower than that of the old GTX 295. In the third application, the system power consumption was equal, and the difference was only 2 watts. In idle, the system based on the novelty also consumed less - by 5 watts. The Radeon HD 4870 X2 card broke all records for “economy” - a system based on it under load already consumed 100 W more in FurMark, 50 W more in Crysis and 10 W more in DMC 4 than with the GeForce GTX 295, and at idle, the difference reached almost 20 watts. The result is clearly not in favor of AMD's solution.

conclusions

No sooner had a flagship based on the G200 pair been released than a simplified design with a more efficient and quieter cooling system, lower power consumption (according to our measurements) and better overclocking potential was immediately offered. The question is - why was it impossible to immediately release such a solution, why was it originally "to fence the garden" with two boards? The question remains open.

But be that as it may, the new revision of the GeForce GTX 295 turned out to be extremely successful, probably the best representative of the two-chip video cards ever produced. And given the low power consumption and higher performance than the Radeon HD 4870 X2, the new product becomes the best choice in the hi-end class.

As for the Radeon HD 4870 X2, this accelerator, at very high power consumption, may not always show a better result than the GeForce GTX 295. But if the game application supports DirectX 10.1, then the card gets a "second wind" and quite easily can show more frames in second, especially when smoothing is enabled.

  • Max Point for the Silver Power SP-S850 PSU;
  • Noctua for the Noctua NH-C12P cooler and Noctua NT-H1 thermal paste.

  • No one likes to be in second place, and especially not a company as ambitious as Nvidia. More recently, this company introduced the GeForce GTX 260 and 280 graphics cards, and at that time, AMD could only offer the unfortunate Radeon HD 3870. Not surprisingly, the GT200 graphics cards continued nVidia's leadership. But no one imagined that AMD would be able to strike back with the Radeon HD 4850 and 4870 graphics cards - by the way, neither card was faster than Nvidia's flagship - which managed to divert attention from the massive monolithic chip.

    After gaining a lot of experience with its smaller and more flexible GPUs, AMD quickly introduced the Radeon HD 4870 X2, featuring two RV770 GPUs, 2GB of GDDR5 memory, and a PCI Express (PCIe) bridge for chip communication. All of a sudden, the tide turned and AMD had the fastest dedicated graphics card on the market. It's been six months since the announcement, and already the full suite of RV770-based graphics cards is beating Nvidia's position in every market the G92 has conquered.

    Even hardcore gamers who can spend thousands of dollars to get the best gaming performance are confused. Want to beat the 4870 X2 with nVidia's solution? Buy a pair of GeForce GTX 280s. Want to beat it with an AMD solution? Buy a pair of 4870 X2 and run it in 4-way CrossFireX mode. Do you need something faster? Throw in a third GTX 280. But you'll need an expensive motherboard and a very powerful processor to handle that kind of graphics performance. Is there a limit to this?

    Therefore, the limitation of only one separate video card seems quite reasonable to determine the winner for today. Of course, with the existence of 4-way CrossFire and 4-way SLI configurations, today there is always more money to spend, increase power consumption and get 10-20 percent extra performance ... But we decided to pick a winner among single video cards.


    Click on the picture to enlarge.

    In 2008, nVidia didn't work too hard against the Radeon threat, only updating the GeForce GTX 260 to compete more aggressively against the 1GB AMD HD 4870. According to the company, all GTX 260 graphics cards will be able to use the 216 shader processor configuration in the future. Cards with 192 shader processors will disappear as partners sell them. Considering the previous price level, nVidia assumes that the new GTX 260s will be able to beat AMD's current offerings.

    But apart from that, nVidia wants to take the performance crown of separate graphics cards, and the company spends a lot of money on this. Our article is dedicated to a preview of the video card that nVidia plans to release during CES. The GeForce GTX 295 is NVIDIA's answer to the AMD Radeon HD 4870 X2 and uses the same design as the GeForce 9800 GX2.

    GeForce GTX 295 in detail



    Click on the picture to enlarge.

    Initially, there were rumors that the GeForce GTX 295 would consist of two GT200s in a configuration that simulates a pair of GTX 260s. In fact, the video card consists of two full-fledged GT200s, each with 240 processing cores, but with a memory configuration similar to the GTX 260.

    The original GT200 contained 1.4 billion transistors and was manufactured at TSMC's 65nm process. A version of the chip called the GeForce GTX 295 benefits from a reduced process technology to 55nm. In addition, according to Jason Paul of nVidia, the company has changed the on-chip latencies to improve the performance-per-watt ratio, which should show up in power consumption tests.

    Like the GeForce GTX 280, each GPU on the GTX 295 uses 240 stream processors and 80 texture filtering/addressing units. But, like the GTX 260, at the same time, the GPU 295 uses seven ROPs / framebuffer units, which gives 28 ROPs in total, and a 448-bit interface is applied to 896 MB of DDR3 memory. The most important clock speeds also match the GeForce GTX 260. The core frequency, including texturing units and ROPs, is 576 MHz. Stream processors operate at 1242 MHz. And the memory - at a frequency of 999 MHz (1998 MHz effective frequency). As you can see, the architecture of each chip is between the fastest and second fastest nVidia GPUs.

    GeForce GTX 280 GeForce GTX 260 GeForce 9800 GX2 Radeon HD 4870 X2
    Process technology 55 nm TSMC 65 nm TSMC 65 nm TSMC 65 nm TSMC 55 nm TSMC
    Number of stream processors 480 240 216 256 1600
    Core frequency 576 MHz 602 MHz 576 MHz 600 MHz 750 MHz
    Shader frequency 1242 MHz 1296 MHz 1242 MHz 1500 MHz 750 MHz
    Memory frequency (effective) 1998 MHz 2214 MHz 1998 MHz 2000 MHz 3600 MHz
    Memory 1792 MB 1 GB 896 MB 1 GB 2 GB
    Memory bus width 448bit x 2 512 bit 448 bit 256 bit x 2 256 bit x 2
    ROP 56 32 28 32 32
    Price $499 (rec.) ~$380 (from 12 thousand rubles) ~$230 (from 8 thousand rubles) From 9 thousand rubles ~$500 (from 14 thousand rubles)

    Two GPUs, one graphics card

    At first glance, the GeForce GTX 295 looks like a 280 or 260. If you turn the card over and look at the back panel, you will see only one GPU there. Like the 9800 GX2 and 7950 GX2 that came out earlier, the video card actually consists of two printed circuit boards, which are located in the form of a "sandwich", with a special heatsink with a fan in between them, which is responsible for cooling. The two boards are connected by an SLI cable and enclosed in a protective casing.



    Click on the picture to enlarge.

    It is only natural that the design of the cooler has been specially adapted to cool two printed circuit boards. You can see that both boards have holes through which air is sucked in. The board takes up two slots, so it's no wider than other single-chip nVidia offerings. In addition, the board is the same length as the GeForce GTX 280 (and AMD Radeon HD 4870 X2).

    Nvidia isn't ready to compete on GTX 295 power consumption with AMD's solution just yet - so we weren't allowed to put the power consumption value in numbers. However, the card consumes less power (idle and under load) than the Radeon HD 4870 X2. The following graph shows the power consumption of the system from an outlet.



    Click on the picture to enlarge.

    On paper, the GeForce GTX 295 uses a TDP of 289W. The 4870 X2 has a TDP of 286W. When we measured system power draw from a wall outlet, the GTX 295 was drawing 10W less than the AMD board when idle. When running the Far Cry 2 test in a loop at 2560x1600 with anti-aliasing (AA) and anisotropic filtering (AF), the nVidia board consumed, on average, a whopping 50 watts less.

    Of course, we should wait until early January when fan speeds are determined and power consumption is balanced. But already now we can say that the transition to the 55-nm process technology had a good effect on the GeForce GTX 295, despite the massive size of the GT200 GPU.

    Test configuration

    Our test configuration corresponds to a system for enthusiasts, in which it is not a shame to install a $500 video card - it can be either a system with a top i7 965 processor or an overclocked i7 920. In any case, the platform coped well with the load of all four of our test video card configurations . It's understandable that if you're planning to upgrade to a 4-way SLI configuration (which the GTX 295 supports) or 4-way CrossFireX, then you need to pay close attention to CPU performance.

    But today we are interested in the fastest single video card prepared for 2009.

    Hardware configuration
    CPU Intel Core i7 965 Extreme (3.2 GHz)
    Motherboard Asus Rampage II Extreme
    Memory 6 GB DDR3-1333 7-7-7 (three channels)
    Storage device Seagate 250 GB Barracuda 7200.10 7200 rpm
    optical drive Lite-On DH-4O1S BD-ROM
    Power Supply Cooler Master UCP 1100W
    Video cards Nvidia GeForce GTX 295 1.8 GB
    Nvidia GeForce GTX 280 1 GB
    AMD Radeon HD 4870 X2 2 GB
    AMD Radeon HD 4870 512 MB

    We again used the 64-bit version of Vista, which supports a full 6 GB of system memory.

    For tests, we used the following games.

    Tests and settings

    Tests and settings
    Very High Quality Settings, No AA / No AF, vsync off, 1900x1200 / 2560x1600, Patch 1.2.1, DirectX 10, 64-bit executable, benchmark tool
    Very High Quality Settings, 4x AA / 8x AF, vsync off, 1900x1200 / 2560x1600, Patch 1.2.1, DirectX 10, 64-bit executable, benchmark tool
    Highest Quality Settings, No AA / No AF, vsync off, 1920x1200 / 2560x1600, Patch 1.1, FRAPS/saved game
    Highest Quality Settings, 4x AA / 8x AF, vsync off, 1920x1200 / 2560x1600, Patch 1.1, FRAPS/saved game
    Highest Quality Settings, No AA / No AF forced in drivers, vsync off, 1920x1200 / 2560x1600, FRAPS/saved game
    Highest Quality Settings, 4x AA / 8x AF forced in drivers, vsync off, 1920x1200 / 2560x1600, FRAPS/saved game
    Ultra Quality Settings, No AA / No AF, vsync off, 1920x1200 / 2560x1600, Patch 1.0.0.15, FRAPS/saved game
    Ultra Quality Settings, 4x AA / 8x AF, vsync off, 1920x1200 / 2560x1600, Patch 1.0.0.15, FRAPS/saved game
    Very High Quality Settings, No AA / No AF, vsync off, 1920x1200 / 2560x1600, DirectX 10, Steam Version, in-game benchmark
    Very High Quality Settings, 4x AA / 8x AF, vsync off, 1920x1200 / 2560x1600, DirectX 10, Steam Version, in-game benchmark
    Highest Quality Settings, No AA / No AF, vsync off, 1920x1200 / 2560x1600, timedemo
    Highest Quality Settings, 4x AA / 8x AF, vsync off, 1920x1200 / 2560x1600, timedemo

    Test results



    We started our tests with Crysis, which supports 64-bit mode and is one of the most demanding games on the market. In all but one case, the GTX 295 outperformed the AMD Radeon HD 4870 X2. At 2560x1600 with anti-aliasing (AA) and anisotropic filtering (AF), nVidia acknowledges that there is a performance bug in the beta driver.

    Interestingly, the 512MB Radeon HD 4870 comes pretty close to the GTX 280 without AA and AF turned on. However, it is worth turning on the options for improving image quality, as half the amount of memory starts to slow down performance.



    Once again, the GeForce GTX 295 shows a solid lead, which is most evident at 2560x1600 with active AA and AF. If the performance results of the final version of the card are the same (and at a price of $ 499 you can not expect otherwise), nVidia has a serious advantage over the AMD graphics cards in this first-person shooter with a World War II theme.

    In this game, unlike Crysis, the GTX 280 was noticeably faster than the 512 MB Radeon HD 4870. However, the 1 GB version would certainly perform better, although it will cost $40 more.



    Dead Space - a fresh "horror", transferred from the world of consoles to the PC. And, unfortunately, this negatively affected the quality of the graphics.

    Although we set the indicated full-screen anti-aliasing modes (which led to a corresponding drop in performance, that is, AA and AF were performed in the driver), we still noticed obvious "ladders" even at 1920x1200 and 2560x1600 resolutions. Additionally, the Radeon HD 4870 X2 experienced strange but repetitive behavior at 1900x1200, where results were less than at 2560x1600.

    This game clearly prefers Nvidia graphics cards, as even the GTX 280 outperforms AMD's flagship. However, this is not a game that we would add to our usual test package, if only for the fact that it is doubtfully affected by image enhancement options.

    We discussed the results with Nvidia. Apparently, Dead Space has its own anti-aliasing algorithm, so enabling AA in the driver panel did not affect the game. You can enable AA in the game itself, but this does not affect the performance of the test in any way.



    Fallout 3 is a very interesting game, but it seems that performance is limited by the platform. Regardless of the inclusion of anti-aliasing and anisotropic filtering, the GeForce GTX 295 and Radeon HD 4870 X2 performed almost identically at 1920x1200 and 2560x1600 resolutions. It is quite clear that the systems provide a lot of graphical power here, which has yet to be revealed. It wasn't until we enabled anti-aliasing and filtering (AA/AF) on the GTX 280 at 2560x1500 that we noticed a drop in performance. And the HD 4870 has a memory limitation.



    AMD has been struggling with Far Cry 2 for the past month. Looks like the Catalyst 8.12 driver has been able to improve the situation. Without anti-aliasing, the 4870 X2 wins at 1900x1200 and 2560x1600. Enabling the upscaling options results in nVidia taking the lead at both resolutions.



    Since the game is based on the Valve Source engine, we were almost certain that Left 4 Dead would give an advantage to AMD graphics cards. But at every resolution, with AA/AF on or off, the GeForce GTX 295 came out ahead of the AMD Radeon HD 4870 X2. Considering the price difference between the GTX 280 and the 4870 512 MB, the results for this pair are not surprising. However, these graphics cards are still a good starting point for those looking to upgrade.

    Looking back 2009

    Regarding the tests carried out, I would like to say the following. Let's start with the fact that the review and the results are preliminary. We will release a full review in 2009 when the final version of the GeForce GTX 295 hits the market and boxes with this video card appear on sale.

    Here we have presented the results of six games. Five of them are featured by nVidia as examples of the most anticipated games for the 2008 New Year holiday season. Four of those five games were part of nVidia's "The Way It's Meant to Be Played" program. Two games are already in our traditional test suite. And we've added Crysis to the mix. But the graphics card is still an engineering sample, and according to nVidia's final fan speeds have not yet been set.

    Why did we decide to release such a test? All of the selected games are, in fact, popular, and we understand how tired our readers are of seeing three-year-old games being used because they've become recognized benchmarks of performance. Therefore, we decided to take a breath of fresh air instead of the boring Supreme Commander or World in Conflict.

    Of course, TWIMTBP is nothing but nVidia's competitive advantage. Relationships with developers so that they can optimize their software for hardware are quite logical and understandable, if the relationship does not lead to infringement of performance with competitor's video cards. Considering the number of times AMD graphics cards beat nVidia models in 2008, this developer relations program doesn't have such nefarious goals. In fact, we were recently able to take a closer look at how nVidia contributed to the development of Far Cry 2 as part of TWIMTBP. Nvidia spent three man-months (if I may say so) on-site consultations and performance tests, including regular driver updates. As a result, GeForce GTX graphics cards did not experience as many problems as Radeon models (we described these problems during gaming tests on a Core i7 system with configurations up to 4-way CrossFire and 3-way SLI).

    Conclusion

    At the beginning of the article, we asked ourselves if nVidia could debunk AMD's claim of being the fastest single graphics card on the market? All in all, the GeForce GTX 295 was able to do just that, as hinted at in preliminary performance tests based on some of the hottest games of the holiday season.

    According to nVidia, the GeForce GTX 295 will be released at CES in 2009, a few weeks from now. It will retail for a suggested price of $499 - roughly equivalent to the Radeon HD 4870 X2 - and will be available at the time of the announcement (we can't say for Russia). When we get our hands on a final retail sample of a graphics card, rather than an early engineering sample, we will be able to conduct thorough tests and more accurately evaluate performance.

    What do we know so far? The nVidia GeForce GTX 295 is a really fast graphics card. We know the company has moved to a 55nm process that has reduced power consumption - and the GTX 295 draws less power at idle and under load than the AMD Radeon HD 4870 X2.

    However, the GeForce GTX 295, like all other upcoming 55nm nVidia GPUs, has other tricks up its sleeve, not just performance numbers. This includes all the talk about CUDA, PhysX and the soon-to-be-released 3D-stereo technology, which we have already been able to get acquainted with at NVISION.

    In our article, we did not pay due attention to PhysX support, since the list of supported games does not overlap with the list of the best games. We didn't talk about CUDA either, because apart from Badaboom, there are too few applications today that would be worth buying a graphics card with this technology. Of course, we were impressed by the stories of five or six developers about how CUDA could revolutionize development in their niche areas with Tesla cards, but transferring this advantage to desktop applications is problematic.

    But in this regard, the situation is slowly but surely changing. EA should catch up, and in early 2009 there should be a couple of games with PhysX support, which will allow us to assess how this feature justifies itself. We tried playing Left 4 Dead (remember it's not part of the TWIMTBP program) with the upcoming nVidia 3D glasses, and the gameplay was absolutely amazing. But this topic is for another article, which will also be published soon.

    About six months ago, AMD released its most powerful video card AMD Radeon 4870x2, which for a long time became the leader among single video cards. It combined two RV770 chips, very high temperatures, rather high noise levels, and, most importantly, the highest performance in games among single video cards. And only now, at the beginning of 2009, NVIDIA finally responded with the release of a new video card, which was supposed to become a new leader among single solutions. The new video card is based on the 55nm GT200 chip, has 2x240 stream processors, but there is one “but” - a reduced memory bus (448 bits vs. 512 for the GTX280), reduced ROPs (28 vs. 2GHz versus 2.21GHz for the GTX280). Today I want to introduce you, dear readers, to the speed of the new TOP, its advantages and disadvantages, and, of course, to compare the speed of the new video card with its main competitor - AMD 4870x2.

    Review of the Palit GeForce GTX295 video card

    The Palit GeForce GTX295 came into our test lab as an OEM product that only included a Windows XP/Vista driver disk. Let's look at the video card itself:

    advertising

    The video card has, probably, the length of 270mm, which is standard for top-level video cards, is equipped with two DVI-I (dual link) ports and an HDMI port - like its younger sister - GeForce 9800GX2:

    But there are some changes: the DVI-I ports are now located on the right, and the hot air exhaust grille has become narrower and longer, which should clearly improve the quality of the video card cooling. Not all of the hot air is expelled outside the case - most of it remains inside the case, as in the case of the 9800GX2, you can see the cooling system radiator fins in the upper part of the video card.

    In order to “climb the mountain again”, NVIDIA has switched its “top” GT200 GPU to a thinner manufacturing process - 55 nm. The new GT200b chip architecturally remained the same, but became less hot, which made it possible to release an accelerated version of the single-chip GeForce GTX 280 - the GeForce GTX 285 video card (it was discussed in the review ZOTAC GeForce GTX 285 AMP! Edition). But this was not enough to overthrow the Radeon HD 4870 X2, and simultaneously with the GeForce GTX 285, the dual-chip GeForce GTX 295 was released, based on two slightly simplified GT200b chips. The resulting accelerator, in the name of which there is no hint of "double-headedness", should provide unsurpassed performance.

    To begin with, let's give a comparative table showing the history of the development of two-chip NVIDIA video cards and analogues from the direct competitor AMD-ATI.

    GeForce 9800 GX2

    GeForce GTX 295

    Radeon HD 3870 X2

    Radeon HD 4870 X2

    Graphics chip

    GT200-400- B3

    Core frequency, MHz

    Frequency of unified processors, MHz

    1 242

    Number of unified processors

    Number of texture addressing and filtering blocks

    Number of ROPs

    Memory size, MB

    Effective memory frequency, MHz

    2 000
    (2*1
    000)

    2250
    (2*1125)

    Memory type

    Memory bus width, bit

    Technical process of production

    Power consumption, W

    before289

    Judging by the characteristics of the GeForce GTX 295, the architecture of the new graphics accelerator is at the junction of the SLI configurations of the GeForce GTX 280 and GeForce GTX 260 with the transition to a 55 nm process technology, which made it possible to obtain power consumption at the level of a competitor's solution. How effective such an "alloy" is, only testing can show, so let's get acquainted with one of the new options.

    Video card ZOTAC GeForce GTX 295

    Let's start by summarizing and clarifying the ZOTAC GeForce GTX 295 specification.

    Manufacturer

    ZT-295E3MA-FSP (GeForce GTX 295)

    Graphics core

    NVIDIA GeForce GTX 295 (GT200-400-B3)

    Conveyor

    480 unified streaming

    Supported APIs

    DirectX 10.0 (Shader Model 4.0)
    OpenGL 2.1

    Core (shader domain) frequency, MHz

    Volume (type) of memory, MB

    Frequency (effective) memory, MHz

    Memory bus, bit

    Tire standard

    PCI Express 2.0 x16

    Maximum Resolution

    Up to 2560 x 1600 in Dual-Link DVI mode
    Up to 2048 x 1536 @ 85Hz over analog VGA
    Up to 1920x1080 (1080i) via HDMI

    2x DVI-I (VGA via adapter)
    1xHDMI

    HDCP support
    HD video decoding

    There is
    H.264, VC-1, MPEG2 and WMV9

    Drivers

    Latest drivers can be downloaded from:
    - support site ;
    - GPU manufacturer website .

    Products webpage

    The video card comes in a cardboard box, which is exactly the same as for the ZOTAC GeForce GTX 285 AMP! Edition, but has a slightly different design, all in the same black and gold colors. On the front side, in addition to the name of the graphics processor, the presence of 1792 MB of GDDR3 video memory and an 896-bit memory bus, hardware support for the newfangled physical API NVIDIA PhisX and the presence of a built-in HDMI video interface are noted.

    The back of the package lists the general features of the graphics card with a brief explanation of the benefits of using NVIDIA PhisX, NVIDIA SLI, NVIDIA CUDA, and GeForce 3D Vision.

    All this information is briefly duplicated on one of the sides of the box.

    The other side of the package lists the exact frequency response and the operating systems supported by the drivers.

    A little lower, information is carefully indicated on the minimum requirements for the system in which the "gluttonous" ZOTAC GeForce GTX 295 will be installed. So, the future owner of such a powerful graphics accelerator will need a power supply unit with a capacity of at least 680 W, which is capable of providing up to 46 A via the 12 V line. Also, the PSU must have the required number of PCI Express power outputs: one 8-pin and one 6-pin.

    By the way, the new packaging is much more convenient to use than the previously used ones. It allows you to quickly access the contents, which is convenient both when you first build the system and when you change the configuration, for example, if you need some kind of adapter or maybe a driver disk when you reinstall the system.

    The delivery set is quite sufficient for installing and using the accelerator, and in addition to the video adapter itself, it includes:

    • disk with drivers and utilities;
    • CD with bonus game "Race Driver: GRID";
    • disc with 3DMARK VANTAGE;
    • paper manuals for quick installation and use of the video card;
    • branded sticker on the PC case;
    • adapter from 2x peripheral power connectors to 6-pin PCI-Express;
    • adapter from 2x peripheral power connectors to 8-pin PCI-Express;
    • adapter from DVI to VGA;
    • HDMI cable;
    • cable for connecting the SPDIF output of a sound card to a video card.

    The appearance of the GeForce GTX 295 video card owes its unusual appearance to the design. This is striking at first glance when you see a virtually rectangular accelerator completely enclosed by a casing with a slot through which you can see the turbine of the cooling system, hidden somewhere in the depths.

    The reverse side of the video card looks a little simpler. This is due to the fact that, unlike the GeForce 9800 GX2, there is no second half of the casing here. This, it would seem, indicates a greater predisposition of GeForce GTX 295 graphics accelerators to disassembly, but as it turned out later, everything is not so simple and safe.

    On top of the video card, almost at the very edge, there are additional power connectors, and due to the rather high power consumption, you need to use one 6-pin and one 8-pin PCI Express.

    Next to the power connectors there is a SPDIF digital audio input, which should provide mixing of the audio stream with video data when using the HDMI interface.

    Further, a cutout is made in the casing, through which heated air is ejected directly into the system unit. This will clearly worsen the climate inside it, because, given the power consumption of the video card, the cooling system will blow out a fairly hot stream.

    And already near the panel of external interfaces there is an NV-Link connector, which will allow you to combine two such powerful accelerators in an uncompromising Quad SLI configuration.

    Two DVIs are responsible for image output, which can be converted to VGA with the help of adapters, as well as a high-definition multimedia output HDMI. There are two LEDs next to the top video outputs. The first one displays the current power status - green if the level is sufficient and red if one of the connectors is not connected or an attempt was made to power the card from two 6-pin PCI Express. The second one indicates the DVI output to which the master monitor should be connected. A little lower is a ventilation grill through which part of the heated air is blown out.

    Disassembling the video card, however, turned out to be not as difficult as it was in the case of the GeForce 9800 GX2. First you need to remove the top casing and ventilation grill from the interface panel, and then just unscrew a dozen and a half spring-loaded screws on each side of the video card.

    Inside this "sandwich" there is a single cooling system, consisting of several copper heat sinks and copper heat pipes, on which a lot of aluminum plates are strung. All this is blown by a sufficiently powerful and, accordingly, not very quiet turbine.

    There are printed circuit boards on both sides of the cooling system, each of which represents a half of the GeForce GTX 295: it carries one video processor with its own video memory and power system, as well as auxiliary chips. Please note that the 6-pin auxiliary power connector is located on the half more saturated with chips, which is logical, because. even up to 75 watts is provided by the PCI Express bus located in the same place.

    The boards are interconnected using special flexible SLI bridges. Moreover, they themselves and their connection connectors are very capricious, so we do not recommend opening your expensive video card without special need.

    But bridges alone are not enough to ensure the coordinated operation of parts of the GeForce GTX 295 with the rest of the system. In real conditions, the chipset or the increasingly used NVIDIA nForce 200 extension with PCI Express 2.0 support is responsible for the operation of the SLI-bundle of a pair of video cards. It is the NF200-P-SLI PCIe switch that is used in the GeForce GTX 295.

    The boards also have two NVIO2 chips that are responsible for the video outputs: the first one provides support for a pair of DVI, and the second one for one HDMI.

    It is thanks to the presence of the second chip that multi-channel audio is mixed from the SPDIF input to a convenient and promising HDMI output.

    The video card is based on two NVIDIA GT200-400-B3 chips. Unfortunately, there was not enough space on the printed circuit boards to accommodate several more memory chips, so the full-fledged GT200-350 had its memory bus reduced from 512 to 448 bits, which in turn led to a reduction in rasterization channels to 28, like in the GeForce GTX 260. But on the other hand, there are 240 unified processors left, like a full-fledged GeForce GTX 285. The graphics processor itself operates at a frequency of 576 MHz, and shader pipelines at 1242 MHz.

    The total volume of 1792 MB of video memory is typed by Hynix chips H5RS5223CFR-N0C, which at an operating voltage of 2.05 V have a response time of 1.0 ms, i.e. provide operation at an effective frequency of 2000 MHz. At this frequency, they work, without providing a backlog for overclockers.

    Cooling System Efficiency

    In a closed but well-ventilated test case, we were unable to get the turbine to automatically spin up to maximum speed. True, even with 75% of the cooler's capabilities, it was far from quiet. In such conditions, one of the GPUs, located closer to the turbine, heated up to 79ºC, and the second - up to 89ºC. Given the overall power consumption, these are still far from critical overheating values. Thus, we can note the good efficiency of the cooler used. If only it were a little quieter...

    At the same time, acoustic comfort was disturbed not only by the cooling system - the power stabilizer on the card also did not work quietly, and its high-frequency whistle cannot be called pleasant. True, in a closed case, the video card became quieter, and if you play with sound, then good speakers will completely hide its noise. It will be worse if you decide to play in the evening with headphones, and next to you someone is already going to rest. But that's exactly the price you pay for a very high performance.

    When testing, we used the Stand for testing Video cards No. 2

    CPU Intel Core 2 Quad Q9550 (LGA775, 2.83GHz, L2 12MB) @3.8GHz
    motherboards ZOTAC NForce 790i-Supreme (LGA775, nForce 790i Ultra SLI, DDR3, ATX)GIGABYTE GA-EP45T-DS3R (LGA775, Intel P45, DDR3, ATX)
    Coolers Noctua NH-U12P (LGA775, 54.33 CFM, 12.6-19.8 dB) Thermalright SI-128 (LGA775) + VIZO Starlet UVLED120 (62.7 CFM, 31.1 dB)
    Additional cooling VIZO Propeller PCL-201 (+1 slot, 16.0-28.3 CFM, 20 dB)
    RAM 2x DDR3-1333 1024MB Kingston PC3-10600 (KVR1333D3N9/1G)
    Hard drives Hitachi Deskstar HDS721616PLA380 (160 GB, 16 MB, SATA-300)
    Power supplies Seasonic M12D-850 (850W, 120mm, 20dB) Seasonic SS-650JT (650W, 120mm, 39.1dB)
    Frame Spire SwordFin SP9007B (Full Tower) + Coolink SWiF 1202 (120x120x25, 53 CFM, 24 dB)
    Monitor Samsung SyncMaster 757MB (DynaFlat, [email protected] Hz, MPR II, TCO"99)

    Choose what you want to compare GeForce GTX 295 1792MB DDR3 ZOTAC to

    Having in its arsenal the same number of stream processors as a pair of GeForce GTX 280, but in other characteristics more like a 2-Way SLI bundle from the GeForce GTX 260, the dual-chip GeForce GTX 295 video card is located just between the pairs of these accelerators and only occasionally overtakes all possible competitors. But since it does not impose SLI support on the motherboard and should cost less than two GeForce GTX 280 or GeForce GTX 285, it can be considered a really promising high-performance solution for true gamers.

    power usage

    Having repeatedly made sure during the tests of power supplies that the system requirements for performance components often indicate overestimated recommendations regarding the required power, we decided to check whether the GeForce GTX 295 really requires 680 watts. At the same time, let's compare the power consumption of this graphics accelerator with other video cards and their bundles, complete with a quad-core Intel Core 2 Quad Q9550 overclocked to 3.8 GHz.

    Power consumption, W

    GeForce GTX 295

    GeForce GTX 260 SLI

    GeForce GTX 260 3-Way SLI

    Crysis Warhead, Max Quality, 2048x1536, AA4x/AF16x

    Call of Juarez, Max Quality, Max Quality, 2048x1536, AA4x/AF16x

    FurMark, stability, 1600x1200, AA8x

    idle mode

    The results obtained allow us to refute the recommendation to use very powerful power supplies to provide energy to the GeForce GTX 295. In other words, manufacturers are clearly reinsuring themselves in case of using low-quality power supplies with overestimated passport characteristics. While a good 500 W power supply from a reliable manufacturer will provide enough power for such a powerful video card.

    Overclocking

    To overclock the video card, we used the RivaTuner utility, while the case was opened to improve the flow of fresh air to the video card. In addition, the VIZO Propeller cooler provided an improved supply of fresh air directly to the video card turbine, and a household fan standing next to the system unit “blew” the air heated by the video card to the side, preventing it from getting inside the case again.

    As a result of overclocking, the frequency of the raster and shader domains was raised to 670 and 1440 MHz, which is 94 MHz (+16.3%) and 198 MHz (+15.9%) respectively higher than the default values. The video memory worked at an effective frequency of 2214 MHz, which is 216 MHz (+10.8%) higher than the nominal one. Let's see how overclocking a video card affects performance.

    Test package

    Standard Frequencies

    Overclocked graphics card

    Productivity increase, %

    Crisys Warhead, Maximum Quality, NO AA/AF, fps

    Far Cry 2, Maximum Quality, NO AA/AF, fps

    Far Cry 2, Maximum Quality, AA4x/AF16x, fps

    Overclocking an already productive video card made it possible to get up to 13% more performance. But even at very high resolutions and at maximum image quality settings, it will be hard to notice, because. and without overclocking, the GeForce GTX 295 provides sufficient frame rates in all games.

    conclusions

    The GeForce GTX 295 graphics card provides a qualitative step forward in terms of offering more attractive dual-chip accelerators for true connoisseurs of computer games. This solution really provides a very high level of performance, sometimes allowing it to be called the fastest 3D accelerator today. At the same time, it has moderate power consumption and a not so noisy cooling system. On the other hand, there are still practically no games in which, at maximum image quality settings and at very high resolutions, “top-end” single-chip video cards do not provide gaming frame rates. Therefore, you will have to think twice before buying such an expensive graphics accelerator that does not always provide a noticeable performance jump. It will probably make more sense to get a GeForce GTX 285 and a more powerful processor, or a more capable motherboard and more faster RAM, or maybe just a more spacious, well-ventilated case and another large hard drive. But in any case, you will have to put up with the noisy power regulator of the new GeForce GTX 285 and GeForce GTX 295, although this defect can be corrected in future revisions of video cards.

    As for the ZOTAC GeForce GTX 295, this video card completely repeats the reference accelerator, and its distinctive positive features include only a very good bundle and, perhaps, a slightly lower cost than its competitors.

    Advantages:

    • very high performance in gaming applications;
    • support for DirectX 10.0 (Shader Model 4.0) and OpenGL 2.1;
    • support for NVIDIA CUDA and NVIDIA PhysX technologies;
    • technology supportQuadSLI;
    • excellent kit.

    Disadvantages:

    • perceptible squeak of the power stabilizer.

    Peculiarities:

    • medium-noise and relatively efficient cooling system;
    • increased requirements for the power supply;
    • relatively high cost.

    We are grateful to the company ELKO Kyiv» official distributor Zotac International in Ukraine for video cards provided for testing.

    Article read 30911 times

    Subscribe to our channels

    Temperature and overclocking level of Inno3D GeForce GTX 295 Platinum

    The temperature regime of the card was at an acceptable level and the GPU did not warm up above 80-82 degrees Celsius, and the noise level, even at 94% of the maximum, was much lower than that of the turbine of the video card of the old revision.

    Testing

    Test configuration

    Testing was carried out on the following stand:

    • CPU:
    • Motherboard:
      • ASUS Rampage Formula (Intel X48)
    • RAM:
    • HDD:
      • Samsung HD252HJ (250 GB, SATA2)
    • Power Supply:
      • Silver Power SP-S850
    • Operating system:
      • Windows Vista Ultimate x86 SP1

    Testing was carried out in the operating system Windows Vista Ultimate x86 SP1. For adapters based on GeForce chipsets, the ForceWare 185.85 driver was used, for Radeon - Catalyst 9.4. The acceleration of physical effects through the video card was disabled only for testing in the 3DMark Vantage application.

    As test applications, we used synthetic packages 3DMark "06 and 3DMark Vantage from Futuremark Corp., a masterpiece of gaming from Crytek - Crysis (the test was carried out by the CrysisBenchmarkTool 1.05 utility, standard level), Far Cry 2 released last year (built-in benchmark), a benchmark based on games STALKER: Clear Sky (Day mode) and another benchmark Cryostasis TechDemo, which demonstrates the capabilities of the game "Cryostasis: Sleep of Reason".

    All measurements were carried out at a screen resolution of 1680x1050 (except 3DMark "06 and 3DMark Vantage) to create a serious load on video accelerators. Default settings were set for 3DMark, in CrysisBenchmarkTool 1.05 tests were run with the Very High profile, for Far Cry 2 the maximum settings were used, in the benchmark STALKER was set to ultra-quality with support for DirectX 10, the High preset was used in Cryostasis TechDemo. Anti-aliasing and anisotropic filtering were not forced in the drivers, but were selected in the applications themselves.

    On the graphs with the results, both the average fps and the minimum are marked.