Download drivers for nvidia geforce gtx 260 card. Test configuration, tools and testing methodology

This is a WHQL release from the Release 260 family of drivers. This driver package supports GeForce 6, 7, 8, 9, 100, 200, 300, and 400-series desktop GPUs as well as ION desktop GPUs.

New in Release 260.89

New GPU Support

  • Adds support for the newly released GeForce GPU.

Performance

  • Increases performance for GeForce GTX 400 Series GPUs in several PC games vs. v258.96 WHQL drivers. The following are examples of some of the most significant improvements measured on Windows 7. Results will vary depending on your GPU and system configuration:
  • GeForce GTX 480:

      • Up to 10% in StarCraft II (2560x1600 4xAA/16xAF Ultra)
      • Up to 14% in S.T.A.L.K.E.R.: Call of Pripyat (1920x1200 4xAA/16xAF)
      • Up to 16% in S.T.A.L.K.E.R.: Call of Pripyat (SLI – 2560x1600 4xAA/16xAF)
      • Up to 6% in Aliens vs. Predator (SLI – 1920x1200 noAA – Tessellation on)

    GeForce GTX 460:

      • Up to 19% in StarCraft II (SLI – 1920x1200 4xAA/16xAF Ultra)
      • Up to 15% in Battlefield Bad Company 2 (SLI – 2560x1600 4xAA/16xAF)
      • Up to 12% in S.T.A.L.K.E.R.: Call of Pripyat (2560x1600 noAA)
      • Up to 9% in Aliens vs. Predator (1680x1050 4xAA/16xAF – Tessellation on)
      • Up to 7% in Metro 2033 (1680x1050 noAA – Tessellation on)
      • Up to 11% in Dirt 2 (SLI – 2560x1600 4xAA/16xAF)
      • Up to 12% in Crysis:Warhead (SLI – 1920x1200 4xAA/16xAF Gamer)
      • Up to 13% in Far Cry 2 (2560x1600 4xAA/16xAF)
      • Up to 12% in H.A.W.X (SLI – 1920x1200 4xAA/16xAF SSAO Very High)
      • Up to 5% in Just Cause 2 (1920x1200 4xAA/16xAF)
      • Up to 22% in Riddick: Assault on Dark Athena (1920x1200 noAA)
      • Up to 5% in 3DMark Vantage (Extreme Preset)

Blu-ray 3D

  • Adds support for playing back Blu-ray 3D discs when connecting your GPU to an HDMI 1.4 3D TV. Playback requires compatible software application from CyberLink, ArcSoft, Roxio, or Corel. For more information, .

HD Audio

  • Adds lossless DTS-HD Master Audio and Dolby TrueHD audio bitstreaming support for compatible Blu-ray movies with GeForce GTX 460 GPUs*.
  • Adds high definition 24-bit, 96 and 192 KHz multi-channel audio sampling rate support for compatible Blu-ray movies with GeForce GTX 400 Series ,GT 240, GT 220 and 210 GPUs*.
  • Upgrades HD Audio driver to version 1.1.9.0.

*Note: A Blu-ray movie player update may be required to enable these new features; check with your movie players software manufacturers for more details.These features are only supported on Windows 7.

Installation

  • New driver installer with enhanced user interface and new Express and Custom installation options.
    • Express – fast and easy one-click installation
    • Custom – customized installation
      • Option to perform a clean installation (completely removes older drivers from your system prior to installing the new driver).
      • Option to choose which driver components (ie. PhysX or 3D Vision) to install.
    • Improved installation time for multi-GPU PCs.

NVIDIA Surround

  • Updated NVIDIA Surround setup wizard
    • After first setup, the wizard allows users to jump to any setup step.
    • Improved display connection diagrams and tooltips.
    • Improved UI for setup and arrangement of displays.
    • Improved bezel correction setup experience.
    • Adds help page to highlight which in-game resolution to select (e.g. how to pick bezel corrected resolutions)
    • Option to dedicate an extra GPU to PhysX or to drive an additional display.
    • Allows for portrait or landscape setup directly from the setup wizard.
  • Updated 3D Vision Surround and NVIDIA Surround game support list. Please visit the for a full list of supported games.

NVIDIA 3D Vision

  • WHQL Certified driver
  • With Release 260 drivers, the installation process for 3D Vision has changed. Please view this knowledgebase article for more information on the changes.
  • Fixed glasses losing sync to 3D Vision IR emitter that would cause glasses to flicker and loss of 3D effect user experience.
  • Adds NVIDIA 3D Vision streaming support for Firefox 4 and Google Chrome web browsers.
  • Adds support for Sony's 3D Sweep Panorama picture format added to NVIDIA 3D Photo Viewer (Sony digital cameras that can capture 3D Sweep Panorama pictures include NEX-5/NEX-3, Alpha a560/a580 and Cyber-shot DSC-WX5/DSC-TX9 /DSC-T99 models).
  • Adds support for new 3D Vision Desktop LCD monitors: BenQ XL2410T and NEC F23W2A
  • Adds support for new 3D Vision projectors: Sanyo PDG-DWL2500 and ViewSonic PJD6251
  • Added the following
    • Arcania Gothic 4
    • Fallout: New Vegas
    • Ferrari Virtual Academy 2010 (new in 260.89)
    • Ferrari Virtual Race (new in 260.89)
    • FIFA 11
    • Formula 1 Racing
    • Final Fantasy XIV Benchmark
    • Guild Wars 2
    • Kane & Lynch 2 – Dog Days
    • Lead and Gold
    • Lego Harry Potter
    • Live For Speed
    • Lost Planet 2
    • Moonbase Alpha
    • Serious Sam HD – The Second Encounter
    • Shrek Forever After
    • Singularity
    • Vitrua Tennis 2009
    • Virtrua Tennis 3
  • Update the following
    • Civilization V – updated from v260.63 to 3D Vision Ready rating
    • Dead Rising 2 – updated from v260.63 to 3D Vision Ready rating
    • Drakensang: The Dark Eye – updated in-game compatibility text
    • Mafia II – updated profile to properly for 3D Vision-Rating
    • StarCraft II – fixed profile to properly recognize the retail game executable name and match the 3D Vision rating of “Good”
    • Super Commander – fixed HUD elements
    • TRINE – new profiles fixes that allow the game to be rated”3D Vision-Ready” when used with the TRINE patch v1.08, available via Steam.

NVIDIA SLI

  • Adds or enhances SLI profiles for the following PC games:
    • City of Heroes: Going Rogue
    • Alien Swarm
    • Dead Rising 2
    • Front Mission Evolved
    • Kane and Lynch 2: Dog Days
    • LEGO: Harry Potter

Other Improvements

  • Adds support for OpenGL 4.1 for GeForce 400 series GPUs.
  • Upgrades System Software to version 9.10.0514.
  • Improves compatibility for older PC games (DirectX 7 to DirectX 9) running on Windows 7 (examples: Gothic, Gothic II, Falcon 4.0: Allied Force, Links 2003, Independence War II - Edge of Chaos, and X2: Wolverine's Revenge) .
  • Adds drag and drop display arrangement support to the “Set up multiple displays” page.
  • Includes numerous bug fixes. Refer to the release notes on the documentation tab for information about the key bug fixes in this release.
  • Users without US English operating systems can select their language and download the International driver .

Additional Information

  • Supports the new GPU-accelerated features in .
  • Supports GPU acceleration for smoother online HD videos with Adobe Flash 10.1. Learn more.
  • Supports the new version of MotionDSP"s video enhancement software, vReveal, which adds support for HD output. NVIDIA customers can download a free version of vReveal that supports up to SD output
    GeForce 7 series:
    7950 GX2, 7950 GT, 7900 GTX, 7900 GT/GTO, 7900 GS, 7800 SLI, 7800 GTX, 7800 GS, 7650 GS, 7600 LE, 7600 GT, 7600 GS, 7550 LE, 7500 LE, 7350 LE, 73 00 SE / 7200 GS, 7300 LE, 7300 GT, 7300 GS, 7150 / NVIDIA nForce 630i, 7100 GS, 7100 / NVIDIA nForce 630i, 7100 / NVIDIA nForce 620i, 7050 PV / NVIDIA nForce 630a, 7050 / NVIDIA nForce 630i, 7050 / NVIDIA nForce 610i, 7025 / NVIDIA nForce 630a

    GeForce 6 series:
    6800 XT, 6800 XE, 6800 Ultra, 6800 LE, 6800 GT, 6800 GS/XT, 6800 GS, 6800, 6700 XL, 6610 XL, 6600 VE, 6600 LE, 6600 GT, 6600, 6500, 6250, 6200 TurboCache, 6200SE TurboCache, 6200 LE, 6200 A-LE, 6200, 6150SE nForce 430, 6150LE / Quadro NVS 210S, 6150 LE, 6150, 6100 nForce 420, 6100 nForce 405, 6100 nForce 400, 6100

In their desire to get ahead of each other, NVIDIA and ATI often sacrifice the quality of driver preparation when releasing new graphics processors and video cards based on them. At the same time, it’s no secret that NVIDIA GeForce (“ForceWare”) and ATI Catalyst drivers are one of the determining factors for the success of new graphics solutions. As an example, it is enough to remember that at the time of the release of the G200, RV770 graphics processors and video cards based on them, there were no official driver versions at all, and all reviewers had to test the cards on various beta versions, which were far from not only ideal, but also from elementary stability. As a result, the testing results could not be called completely objective, and the spread in the results obtained by different authors was quite significant even within one individual game or test application.

The release of official certified versions of drivers does not always and not in all applications increase the speed of video cards - it is most often aimed at correcting errors and supporting new games. However, both NVIDIA and ATI are not shy about systematically announcing certain performance gains in some games and tests. But in practice, these statements are not confirmed as often as we would like. In order to try to identify the speed increase on different versions of NVIDIA GeForce and ATI Catalyst drivers, I decided to prepare two separate articles comparing the speed of video cards on several driver versions. In the next article, after the release of the promising Catalyst 8.12, I will test the Radeon HD 4870 using drivers of various versions. Well, in today’s article we will check how much NVIDIA GeForce drivers have evolved in this regard using the example of the GeForce GTX 260 (new version, with 216 pipelines). ZOTAC will help us with this, kindly providing their GeForce GTX 260 AMP2 video card! Edition for testing.

ZOTAC GeForce GTX 260 AMP2 video card review! Edition 896 MB

company ZOTAC International (MCO) Limited can still be considered a newcomer to the video card market, since it entered this market segment only last year. ZOTAC GeForce GTX 260 AMP2 video card! Edition 896 MB comes in a large cardboard box oriented vertically. The front side depicts a formidable dragon with outstretched wings:



There you can also find some characteristics of the video card, a label about the extended warranty period and a sticker with information that you are purchasing the Race Driver GRID game along with the video card. On the back of the box there is a description of the key features of the NVIDIA GPU, which talks about another 192 unified shader processors, while the video card tested today is equipped with 216 processors. Apparently, the box is still from the old version of ZOTAC cards of the “AMP!” series, only they managed to stick the “AMP2!” sticker on the front side.

Inside, the video card is securely fixed in a plastic shell in the central compartment, and around and under it you can find the following components:



I will list them:

adapter-splitter from S-Video output to component cable;
one DVI → HDMI adapter;
one DVI → D-Sub adapter;
instructions for installing a video card and drivers;
two cables for connecting additional power to the video card (6-pin connectors);
audio cable (S/PDIF, for audio output via HDMI interface);
CD with video card drivers;
DVD with the full version of the game Race Driver GRID.

In this regard, the video card is distinguished from competitors' products in this regard by two cables for connecting additional power (most often one is supplied), as well as by the quite current and graphically rich game Race Driver GRID.

The external difference between the new video card and the reference GeForce GTX 260 is only in the sticker on the plastic casing of the cooling system, which depicts the same dragon as on the box:


The two-slot cooling system covers the card on three sides:






The dimensions of the video card are standard and are 270 x 100 x 32 mm. ZOTAC GeForce GTX 260 AMP2! Edition is not full of frills in terms of connectors and is equipped with a PCI-Express x16 2.0 interface, two DVI-I ports (dual-channel, with support for high resolutions), as well as an S-Video output adjacent to the grille for air exhaust from the system unit case:



At the top of the video card there are two 6-pin connectors for connecting additional power, a connector for connecting an audio S/PDIF cable, as well as two MIO connectors for building SLI and 3-Way SLI systems from two or three identical video cards on NVIDIA GPUs:


Let me remind you that according to the specifications, the peak power consumption of the GeForce GTX 260 (192SP) is 182 W, so for a system with one such video card installed, a power supply of 500 W is recommended, and for SLI configurations - a power supply of 600 W or more. For a video card with 216 unified shader processors, the power requirements are not significantly higher. Of course, these are official recommendations, the authors of which assume that your system may have a processor with high power consumption and several hard drives; otherwise, the video card will work on less powerful units - as our measurements show, the card itself on the GeForce GTX 260 (192SP) consumes about 136 W.

Let's see what a video card looks like without a cooling system:


Its power part:



ZOTAC GeForce GTX 260 AMP2 GPU markings! Edition released in Taiwan in week 27, 2008 - G200-103-A2:



The chip belongs to the second revision (A2). The number of unified shader processors is 216, texture units – 72, rasterization units – 28. The frequencies of the graphics processor of the video card, in contrast to the official ones for the GeForce GTX 260 (575/1242 MHz), are increased to 648/1404 MHz – or +12.7/ +13.1%! Quite good, even if these are not record frequencies for factory video cards. The voltage on the GPU is 1.06 V. I’ll add here that to save energy and reduce heat dissipation, the GPU operating frequencies in 2D mode are reduced to 300/600 MHz.

The 896 MB of GDDR3 video card memory is comprised of 14 chips located on the front and back sides of the board. The width of the video card memory bus is 448 bits. Like all GeForce GTX 260 we have previously studied, the ZOTAC video card uses chips manufactured by Hynix with a nominal access time of 1.0 ns:


The passport operating frequency of chips labeled as H5RS5223CFR NOC is 2000 MHz. According to the specifications of the GeForce GTX 260, its memory should operate at an effective frequency of 1998 MHz, while the ZOTAC GeForce GTX 260 AMP2! Edition memory frequency is 2106 MHz, which is only 5.4% higher than the nominal. Modestly, however, since there are commercially produced video cards GeForce GTX 260 and more high frequency memory.

Be that as it may, the specifications of the new video card are as follows:


Well, more detailed and complete characteristics can be seen in the table:


The cooling system has not changed compared to conventional cards based on GTX 280/260 chips:



I was always surprised by the thick, thick layer of thermal paste on the GPU. It seems that workers at the factory are paid piece-rate, based on the amount of thermal paste they use, just as collective farmers-tractor drivers were paid for the amount of diesel fuel they used in Soviet times. Only our “smart guys” poured it into the ground at the edge of the field, and the hardworking Chinese diligently put all the “proper” thermal interface on the cover of the GPU heat spreader. The problem here is that thermal paste ideally should not form a continuous layer between the chip and the radiator, it should only smooth out the roughness of the metal, filling the irregularities - its thermal conductivity is high compared to air, but small compared to the metal of the radiator.

Let's check the temperature of the video card. Hereinafter, the load on the GPU and video card as a whole was created using ten cycles of the Firefly Forest test from the synthetic graphics package 3DMark 2006 at a resolution of 1920 x 1200 with anisotropic filtering at 16x level. Full-screen anti-aliasing was not activated, since when it is turned on, the load on the GPU and its temperature are lower. All tests were carried out in a closed Ascot 6AR2-B case of the system unit (you can find the configuration of the fans in it below in the section with the testing methodology). The room temperature during testing was 23.5 degrees Celsius. Monitoring of the frequencies and temperatures of the video card was carried out using the RivaTuner v2.20 program (author - Alexey Nikolaychuk). Due to the fact that the video card was disassembled before testing, the standard thermal interface on the GPU was replaced with high-performance Gelid GC1 thermal paste, applied to the graphics processor in the thinnest possible layer.

Let's look at the test results in automatic turbine operation mode:


The GPU temperature did not exceed 80 °C, and this despite the already increased frequencies of the video card and the low noise level of the cooling system turbine. The latter during testing only spun up to 1860 rpm, with a maximum possible 3200 rpm.

And here’s how the reference cooler cools the video card at 100% turbine power:


Here the temperatures are completely atypically low for a top-end video card. Interestingly, the voltage regulator current at peak load at lower video card temperatures turned out to be 5 A less than at higher temperatures.

Considering high efficiency reference cooler, the overclocking potential of the video card was tested without replacing the cooling system, but the turbine rotation speed was manually set to 2050 rpm - this turned out to be the maximum speed and at the same time subjectively comfortable mode in terms of noise level. As a result, the video card was overclocked to 2448 MHz (+22.5%) in the video memory without loss of stability and picture quality, but was completely unable to be overclocked in the GPU:


Alas, the core of the ZOTAC video card provided to us for testing is already working at the limit of its capabilities. In addition, increasing the core voltage from 1.06 V to 1.15 V in the video card BIOS did not lead to an increase in the overclocking potential of the GPU. Well what can I say? Perhaps the features of a particular video card, nothing more.

The temperature conditions of the additionally overclocked card using video memory are as follows:


By the time the preparation of the material is completed, the recommended cost of the ZOTAC GeForce GTX 260 AMP2 video card! Edition 896 MB was 320-350 US dollars, but price reductions for the entire new line of video cards based on NVIDIA GPUs are expected soon, which will certainly affect ZOTAC products. At the time of writing this article, the real cost of this card in Moscow stores was slightly above 10 thousand rubles.

Stages of evolution of GeForce drivers (ForceWare)

First of all, it is necessary to draw your attention to the fact that NVIDIA this year changed the name of the drivers from “ForceWare” to “GeForce”, which, in my opinion, now only leads to confusion between the names of video cards and the names of drivers. Well, okay, we’ll get used to it, not the first time. As of November 19, 2008 (testing start date), NVIDIA has released countless beta versions of drivers and five official certified versions for the GeForce GTX 260 and GTX 280 line of video cards, four of which we will study today.

GeForce 177.41 (06/26/2008)– is the second driver with a WHQL certificate for the GTX 280/260 line. Why was not the first official version 177.35 chosen? The fact is that driver version 177.41 was released just nine days after 177.35, and did not make any significant changes. Here, for example, is what the official list of innovations looks like:

added support for GeForce GTX 280 and GeForce GTX 260 GPU;
supports one GPU and NVIDIA SLI technology on DirectX 9, DirectX 10 and OpenGL, including 3-way SLI technology with GeForce GTX 280 and GeForce GTX 260 GPUs;
added support for CUDA technology;
added support for the folding@home distributed computing system;
HybridPower support, Hybrid SLI technology, on the following GPUs and motherboards:

GeForce GTX 280 GPU;
GeForce GTX 260 GPU;
nForce 780a SLI;
nForce 750a SLI;


support for GPU overclocking and temperature monitoring when installing NVIDIA System Tools (minor bugs fixed).

As you can see, no changes were announced in terms of increased performance of video cards in the new (at that time) driver. In addition to the above, a couple of bugs were fixed under Windows Vista x64 - that's all.

GeForce 178.13 (09.25.2008)– the third WHQL-certified driver for GeForce GTX 280/260, released after an unusually long break for NVIDIA (almost 3 months have passed). Some interesting changes include the following:

added support for NVIDIA PhysX acceleration for all GeForce 8, 9 and 200 series GPUs with at least 256 MB of video memory (PhysX version 8.09.04 is now packaged with the driver);
added support for 2-way and 3-way NVIDIA SLI technology for GeForce GTX 200 series GPUs on Intel D5400 XS motherboards;


up to 11% in 3DMark Vantage;

up to 15% in BioShock (DX10);
up to 15% in Call of Duty 4;




up to 7% in BioShock (DX10);


up to 10% in World in Conflict (DX10);


eliminated various problems compatibility with 3D applications.

In addition, the notes for this version talk about fixing some errors in games for a single video card and SLI connections. By the way, this version of the GeForce driver was characterized by many users and testers as the most stable and problem-free.

GeForce 178.24 (10/15/2008)– came out just 20 days after the release of version 178.13 and also has a WHQL certificate. Despite the very short period of time, there are more than enough changes. I will list the key ones related to improving performance in games:

Improved performance in the following 3D applications on a single GPU:

up to 11% in 3DMark Vantage;
up to 11% in Assassin's Creed (DX10);
up to 15% in BioShock (DX10);
up to 15% in Call of Duty 4;
up to 8% in Enemy Territory: Quake Wars;


Improved performance in the following 3D applications for 2-way SLI GPU:

up to 7% in BioShock (DX10);
up to 10% in Company of Heroes: Opposing Fronts (DX10);
up to 12% in Enemy Territory: Quake Wars;
up to 10% in World in Conflict (DX10).

Attentive readers will probably notice that all optimizations are identical to those in driver version 178.13. However, this is not my fault, since the official website states that all performance gains are shown from the beta driver 178.19, which was released later than 178.13. An interesting point, since, in fact, NVIDIA, using two drivers, increased the speed of its video cards twice by an equal amount in the above games. Or rather, it should have increased...

In addition to speed optimizations, the new driver brought fixes to the operation of DVI-HDMI devices, the Hybrid SLI mode, and also corrected errors in the World in Conflict game menu when using full-screen anti-aliasing for the GeForce 6600 (apparently, this is what NVIDIA jokes about - show in a very demanding This card still won’t be able to match the power of a GPU game with any acceptable performance). Just like version 178.13, PhysX libraries version 8.09.04 are packaged with this driver.

GeForce 180.48 (11/19/2008)– the latest certified GeForce driver for video cards based on NVIDIA chips. The long-awaited and pre-advertised driver for the new 180 series by NVIDIA itself, in addition to traditional error corrections, should bring significant changes and a noticeable performance increase. In more detail it looks like this:

new opportunities:

certification of NVIDIA SLI technology for motherboards based on Intel X58 chipsets (Intel Core i7 processors) for GeForce GTX 280, GeForce GTX 260, GeForce 9800 GX2, GeForce 9800 GTX+ and GeForce 9800 GTX graphics processors;
added the ability to connect multiple monitors to video cards combined in an SLI connection, both in desktop and 3D modes;
PhysX technology now allows the use of physical graphics acceleration on GeForce 8, 9 and 200 series video cards;


performance gains are claimed for the following 3D applications:

up to 10% in 3DMark Vantage;
up to 13% in Assassin's Creed;
up to 13% in BioShock;
up to 15% in Company of Heroes: Opposing Fronts;
up to 10% in Crysis Warhead;
up to 25% in Devil May Cry 4;
up to 38% in Far Cry 2;
up to 18% in Race Driver: GRID;
up to 80% in Lost Planet: Colonies;
up to 18% in World of Conflict.

Particularly impressive is the 80% increase in speed in the new game Lost Planet: Colonies - one can erroneously conclude that before the 180.48 driver there were only 100 fps instead of 180 fps. In addition, to all the declared performance gains on the official website page there is a postscript: “results may vary on different configurations,” that is, no one guaranteed anything. Well, okay, in general it should be noted that the changes announced in the new driver are very interesting. Will they turn out to be a reality in practice?.. Let's check.

Test configuration, tools and testing methodology

Testing of the video card and different driver versions was carried out on a computer with the following configuration:

Motherboard: DFI LANParty DK X48-T2RS (Intel X48, LGA 775, BIOS 03/10/2008);
Processor: Intel Core 2 Extreme QX9650 (3.0 GHz, 1.25 V, L2 2 x 6 MB, FSB 333 MHz x 4, Yorkfield, C0);
CPU cooling system: Thermalright SI-128 SE with Scythe Ultra Kaze fan (1320 rpm);
Thermal interface: Gelid GC1;
RAM:

2 x 1024 MB DDR2 Corsair Dominator TWIN2X2048-9136C5D (Spec: 1142 MHz / 5-5-5-18 / 2.1 V);
2 x 1024 MB DDR2 CSX DIABLO CSXO-XAC-1200-2GB-KIT (Spec: 1200 MHz / 5-5-5-16 / 2.4 V);


Disk subsystem: SATA-II 300 GB, Western Digital VelociRaptor, 10,000 rpm, 16 MB, NCQ;
HDD cooling and sound insulation system: Scythe Quiet Drive;
Drive: SATA DVD RAM & DVD±R/RW & CD±RW Samsung SH-S183L;
Case: Ascot 6AR2-B (120 mm Scythe Slip Stream case fans at 960 rpm on silicone studs are installed for intake and exhaust; the same fan at 960 rpm is installed on the side wall);
Control and monitoring panel: Zalman ZM-MFC2;
Power supply: Thermaltake Toughpower 1500 W, W0218 (standard 140 mm fan);
Monitor: 24" BenQ FP241W (1920 x 1200 / 60 Hz).

In order to reduce the dependence of the video card tested today on the platform speed, the quad-core processor was overclocked to a frequency of 4.0 GHz at a voltage of 1.575 V:


During the tests, the RAM operated at a frequency with timings reduced to 5-4-4-12 at Performance Level = 6 and a voltage of 2.175 V.

All tests were carried out on the Windows Vista Ultimate Edition x86 SP1 operating system (plus all critical updates as of November 10, 2008). The start date of the tests was November 19, 2008, so the following drivers available at that time were used:

motherboard chipset: Intel Chipset Drivers 9.1.0.1007;
DirectX libraries: November 2008.

Each GeForce/ForceWare driver had its own PhysX package installed, available at the time the driver was released, or included directly in the driver. The drivers were tested in the same sequence in which they were released. Each of the drivers and the PhysX package with them were installed only after removing the drivers of the previous version and additionally cleaning the system using the Driver Sweeper v1.5.5 program.

After installing each of the drivers, the following settings were changed in their control panels: graphics quality from the “Quality” position to “High Quality”, Transparency antialiasing from the “Disable” position was changed to “Multisampling”, vertical synchronization was forcibly disabled ( "Force off") Apart from this, no changes were made. Anisotropic filtering and full-screen anti-aliasing were enabled directly in the game settings. If changing these settings in the games themselves is not implemented, the parameters were adjusted in the GeForce driver control panel.

Testing of video cards was carried out in two resolutions: 1280 x 1024/960 and widescreen 1920 x 1200. The following set of applications was used for testing, consisting of two synthetic tests, one techno demo and twelve games of different genres:

3DMark 2006(Direct3D 9/10) – build 1.1.0, default settings and 1920 x 1200 + AF16x + AA4x;
3DMark Vantage(Direct3D 10) – v1.0.1, “Performance” and “Extreme” profile (only basic tests were tested);
Unigine Tropics Demo(Direct3D 10) – v1.1, built-in demo test, maximum quality settings, resolution 1280 x 1024 without techniques and resolution 1920 x 1200 with AF16x and AA4x;
World in Conflict(Direct3D 10) – game version 1.0.0.9(b89), graphics quality profile “Very High”, but UI texture quality = Compressed; Water reflection size = 512; DirectX 10 rendering activated;
Enemy Territory: Quake Wars(OpenGL 2.0) – game version 1.5, maximum graphics settings, demo at the “Salvage” level, Finland;
(Direct3D 9) – game version 1.7.568, graphics and texture settings set to “Extra” level, demo “d3” at “Bog” level;
Unreal Tournament 3(Direct3D 9) – game version 1.3, maximum graphics settings in the game (level 5), Motion Blur and Hardware Physics activated, the “Fly By” scene was tested at the “DM-ShangriLa” level (two consecutive cycles), the test was used HardwareOC UT3 Bench v1.3.0.0;
Devil May Cry 4(Direct3D 10) – game version 1.0, maximum graphics quality settings (“Super High”), the result was taken as the average value of a double sequential run of the second test scene.
S.T.A.L.K.E.R.: Clear Sky(Direct3D 10) – game version 1.5.07, quality settings profile “Enhanced full illumination DX10” plus anisotropic filtering at 16x level and other maximum graphics quality settings, used our own demo recording “s04” (triple test cycle);
Crysis WARHEAD(Direct3D 10) – game version 1.1.1.690, “Very High” settings profile, double test cycle of the video card at the “Frost” level from the test HardwareOC Crysis WARHEAD Bench v1.1.1.0;
Far Cry 2(Direct3D 10) – game version 1.00, settings profile “Ultra High”, double cycle of the “Ranch Small” test from the Far Cry 2 Benchmark Tool (v1.0.0.1);
X3: Terran Conflict(Direct3D 10) – game version 1.2.0.0, maximum quality of textures and shadows, fog is turned on, the “More Dynamic Light Sources” and “Ship Color VariATIons” parameters are turned on, the result is taken as the average speed value based on the results of one run of all four demos;
Left 4 Dead(Direct3D 9) – game version 1.0.0.5, maximum quality, the “meat” demo was tested (two passes) on the third level “No Mersy”, the first stage of “The Seven”;
Lost Planet: Colonies(Direct3D 10) – game version 1.0, graphics level “Maximum quality”, HDR Rendering DX10, built-in test consisting of two scenes.

The latest game is practically of no interest anymore, since it is quite old (by the standards of the gaming industry). However, I decided to add it to the list, since the NVIDIA GeForce drivers claim a performance increase in Lost Planet: Colonies on driver 180.48 of up to 80%! I wonder how real this figure is?

Testing in each application was carried out twice (not to be confused with running demo recordings twice!). The final result was taken as the best speed indicator (or conditional points) from two testing cycles, which you will see in the diagrams. Let's move on to the test results and their analysis.

Test results

The diagrams list the drivers in order of their release. The first driver - version 177.41 - is highlighted in yellow, the two drivers of the 178.xx series are indicated in blue-green, and the new 180.48 is marked in dark blue. Thus, the list of drivers in the tests looks like this:

GeForce 177.41;
GeForce 178.13;
GeForce 178.24;
GeForce 180.48.

Before moving on to analyzing the results, it is necessary to pay attention to one very important point. The fact is that the driver version 180.48 presented an unpleasant “surprise”, namely, the still very popular (if not the most popular) resolution of 1280 x 1024 pixels disappeared, and instead in games and tests it was proposed to set 1280 x 960. And in the section with arbitrary resolutions (“Custom”), it was impossible to add it in the driver control panel in any way, since every attempt to test the “alternative” resolution of 1280 x 1024 pixels failed. There was no time to wait for a response from NVIDIA support, to which the corresponding request was sent. Alternatively, you could simply replace the resolution of 1280 x 1024 selected for testing in all other driver versions with 1280 x 960, but, as you remember, driver version 180.48 was tested last... In general, the prospect of repeating full cycle All the tests on the three previous versions of drivers did not appeal to me at all. However, the resolution of 1280 x 960 is less than 1280 x 1024 by less than 6.3%, and the performance of video cards in various resolutions does not have linear dependence from them. In addition, we have the same for all and less processor-dependent 1920 x 1200 pixels, which I propose to focus on.

First, the results of testing video card drivers in two synthetic tests:

3DMark 2006



In the 3DMark 2006 test there is no increase in performance when moving to newer driver versions. A slight improvement compared to version 177.41 was brought by the 178-series drivers, and 180.48 is in the lead due to its slightly lower resolution of 1280 x 960 and is equal to others in 1920 x 1200.

3DMark Vantage



The three versions preceding the 180.48 driver are equal in performance in 3DMark Vantage, but the latest official one, 180.48, shows a slight increase in speed in both resolutions. The overall score at 1280 x 960 pixels is not calculated in 3DMark Vantage, so I only included the GPU Score data.

Unigine Tropics Demo

I draw your attention to the fact that today's article uses the new version 1.1 of this beautiful demo. The settings look like this (resolution, AF and AA changed):



Let's look at the results:



In the test from the new version of the Unigine Tropics demo, driver version 180.48 also allows the video card to run a little faster.

So, in two out of three synthetic tests we observed, albeit a very small, but still an increase in video card performance on the new drivers version 180.48. How will the situation develop in real games?

World in Conflict






In mode without anti-aliasing and anisotropic filtering, there is a performance increase on all three versions of drivers released after 177.41. The latest driver, version 180.48, also shows an increase in modes with full-screen anti-aliasing and anisotropic filtering, quite close to the promised 18 percent.

Enemy Territory: Quake Wars







Call of Duty 4: Modern Warfare MP







Unreal Tournament 3






In the three games tested above - Enemy Territory: Quake Wars, Call of Duty 4: Modern Warfare MP and Unreal Tournament 3 - no difference in the performance of the GeForce GTX 260 video card on different driver versions was noticed (let me remind you that version 180.48 gets a slight head start due to the resolution 1280 x 960). Although performance gains were promised in the games Enemy Territory: Quake Wars and Call of Duty 4: Modern Warfare MP, they may vary on some “different configurations.”

Devil May Cry 4






But in the game Devil May Cry 4, a significant jump in performance with driver version 180.48 is already obvious. Both in mode with full-screen anti-aliasing and anisotropic filtering, and in mode without these graphics quality improvement techniques, the GeForce 180.48 driver allows the video card to demonstrate a higher average speed than its predecessor versions.

S.T.A.L.K.E.R.: Clear Sky



There is a subtle change in performance depending on driver versions. It's hardly worth paying attention to.

Crysis WARHEAD

The Crysis WARHEAD game settings look like this:


Results:






1-2 frames per second faster than GeForce 180.48 in Crysis WARHEAD, which in practice, of course, is completely unnoticeable in the game.

Far Cry 2






The results of testing a video card with different versions of drivers in a mode without methods for improving graphics quality are not of any interest, but when full-screen anti-aliasing and anisotropic filtering are enabled, the GeForce 180.48 leaps ahead. This is not at all the 38 percent increase that was promised, but it is also interesting. Fortunately, we could not detect any difference in picture quality on different driver versions.

X3: Terran Conflict

In the new game X3: Terran Conflict, testing was carried out with the following settings (resolution, AF and AA were changed):


The test results did not bring any surprises:







Left 4 Dead

Since testing in this new game is being carried out for the first time, I will give its detailed settings here:


Screen resolution, anisotropic filtering, and full-screen anti-aliasing varied depending on the testing mode. Since the game engine is not too heavy for modern video cards, MSAA8x was used instead of the often used 4x full-screen anti-aliasing.

The test was carried out on the first stage of “The Seven” level “No Mersy” with large quantities explosions, quickly dying dead and other effects:



Let's look at the results:






As we all remember, there is no mention of performance gains in the release notes for GeForce drivers for Left 4 Dead. Moreover, at the time of the release of all driver versions up to and including 178.24, the game was not yet on the market. However, the increase is quite noticeable, and occurs in steps from the 177.xx series to 178.xx and further to 180.xx. No changes in picture quality were found not only in the dynamics of the game, but also when closely studying the screenshots.

Lost Planet: Colonies

Since testing in this game was carried out after the completion of the main block of tests, the resolution of 1280 x 960 was used not only in driver version 180.48, but also in the three previous versions of GeForce. Results:






Indeed, there is a slight increase in performance when moving to newer versions of drivers in both test scenes of the game Lost Planet: Colonies, but it is too far from the declared 80% at 180.48.

Graphics quality and speed using 3DMark 2006 as an example

First of all, I will make a reservation that this section of the article is intended as a supplement and does not pretend to be a full-fledged study. Here I propose to evaluate the performance degradation of the GeForce GTX 260 (216SP) in the synthetic 3DMark 2006 package depending on the graphics quality mode, as well as evaluate the quality itself. To do this, 3DMark 2006 was tested twice eight times, and the quality settings in the GeForce driver control panel were changed from “High Performance” to “High Quality” mode. Further, already in 3DMark 2006 itself, anisotropic filtering and three levels of full-screen anti-aliasing of 2x, 4x and 8x degrees were sequentially enabled. Let me remind you that 3DMark 2006 does not respond to anti-aliasing enabled from the GeForce driver control panel settings, so other types of multisampling were not tested. The choice of this particular synthetic package for testing is due to the almost unique ability to create a screenshot of a given frame, which is necessary in the context of testing.

Tests were carried out using GeForce driver version 180.48. First, let's look at the diagram showing the speed change depending on the quality mode:



Obviously, the quality mode selected directly in the driver does not affect the speed at all, since 3DMark 2006 scores change within the error in one direction or another in both resolutions. But turning on anisotropic filtering and, especially, full-screen anti-aliasing already leads to a decrease in speed, which, however, is quite predictable.

Now let's look at how the picture quality changes with different settings in the GeForce driver control panel. For this, frame number 1350 from the “Canyon Flight” scene was selected in the maximum monitor resolution of 1920 x 1200:


High PerformancePerformance


QualityHigh Quality


To view the details, it is more convenient to download all the screenshots at once and switch between them in some graphical browser (ACDSee, for example). The first two screenshots in the “High Performance” and “Perfomance” modes do not differ at all in quality. At least I couldn't find any differences. The second pair of four screenshots in the “Quality” and “High Quality” modes also do not differ in quality from each other, but you can identify the difference with the quality of the previous pair. Namely, on the “Quality” and “High Quality” settings the picture is a little darker, the background does not look as white as in the first couple of screenshots.

Now let's see how the picture changes when various techniques for improving graphics quality are activated, be it anti-aliasing or anisotropic filtering:


HQ+AF16xHQ + AF16x + AA2x


HQ + AF16x + AA4xHQ + AF16x + AA8x


Enable 16x level anisotropic filtering in literally transforms the image! Textures appear on the body of the sea monster and the rocks, the boards on the rudder and keel of the accelerated ship become visible, the surface of the balloon and its plumage also begin to look more natural and detailed. In general, it is strictly not recommended to turn off anisotropic filtering. In the remaining three screenshots, turning on full-screen anti-aliasing gradually raises the image quality to its maximum level. The smoothed cables and edges of the ship, the guides of the engine propeller, the wings of the ball on which the ship is suspended - all this is a consequence of the work of multisampling. If we talk about the difference in picture quality from different degrees anti-aliasing, then between AA2x and AA4x it is very noticeable, but the transition to AA8x is not so obvious.

Initially, I planned to conduct similar 3DMark 2006 testing in the new game Crysis WARHEAD, but this attempt ended in failure. Despite the fact that the HardwareOC Crysis WARHEAD Bench utility allows you to take a screenshot from a given frame of the demo, the screenshots turned out to be a little blurry around the edges, and it was only a stretch to compare them in terms of graphics quality. I went through all 13 demos built into the test, but the pictures were all offset. Alternatively, you could try to take screenshots immediately after loading a save, but in this case the image was not static, so it was not possible to make an accurate quality comparison. However, this did not prevent us from tracking the change in video card performance in the game depending on the quality mode:



Among the features of testing in Crysis WARHEAD, we can note the fact that anisotropic filtering does not affect the already low speed in the game on the GeForce GTX 260, and also that in modes using full-screen anti-aliasing of various degrees, the results are almost equal.

Conclusion

Despite the fact that the performance gains promised by NVIDIA during the gradual transition to newer versions of GeForce drivers were not confirmed in my conditions, an increase in speed during today's testing was noticed in games such as World in Conflict, Devil May Cry 4, Crysis WARHEAD (very small increase), Far Cry 2, Left 4 Dead, Lost Planet: Colonies, as well as in the 3DMark Vantage and Unigine Tropics Demo test applications. Therefore, there is undoubtedly a point in installing new drivers for video cards. In addition, in some old games that appear in the “release notes” for drivers, testing was not carried out, and in some new ones, noted there, there are simply no sufficiently accurate methods for measuring performance. On other configurations and operating systems, performance gains may vary both up and down. Here it is necessary to remember that new drivers are designed not only to improve performance, but are also often aimed at correcting errors in working with new games, which is also an argument in favor of installing new versions.

At the end of the article, a few words about the tested card. ZOTAC GeForce GTX 260 AMP2 video card! Edition turned out to be an interesting product with expressive packaging, the most complete set of delivery, with higher frequencies relative to the nominal ones and with a very efficient and quiet cooling system. Unfortunately, the overclocking potential of the video card's GPU was completely absent, but overclocking the video memory turned out to be quite successful. GeForce GTX 260 AMP2 price! Edition is comparable to that of competitors' products. It should be noted that in this segment the range of GeForce GTX 260 video cards on the market is quite extensive, so choosing the model that is interesting to you will not be difficult.

In our next article on this topic, I plan to tell you about the evolution of the speed of ATI Catalyst drivers using the example of the Radeon HD 4870 1024 MB video card.

P.S. We thank the Russian representative office of ZOTAC and the 3Logic company, as well as Nadezhda Dymova personally for providing the video card for testing.

Other materials on this topic


Fallout 3 and modern video cards: expectations and reality
Inexpensive gaming video card ATI Radeon HD 4830
Choosing a video card for games: fall-winter 2008

The NVIDIA GeForce GTX 260 graphics adapter first appeared on the market in 2008, and in fact, has long been outdated. Although it is still ahead of almost all built-in video cards and can be used to run even some modern games, albeit at minimum settings. It is worth noting that the model did not become a flagship at the time of its release, but it looked favorably against the background of other market offers, almost catching up in parameters with the older, flagship version of the GTX 280.

Having received from the manufacturer not the worst characteristics at that time, the GTX 260 allowed the computer on which it was installed to be used for the following purposes:

  • to run games and other graphic applications that support DirectX 10.0 or OpenGL 2.1 technology;
  • to watch video in FullHD format via the TV-Out input;
  • to transmit images in 2048x1536 and 2560x1600 quality when connected to monitors via VGA and HDMI adapters, respectively (despite the fact that only DVI-I was located on the card).

The card's 896 Mb GDDR3 video memory and 999 (1998) MHz frequency is not impressive even compared to its competitors in 2008. However, the 448-bit bus width made the video adapter more powerful than some more modern 1 or 2 GB office models such as the GT 610 or GT 720. In tests, it corresponds to the parameters of the integrated Intel HD 630 card.

The parameters that were quite good at the time of the model’s release led to its high cost. The average price of a GTX 260 on the market started at $400. Now, 10 years later, those who want to purchase a graphics adapter can do so at a much lower cost. On the secondary market, the cost of an NVIDIA GeForce GTX 260 video card in rubles starts at one and a half thousand.

GTX 260 review

Appearance graphics adapter, regardless of its manufacturer, remains impressive, reminiscent of modern maps. Among the card's features are:

  • support for PhysX technologies for obtaining game effects, universal computing CUDA, DirectX and OpenCL, thanks to which the card is also suitable for video encoding or editing;
  • the ability to combine with 1 or 2 of the same adapters when installed on a special SLI-compatible motherboard;
  • a certain margin of safety for working with 3D images if you have a compatible monitor.
On the other hand, the capabilities of the card largely depend on the computer’s central processor and the amount of RAM. Good performance will be achieved on a computer with Intel Core i5 processors and 4-8 GB of RAM.

The disadvantages include the power consumption of the GTX 260, reaching 182 W - because of it, the cooling system of the video adapter is quite noisy, and for normal operation you will need a powerful power supply.

Another serious drawback is the lack of support for DirectX 11, which is why some new games may simply not run, even if the card meets their minimum requirements.

What kind of power supply is needed for GTX 260

The need to provide almost 200 W of power just to operate the video card requires the use of a fairly powerful power supply. The recommended value for the power supply of a basic gaming computer is 500 W. The minimum permissible power value is 450 W.

Such requirements do not allow installing a video card on a regular office PC equipped with a 300-400 W unit. Therefore, even when assembling an inexpensive system from used components, it is worthwhile to provide sufficiently powerful power.

How to overclock an NVIDIA GeForce GTX 260 video card

To improve graphics performance, you can overclock the GTX 260 using utilities such as FireStorm or RivaTuner. They also use special software from manufacturers (for example, Zotac or Palit) to improve performance.

After overclocking the NVIDIA GeForce GTX 260 video card is completed, the operating frequency increases by 8-16%. Performance in games also increases, increasing the comfort of the gaming process.

Taking into account the high power consumption and relatively low speed of the video adapter, it is not worth mining cryptocurrency with its help. Even after overclocking, mining the GTX 260 brings profits comparable to paying electricity bills.

Therefore, it is not recommended to assemble a mining farm from such GPUs. For this, more modern versions with lower power consumption and larger memory capacity and frequency are better suited.

Game testing

When performing the test in GeForce GTX 260 games, the following indicators are obtained:

  • for the shooter Call Of Duty 4, FPS indicators increase from 60 to 130 frames per second, depending on the settings;
  • in the first part of Assasin`s Creed the card produces from 46 to 102 FPS;
  • for Unreal Tournament 3 the value rises to 170 when switching to a resolution of 1280x1024 pixels;
  • in Crysis the frame rate can be 30, 60 or 80 FPS - it all depends on the settings and format.

For more modern games the performance will be less impressive. You won't be able to run most modern shooters and action games with it, at best - strategies or MMORGs.

The GTX 260 model is considered the minimum acceptable video card for online gaming Total War: Arena, the popular 5th part of the series can be easily launched with its help The Elder Scrolls, Skyrim and even Fallout 4.

However, it is not recommended to try to play the third “The Witcher” or GTA V - they will start, but the gameplay can hardly be called comfortable.

How to update NVIDIA GeForce GTX 260 drivers

The need to download drivers for NVIDIA GeForce GTX 260 may arise when problems arise with software already installed on your computer. You will have to search for control programs for the video card when reinstalling Windows (or another supported operating system).

To download the driver, it is recommended to use the official NVIDIA website. Here on one of the pages you can download the GeForce Experience utility, which allows you not only to optimize the card’s performance, but also to improve its functionality in games.