Comparison of Nvidia graphics processing units 3263352 226137429 2008-07-17T00:01:01Z 171.65.103.78 cleanup table, cards only vary by speed (so far) This page contains general information about [[NVIDIA]]'s [[Graphics processing unit|GPUs]] and videocards based on official NVIDIA specifications. ==DirectX version note== [[DirectX]] version indicates which graphics acceleration operations the card supports. * DirectX 6.0 - [[Texture mapping|Multitexturing]] * DirectX 7.0 - Hardware [[Transform and lighting|Transformation, Clipping and Lighting]] (TCL/T&L) * DirectX 8.0 - [[Shader]] Model 1.1 * DirectX 8.1 - Pixel Shader 1.3/1.4 & Vertex Shader 1.1 * DirectX 9.0/a - Shader Model 2.0 or Shader Model 2.0 extended * DirectX 9.0b - Pixel Shader 2.0a/b & Vertex Shader 2.0a/b * DirectX 9.0c - Shader Model 3.0, [[GPGPU]] * DirectX 9.0L - Windows Vista only, Vista version of DirectX 9.0c, Shader Model 3.0, Windows Graphic Foundation 1.0, DXVA 1.0, [[GPGPU]] * Direct3D 10 - Windows Vista only, Shader Model 4.0, Windows Graphic Foundation 2.0, DXVA 2.0, [[GPGPU]] * Direct3D 10.1 - Windows Vista only, Shader Model 4.1, Windows Graphic Foundation 2.1, DXVA 2.1, [[GPGPU]] ==OpenGL version note== [[OpenGL]] version indicates which graphics acceleration operations the card supports. * OpenGL 1.1 - texture objects * OpenGL 1.2 - 3d textures, BGRA and [[packed pixel]] formats <ref name='Gamedev1929'>{{cite news | first=Dave | last=Astle | coauthors= | title=Moving Beyond OpenGL 1.1 for Windows | date=2003-04-01 | publisher= | url =http://www.gamedev.net/reference/articles/article1929.asp | work =gamedev.net | pages = | accessdate = 2007-11-15 | language = }}</ref> * OpenGL 1.3 - [[Texture mapping|Multitexturing]], multisampling, texture compression * OpenGL 1.4<!-- NO: ASM never went in core, its a raccomanded ext but not a GL feature - ARB_vertex_program, --> nv_register_combiners2 <!-- I am not sure --> * OpenGL 1.5<!-- - GLSL NO. GLSL is GL2.0 feature--> * OpenGL 2.0 - GLSL * OpenGL 2.1 - GLSL with improved capabilities<!-- with geometry shaders NO: geometry shaders are still EXT or NV --> * OpenGL 3.0 == Field Explanations == The fields in the table listed below describe the following: * '''Model''' - The Marketing name for the processor assigned by NVIDIA. * '''Year''' - Year of release for the processor. * '''Code Name''' - The internal engineering codename for the processor (typically designated by an NVXY name and later GXY where X is the series number and Y is the schedule of the project for that generation. * '''Fab''' - Fabrication Process. Average feature size of components of the processor. * '''Bus interface''' - Bus by which the Graphics processor is attached to the system (typically an expansion slot, such as PCI, AGP, or PCI-Express). * '''Memory max''' - The maximum amount of memory used by the processor. * '''Core Clock max''' - The maximum factory core clock frequency (used as some manufacturers adjust clocks lower and higher, this number will always be the reference clocks used by NVIDIA). * '''Memory Clock max''' - The maximum factory memory clock frequency (used as some manufacturers adjust clocks lower and higher, this number will always be the reference clocks used by NVIDIA). * '''Config Core''' - The layout of the graphics pipeline, in terms of functional units. Over time the number, type and variety of functional units in the GPU core has changed significantly; before each section in the list there is an explanation as to what functional units are present in each generation of processors. In later models, Shaders are integrated into a Unified Shader Architecture, where any one shader can perform any of the functions listed. * '''Fillrate''' - Maximum theoretical fillrate in Textured Pixels per second. This number is generally used as a "maximum throughput number" for the GPU and generally, a higher fillrate corresponds to a more powerful (and faster) GPU. * '''Memory Subsection''' ** '''Bandwidth max''' - Maximum theoretical bandwidth for the processor at factory clock with factory bus width. GB=10^9 bytes. ** '''Bus Type''' - Type of memory bus or buses utilized. ** '''Bus Width''' - Maximum Bit width of the memory bus or buses utilized. This will always be a factory bus width. * '''Graphics Library Support Section''' ** '''Direct X''' - Maximum version of Direct3D fully supported. ** '''Open GL''' - Maximum version of OpenGL fully supported. * '''Features''' - Additional features that are not standard as a part of the two Graphics libraries. ==Comparison Table: Desktop GPUs== ===Pre-GeForce=== *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] {| class ="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | [[NV1|NV1 / STG-2000]] | Sep 1995 | NV1 | ? | [[Peripheral Component Interconnect|PCI]] | 2/4 | ? | 75 | ? | 12 | 0.6 | [[Dynamic random access memory|EDO]]/[[Dynamic random access memory|VRAM]] | 64 | n/a | n/a | 2D, 3D (NURBS), Game Port, AV playback |- ! style="text-align:left;" | [[RIVA 128]] | Apr 1997 | NV3 | 350 | [[Accelerated Graphics Port|AGP]] 1x, PCI | 4 | 100 | 100 | 0:1:1:1 | 100 | 1.6 | [[SDRAM|SDR]] | 128 | 5 | ? | First DirectX compatible |- ! style="text-align:left;" | RIVA 128 ZX | Mar 1998 | NV3 | 350 | AGP 2x, PCI | 8 | 100 | 100 | 0:1:1:1 | 100 | 1.6 | SDR | 128 | 5 | 1.0 | |- ! style="text-align:left;" | [[RIVA TNT]] | 1998? | NV4 | 350 | AGP 2x, PCI | 8/16 | 90 | 110 | 0:2:2:2 | 180 | 1.7 | SDR/SG | 128 | 6 | 1.1 | AGP sideband (rev.4) |- ! style="text-align:left;" | Vanta | 1999? | NV6 | 250 | AGP 4x, PCI | 16 | 100 | 110 | 0:2:2:2 | 200 | 1.0 | SDR | 64 | 6 | 1.1 | [[RAMDAC]] 250&nbsp;MHz |- ! style="text-align:left;" | Vanta LT | 1999? | NV6 | 250 | AGP 4x | 8 | 80 | 100 | 0:2:2:2 | 160 | 0.8 | SDR | 64 | 6 | 1.1 | ? |- ! style="text-align:left;" | [[RIVA TNT2]] M64 | Jul 1999 | NV6 | 220 | AGP 4x, PCI | 32 | 125 | 135 | 0:2:2:2 | 250 | 1.2 | SDR | 64 | 6 | 1.1 | RAMDAC 300&nbsp;MHz |- ! style="text-align:left;" | RIVA TNT2 | Mar 1999 | NV5 | 250 | AGP 4x, PCI | 32 | 125 | 150 | 0:2:2:2 | 250 | 2.4 | SDR | 128 | 6 | 1.1 | |- ! style="text-align:left;" | RIVA TNT2 Pro | 1999? | NV5 | 250 | AGP 4x, PCI | 32 | 143 | 166 | 0:2:2:2 | 286 | 2.7 | SDR | 128 | 6 | 1.1 | |- ! style="text-align:left;" | RIVA TNT2 Ultra | May 1999 | NV5 | 250 | AGP 4x, PCI | 32 | 150 | 183 | 0:2:2:2 | 300 | 2.9 | SDR | 128 | 6 | 1.1 |} ===GeForce series=== {{main|GeForce 256 }} *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] {| class ="wikitable" style="font-size: 85%; text-align: center; width: auto;" ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | GeForce 256 (SDR) | Oct 1999 | NV10 | 220 | AGP 4x | 32/64 | 120 | 166 | 0:4:4:4 | 480 | 2.7 | SDR | 128 | 7 | 1.2 | Hardware Transform & Lighting |- ! style="text-align:left;" | GeForce 256 (DDR) | Jan 2000 | NV10 | 220 | AGP 4x | 32/64 | 120 | 300 | 0:4:4:4 | 480 | 4.8 | DDR | 128 | 7 | 1.2 | Hardware Transform & Lighting |} ===GeForce 2 series=== {{main|GeForce 2 Series}} *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] *<sup>2</sup> GeForce2 Ti VX has never been presented officially, though it is nothing more than a frequency-reduced GeForce2 Ti. {| class ="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | GeForce2 MX 100 | 2001? | NV11 | 180 | AGP 4x | 32 | 143 | 166 | 0:2:4:2 | 572 | 0.6 | SDR | 32 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce2 MX 200 | Mar 2001 | NV11 | 180 | AGP 4x | 64 | 175 | 166 | 0:2:4:2 | 700 | 1.2 | SDR/ DDR | 64 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce2 MX | Jun 2000 | NV11 | 180 | AGP 4x | 64 | 175 | 166 | 0:2:4:2 | 700 | 2.7 | SDR | 128 | 7 | 1.2 | +TwinView +Shaders (NVIDIA Shading Rasterizer, [[OpenGL]] only) |- ! style="text-align:left;" | GeForce2 MX 400 | Mar 2001 | NV11 | 180 | AGP 4x, PCI | 32/64 | 200 | 183 | 0:2:4:2 | 800 | 2.9 | SDR/ DDR | 128/64 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce2 GTS | Apr 2000 | NV15 | 180 | AGP 4x | 32/64 | 200 | 333 | 0:4:8:4 | 1600 | 5.3 | DDR | 128 | 7 | 1.2 | -TwinView |- ! style="text-align:left;" | GeForce2 Pro | 2000? | NV15 | 180 | AGP 4x | 32/64 | 200 | 400 | 0:4:8:4 | 1600 | 6.4 | DDR | 128 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce2 Ti VX <sup>2</sup> | 2001? | NV15 | 150 | AGP 4x | 64 | 225 | 400 | 0:4:8:4 | 1800 | 6.4 | DDR | 128 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce2 Ti | Oct 2001 | NV15 | 150 | AGP 4x | 64 | 250 | 400 | 0:4:8:4 | 2000 | 6.4 | DDR | 128 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce2 Ultra | Aug 2000 | NV16 | 180 | AGP 4x | 64 | 250 | 460 | 0:4:8:4 | 2000 | 7.4 | DDR | 128 | 7 | 1.2 |} ===GeForce 3 series=== {{main|GeForce 3 Series}} *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] {| class ="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | GeForce3 | Feb 2001 | NV20 | 180 | AGP 4x | 64/128 | 200 | 460 | 1:4:8:4 | 1600 | 7.4 | DDR | 128 | 8.0 | 1.2 | +Shaders (v1.1) +LMA -TwinView |- ! style="text-align:left;" | GeForce3 Ti 200 | Oct 2001 | NV20 | 150 | AGP 4x | 64/128 | 175 | 400 | 1:4:8:4 | 1400 | 6.4 | DDR | 128 | 8.1 | 1.3 | |- ! style="text-align:left;" | GeForce3 Ti 500 | Oct 2001 | NV20 | 150 | AGP 4x | 64/128 | 240 | 500 | 1:4:8:4 | 1920 | 8.0 | DDR | 128 | 8.1 | 1.3 |} ===GeForce 4 series=== {{main|GeForce 4 Series}} *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] AGP 3.0 denotes a voltage reduction and increased maximum theoretical bandwidth (available speeds are 4X and 8X, instead of 2X and 4X); the cores themselves saw little to no changes (from NV17 to NV18 or NV25 to NV28). Apart from an increased clock speed and a reduced signal swing voltage (from 1.5V to 0.8V) , AGP 3.0 still supports sideband addressing (added late to AGP 1.0 specifications, increased practical data throughput) and fast writes (AGP 2.0 specifications, direct data writes to graphic card's memory). Strangely, the 4200's name change doesn't reflect the increased core clock, while the 4800 still has the same clock rates as the 4600. {| class ="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | GeForce4 MX 420 | Feb 2002 | NV17 | 150 | AGP 4x | 64 | 250 | 166 | 0:2:4:2 | 1000 | 2.7 | SDR | 64 | 7 | 1.2 | +TwinView +LMA2 |- ! style="text-align:left;" | GeForce4 MX 440SE | 2002 | NV18 | 150 | AGP 4x, PCI | 64 | 250 | 333 | 0:2:4:2 | 1100 | 2.7,5.3 | SDR, DDR | 64,128 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce4 MX 440 | Feb 2002 | NV17 | 150 | AGP 4x, PCI | 64 | 270 | 400 | 0:2:4:2 | 1100 | 6.4 | DDR | 128 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce4 MX 440 8x | Oct 2002 | NV18 | 150 | AGP 8x | 128 | 275 | 513 | 0:2:4:2 | 1100 | 8.2 | DDR | 128 | 7 | 1.2 | AGP 3.0 |- ! style="text-align:left;" | GeForce4 MX 460 | Feb 2002 | NV17 | 150 | AGP 4x | 64 | 300 | 550 | 0:2:4:2 | 1200 | 8.8 | DDR | 128 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce4 MX 4000 | 2003? | NV18B | 150 | AGP 4x/8x, PCI, | 64, 128 | 250, 275 | 266, 333, 400 | 0:2:4:2 | 1100 | 1.1, 3.2, 5.3 | DDR | 32, 64, 128 | 7 | 1.2 | |- ! style="text-align:left;" | GeForce4 Ti 4200 | Apr 2002 | NV25 | 150 | AGP 4x | 64, 128 | 250 | 500/ 444 | 2:4:8:4 | 2000 | 8.0/ 7.1 | DDR | 128 | 8.1 | 1.4 | +Shaders (v1.1 vertex, v1.3 pixel) +1 Vertex unit |- ! style="text-align:left;" | GeForce4 Ti 4200 8x | Oct 2002 | NV28 | 150 | AGP 8x | 128 | 250 | 513 | 2:4:8:4 | 2000 | 8.2 | DDR | 128 | 8.1 | 1.4 | AGP 3.0 |- ! style="text-align:left;" | GeForce4 Ti 4400 | Feb 2002 | NV25 | 150 | AGP 4x | 128 | 275 | 550 | 2:4:8:4 | 2200 | 8.8 | DDR | 128 | 8.1 | 1.4 | |- ! style="text-align:left;" | GeForce4 Ti 4800 SE | Feb 2003 | NV28 | 150 | AGP 8x | 128 | 275 | 550 | 2:4:8:4 | 2200 | 8.8 | DDR | 128 | 8.1 | 1.4 | AGP 3.0 |- ! style="text-align:left;" | GeForce4 Ti 4600 | Feb 2002 | NV25 | 150 | AGP 4x | 128 | 300 | 650 | 2:4:8:4 | 2400 | 10.4 | DDR | 128 | 8.1 | 1.4 | |- ! style="text-align:left;" | GeForce4 Ti 4800 | Feb 2003 | NV28 | 150 | AGP 8x | 128 | 300 | 650 | 2:4:8:4 | 2400 | 10.4 | DDR | 128 | 8.1 | 1.4 | AGP 3.0 |} ===GeForce FX series=== {{main|GeForce FX Series}} *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] * NV31, NV34 and NV36 are 2x2 pipeline designs if running vertex shader, otherwise they are 4x1 pipeline designs. * GeForce FX series has limited OpenGL 2.0 support. {| class ="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | GeForce FX 5200 | Mar 2003 | NV34 | 150 | AGP 8x,PCI | 128, 256 | 250 | 333,400 | 1:2:2:2 *:4:4:4 | 1000 | 2.7,6.4 | DDR | 64,128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5200 Ultra | Mar 2003 | NV34 | 150 | AGP 8x | 256 | 325 | 650 | 1:2:2:2 *:4:4:4 | 1300 | 10.4 | DDR | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce PCX 5300 | 2003 | NV34 | 150 | PCI-e | 256 | 250 | 400 | *:2:4:4 | 1000 | 3.2/6.4 | DDR | 64/128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5500 | Mar 2004 | NV34 | 150 | AGP 8x,PCI | 128, 256 | 270 | 400 | 1:2:2:2 *:4:4:4 | 1080 | 3.2, 6.4 | DDR | 64, 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5600 XT | 2003 | NV31 | 130 | AGP 8x | 128, 256 | 235 | 400 | 1:2:2:2 *:4:4:4 | 940 | 6.4 | DDR | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5600 | 2003 | NV31 | 130 | AGP 8x | 256 | 325 | 550 | 1:2:2:2 *:4:4:4 | 1300 | 8.8 | DDR | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5600 Ultra | Mar 2003 | NV31 | 130 | AGP 8x | 256 | 350 | 700 | 1:2:2:2 *:4:4:4 | 1400 | 11.2 | DDR | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5600 Ultra Rev.2 | Aug 2003 | NV31 | 130 | AGP 8x | 256 | 400 | 800 | 1:2:2:2 *:4:4:4 | 1600 | 12.8 | DDR | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5700 VE | 2003 | NV36 | 130 | AGP 8x | 128, 256 | 300 | 500 | 3:2:2:2 *:4:4:4 | 1200 | 8.0 | DDR | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5700 LE | 2003 | NV36 | 130 | AGP 8x, PCI | 256 | 250 | 400 | 3:2:2:2 *:4:4:4 | 1000 | 6.4 | DDR | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5700 | 2003 | NV36 | 130 | AGP 8x | 256 | 425 | 550 | 3:2:2:2 *:4:4:4 | 1700 | 8.8 | DDR | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5700 Ultra | Oct 2003 | NV36 | 130 | AGP 8x | 256 | 475 | 900 | 3:2:2:2 *:4:4:4 | 1900 | 14.4 | GDDR2 | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5700 Ultra GDDR3 | Mar 2004 | NV36 | 130 | AGP 8x | 256 | 475 | 950 | 3:2:2:2 *:4:4:4 | 1900 | 15.2 | [[GDDR3]] | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce PCX 5750 | 2004 | NV36 | 130 | PCI-E | 256 | 425 | 500 | *:4:4:4 | 1700 | 8.0 | [[GDDR3]] | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5800 | Jan 2003 | NV30 | 130 | AGP 8x | 256 | 400 | 800 | 2:4:8:8 2:8:8:8 (no Z) | 3200 | 12.8 | GDDR2 | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5800 Ultra | Jan 2003 | NV30 | 130 | AGP 8x | 256 | 500 | 1000 | 2:4:8:8 2:8:8:8 (no Z) | 4000 | 16.0 | GDDR2 | 128 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5900 XT | Dec 2003 | NV35 | 130 | AGP 8x | 256 | 400 | 700 | 3:4:8:8 3:8:8:8 (no Z) | 3200 | 22.4 | DDR | 256 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5900 | May 2003 | NV35 | 130 | AGP 8x | 256 | 400 | 850 | 3:4:8:8 3:8:8:8 (no Z) | 3200 | 27.2 | DDR | 256 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5900 Ultra | May 2003 | NV35 | 130 | AGP 8x | 256 | 450 | 850 | 3:4:8:8 3:8:8:8 (no Z) | 3600 | 27.2 | DDR | 256 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce PCX 5900 | 2004 | NV35 | 130 | PCI-E | 256 | 425 | 550 | 3:4:8:8 3:8:8:8 (no Z) | 3400 | 17.6 | DDR | 256 | 9.0 | 1.5/2.0** | |- ! style="text-align:left;" | GeForce FX 5950 Ultra | Oct 2003 | NV38 | 130 | AGP 8x | 256 | 475 | 950 | 3:4:8:8 3:8:8:8 (no Z) | 3800 | 30.4 | DDR | 256 | 9.0 | 1.5/2.0** |- ! style="text-align:left;" | GeForce PCX 5950 | 2004 | NV38 | 130 | PCI-E | 256 | 350 | 950 | 3:4:8:8 3:8:8:8 (no Z) | 3800 | 30.4 | DDR | 256 | 9.0 | 1.5/2.0** |- |} ===GeForce 6 series=== {{main|GeForce 6 Series}} *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] *<sup>2</sup> Graphics card supports [[TurboCache]], memory size entries in bold indicate total memory (VRAM + System RAM), otherwise entries are VRAM only {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! colspan=3 style="text-align:center;" | [[Fillrate]] ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) |- ! Pixel ([[Pixel|MP]]/s) ! Vertex ([[Vertex (geometry)#Vertices in computer graphics|MV]]/s) ! Texture ([[Texel (graphics)|MT]]/s) ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | GeForce 6100 | Nov 2005 | C51G | 90 | [[HyperTransport|Hyper Transport]] | 256 (Shared) | 425 | System Memory | 1:2:2:1 | 425 | 106.3 | 850 | System Memory (up to system HT limit) | System Memory | 64, 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6150 LE | Jun 2006 | C51PVG | 90 | [[HyperTransport|Hyper Transport]] | 128 (Shared), 256 | 425 | System Memory | 1:2:2:1 | 425 | 106.3 | 850 | System Memory (up to system HT limit) | System Memory | 64, 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6150 | Nov 2005 | C51PV | 90 | [[HyperTransport|Hyper Transport]] | 256 (Shared) | 475 | System Memory | 1:2:2:1 | 475 | 118.8 | 950 | System Memory (up to system HT limit) | System Memory | 64, 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6150 SE | | MCP61P | | [[HyperTransport|Hyper Transport]] | 256 (Shared) | 425 | System Memory | 1:2:2:1 | 425 | 106.3 | 850 | System Memory (up to system HT limit) | System Memory | 64,128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6200 | 2004 | NV43 | 110 | PCIe x16, AGP 8X, PCI | 128, 256, | 300 | 400 | 3:4:4:2 | 600 | 225 | 1200 | 3.2, 6.4 | DDR | 64, 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6200 TC<sup>2</sup> | 2005 | NV44 | 110 | PCIe x16 | 16, 32, 64, '''128''', '''256''' | 350 | 700, 550 | 3:4:4:2 | 700 | 262.5 | 1400 | 2.8, 5.6, 4.4 | DDR | 32, 64 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6200 | 2005 | NV44 | 110 | PCI | 128, 256 | 350 | 400 | 3:4:4:2 | 700 | 262.5 | 1400 | 3.2 | DDR | 64 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6200 | 2005 | NV43, NV44a | 110 | AGP 8X | 128, 256, 512 | 350 | 533 | 3:8:4:4 | 1400 | 262.5 | 1400 | 4.3, 8.5 | DDR, DDR2 | 64, 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6500 | 2005 | NV43, NV44 | 110 | PCIe x16 | 128, 256 | 350 | 550 | 3:4:4:4 | 1400 | 262.5 | 1400 | 4.2 | DDR2 | 64 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6600LE | 2005 | NV43 | 110 | PCIe x16, AGP-8x | 128, 256 | 300 | 500 | 3:4:4:4 | 1200 | 225 | 1200 | 8.8 | DDR | 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6600 | 2004 | NV43 | 110 | PCIe x16, AGP 8x | 128, 256, 512 | 300 | 500 | 3:8:8:4 | 1200 | 225 | 2400 | 8.8 | DDR | 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6600 DDR2 | Nov 2005 | NV43 | 110 | PCIe x16, AGP 8x | 256, 512 | 350 | 800 | 3:8:8:4 | 1400 | 262.5 | 2800 | 12.8 | DDR2 | 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6600 GT | Nov 2004 | NV43 | 110 | AGP 8X | 128 | 500 | 900 | 3:8:8:4 | 2000 | 375 | 4000 | 14.4 | GDDR3 | 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6600 GT | Aug 2004, 2005 | NV43 | 110 | PCIe x16 | 128, 256 | 500 | 1000 | 3:8:8:4 | 2000 | 375 | 4000 | 16.0 | GDDR3 | 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6600 XL | ? | NV43 | 110 | PCIe x16 | 128, 256 | 525 | 1100 | 3:8:8:4 | 2100 | 375 | 4200 | 17.6 | GDDR3 | 128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 LE | 2004 | NV40 | 130 | AGP 8X | 256 | 325 | 700 | 4:8:8:8 | 2600 | 325 | 2600 | 22.4 | DDR | 256 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 XT | 2005 | NV40, NV41, NV42 | 110 | PCIe x16, AGP 8x | 256 | 325 | 700 | 4:8:8:8 | 2600 | 325 | 2600 | 22.4 | DDR | 256/128 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 | 2004 | NV40 | 130 | AGP 8X | 128, 256 | 325 | 700 | 5:12:12:8 | 2600 | 406.3 | 3900 | 22.4 | DDR | 256 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 | 2004 | NV41, NV42 | 130, 110 | PCIe x16 | 128, 256 | 325 | 600 | 5:12:12:12 | 3900 | 406.3 | 3900 | 19.2 | DDR | 256 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 GTO | 2004 ([[OEM]] only) | NV41, NV45 | 130 | PCIe x16 | 256 | 350 | 900 | 5:12:12:12 5:12:12:12 | 4200 | 437.5 | 4200 | 28.8 | GDDR3 | 256 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 GS | Dec 2005 | NV40 | 130 | AGP 8X | 256 | 350 | 1000 | 5:12:12:12 | 4200 | 437.5 | 4200 | 32.0 | GDDR3 | 256 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 GS | Nov 2005 | NV42 | 110 | PCIe x16 | 256 | 425 | 1000 | 5:12:12:8 | 3400 | 531.25 | 5100 | 32.0 | GDDR3 | 256 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 GT | May 2004 | NV40 | 130 | PCIe x16, AGP 8X, | 256, (512 PCI-e only) | 350 | 1000 | 6:16:16:16 | 5600 | 525 | 5600 | 32.0 | GDDR3 | 256 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 Ultra | Apr 2004 | NV40 | 130 | AGP 8X | 256 | 400 | 1100 | 6:16:16:16 | 6400 | 600 | 6400 | 35.2 | GDDR3 | 256 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 Ultra Extreme | Jun 2004 | NV45 | 130 | PCIe x16 | 256 | 450 | 1100 | 6:16:16:16 | 7200 | 618.75 | 7200 | 35.2 | GDDR3 | 256 | 9.0c | 1.5 |- ! style="text-align:left;" | GeForce 6800 Ultra | Mar 2005 | NV45 | 130 | PCIe x16 | 512 | 400 | 1050 | 6:16:16:16 | 6400 | 600 | 6400 | 33.6 | GDDR3 | 256 | 9.0c | 1.5 |} ==== Features ==== {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! colspan=6 style="text-align:center;" | Features |- ! Transparency Anti-Aliasing ! OpenEXR HDR ! [[Scalable Link Interface]] ([[SLI]]) ! TurboCache Technology ! PureVideo w/WMV9 Decode ! PureVideo wo/WMV9 Decode |- ! style="text-align:left;"| GeForce 6100 | {{yes}} (FW 91.47+) | {{no}} | {{no}} | {{no}} | {{yes}} (limited) | {{no}} |- ! style="text-align:left;"| GeForce 6150 SE | {{yes}} (FW 91.47+) | {{no}} | {{no}} | {{no}} | {{yes}} (limited) | {{no}} |- ! style="text-align:left;"| GeForce 6150 | {{yes}} (FW 91.47+) | {{no}} | {{no}} | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 6150 LE | {{yes}} (FW 91.47+) | {{no}} | {{no}} | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 6200 | {{yes}} (FW 91.47+) | {{no}} | {{no}} | {{yes}} (PCI-E only) | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 6500 | {{yes}} (FW 91.47+) | {{no}} | {{yes}} | {{yes}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 6600 LE | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} (No SLI Connector) | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 6600 | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} (SLI Connector or PCI-E Interface) | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 6600 DDR2 | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} (SLI Connector or PCI-E Interface) | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 6600 GT | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 6800 LE | {{yes}} (FW 91.47+) | {{yes}} | {{no}} | {{no}} | {{no}} | {{yes}} |- ! style="text-align:left;"| GeForce 6800 XT | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} (PCI-E only) | {{no}} | {{yes}} (NV42 only) | {{no}} |- ! style="text-align:left;"| GeForce 6800 | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} (PCI-E only) | {{no}} | {{yes}} (NV41, NV42 only) | {{no}} |- ! style="text-align:left;"| GeForce 6800 GTO | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} | {{no}} | {{no}} | {{yes}} |- ! style="text-align:left;"| GeForce 6800 GS | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} (PCI-E only) | {{no}} | {{yes}} (NV42 only) | {{yes}} (NV40 only) |- ! style="text-align:left;"| GeForce 6800 GT | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} (PCI-E only) | {{no}} | {{no}} | {{yes}} |- ! style="text-align:left;"| GeForce 6800 Ultra | {{yes}} (FW 91.47+) | {{yes}} | {{yes}} (PCI-E only) | {{no}} | {{no}} | {{yes}} |} ===GeForce 7 series=== {{main|GeForce 7 Series}} *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] *<sup>2</sup> Graphics card supports [[TurboCache]], memory size entries in bold indicate total memory (VRAM + System RAM), otherwise entries are VRAM only {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! colspan=3 style="text-align:center;" | [[Fillrate]] ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) |- ! Pixel ([[Pixel|MP]]/s) ! Vertex ([[Vertex (geometry)#Vertices in computer graphics|MV]]/s) ! Texture ([[Texel (graphics)|MT]]/s) ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- !style="text-align:left;" | GeForce 7025 | [[July 2007|Jul 2007]] | C86S | | [[HyperTransport|Hyper Transport]] | 256 (Shared) | 425 | System Memory | 1:4:2:2 | 850 | 106.3 | 850 | System Memory | System Memory | 64, 128 | 9.0c | 2.0 |- !style="text-align:left;" | GeForce 7050 PV/SE | [[July 2007|Jul 2007]] | C86PV | | [[HyperTransport|Hyper Transport]] | 256 (Shared) | 425 | System Memory | 1:4:2:2 | 850 | 106.3 | 850 | System Memory | System Memory | 64, 128 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7100 GS<sup>2</sup> | Sep 2006 | NV44 | 110 | PCIe x16 | 128, '''512''' | 350 | 667 | 3:4:4:2 | 700 | 262.5 | 1400 | 5.33 | DDR2 | 64 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7200 GS<sup>2</sup> | Apr 2007 | G72 | 90 | PCIe x16 | 64, 128, 256, '''512''' | 450 | 667, 800 | 2:2:2:2 | 900 | 225 | 900 | 5.33, 6.4 | DDR2 | 64 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7300 SE<sup>2</sup> | Mar 2006 | G72 | 90 | PCIe x16 | 128, 256, '''256''', '''512''' | 450 | 667 | 2:2:2:2 | 900 | 225 | 900 | 5.33 | DDR | 64 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7300 LE<sup>2</sup> | Mar 2006 | G72 | 90 | PCIe x16 | 128, 256, '''512''' | 450 | 667 | 3:4:4:2 | 900 | 337.5 | 1800 | 5.33 | DDR2 | 64 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7300 GS<sup>2</sup> | Jan 2006 | G72 | 90 | PCIe x16, AGP 8x | 128, 256, '''256''', '''512''' | 550 | 810 | 3:4:4:2 | 1100 | 412.5 | 2200 | 6.48 | DDR2 | 64 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7300 GT | May 2006 | G73 | 90 | PCIe x16, AGP 8x | 256, 512 | 350, 400, 500 | 533, 667, 800, 1400 | 4:8:8:8 | 2800 | 350 | 2800 | 8.53-10.67-22.4 | DDR2, DDR3 | 128 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7600 GS | Mar 2006 | G73 | 90 | PCIe x16, AGP 8x | 256, 512 | 400, 560 | 800, 540 | 5:12:12:8 | 3200 | 500 | 4800 | 12.8, 8.64 | DDR2, DDR3 | 128 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7600 GT | Mar 2006 | G73 | 90 | PCIe x16, AGP 8x | 256, 512 | 560 | 1400 | 5:12:12:8 | 4480 | 700 | 6720 | 22.4 | GDDR3 | 128 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7600 GT 80nm | Jan 2007 | G73-B1 (80nm) | 80 | PCIe x16, AGP 8x | 256, 512 | 650 | 1600 | 5:12:12:8 | 5200 | 812.5 | 7800 | 25.6 | GDDR3 | 128 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7800 GS | Feb 2006 | G70 | 110 | AGP 8x | 256 | 375 | 1200 | 6:16:16:8 | 3000 | 562.5 | 6000 | 38.4 | GDDR3 | 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7800 GT | Aug 2005 | G70 | 110 | PCIe x16 | 256 | 400 | 1000 | 7:20:20:16 | 6400 | 700 | 8000 | 32.0 | GDDR3 | 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7800 GTX | Jun 2005 | G70 | 110 | PCIe x16 | 256 | 430 | 1200 | 8:24:24:16 | 6880 | 940 | 10320 | 38.4 | GDDR3 | 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7800 GTX 512 | Nov 2005 | G70 | 110 | PCIe x16 | 512 | 550 | 1700 | 8:24:24:16 | 8800 | 1100 | 13200 | 54.4 | GDDR3 | 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7900 GS | May 2006 ([[OEM]] only) Sept 2006 (Retail) | G71 | 90 | PCIe x16, AGP 8x | 256 | 450 | 1320 | 7:20:20:16 | 7200 | 822.5 | 9000 | 42.2 | GDDR3 | 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7900 GT | Mar 2006 | G71 | 90 | PCIe x16 | 256 | 450 | 1320 | 8:24:24:16 | 7200 | 940 | 10800 | 42.2 | GDDR3 | 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7900 GTO | Sept 2006 | G71 | 90 | PCIe x16 | 512 | 650 | 1320 | 8:24:24:16 | 10400 | 1300 | 15600 | 42.2 | GDDR3 | 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7900 GTX | Mar 2006 | G71 | 90 | PCIe x16 | 512 | 650 | 1600 | 8:24:24:16 | 10400 | 1400 | 15600 | 51.2 | GDDR3 | 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7900 GX2 | Mar 2006 ([[OEM]] only) | '''2x''' G71 | 90 | PCIe x16 | '''2x''' 512 | 500 | 1200 | '''2x''' 8:24:24:16 | 16000 | 2000 | 24000 | 76.8 | GDDR3 | '''2x''' 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7950 GT | Q1 2007 (256) Sept 2006 (512) | G71 | 90 | PCIe x16, AGP 8x | 256, 512 | 550 | 1400 | 8:24:24:16 | 8800 | 1100 | 13200 | 44.8 | GDDR3 | 256 | 9.0c | 2.0 |- ! style="text-align:left;" | GeForce 7950 GX2 | June 2006 (Retail) | '''2x''' G71 | 90 | PCIe x16 | '''2x''' 512 | 500 | 1200 | '''2x''' 8:24:24:16 | 16000 | 2000 | 24000 | 76.8 | GDDR3 | '''2x''' 256 | 9.0c | 2.0 |} ==== Features ==== {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! colspan=6 style="text-align:center;" | Features |- ! Transparency Anti-Aliasing ! Gamma Correct Anti-Aliasing ! 64Bit OpenEXR HDR ! Scalable Link Interface (SLI) ! TurboCache Technology ! Dual Link DVI |- ! style="text-align:left;"| GeForce 7100 GS | {{yes}} | {{no}} | {{no}} | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 7200 GS | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 7300 SE | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 7300 LE | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 7300 GS | {{yes}} | {{yes}} | {{yes}} | {{yes}} (PCI-E only) (No SLI bridge) | {{yes}} | {{no}} |- ! style="text-align:left;"| GeForce 7300 GT | {{yes}} | {{yes}} | {{yes}} | {{yes}} (PCI-E only) (No SLI bridge) | {{no}} | {{yes}} 1 |- ! style="text-align:left;"| GeForce 7600 GS | {{yes}} | {{yes}} | {{yes}} | {{yes}} (PCI-E only) | {{no}} | {{yes}} 1 |- ! style="text-align:left;"| GeForce 7600 GT | {{yes}} | {{yes}} | {{yes}} | {{yes}} (PCI-E only) | {{no}} | {{yes}} 1 |- ! style="text-align:left;"| GeForce 7600 GT (80 nm) | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} 1 |- ! style="text-align:left;"| GeForce 7800 GS | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no}} | {{yes}} 1 |- ! style="text-align:left;"| GeForce 7800 GT | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} 1 |- ! style="text-align:left;"| GeForce 7800 GTX | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} 1 |- ! style="text-align:left;"| GeForce 7800 GTX 512 | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} 1 |- ! style="text-align:left;"| GeForce 7900 GS | {{yes}} | {{yes}} | {{yes}} | {{yes}} (PCI-E only) | {{no}} | {{yes}} 2 |- ! style="text-align:left;"| GeForce 7900 GT | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} 2 |- ! style="text-align:left;"| GeForce 7900 GTO | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} 2 |- ! style="text-align:left;"| GeForce 7900 GTX | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} 2 |- ! style="text-align:left;"| GeForce 7900 GX2 | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} 2 |- ! style="text-align:left;"| GeForce 7950 GT | {{yes}} | {{yes}} | {{yes}} | {{yes}} (PCI-E only) | {{no}} | {{yes}} 2 |- ! style="text-align:left;"| GeForce 7950 GX2 | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} 2 |} ===GeForce 8 series=== {{main|GeForce 8 Series}} *<sup>1</sup> [[Unified shader model|Unified Shaders]] ([[Vertex shader]]/[[Geometry shader]]/[[Pixel shader]]) : [[Texture mapping unit]] : [[Render Output unit]] *<sup>2</sup> G80 has half the texture address units as it has texture filtering units, but with a wider (384/320bit) memory bus and more rops (24/20) than the G92. {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | Transistors (Million) ! rowspan=2 | Die Size (mm<sup>2</sup>) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory min ([[Mebibyte|MiB]]) ! rowspan=2 | Config core<sup>1</sup> ! colspan=3 style="text-align:center;" | Reference clock rate ! colspan=2 style="text-align:center;" | [[Fillrate]] ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | GFLOPs (MADD/MUL) ! rowspan=2 | TDP (Watts) |- ! Core ([[Hertz|MHz]]) ! Shader ([[Hertz|MHz]]) ! Memory ([[Hertz|MHz]]) ! Pixel ([[Pixel|MP]]/s) ! Texture ([[Texel (graphics)|MT]]/s) ! Bandwidth reference ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- |- valign="top" ! style="text-align:left;" | GeForce 8300 GS (OEM)<ref name=G84_G86_Shader_Specs> [http://www.theinquirer.net/default.aspx?article=38884], Theinquirer.net, accessed April 12, 2007.</ref> | July 2007 | G86 | 80 | ? | ? | PCI Express x16 | 128, 256 | 8:4:2 | 450 | 900 | 800 | 900 | 1800 | 6.4 | DDR2 | 64 | 10 | 2.1 | 22 | 40 |- ! style="text-align:left;" | GeForce 8400 GS <ref name="nvidia-071112">{{cite web | url=http://www.nvidia.com/object/geforce_8400.html | title=GeForce 8400 | date=2008 | author=Nvidia Corporation}}</ref> | June 2007 | G86 | 80 | 210 | 115 | PCIe x16 | 128, 256 | 16:8:4 | 450 | 900 | 800 | 1800 | 3600 | 6.4 | DDR2 | 64 | 10 | 2.0 | 43 |40 |- ! style="text-align:left;"| GeForce 8500 GT <ref name="nvidia-071112">{{cite web | url=http://www.nvidia.com/object/geforce_8500.html | title=GeForce 8500 | date=2008 | author=Nvidia Corporation}}</ref> | April 2007 | G86 | 80 | 210 | 115 | PCIe x16 | 256, 512 | 16:8:4 | 450 | 900 | 800 | 1800 | 3600 | 12.8 | DDR2 | 128 | 10 | 2.0 | 43 |45 |- ! style="text-align:left;"| GeForce 8600 GS (OEM) | April 2007 | G84 | 80 | 289 | 169 | PCIe x16 | 256, 512 | 32:16:8 | 540 | 1180 | 800 | 4320 | 8640 | 12.8 | DDR2 | 128 | 10 | 2.0 | 113 | 47 |- ! style="text-align:left;"| GeForce 8600 GT <ref name="nvidia-071111">{{cite web | url=http://www.nvidia.com/object/geforce_8600.html | title=GeForce 8600 | date=2008 | author=Nvidia Corporation}}</ref> | April 2007 | G84 | 80 | 289 | 169 | PCIe x16 | 256, 512 | 32:16:8 | 540 | 1180 | 1400 | 4320 | 8640 | 22.4 | GDDR3 | 128 | 10 | 2.0 | 113 | 47 |- ! style="text-align:left;"| GeForce 8600 GTS <ref name="nvidia-071111">{{cite web | url=http://www.nvidia.com/object/geforce_8600.html | title=GeForce 8600 | date=2008 | author=Nvidia Corporation}}</ref> | April 2007 | G84 | 80 | 289 | 169 | PCIe x16 | 256, 512 | 32:16:8 | 675 | 1450 | 2000 | 5400 | 10800 | 32.0 | GDDR3 | 128 | 10 | 2.0 | 139 | 75 |- ! style="text-align:left;"| GeForce 8800 GS <ref name="nvidia-071118">{{cite web | url=http://www.nvidia.com/page/geforce_8800.html | title=GeForce 8800 | date=2008 | author=Nvidia Corporation}}</ref> | Jan 2008 | G92 | 65 | 754 | 324 | PCIe x16 2.0 | 384 | 96:48:12 | 550 | 1375 | 1600 | 6600 | 26400 | 38.4 | GDDR3 | 192 | 10 | 2.0 | 396 | 92 |- ! style="text-align:left;"| GeForce 8800 GTS <ref name="nvidia-071118">{{cite web | url=http://www.nvidia.com/page/geforce_8800.html | title=GeForce 8800 | date=2008 | author=Nvidia Corporation}}</ref> | Feb 2007 (320) <br>Nov 2006 (640) | G80 | 90 | 681 | 484 | PCIe x16 | 320, 640 | 96:48<sup>2</sup>:20 | 500 | 1200 | 1600 | 10000 | 24000 | 64.0 | GDDR3 | 320 | 10 | 2.0 | 346 | 143 |- ! style="text-align:left;"| GeForce 8800 GT <ref name="nvidia-071116">{{cite web | url=http://www.nvidia.com/object/geforce_8800gt.html | title=NVIDIA GeForce 8800 GT | date=2008 | author=Nvidia Corporation}}</ref> | Oct 2007 (512)<br>Dec 2007 (256, 1024) | G92 | 65 | 754 | 324 | PCIe x16 2.0 | 256, 512, 1024 | 112:56:16 | 600 | 1500 | 1400 (256)<br>1800 (512, 1024) | 9600 | 33600 | 44.8 (256)<br>57.6 (512, 1024) | GDDR3 | 256 | 10 | 2.1<ref name="nvidia-071117>{{cite web | url=http://img98.imageshack.us/img98/2943/geforce8800gtcv7.jpg | title=8800 GT OpenGL 2.1 support | date=2008 | author=OpenGL Extensions Viewer 3.0}}</ref> | 504 | 102 |- ! style="text-align:left;"| GeForce 8800 GTS 512 <ref name="nvidia-071118">{{cite web | url=http://www.nvidia.com/page/geforce_8800.html | title=GeForce 8800 | date=2008 | author=Nvidia Corporation}}</ref> | Dec 2007 | G92 | 65 | 754 | 324 | PCIe x16 2.0 | 512 | 128:64:16 | 650 | 1625 | 1940 | 10400 | 41600 | 62.7 | GDDR3 | 256 | 10 | 2.1 | 624 | 143 |- ! style="text-align:left;"| GeForce 8800 GTX <ref name="nvidia-071118">{{cite web | url=http://www.nvidia.com/page/geforce_8800.html | title=GeForce 8800 | date=2008 | author=Nvidia Corporation}}</ref> | Nov 2006 | G80 | 90 | 681 | 484 | PCIe x16 | 768 | 128:64<sup>2</sup>:24 | 575 | 1350 | 1800 | 13800 | 36800 | 86.4 | GDDR3 | 384 | 10 | 2.0 | 518 | 155 |- ! style="text-align:left;"| GeForce 8800 Ultra <ref name="nvidia-071118">{{cite web | url=http://www.nvidia.com/page/geforce_8800.html | title=GeForce 8800 | date=2008 | author=Nvidia Corporation}}</ref> | May 2007 | G80 | 90 | 681 | 484 | PCIe x16 | 768 | 128:64<sup>2</sup>:24 | 612 | 1500 | 2160 | 14688 | 39168 | 103.7 | GDDR3 | 384 | 10 | 2.0 | 576 | 171 |- |} ==== Features ==== *Compute Capability: 1.1 has support for Atomic functions, which are used to write thread-safe programs. {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! colspan=7 style="text-align:center;" | Features |- ! Coverage Sample Anti-Aliasing ! Angle Independent Anisotropic Filtering ! 128Bit OpenEXR HDR ! Scalable Link Interface (SLI) ! 3-Way Scalable Link Interface (SLI) ! [[PureVideo]] HD with VP1 ! [[PureVideo]] 2 with VP2, BSP Engine, and AES128 Engine ! Compute Capability 1.1 |- ! style="text-align:left;"| GeForce 8400 GS | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no}} | {{no}} | {{yes}} | {{yes|1.1}} |- ! style="text-align:left;"| GeForce 8500 GT | {{yes}} | {{yes}} | {{yes}} | {{yes}} (No SLI Bridge) | {{no}} | {{no}} | {{yes}} | {{yes|1.1}} |- ! style="text-align:left;"| GeForce 8600 GT | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no}} | {{yes}} | {{yes|1.1}} |- ! style="text-align:left;"| GeForce 8600 GTS | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no}} | {{yes}} | {{yes|1.1}} |- ! style="text-align:left;"| GeForce 8800 GS | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no}} | {{yes}} | |- ! style="text-align:left;"| GeForce 8800 GTS | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} | {{no}} | {{no|1.0}} |- ! style="text-align:left;"| GeForce 8800 GTS Rev. 2 | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{yes}} | {{no}} | |- ! style="text-align:left;"| GeForce 8800 GT | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no}} | {{yes}} | {{yes|1.1}} |- ! style="text-align:left;"| GeForce 8800 GTS 512 | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no}} | {{yes}} | |- ! style="text-align:left;"| GeForce 8800 GTX | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no|1.0}} |- ! style="text-align:left;"| GeForce 8800 Ultra | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{no}} | {{no|1.0}} |} ===GeForce 9 series=== {{main|GeForce 9 Series}} *<sup>1</sup> [[Unified shader model|Unified Shaders]] ([[Vertex shader]]/[[Geometry shader]]/[[Pixel shader]]) : [[Texture mapping unit]] : [[Render Output unit]] {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | Transistors (Million) ! rowspan=2 | Die Size (mm<sup>2</sup>) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory min ([[Mebibyte|MiB]]) ! rowspan=2 | Config core<sup>1</sup> ! colspan=3 style="text-align:center;" | Reference clock rate ! colspan=2 style="text-align:center;" | [[Fillrate]] ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | GFLOPs (MADD/MUL) ! rowspan=2 | TDP (Watts) |- ! Core ([[Hertz|MHz]]) ! Shader ([[Hertz|MHz]]) ! Memory ([[Hertz|MHz]]) ! Pixel ([[Pixel|MP]]/s) ! Texture ([[Texel (graphics)|MT]]/s) ! Bandwidth reference ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;"| GeForce 9500 GT | 2008 | G96 | 65 | ? | 144 | PCIe x16 2.0 | 256 | 32:16:8 | 550 | 1375 | 800/1600 | 4400 | 8800 | 12.8/25.6 | DDR2/GDDR3 | 128 | 10 | 2.1 | 132 | ? |- ! style="text-align:left;"| GeForce 9600 GSO<ref>[http://www.nvidia.com/object/geforce_9600gso.html NVIDIA GeForce 9600 GSO<!-- Bot generated title -->]</ref> | May 2008 | G92 | 65 | 754 | 324 | PCIe x16 2.0 | 384 | 96:48:12 | 550 | 1375 | 1600 | 6600 | 26400 | 38.4 | GDDR3 | 192 | 10 | 2.1 | 396 | 84 |- ! style="text-align:left;"| GeForce 9600 GT<ref name="nvidia-071119">{{cite web | url=http://www.nvidia.com/object/geforce_9600gt.html | title=Nvidia GeForce 9600 GT | date=2008 | author=Nvidia Corporation}}</ref> | Feb 2008 | G94 | 65 | 505 | 240 | PCIe x16 2.0 | 512, 1024 | 64:32:16 | 650 | 1625 | 1800 | 10400 | 20800 | 57.6 | GDDR3 | 256 | 10 <ref>[http://www.pconline.com.cn/diy/graphics/reviews/0801/1205247_1.html PConline review], accessed January 14, 2008.</ref> | 2.1 | 312 | 95 |- ! style="text-align:left;"| GeForce 9800 GTX | April 2008 | G92 | 65 | 754 | 324 | PCIe x16 2.0 | 512 | 128:64:16 | 675 | 1688 | 2200 | 10800 | 43200 | 70.4 | GDDR3 | 256 | 10 | 2.1 | 648 | 156 |- ! style="text-align:left;"| GeForce 9800 GTX+ | July 2008 | G92b | 55 | 754 | 230 | PCIe x16 2.0 | 512 | 128:64:16 | 738 | 1836 | 2200 | 11808 | 47232 | 70.4 | GDDR3 | 256 | 10 | 2.1 | 705 | ? |- ! style="text-align:left;"| GeForce 9800 GX2 | Mar 2008 | '''2x''' G92 | 65 | '''2x''' 754 | '''2x''' 324 | PCIe x16 2.0 | '''2x''' 512 | '''2x''' 128:64:16 | 600 | 1500 | 2000 | '''2x''' 9600 | '''2x''' 38400 | '''2x''' 64.0 | GDDR3 | '''2x''' 256 | 10 | 2.1 | '''2x''' 576 | 197 |- |} ==== Features ==== {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! colspan=8 style="text-align:center;" | Features |- ! Coverage Sample Anti-Aliasing ! Angle Independent Anisotropic Filtering ! 128Bit OpenEXR HDR ! Scalable Link Interface (SLI) ! [[PureVideo]] 2 with VP2, BSP Engine, and AES128 Engine |- ! style="text-align:left;"| GeForce 9600 GS0 | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{yes}} |- ! style="text-align:left;"| GeForce 9600 GT | {{yes}} | {{yes}} | {{yes}} | {{yes}} | {{yes}} |- ! style="text-align:left;"| GeForce 9800 GTX | {{yes}} | {{yes}} | {{yes}} | {{yes}}<br/>3-way (link 3 GTXs) | {{yes}} |- ! style="text-align:left;"| GeForce 9800 GX2 | {{yes}} | {{yes}} | {{yes}} | {{yes}}<br/>Quad (link 2 GX2s) | {{yes}} |} ===GeForce 200 series=== {{main|GeForce 200 Series}} *<sup>1</sup> [[Unified shader model|Unified Shaders]] ([[Vertex shader]]/[[Geometry shader]]/[[Pixel shader]]) : [[Texture mapping unit]] : [[Render Output unit]] * The GeForce 200 series uses second-generation unified shaders that perform 50 percent better than the shaders found on the GeForce 9 series.<ref name='Dailytech'>{{cite news | first=Kristopher | last=Kubicki | coauthors= | title= Next-gen NVIDIA GeForce Specifications Unveiled | date=2008-05-20 | publisher= | url =http://www.dailytech.com/article.aspx?newsid=11842 | work =dailytech.com| pages = | accessdate = 2008-06-04 | language = }}</ref> Whether this is to be taken in the context of clock for clock improvement (indicating architectural improvements) or that the D10U GPU sports up to 240 Stream Processors versus the previous generations 128 max, has yet to be seen. Some sources<ref name='Fudzilla'>{{cite news | first=Fuad | last=Abazovic | coauthors= | title=Geforce GTX 280's 240 Shaders are brute force | date=2008-05-30 | publisher=Fudzilla | url =http://www.fudzilla.com/index.php?option=com_content&task=view&id=7604&Itemid=1| work =Fudzilla.com| pages = | accessdate = 2008-06-04 | language = }}</ref> hint at it being the latter case. Stating that D10U is more of an extension of the G80 and G92 generations than it is a significant deviation or reinvention. * The Geforce 200 series features on board NVIDIA PhysX™ Technology.{{Fact|date=July 2008}} {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | Transistors (Million) ! rowspan=2 | Die Size (mm<sup>2</sup>) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory min ([[Mebibyte|MiB]]) ! rowspan=2 | Config core<sup>1</sup> ! colspan=3 style="text-align:center;" | Reference clock rate ! colspan=2 style="text-align:center;" | [[Fillrate]] ! colspan=3 style="text-align:center;" | Reference Memory Configuration ! colspan=2 style="text-align:center;" | Graphics library support (version ! rowspan=2 | GFLOPs (MADD/MUL) ! rowspan=2 | TDP (Watts) |- ! Core ([[Hertz|MHz]]) ! Shader ([[Hertz|MHz]]) ! Memory ([[MT/s]]) ! Pixel ([[Pixel|MP]]/s) ! Texture ([[Texel (graphics)|GT]]/s) ! Bandwidth ([[Gigabyte|GiB]]/s) ! DRAM type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;"| GeForce GTX 260 | June 26, 2008 | D10U-20 | 65 | 1400 | 576 | PCIe x16 2.0 | 896 | 192:64:28 | 576 | 1242 | 1998 | 16128 | 36.9 | 111.9 | GDDR3 | 448 32x14 | 10 | 2.1 | 715 | 182 |- ! style="text-align:left;"| GeForce GTX 280 <ref>[http://www.xtremesystems.org/forums/showpost.php?p=2918592&postcount=73 XtremeSystems Forums - View Single Post - Next Nvidia card is D10U-30 comes before GT200<!-- Bot generated title -->]</ref><ref>[http://forums.vr-zone.com/showthread.php?t=271801 GeForce 9900 GTX (GT200) specs leaked - VR-Zone IT & Lifestyle Forum!<!-- Bot generated title -->]</ref> | June 17, 2008 | D10U-30 | 65 | 1400 | 576 | PCIe x16 2.0 | 1024 | 240:80:32 | 602 | 1296 | 2214 | 19264 | 48.2 | 141.7 | GDDR3 | 512 32x16 | 10 | 2.1 | 933 | 236 |- |} ==Comparison Table: Mobile GPUs== ===GeForce Go 7 series=== The GeForce Go 7 series for notebooks architecture. *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] *<sup>2</sup> Graphics card supports [[TurboCache]], memory size entries in bold indicate total memory (VRAM + System RAM), otherwise entries are VRAM only {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | GeForce Go 7300<sup>2</sup> | Jan 2006 | G72M | 90 | PCIe x16 | 128, 256, '''512''' | 350 | 700 | 3:4:4:2 | 1400 | 5.60 | GDDR3 | 64 | 9.0c | 2.0 | Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7400<sup>2</sup> | Jan 2006 | G72M | 90 | PCIe x16 | 128, 256, '''384''' | 450 | 700 | 3:4:4:2 | 1800 | 7.20 | GDDR3 | 64 | 9.0c | 2.0 | Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7600 | Mar 2006 | G73M | 90 | PCIe x16 | 256, 512 | 450 | 800 | 5:8:8:8 | 3600 | 12.8 | GDDR3 | 128 | 9.0c | 2.0 | Scalable Link Interface (SLI), Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7600 GS | 2006 | G73M | 90 | PCIe x16 | 256 | 400 | 800 | | | 12.8 | GDDR3 | 128 | 9.0c | 2.0 | Scalable Link Interface (SLI), Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7600 GT | 2006 | G73M | 90 | PCIe x16 | 256 | 500 | 1200 | 5:12:12:8 | 6000 | 19.2 | GDDR3 | 128 | 9.0c | 2.0 | Scalable Link Interface (SLI), Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7700 | 2006 | | 80 | PCIe x16 | 512 | 450 | 1000 | 5:12:12:8 | 5400 | 16.0 | GDDR3 | 128 | 9.0c | 2.0 | Scalable Link Interface (SLI), Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7800 | | G70M | 110 | PCIe x16 | 256 | 400 | 1100 | 6:16:16:8 | 6400 | 35.2 | GDDR3 | 256 | 9.0c | 2.0 | Scalable Link Interface (SLI), Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7800 GTX | Oct 2005 | G70M | 110 | PCIe x16 | 256 | 440 | 1100 | 8:24:24:16 | 9600 | 35.2 | GDDR3 | 256 | 9.0c | 2.0 | Scalable Link Interface (SLI), Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7900 GS | Apr 2006 | G71M | 90 | PCIe x16 | 256 | 375 | 1000 | 7:20:20:16 | 7500 | 32.0 | GDDR3 | 256 | 9.0c | 2.0 | Scalable Link Interface (SLI), Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7900 GTX | Apr 2006 | G71M | 90 | PCIe x16 | 256/512 | 500 | 1200 | 8:24:24:16 | 12000 | 38.4 | GDDR3 | 256 | 9.0c | 2.0 | Scalable Link Interface (SLI), Transparency Anti-Aliasing |- ! style="text-align:left;" | GeForce Go 7950 GTX | Oct 2006 | G71M | 90 | PCIe x16 | 512 | 575 | 1400 | 8:24:24:16 | 13800 | 44.8 | GDDR3 | 256 | 9.0c | 2.0 | Scalable Link Interface (SLI), Transparency Anti-Aliasing |} ===GeForce 8M series=== The GeForce 8M series for notebooks architecture. *<sup>1</sup> [[Unified shader model|Unified Shaders]] ([[Vertex shader]]/[[Geometry shader]]/[[Pixel shader]]) : [[Texture mapping unit]] : [[Render Output unit]] {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! colspan=3 style="text-align:center;" | Clock speed ! rowspan=2 | Config core<sup>1</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Core ([[Hertz|MHz]]) ! Shader ([[Hertz|MHz]]) ! Memory ([[Hertz|MHz]]) ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | GeForce 8400M G | May 2007 | NB8M | 80 | PCIe x16 | 128 / 256 | 400 | 800 | 1200 | 8:?:? | 3200 | 9.6 | | 64 | 10 | 2.0 | [[PureVideo]] '''NO h.264''', HD with VP2, BSP Engine, and AES128 Engine |- ! style="text-align:left;" | GeForce 8400M GS | May 2007 | NB8M | 80 | PCIe x16 | 128 / 256 | 400 | 800 | 1200 | 16:?:? | 3200 | 9.6 | | 64 | 10 | 2.0 | [[PureVideo]] HD with VP2, BSP Engine, and AES128 Engine |- ! style="text-align:left;" | GeForce 8400M GT | May 2007 | NB8M | 80 | PCIe x16 | 256 / 512 | 450 | 900 | 1200 | 16:?:? | 3600 | 19.2 | | 128 | 10 | 2.0 | [[PureVideo]] HD with VP2, BSP Engine, and AES128 Engine |- ! style="text-align:left;" | GeForce 8500M GT | May 2007 | NB8M | 80 | PCIe x16 | 256 / 512 | 450 | 900 | 1200 | 16:?:? | 3600 | 12.8 | | 128 | 10 | 2.0 | [[PureVideo]] HD with VP2, BSP Engine, and AES128 Engine |- ! style="text-align:left;" | GeForce 8600M GS | May 2007 | NB8P | 80 | PCIe x16 | 256 / 512 | 600 | 1200 | 1400 | 16:?:? | 4800 | 22.4 | DDR2 / GDDR3 | 128 | 10 | 2.0 | [[PureVideo]] HD with VP2, BSP Engine, and AES128 Engine |- ! style="text-align:left;" | GeForce 8600M GT | May 2007 | NB8P | 80 | PCIe x16 | 256 / 512 | 475 | 950 | 800 / 1400 | 32:16:8 | 7600 | 12.8 / 22.4 |DDR2 / GDDR3 | 128 | 10 | 2.0 | [[PureVideo]] HD with VP2, BSP Engine, and AES128 Engine |- ! style="text-align:left;" | GeForce 8700M GT | June 2007 | ? | 80 | PCIe x16 | 256 / 512 | 625 | 1250 | 1600 | 32:16:8 | 10000 | 25.6 | GDDR3 | 128 | 10 | 2.0 | [[SLI|Scalable Link Interface]], [[PureVideo]] HD with VP2, BSP Engine, and AES128 Engine |- ! style="text-align:left;" | GeForce 8800M GTS | November 2007 | NB8P | 65 | PCIe 2.0 x16 | 512 | 500 | 1250 | 1600 | 64:28:16 | 16000 | 51.2 | GDDR3 | 256 | 10 | 2.0 | [[SLI|Scalable Link Interface]], [[PureVideo]] HD with VP2, BSP Engine, and AES128 Engine |- ! style="text-align:left;" | GeForce 8800M GTX | November 2007 | NB8P | 65 | PCIe 2.0 x16 | 512 | 500 | 1250 | 1600 | 96:28:16 | 24000 | 51.2 | GDDR3 | 256 | 10 | 2.0 | [[SLI|Scalable Link Interface]], [[PureVideo]] HD with VP2, BSP Engine, and AES128 Engine |} ===GeForce 9M series=== The GeForce 9M series for notebooks architecture. *<sup>1</sup> [[Unified shader model|Unified Shaders]] ([[Vertex shader]]/[[Geometry shader]]/[[Pixel shader]]) : [[Texture mapping unit]] : [[Render Output unit]] {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! colspan=3 style="text-align:center;" | Clock speed ! rowspan=2 | Config core<sup>1</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|GT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Core ([[Hertz|MHz]]) ! Shader ([[Hertz|MHz]]) ! Memory ([[Hertz|MHz]]) ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! style="text-align:left;" | GeForce 9300M G | 2008 | | 80 | | 256 | 400 | 800 | 1200 | 16:?:? | 3.2 | 9.6 | | 64 | 10 | 2.1 |Rebranded 8400M GS |- ! style="text-align:left;" | GeForce&nbsp;9500M&nbsp;GS | 2008 | | 80 | | 512 | 475 | 950 | 1400 | 32:?:? | 7.6 | 22.4 | | 128 | 10 | 2.1 |Rebranded 8600M GT |- ! style="text-align:left;" | GeForce&nbsp;9600M&nbsp;GS | 2008 | | 65 | | 512 | 500 | 950 | 1600 | 32:?:? | | | | 128 | 10 | 2.1 | |- ! style="text-align:left;" | GeForce&nbsp;9600M&nbsp;GT | 2008 | | 65 | | 512 | 600 | 1200 | 1600 | 32:?:? | | | | 128 | 10 | 2.1 | |- ! style="text-align:left;" | GeForce&nbsp;9650M&nbsp;GS | 2008 | | 80 | | 512 | 625 | 1250 | 1600 | 32:?:? | ? | ? | | 128 | 10 | 2.1 |Rebranded 8700M GT |} ==Comparison Table: Workstation GPUs== ===Quadro=== {{main|NVIDIA Quadro}} *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] *<sup>2</sup>[[Unified shader model|Unified Shaders]] ([[Vertex shader]]/[[Geometry shader]]/[[Pixel shader]]) : [[Texture mapping unit]] : [[Render Output unit]] *<sup>3</sup> NV31, NV34 and NV36 are 2x2 pipeline designs if running vertex shader, otherwise they are 4x4 pipeline designs. {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup><sup>2</sup><sup>3</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) ! rowspan=2 | Features |- ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- !Quadro |NV10GL |220 |AGP 4x |64 |135 |166 |0.5:4:4:4 |480 |2.66 |SDR |128 |7 |1.2 | |- !Quadro2 MXR |NV11GL |180 |AGP 4x |64 |175 |183 |0.5:2:4:4 |800 |2.93 |SDR |128 |9.0 |1.2 | |- !Quadro2 EX |NV11GL |180 |AGP 4x |64 |175 |166 |0.5:2:4:4 |700 |2.7 |SDR |128 |9.0 |1.2 | |- !Quadro2 PRO |NV15GL |150 |AGP 4x |64 |250 |400 |0.5:4:8:8 |2000 |6.4 |DDR |128 |9.0 |1.2 | |- !Quadro DCC |NV20GL |180 |AGP 4x |128 |200 |460 |1:4:8:8 |1600 |7.4 |DDR |128 |8.0 |1.4 | |- !Quadro4 380XGL |NV18GL |150 |AGP 8x |128 |275 |513 |0.5:2:4:4 |1100 |8.2 |DDR |128 |8 |1.4 | |- !Quadro4 500XGL |NV17GL |150 |AGP 4x |128 |250 |166 |0.5:2:4:4 |1000 |2.7 |SDR |128 |8 |1.4 | |- !Quadro4 550XGL |NV17GL |150 |AGP 4x |64 |270 |400 |0.5:2:4:4 |1000 |6.4 |DDR |128 |8 |1.4 | |- !Quadro4 580XGL |NV18GL |150 |AGP 8x |64 | | | | | |DDR |128 |8 |1.4 | |- !Quadro4 700XGL |NV25 |150 |AGP 4x |64 |275 |550 |2:4:8:8 |2200 |8.8 |DDR |128 |8 |1.4 | |- !Quadro4 750XGL |NV25 |150 |AGP 4x |128 |275 |550 |2:4:8:8 |2200 |8.8 |DDR |128 |8 |1.4 |Stereo Display |- !Quadro4 900XGL |NV25 |150 |AGP 4x |128 |300 |650 |2:4:8:8 |2400 |10.4 |DDR |128 |8 |1.4 |Stereo Display |- !Quadro4 980XGL |NV28GL |150 |AGP 8x |128 |300 |650 |2:4:8:8 |2400 |10.4 |DDR |128 |8 |1.4 |Stereo Display |- !Quadro 600 |NV34GL |150 |PCI |256 |350 |800 |1:2:2:2<br>*:4:4:4 |1000 |7.8 |DDR |128 |9.0 |2.0 |Stereo Display |- !Quadro FX 500 |NV34GL |150 |AGP 8x |128 |270 |480 |1:2:2:2<br>*:4:4:4 |1080 |7.687 |DDR |128 |9.0 |2.0 |Stereo Display |- !Quadro FX 500 |NV35GL |150 |AGP 8x |128 |275 |275 |1:2:2:2<br>*:4:4:4 |1100 |4.4 |DDR |128 |9.0 |2.0 | |- !Quadro FX 1000 |NV30GL |130 |AGP 8x |128 |300 |300 |2:4:8:8 |2400 |9.6 |DDR2 |128 |9.0 |2.0 |Stereo Display |- !FX 1100 |NV36GL |130 |AGP 8x |128 |425 |325 |3:2:2:2<br>*:4:4:4 |1700 |5.2 |DDR2 |128 |9.0 |2.0 | |- !Quadro FX 2000 |NV30GL |130 |AGP 8x |128 |400 |800 |2:4:8:8 |3200 |12.8 |DDR |128 |9.0 |2.0 |Stereo display |- !Quadro FX 3000 |NV35GL |130 |AGP 8x |256 |400 |850 |3:4:8:8 |3200 |27.2 |DDR |256 |9.0 |2.0 |Stereo display |- !Quadro FX 3000G |NV35GL |130 |AGP 8x |256 |400 |850 |3:4:8:8 |3200 |27.2 |DDR |256 |9.0 |2.0 |Stereo display, [[Genlock]] |- !Quadro FX 4000 |NV40GL |130 |AGP 8x |256 |375 |500 |5:12:12:8 |4500 |32.0 |GDDR3 |256 |9.0c |2.0 |Stereo Display |- !Quadro FX 4000 SDI |NV40GL |130 |AGP 8x |256 |375 |500 |5:12:12:8 |4500 |32.0 |GDDR3 |256 |9.0c |2.0 |Stereo Display, Genlock |- !Quadro FX 350 |G72GL |90 |PCIe x16 |128 |550 |810 |3:4:4:2 |1100 |6.48 |DDR2 |128 |9.0c |2.0 | |- !Quadro FX 550 |G73GL |90 |PCIe x16 |128 |360 |800 |4:8:8:8 |2880 |12.8 |GDDR3 |128 |9.0c |2.0 | |- !Quadro FX 560 |G73GL |90 |PCIe x16 |128 |350 |1200 |5:12:12:8 |4200 |19.2 |GDDR3 |128 |9.0c |2.0 | |- !Quadro FX 570 |G84GL |? |PCIe x16 |256 |460 |800 |? |? |12.8 |DDR2 |128 |10 |2.1 | |- !Quadro FX 1400 |NV41 |130 |PCIe x16 |128 |350 |600 |5:12:12:8 |4200 |19.2 |DDR |256 |9.0c |2.0 |Stereo Display, [[SLI]] |- !Quadro FX 1500 |G71 |90 |PCIe x16 |256 |375 |800 |7:20:20:16 |7500? |40.0 |GDDR3 |256 |9.0c |2.0 | |- !Quadro FX 1700 |G84GL |? |PCIe x16 |512 |460 |800 |? |? |12.8 |DDR2 |128 |10 |2.1 | |- !Quadro FX 3450 |NV41 |130 |PCIe x16 |256 |425 |1000 |5:12:12:8 |5100 |32.0 |GDDR3 |256 |9.0c |2.0 |Stereo display, [[SLI]] |- !Quadro FX 3500 |G71GL |90 |PCIe x16 |256 |470 |1320 |7:20:20:16 |3290 |42.2 |GDDR3 |256 |9.0c |2.0 |Stereo display, [[SLI]] |- !Quadro FX 3700<sup>2</sup> |G92 |65 |PCIe x16 2.0 |512 |600 |1800 |112:56:16 |33600 |51.2 |GDDR3 |256 |10 |2.1 |Stereo display, [[SLI]] |- !Quadro FX 4500 |G70 |110 |PCIe x16 |512 |470 |1050 |8:24:24:16 |11280 |33.6 |GDDR3 |256 |9.0C |2.0 |Stereo display, [[SLI]], Genlock |- !Quadro FX 4500X2 |G70 |110 |PCIe x16 |1024 |470 |1050 |16:48:48:32 |11280 |33.6 |GDDR3 |256 |9.0C |2.0 |Stereo display, Genlock |- !Quadro FX 4500 SDI |G70 |110 |PCIe x16 |512 |470 |1050 |8:24:24:16 |11280 |33.6 |GDDR3 |256 |9.0C |2.0 |Stereo display, Genlock |- !Quadro FX 4600<sup>2</sup> |G80 |90 |PCIe x16 |768 |500 |1400 |128:32:24 |32000 |67.2 |GDDR3 |384 |10 |2.0 |Stereo display, [[SLI]], Genlock |- !Quadro FX 5500 |G71 |90 |PCIe x16 |1024 |700 |1050 |8:24:24:16 |16800 |33.6 |DDR2 |256 |9.0c |2.0 |Stereo display, [[SLI]], Genlock |- !Quadro FX 5500 SDI |G71 |90 |PCIe x16 |1024 |700 |1050 |8:24:24:16 |16800 |33.6 |DDR2 |256 |9.0c |2.0 |Stereo display, [[SLI]], Genlock |- !Quadro FX 5600<sup>2</sup> |G80 |90 |PCIe x16 |1536 |600 |1600 |128:32:24 |38400 |76.8 |GDDR3 |384 |10 |2.0 |Stereo display, [[SLI]], Genlock |- ! rowspan=2 | Model ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory max ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock max ([[Hertz|MHz]]) ! rowspan=2 | Memory clock max ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup><sup>2</sup><sup>3</sup> ! rowspan=2 | [[Fillrate]] max ([[Texel (graphics)|MT]]/s) ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] ! rowspan=2 | Features |- ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) |} === Tesla === {{main|NVIDIA Tesla}} * <sup>1</sup> Specifications not specified by NVIDIA assumed to be based on the [[GeForce 8 series#GeForce 8800|GeForce 8800GTX]] * <sup>2</sup> Specifications not specified by NVIDIA assumed to be based on the [[GeForce_200_Series#GeForce_GTX_200|GeForce GTX 280]] * For the basic specifications of Tesla, refer to the GPU Computing Processor specifications. * Due to Tesla's non-output nature, Fillrate and Graphics API compatibility are not applicable. {| class="wikitable" style="font-size: 85%; text-align: center;" |- ! rowspan=2 style="width:12em" | Configuration ! rowspan=2 | Model♦♦ ! rowspan=2 | # of GPUs ! rowspan=2 | Core clock <br/>in MHz (each) ! colspan=2 style="text-align:center;" | Shaders ! colspan=5 style="text-align:center;" | Memory ! rowspan=2 | Processing Power<br/>(GigaFLOPS, total) ! rowspan=2 | Form factor<br/>and features |- ! Thread Processors (total) ! Clock in MHz (each) ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! Total size ([[Mebibytes|MiB]]) ! Clock (MHz) |- valign="top" ! style="text-align:left;" | GPU Computing<br/>Processor<sup>1</sup> | C870 | 1 | 600 | 128 | 1350 | 77 | GDDR3 | 384 | 1536 | 1600 | 519 | Full-height [[video card]] |- valign="top" ! style="text-align:left;" | Deskside Supercomputer<sup>1</sup> | D870 | 2 | 600 | 256 | 1350 | 154 | GDDR3 | 384 | 3072 | 1600 | 1037 | [[NVIDIA Quadro Plex|Deskside system]] or [[Rack unit]] |- valign="top" ! style="text-align:left;" | GPU Computing<br> Server<sup>1</sup> | S870 | 4 | 600 | 512 | 1350 | 307 | GDDR3 | 384 | 6144 | 1600 | 2074 | [[19-inch rack|1U Rack]] |- valign="top" ! style="text-align:left;" | C1060 <br> Computing Processor<sup>2</sup> | C1060 | 1 | 602 | 240 | 1300 | 102 | GDDR3 | 512 | 4096 | 1600 | 936 | Full-height [[video card]]<br />[[IEEE 754r]] capabilities |- valign="top" ! style="text-align:left;" | S1070 1U<br />GPU Computing <br/>Server<sup>2</sup> | S1070 | 4 | 602 | 960 | 1500 | 410 | GDDR3 | 512 | 16384 | 1600 | 3744 | [[19-inch rack|1U Rack]]<br />[[IEEE 754r]] capabilities |} ==Comparison Table: Miscellaneous== *<sup>1</sup> [[Vertex shader]] : [[Pixel shader]] : [[Texture mapping unit]] : [[Render Output unit]] *<sup>2</sup> Number of RSX shader pipelines and fillrates not confirmed by Sony / NVIDIA, speculation only {| class="wikitable" style="font-size: 85%; text-align: center; width: auto;" |- ! rowspan=2 | Model ! rowspan=2 | [[Year]] ! rowspan=2 | [[Code name]] ! rowspan=2 | Fab ([[Nanometer|nm]]) ! rowspan=2 | [[Computer bus|Bus]] [[I/O interface|interface]] ! rowspan=2 | Memory ([[Mebibyte|MiB]]) ! rowspan=2 | Core clock ([[Hertz|MHz]]) ! rowspan=2 | Memory clock ([[Hertz|MHz]]) ! rowspan=2 | Config core<sup>1</sup> ! colspan=3 style="text-align:center;" | [[Fillrate]] ! colspan=3 style="text-align:center;" | Memory ! colspan=2 style="text-align:center;" | Graphics library support (version) |- ! Pixel ([[Pixel|MP]]/s) ! Vertex ([[Vertex (geometry)#Vertices in computer graphics|MV]]/s) ! Texture ([[Texel (graphics)|MT]]/s) ! Bandwidth max ([[Gigabyte|GB]]/s) ! Bus type ! Bus width ([[bit]]) ! [[DirectX]] ! [[OpenGL]] |- ! XGPU ([[Xbox]]) | [[November 2001|Nov 2001]] | NV2A | 150 | Integrated | 64 (shared) | 233 | 400 | 2:4:8:4 | 932 | 115 | 1864 | 6.4 | DDR | 128 | 8.1 | N/A |- ![[RSX 'Reality Synthesizer'|Reality Synthesizer]] ([[PlayStation 3|PS3]]) |[[November 2006|Nov 2006]] |RSX<br>G71 |90 |FlexIO |256<br>256 (shared) |550 |1400<br>3200 |8:24:?:8<br> |4400 |1100 |12000 |22.4 +<br>35(shared) |GDDR3<br>XDR |128<br>Unknown |N/A |[[OpenGL ES|ES 2.0]] |} ==References== {{reflist|2}} == See also == *[[Comparison of AMD Processors]] *[[Comparison of ATI graphics processing units]] *[[Comparison of Intel processors]] *[[Graphics card]] *[[Graphics processing unit]] *[[Quadro]] *[[Scalable Link Interface|SLI]] *[[TurboCache]] *[[CUDA]] ==External links == *[http://www.nvidia.com/ NVIDIA homepage] *[http://www.opengl.org/ OpenGL homepage] *[http://www.microsoft.com/windows/directx/default.mspx Directx homepage] *[http://www.nvidia.com/page/geforce_8800.html NVIDIA factsheet for 8800 series cards] *[http://www.techpowerup.com/gpudb techPowerUp! GPU Database] *[http://www.gpureview.com/videocards.php?manufacturer=nvidia NVIDIA Cards @ GPUReview] *[http://www.tomshardware.com/site/vgacharts/index.html Benchmarks and comparisons nearly all consumer graphics cards] *[http://download.nvidia.com/developer/Papers/2005/OpenGL_2.0/NVIDIA_OpenGL_2.0_Support.pdf OpenGL 2.0 support on NVIDIA GPUs (PDF document)] *[http://developer.download.nvidia.com/opengl/glsl/glsl_release_notes.pdf Release Notes for NVIDIA OpenGL Shading Language Support (PDF document)] *[http://www.beyond3d.com/reviews/leadtek/7300 Beyond3D 7300 GS Review] *[http://www.anandtech.com/video/showdoc.aspx?i=2977 AnandTech Comparison of PureVideo HD with VP1 and VP2] *[http://www.anandtech.com/video/showdoc.aspx?i=2984 AnandTech GeForce 8M Features] *[http://www.dailytech.com/GeForce+9600GT+Benchmarked+Out+in+the+Wild/article10397.htm DailyTech GeForce 9600GT Benchmarked] *[http://www.nvidia.com/object/IO_43357.html nVidia GeForce 8700M GT Existence] *[http://www.xbitlabs.com/news/video/display/20070613232302.html nVidia GeForce 8700M GT Details] *[http://www.drivermax.com/driver/statistics/display_nvidia.php nVidia adapters sorted by popularity] *[http://www.hardware-aktuell.com/viewnews.php?article=707 GeForce 9800 GTX Specs (German)] *[http://www.dailytech.com/article.aspx?newsid=11842 GeForce GTX Details] *[http://www.hardware-infos.com/news.php?news=2090 GeForce GTX Specs] *[http://www.fudzilla.com/index.php?option=com_content&task=view&id=7830&Itemid=1 GeForce GTX 280 Final Specs] *[http://www.fudzilla.com/index.php?option=com_content&task=view&id=7831&Itemid=1 GeForce GTX 260 Final Specs] {{NVIDIA}} [[Category:Graphics cards|*Comparison of NVIDIA Graphics Processing Units]] [[Category:Nvidia|*Comparison of NVIDIA Graphics Processing Units]] [[Category:Computing comparisons|NVIDIA Graphics Processing Units]] [[pt:Comparativo das unidades de processamento gráfico da Nvidia]] [[ru:Сравнение графических процессоров NVIDIA]] [[vi:NVIDIA GPU]] [[zh:NVIDIA顯示核心列表]]