[go: nahoru, domu]

Jump to content

GeForce 2 series: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Pawelkg (talk | contribs)
fix plwiki
Line 162: Line 162:


[[de:NVIDIA-GeForce-2-Serie]]
[[de:NVIDIA-GeForce-2-Serie]]
[[pl:GeForce2]]
[[pl:GeForce 2]]
[[sk:GeForce 2]]
[[sk:GeForce 2]]
[[zh:GeForce 2]]
[[zh:GeForce 2]]

Revision as of 21:58, 27 July 2007

Template:NvidiaGPU The GeForce2 (codenamed NV15) was the second generation of GeForce graphics cards by NVIDIA Corporation. It was the successor to the GeForce 256.

GeForce2 GTS

The first model, GeForce2 GTS (also known as NV15), was named for its texels rate of 1.6 billion per second - GigaTexel Shader. Due to the addition of a second TMU (texture map unit) to each of 4 pixel-pipelines, and a higher core-clock rate (200 MHz vs 120 MHz), the GTS's texel fillrate is 3.3 times higher than that of its predecessor, the GeForce 256 (480 Mtexel/sec.) Other hardware enhancements included an upgraded video-processing pipeline, called HDVP (high definition video processor). HDVP supported motion video-playback at HDTV-resolutions (MP@HL), although playback of high-resolution video still required a powerful CPU. The GeForce2 also introduced the NVIDIA Shading Rasterizer (NSR), which was actually a primitive form of what is known as Pixel Shaders today (The GeForce 256 also had this feature, but it was never publicly announced).

In 3D benchmarks and gaming applications the GTS outperformed its predecessor (GeForce 256) by up to 40%.[1] In OpenGL games (such as Quake III), the GTS outperformed the ATI Radeon and 3DFX Voodoo 5 cards in both 16bpp and 32bpp (true-color) display modes. But in Direct3D games, the Radeon was sometimes able to take the lead in 32-bit color modes.[2]

As studies of the architecture used for GeForce 256 and GeForce2 progressed, it was determined to be extremely memory bandwidth constrained.[3] The chips wasted memory bandwidth and pixel fillrate on unoptimized z-buffer usage, drawing of hidden surfaces, and relatively inefficient RAM controllers. The main competition for these two chips, the ATI Radeon DDR, had effective optimizations (HyperZ) that combated these issues.[4] Because of memory bandwidth constraints on the GeForce chips, they could not approach their theoretical performance potential, and the Radeon, with its significantly less-endowed pipeline count, offered strong competition simply due to greater efficiency. The later NV17 revision of the NV10 design, used for the GeForce4 MX, was far more efficient in memory management; although the GeForce4 MX 460 was a 2x2 pipeline design, it could outperform the GeForce2 Ultra.

Additionally, the state of the PC gaming software at the time, and the Generally new DirectX 7 (API), likely limited the number of games able to take advantage of hardware multitexturing capabilities (the most significant difference between GeForce256 and GeForce2). Most games emphasized single-layer texturing on surfaces, which would not benefit from multitexturing hardware found on the GeForce2 or Radeon.

GeForce2 cores

There were three more revisions on the GeForce2 GTS core - the first was the GeForce2 Ultra, launched in late 2000. Architecturally identical to the GTS, the Ultra was spec'd with higher core and memory clocks. Some speculate the Ultra was intended to defeat 3dfx's Voodoo 5 6000. While later tests showed that the Ultra outperforming the Voodoo 5 6000, the Voodoo 5 6000 never reached the consumer market. Nonetheless, the Ultra put a definite lead between it and the Radeon and Voodoo 5, and even outperformed the first GeForce 3 products. The GeForce 3 initially had lower texture fillrate than GF2 Ultra, but the GeForce 3 Ti500 in late 2001 finally overtook the GeForce2 Ultra.

The other two GTS revisions were the GeForce2 Pro and the GeForce2 Ti (for "titanium"). Both parts fell between the GTS and Ultra. They were positioned as cheaper, but less advanced (feature-wise), alternatives to the high-end GeForce 3, which lacked a mass-market version. The GeForce2 Ti, which was the final incarnation of the GTS core, performed competitively with the Radeon 7500 (however the 7500 had the advantage of dual-display support). The GeForce2 Ti was released in the summer of 2001 but it did not last long, being replaced by the faster and dual-monitor capable GeForce 4 MX 440 in January 2002.

GeForce2 MX

Finally, the most successful GeForce2 part was the budget-model GeForce2 MX. The GeForce2 MX was extremely popular with OEM system builders, like its predecessor the RIVA TNT2 M64, because of its low cost and relatively competent 3D feature-set. The MX retained the GTS's core 3D architecture and feature-set, but removed two 3D pixel-pipelines and half of the GTS's memory bandwidth. NVIDIA also added true dual-display support to the MX. The GTS and subsequent non-MX models could drive a separate TV-encoder, but this second-display was always tied to the primary desktop.

ATI's competing Radeon VE (later Radeon 7000) had better dual-monitor display software but it did not offer hardware T&L. The Radeon SDR was released late, lacked multi-display support, and it was not priced competitively with the MX. In addition to being released early and achieving the best price/performance ratio, the MX and the rest of the GeForce2 line was backed by a single reliable driver unlike ATI whose products suffered from unreliable drivers at the time.

The MX performed well enough to make it a viable mainstream alternative to the GTS (and its later revisions.) Among the gamer community, the MX effectively replaced the older RIVA TNT2 cards. NVIDIA eventually split the MX product-line into "performance-oriented" (MX400) and "cost-oriented" (MX200/MX100) versions. The MX400, like the original MX, had a 128-bit (SDR) memory bus which could also be configured as 64-bit DDR. The MX200 had a 64-bit SDR memory bus, greatly limiting its gaming potential. The lowest model, MX100, was equipped with a mere 32-bit SDR memory bus.[5]

The GeForce2 MX was later used by NVIDIA as integrated graphics on its nForce line of motherboard chipsets for AMD Athlon and Duron processors. It also was used in notebooks as GeForce2 Go, a reduced-power variant using only 2.6 W peak power.[6]

Successor

The successor to the GeForce2 (non-MX) line was the GeForce3. The MX line was replaced in its market position with the GeForce4 MX. The new MX was codenamed "NV17", meaning it was generally of the same family as the GeForce2. However, GeForce4 MX gained a far superior memory controller, more efficient memory bandwidth usage, and a multi-sampling anti-aliasing unit, all from the GeForce4 Ti line.[7] As a result, the GeForce 4 MX 440 and 460 efficiently achieved performance similar or better to the GeForce 2 GTS line (even beating the "brute-force" GeForce2 Ultra), and the GeForce 4 MX also offered dual monitor support.

Models

Card
Name
Codename Core
Clock
Memory
Clock
Bandwidth Pixel
Pipelines
Bus
Type
Bus
Width
Interface
GeForce 2 MX 100 NV11 143 MHz 166 MHz 0.6 GB/s 2 SDR 32 Bit AGP/PCI
GeForce 2 MX 200 NV11 175 MHz 166 MHz 1.2 GB/s 2 SDR/DDR 64 Bit AGP/PCI
GeForce 2 MX NV11 175 MHz 166 MHz 2.7 GB/s 2 SDR 128 Bit AGP/PCI
GeForce 2 MX 400 NV11 200 MHz 183 MHz 2.9 GB/s 2 SDR/DDR 128/64 Bit AGP/PCI
GeForce 2 GTS NV15 200 MHz 166 MHz 5.3 GB/s 4 DDR 128 Bit AGP
GeForce 2 Pro NV15 200 MHz 200 MHz 6.4 GB/s 4 DDR 128 Bit AGP
GeForce 2 Ti NV15 250 MHz 200 MHz 6.4 GB/s 4 DDR 128 Bit AGP
GeForce 2 Ultra NV15/NV16 250 MHz 230 MHz 7.4 GB/s 4 DDR 128 Bit AGP

Discontinued support

nVidia has stopped supporting GeForce2 GTS, GeForce2 Pro, GeForce2 Ti and GeForce2 Ultra in its drivers at all supported OSes.The last driver for Linux-FreeBSD was the 1.0-7184 released at August 24, 2006. The last driver for Windows 9x & Windows Me was the 71.84 released at March 11, 2005, the last driver for 32-bit Windows 2000 & 32-bit Windows XP was the 71.89 released at April 14, 2005, and the last driver for Vista was the 6.0.6000.16386 released at June 21, 2006.

Competing chipsets

References

  1. ^ Lal Shimpi, Anand. NVIDIA GeForce 2 GTS, Anandtech, April 26, 2000.
  2. ^ Witheiler, Matthew. ATI Radeon 64MB DDR, Anandtech, July 17, 2000.
  3. ^ Lal Shimpi, Anand. NVIDIA GeForce2 Ultra, Anandtech, August 14, 2000.
  4. ^ Lal Shimpi, Anand. ATI Radeon 256 Preview (HyperZ), Anandtech, April 25, 2000: p.5.
  5. ^ Worobyev, Andrey. Leadtek WinFast GeForce2 MX 64 MBytes, Leadtek WinFast GeForce2 MX64 and Leadtek WinFast GeForce2 MX MAX on NVIDIA GeForce2 MX, MX200 and MX400, Digit-Life, accessed September 6, 2006.
  6. ^ Smith, Rick. NVIDIA GeForce2 Go: High performance graphics to go, Reviews OnLine, December 13, 2000.
  7. ^ Freeman, Vince. VisionTek Xtasy GeForce4 MX 440 Review, Sharky Extreme, April 19, 2002: p.3.

See also

External links