Saturday, November 8, 2008


The GeForce logo used since 2007
Invented by Nvidia

GeForce is a brand of PC graphics processor units (GPUs) designed by Nvidia. The first GeForce products were designed and marketed for the high-margin computer gamer market, but later the product's releases expanded the product line to cover all tiers of the graphics market, from low-end to high-end. As of 2008, there have been ten iterations of the design. Nvidia only designs the chips; manufacturing is outsourced. While several companies (notably, Intel) design low-end GPUs, only Nvidia's GeForce and ATI's Radeon series compete for the high-end GPU market.



Name origin

The "GeForce" name originated from a contest held by Nvidia in early 1999. Called "Name That Chip", the contest called out to the public to name the successor to the RIVA TNT2 line of graphics boards. There were over 12,000 entries received and 7 winners received a RIVA TNT2 Ultra graphics board as a reward.[1][2]


GeForce 256
Launched on August 31, 1999, the GeForce 256 (NV10) was the first PC graphics chip with hardware transform, lighting, and shading although 3D games utilizing this feature did not appear until later. Initial GeForce 256 boards shipped with SDR SDRAM memory, and later boards shipped with faster DDR SDRAM memory.
Launched in April 2000, the first GeForce2 (NV15) was another high-performance graphics chip. Nvidia moved to a twin texture processor per pipeline (4x2) design, doubling texture fillrate per clock compared to GeForce 256. Later, Nvidia released the GeForce2 MX (NV11), which offered performance similar to the GeForce 256 but at a fraction of the cost. The MX was a compelling value in the low/mid-range market segments and was popular with OEM PC manufacturers and users alike.
Launched in February 2001, the GeForce3 (NV20) introduced DirectX 8.0 programmable pixel shaders to the GeForce family. It had good overall performance and shader support, making it popular with enthusiasts although it never hit the midrange price point. A derivative of the GeForce3, NV2A, was developed for the Microsoft Xbox game console.
Launched in February 2002, the high-end GeForce4 Ti (NV25) was mostly a refinement to the GeForce3. The biggest advancements included enhancements to anti-aliasing capabilities, an improved memory controller, a second vertex shader, and a manufacturing process size reduction to increase clock speeds. Another "family member," the budget GeForce4 MX, was based on the GeForce2, with a few additions from the new GeForce4 Ti line. It targeted the value segment of the market and lacked pixel shaders.
GeForce FX
Officially launched in November 2002, the GeForce FX (NV30) was a huge change in architecture compared to its predecessors. The GPU was designed not only to support the new Shader Model 2 specification but also to perform well on older DirectX 7 and 8 titles. However, initial models suffered from weak floating point shader performance and excessive heat which required two-slot cooling solutions. Products in this series carry the 5000 model number, as it is the fifth generation of the GeForce, though Nvidia marketed the cards as GeForce FX instead of GeForce 5 to show off "the dawn of cinematic rendering".
GeForce 6
Launched in April 2004, the GeForce 6 (NV40) added Shader Model 3.0 support to the GeForce family, while correcting the weak floating point shader performance of its predecessor. It also implemented high dynamic range imaging and introduced SLI (Scalable Link Interface) and PureVideo capability.
GeForce 7
The 7th generation GeForce (G70/NV47) was launched in June 2005. The design was a refined version of GeForce 6, with the major improvements being a widened pipeline and an increase in clock speed. The GeForce 7 also offers new transparency supersampling and transparency multisampling anti-aliasing modes (TSAA and TMAA). These new anti-aliasing modes were later enabled for the GeForce 6 series as well.
A modified version of GeForce 7800GTX called the RSX 'Reality Synthesizer' is used as the main GPU in the PlayStation 3 from Sony.
GeForce 8
Released on November 8, 2006, the 8th generation GeForce (G80 originally) was the first ever GPU to fully support DirectX 10. Built on a brand new architecture, it has a fully unified shader architecture. Originally just the 8800GTX, the GTS was released months into the product line's life, and it took nearly 6 months for mid-range and OEM/mainstream cards to be integrated into the 8-series. Die-shrinks and revisions to the G80 design, codenamed G92, were implemented into the 8 series with the 8800GS, the 8800GT, and 8800GTS-512.
GeForce 9 / GeForce 100 Series
The successor to the GeForce 8 series graphics products. The first product was released on February 21, 2008.[3] No concrete information about the products was known except officials claiming the next generation products having close to 1 TFLOPS performance while the GPU cores being made on 65 nm process, and reports about Nvidia downplaying the significance of DirectX 10.1.[4] So far, all 9-series designs, both currently-out and speculated, are simply revisions to existing late 8-series products. The 9600GT uses the G94 architecture, which differs from G92 architecture as the GPU has 64 stream processors.[5] The 9800GX2 uses two G92 GPU's, as used in later 8800 cards, in a dual PCB configuration while still only requiring a single PCI-Express 16x slot. The 9800GX2 utilises two separate 256-bit memory busses, one for each GPU and its respective 512MB of memory, which equates to an overall of 1GB of memory on the card (although the SLI configuration of the chips necessitates mirroring the frame buffer between the two chips, thus effectively having the memory performance of a 256-bit/512MB configuration). The later 9800GTX features a single G92 GPU, 256-bit data bus, and 512MB of GDDR3 memory[6].
GeForce GTX 200
Based on the GT200 graphics processor consisting of 1.4 Billion transistors, the GTX 200 series launched at 0630 PDT on 16 June 2008.[7] The next generation of the GeForce series takes the card-naming scheme in a controversial new direction, by replacing the series number (such as 8800 for 8-series cards) with the GTX acronym (which used to go at the end of card names, denoting their 'rank' among other similar models), and then adding model-numbers such as 260 and 280 after that. The GTX could be interpreted as 10000, as the predecessor was 9000, and the Roman Numeral X is also representative of 10, which could translate as: GeForce 200 could also be read as 10200.[8] The series will feature the new GT200 core on a 65nm.[9] The first products will be GeForce GTX 260 and the more expensive GeForce GTX 280.[10]

1 comment:

Ralph said...

I recently came across your blog and have been reading along. I thought I would leave my first comment. I don't know what to say except that I have enjoyed reading. Nice blog. I will keep visiting this blog very often.