: News
: Reviews
: Editorials

: Archives
: About
: Contact
: Advertising
: Privacy
: Links
: Forums




Creating the 512 MB 7800 GTX


Insights into the High End


by Josh Walrath


            The road to the 512 MB 7800 GTX was a strange and twisted one, but eventually it has led to what was released today.  As of November 14, 2005 it is the fastest 3D graphics card on the planet.  Period.  Two of them in SLI are insane.  These are so fast that most test platforms are starting to appear CPU limited in many situations.

            The 512 MB 7800 GTX runs at a blistering 550 MHz core and 850 MHz memory (1700 MHz effective GDDR-3).  To keep this cool NVIDIA designed a large dual slot cooler with integrated heatpipes and a large diameter fan.  This heatsink/fan combination has been seen previously on Quadro cards as well as Leadtek’s 7800 GTX TDH Extreme.  NVIDIA also did another revision of the PCB for this card so the core and memory speeds can be maximized.  This probably makes the PCB a little bit more expensive than a standard 7800 GTX, but the speed gains are very impressive to say the least.

            This product beats simply everything on the market.  Even in applications where the ATI X1800 XT really showed up the 7800 GTX, the 512 MB version simply tops everything.  Nothing else can touch this card when it is in a SLI configuration.  People who want to play their games at high resolutions with AA and high quality AF enabled will obviously be attracted to this card.  The only thing standing between most people and pure 3D happiness is the $649 MSRP.  At the time of this writing most retailers are selling products for between $699 and $749.  I know most people at NVIDIA and its board partners are usually unhappy when it appears as though retailers are gouging customers, but the laws of supply and demand can often be painful for customers.

            So how did the 512 MB 7800 GTX come into existence?  The story is quite interesting, and I can hopefully take readers behind the scenes and explain the series of events that transpired to make this idea into reality.  Some of this is confirmed, and some of this was gathered in rumors about the industry.  The story is not simply “let’s make this faster” but rather “how can we one-up our competition without going totally insane?” 

Sweating Bullets

            The 3D graphics companies do not live in vacuums, and due to the nature of the business they are usually painfully aware of what exactly the competition is doing.  How so?  First of all, both ATI and NVIDIA share the same primary foundry, which is TSMC.  Secondly, they have to deal with many of the same subcomponent manufacturers like Samsung, Infineon, and others.  They know when the other makes big orders, cancels orders, or makes major changes to their orders.  Another part of this is that engineers talk to each other.  Silicon Valley, while large, has a pretty close knit community of engineers that have known each other since the early days of 3D.  It is not uncommon to go to eateries and hear these guys talking about what they are working on.

            In early December NVIDIA found out that ATI had received working silicon of the R520 chip on 90 nm.  They also heard that it was optimized for high clock speeds though with 16 pixel shader pipelines.  Rumors of 700 MHz core speeds quickly spread.  NVIDIA was quite well along with the G70, and most likely they had silicon back as well for that product.  The chip was able to clock well into the 400’s without a problem, but things could get dicey if ATI was able to deliver on a very fast clocked R520.

            To combat this potential monster of a chip NVIDIA felt they needed to be able to clock the G70 to possibly 600 MHz and beyond, depending on what ATI was able to deliver.  While the G70’s improved 24 pixel shader design would not have a problem competing with the faster 16 pixel shader R520, they both share a 8 vertex shader design and that would be an area where ATI could really come out ahead.  There was also a question about what the memory controller on the G70 could handle for speed.

            From what I gather NVIDIA initially planned to call the 7800 GTX the “Ultra” and leave the 7800 GT as is.  With the threat of ATI’s R520 looming, NVIDIA decided they needed to change their lineup to allow some flexibility.  By early Winter 2005 NVIDIA still hadn’t decided exactly what to do, but the wheels at NVIDIA were put into motion and some interesting ideas where put forward to get the 7800 “Ultra” into form to compete.

            The 7800 GTX was going to be left “as-is” with a single slot cooler and 430 MHz core speeds.  What was going to be in the air was where it would be introduced at in terms of price.  If the R520 was the performer that NVIDIA feared it would be, then NVIDIA was expecting X1800 XL, X1800 XT, and X1800 XT/PE    editions.  If that was the case then the 7800 GT would be introduced at $399, the 7800 GTX at $499, and the 7800 Ultra at $649 to compete with everything ATI could throw at them.


Next: Cooling and Design


If you have found this article interesting or a great help, please donate to this site.


Copyright 1999-2005 PenStar Systems, LLC.