NVIDIA's GeForce 6800 Ultra - NV40 Debuts


NVIDIA's GeForce 6800 Ultra - Page 1

The GeForce 6800 Ultra
NVIDIA's NV40 Debuts...

By, Marco Chiappetta
April 14, 2004

NVIDIA is poised to fire the first shot in the next battle of the war for 3D supremacy with today's official unveiling of their new NV40 GPU.  The past two years have been especially hard fought.  Both NVIDIA and ATI have been introducing new products every few months in their continuing effort to "one-up" the other in the eyes of influential enthusiasts, casual gamers, and budget conscious consumers looking for the best return on their investment.  We've seen a myriad of new high-end, mid-range, and budget GPUs from NVIDIA and ATi, with each one designed to offer its own unique features and benefits at its specific price point.  We saw the performance lead change hands between ATi and NVIDIA at the low and mid-range market segments a few times over the past couple of years, but ever since the introduction of the Radeon 9700 Pro back in August of 2002, ATi has held onto the top spot with a firm grasp in the sought after enthusiast segment.  The R300, and the evolution of high-end "enthusiast-class" products based on its core technology, essentially remained one step ahead of NVIDIA's flagship NV3x products for all of 2003.

This put NVIDIA in the unfamiliar position of playing "catch-up", which did not sit well with their outspoken CEO, Jen-Hsun Huang.  When asked about ATi's ability to snatch the performance crown from NVIDIA he responded with, "Tiger Woods doesn't win every day. We don't deny that ATI has a wonderful product and it took the performance lead from us. But if they think they're going to hold onto it, they're smoking something hallucinogenic.''  With what we know today, the confidence, and perhaps brashness, Jen-Hsun exuded with this statement seems to have stemmed from his knowledge of NVIDIA's next-gen GPU architecture, codenamed NV40.  With the NV40, NVIDIA's goals were to dramatically improve performance and image quality, while adding support for the latest DirectX feature set.  The culmination of their efforts resulted in the new GeForce 6 Series of products powered by the NV40.

The video card we'll be looking at today on HotHardware is NVIDIA's latest flagship product, the GeForce 6800 Ultra (yes, the FX moniker is gone).  With the GeForce 6800 Ultra, NVIDIA strives to erase all of the NV3x's shortcomings, while emphatically building upon its strengths.  The result is a product that doesn't simply outperform the previous generation - it destroys it...

Specifications & Features of The GeForce 6800 Ultra
NVIDIA's Newest Flagship GPU

                    
CLICK ANY IMAGE FOR AN ENLARGED VIEW

CINEFX 3.0 SHADING ARCHITECTURE
  • Vertex Shaders
    ° Support for Microsoft DirectX 9.0 Vertex Shader 3.0
    ° Displacement mapping
    ° Vertex frequency stream divider
    ° Infinite length vertex programs*
  • Pixel Shaders
    ° Support for DirectX 9.0 Pixel Shader 3.0
    ° Full pixel branching support
    ° Support for Multiple Render Targets (MRTs)
    ° Infinite length pixel programs*
  • Next-Generation Texture Engine
    ° Up to 16 textures per rendering pass
    ° Support for 16-bit floating point format and 32-bit floating point format
    ° Support for non-power of two textures
    ° Support for sRGB texture format for gamma textures
    ° DirectX and S3TC texture compression
  • Full 128-bit studio-quality floating point precision through the entire rendering pipeline with native hardware support for 32bpp, 64bpp, and 128bpp rendering modes

NVIDIA HIGH-PRECISION DYNAMIC-RANGE (HPDR) TECHNOLOGY

  • Full floating point support throughout entire pipeline
  • Floating point filtering improves the quality of images in motion
  • Floating point texturing drives new levels of clarity and image detail
  • Floating point frame buffer blending gives detail to special effects like motion blur and explosions
  • New rotated-grid anti-aliasing removes jagged edges for incredible edge quality

INTELLISAMPLE 3.0 TECHNOLOGY

  • Advanced 16x anisotropic filtering
  • Blistering-fast anti-aliasing and compression performance
  • Support for advanced lossless compression algorithms for color, texture, and z-data at even higher resolutions and frame rates
  • Fast z-clear
  • High-resolution compression technology (HCT) increases performance at higher resolutions through advances in compression technology

ULTRASHADOW II TECHNOLOGY

  • Designed to enhance the performance of shadow-intensive games, like id Software?s Doom III

ADVANCED ENGINEERING

  • Over 220m transistors
  • Designed for PCI Express x16
  • Supports PCI Express high-speed interconnect (HSI) technology for bidirectional interconnect protocol conversion
  • Full support of AGP 8X including Fast Writes and sideband addressing
  • Support for the industry?s fastest GDDR3 memory
  • 256-bit advanced memory interface
  • 0.13 micron process technology
  • Advanced thermal management and thermal monitoring
  • 40 mmx40 mm, BGA flip-chip package

Architecture Characteristics of the GeForce 6 Series

Pixel pipelines 16
Superscalar shader Yes
Pixel shader operations/pixel 8
Pixel shader operations/clock 128
Pixel shader precision 32 bits
Single texture pixels/clock 16
Dual texture pixels/clock 8
Adaptive anisotropic filtering Yes
Z-stencil pixels/clock 32
ADVANCED VIDEO AND DISPLAY FUNCTIONALITY
  • Dedicated on-chip video processor
  • MPEG video encode and decode
  • WMV9 decode acceleration
  • Advanced adaptive de-interlacing
  • High-quality video scaling and filtering
  • Integrated NTSC/PAL TV encoder supporting resolutions up to 1024x768 without the need for panning with built-in Macrovision copy protection
  • DVD and HDTV-ready MPEG-2 decoding up to 1920x1080i resolutions
  • Dual integrated 400 MHz RAMDACs for display resolutions up to and including 2048x1536 at 85Hz.
  • Dual DVO ports for interfacing to external TMDS transmitters and external TV encoders
  • Microsoft® Video Mixing Renderer (VMR) supports multiple video windows with full video quality and features in each window
  • VIP 1.1 interface support for video-in function
  • Full NVIDIA® nView? multi-display technology capability

NVIDIA® DIGITAL VIBRANCE CONTROL? (DVC) 3.0

  • DVC color controls
  • DVC image sharpening controls

OPERATING SYSTEMS

  • Windows XP
  • Windows ME
  • Windows 2000
  • Windows 9X
  • Macintosh OS, including OS X
  • Linux

API SUPPORT

  • Complete DirectX support, including the latest version of Microsoft DirectX 9.0
  • Full OpenGL, including OpenGL 1.5

* The operating system or APIs can impose limits, but the hardware does not limit shader program length.

 
 
The Chip

 
Substrate & Die

 
A Wafer & The Official Badge
 


     

     

Physically, the GeForce 6800 Ultra looks much like the GeForce 5950 Ultra, but don't let initial impressions fool you.  The GeForce 6800 Ultra incorporates some cutting-edge technology, and has quite a few new and useful features.  If it's not entirely clear in these photos, the GeForce 6800 Ultra's cooling solution is still a two-slot design; the model we have here does encroach on the first PCI slot.  The blower and shroud are designed to pull air in through the front, and blow it across the heatsinks mounted over the GPU and RAM.    When operating at full speed, we found the fan to be somewhat louder than the ones installed on most of the retail-ready 5950 Ultras we have reviewed, but we expect NVIDIA's AIC partners will come up with some innovative cooling solutions of their own design.  We wouldn't be surprised if a few single-slot, near silent models hit store shelves in the coming months.

Also notice that our sample was equipped with two DVI connectors, for those looking to run dual-independent digital displays (or dual analog displays using DVI-to-DB15 adapters).  Dual-DVI cards have been few and far between, but we're told that not all 6800 Ultras will be dual-DVI, so don't get too excited just yet.  Some will ship with one DB15 and one DVI connector.  The next aspect of the GeForce 6800 Ultra that may catch your eye are its dual Molex power connectors.  This NV40 core is built using a .13 micron manufacturing process, and is comprised of roughly 222 million transistors.  If you're keeping track, that's approximately 25% more transistors than a P4 Extreme Edition CPU, which makes the NV40 an extremely complex (and large) ASIC.  As such, it demands a lot of power.  NVIDIA is recommending 480W power supplies be used with the GeForce 6800 Ultra.  When connecting the power cables, they can't be split from a single connection either.  The GeForce 6800 Ultra requires connections from two supplemental power rails.  Keep this in mind if you think a GeForce 6800 Ultra is in your future, as a power supply upgrade may be in order as well.  The need for this kind of "external" power stems from a limitation within the AGP spec.  AGP slots can provide a maximum of only 25W of power to the video card.  PCI Express should help alleviate the situation a bit, as the PCI Express standard calls for 60W.

     

       

We also disassembled the cooling hardware mounted on our GeForce 6800 Ultra, to get a closer look at the underlying PCB, memory and the NV40 chip itself.  Once we had the card apart, It was interesting to find that the heatsinks used on the card were constructed of aluminum, instead of copper.  During conversations with NVIDIA, we were told that although the NV40's die is larger and requires a lot of power, it runs cooler than NV38 because of tweaks made to the manufacturing process and its slightly lower clock speed.  We haven't done extensive testing, but our experience with the card so far seems to back up this claim.  At idle we witnessed core temperatures hovering around 36°C.  After a few hours of benchmarking though, temperatures climbed into the upper 40s.  With more elaborate (and expensive) copper coolers, however, temperatures could be brought down even lower.

It was also interesting to find that all 256MB (8x32MB) of the RAM installed on the card, was mounted to one side of the PCB.  Samsung's GDDR3 chips (K4J55323QF - more information here) are available in higher densities than standard DDR RAM, which eliminates the need to mount chips on the backside of the card, unless NVIDIA plans to increase the RAM over 256MB.

Right in the center of the PCB you'll see the massive NV40 itself.  The die is so large because NVIDIA has designed the NV40 to have far more pixel shading performance than last generation's high-end parts.  The NV38 (GeForce FX 5950 Ultra), for example, is a 4x2 or 8x0 (Z-only, no color data) architecture.  In most real-world gaming scenarios, this means it can process 4, dual textured pixels per clock cycle which equates to a theoretical peak fillrate of 3.8GTexels/s.  The NV40, on the other hand, is a 16x1 or 32x0 architecture.  The NV40 has four times the number of pixel pipelines as the NV38, which accounts for a large number of the transistors that comprise the core.  NVIDIA didn't only increase the number of pipelines, but they have also incorporated a second pixel shader unit per pipe and have brought the number of vertex units up to 6 as well.

Although we'll be shining the spotlight on the GeForce 6800 Ultra today, NVIDIA is also announcing the 6000 "non-Ultra".  The GeForce 6800 is based on the same NV40 architecture, but it will have "only" 12 pixel pipelines, as opposed to Ultra's 16.  NVIDIA hasn't disclosed final clock speeds just yet, but they have informed us that the base GeForce 6800 will be a single-slot design (like the card pictured at the very top of this page), and it will require only one Molex power connection.  As more solid information about the GeForce 6800 comes in, we're sure we'll be able to tell you more, but for now, lets move on to some of the GeForce 6800 Ultra's other key features...

New Features & The Drivers


Tags:  Nvidia, GeForce, Ultra, force, 680, ULT, BU, id

Related content