搜档网
当前位置:搜档网 › Lumenex.Engine.Technical.Brief

Lumenex.Engine.Technical.Brief

Lumenex.Engine.Technical.Brief
Lumenex.Engine.Technical.Brief

Technical Brief

Lumenex Engine:

The New Standard in GPU Image

Quality

November 2006

TB-02824-001_v01

winxos 11-01-28

winxos 11-01-28

Lumenex Engine: The New Standard in GPU Image Quality

Introduction to the

Lumenex Engine At NVIDIA we are extremely passionate about image quality. The people who design our award-winning NVIDIA? GeForce? graphics processors hail from a variety of backgrounds. Some came with experience in high-end workstation systems, where thousands of fine lines had to be rendered with the uttermost precision. Others spent their lives in CGI, where pixel shaders could run days on

end to produce the right subtle effects that made great films like The Incredibles and Cars. When these engineers put their minds to design our next-generation architecture—the GeForce 8800—they set out to build a GPU with the best image processing engine in the world. They named the new technology the NVIDIA? Lumenex? engine.

Lumenex comes from the two Latin words luminosus and lumens. It symbolizes the amazing quality of light—at once both bright and scintillating. Before the introduction of the GeForce 8800 GPU Series, PC-based graphics chips could not live up to this ideal for a variety of reasons. Chief among them was the conflict between rendering well and rendering quickly; graphics processors simply did not have the resources to render a scene in its most faithful representation without slowing to a crawl. The result was watered-down images that were neither crisp nor luminous.

The GeForce 8800 with the Lumenex engine solves these problems and raises image quality to the next level. The new Lumenex engine brings several key innovations:

16× Coverage Sampling Antialiasing (CSAA)

16× near-perfect angle-independent anisotropic filtering

16-bit and 32-bit floating-point texture filtering

Fully orthogonal 128-bit high dynamic-range (HDR) rendering with all the above features

A full 10-bit display pipeline

Lumenex Engine: The New Standard in GPU Image Quality

Lumenex Antialiasing Engine

Since NVIDIA introduced multisample antialiasing (MSAA) to the industry in 2001,

gamers have embraced the new graphics possibilities with smooth edges and crisp

textures. Over the years we continually improved our antialiasing engine, bringing

features such as gamma-corrected antialiasing and transparency antialiasing for alpha

textures. With the GeForce 8800 architecture, we were given the chance to

completely rethink our antialiasing strategy and design a solution that sets a new

standard in interactive graphics.

The current method of antialiasing relies on using multiple subpixel samples to

calculate the color of object silhouettes. Storing and reading multiple samples from

memory requires a proportionate increase in resources as the number of samples

increases. For example, 4× multisampling requires four times the storage and ROP

bandwidth as standard rendering. NVIDIA GPUs, having been designed with

multisampling in mind, can perform 4× MSAA at high resolutions with little

performance degradation. However, to attain even higher quality, antialiasing

requires additional samples. This became infeasible on prior generations of

hardware.

The Lumenex engine was designed with one goal in mind: to provide the highest

image quality with the lowest performance impact. To realize this goal, we designed

an antialiasing subsystem that employs a new algorithm called Coverage Sampling

Antialiasing (CSAA). Unlike brute-force multisampling, Coverage Sampling

Antialiasing uses intelligent coverage information to perform ultrahigh quality

antialiasing without bogging down the memory system. CSAA is introduced in the

GeForce 8800 GPUs.

The Lumenex engine sets a new standard in antialiasing by raising the total number

of samples per pixel to 16—an ultrahigh quality often used in offline rendering. The

resulting images show lines with near-perfect gradient, dramatically reduced

shimmering, and unrivalled picture clarity.

In bandwidth-constrained scenarios, traditional GPUs slowed down drastically when

rendering with 16× antialiasing. The Lumenex engine, however, was designed for

high performance so the GeForce 8800 GTX actually performs 16× antialiasing at

nearly the same speed as 4× traditional MSAA. This is a significant breakthrough

for antialiasing in interactive graphics—for the first time, graphics can be rendered

at near-CGI quality antialiasing with real-time framerates.

Lumenex Engine: The New Standard in GPU Image Quality

Case Study: Battlefield 2

Figure 1 is a screenshot from the popular game Battlefield 2. The screenshot was

taken at 1600 × 1200, a reasonably high resolution. But as evident in the highlighted

boxes (please see enlargements in Figure 2), aliasing is still prevalent. This example

illustrates why aliasing cannot be eliminated by merely increasing the screen

resolution—there will always be lines and details fine enough to cause aliasing at any

resolution.

In the next section we see what a dramatic difference the Lumenex engine’s

Coverage Sampling Antialiasing makes to the image quality.

Image taken from Battlefield 2 Figure 1. Examples of aliasing

Lumenex Engine: The New Standard in GPU Image Quality

The three pairs of images shown in Figure 2 compare the results of default

rendering and 16× CSAA. The crane without antialiasing is an awkward mixture of

jagged edges and missing pixels. With 16× CSAA enabled, all edges are rendered

smoothly and the fine lines on the right are accurately depicted.

The second image shows the improvements in high-contrast areas. The default

rendering causes distracting discontinuities in the building’s edges. With 16×xAA,

the lines are perfectly smoothed out.

Finally, the third image depicts a special case—aliasing on alpha textures. Traditional

antialiasing techniques cannot detect alpha textures so they are not effective on

these objects. However, the Lumenex antialiasing engine supports transparency

antialiasing, enabling smooth rendering of foliage, chain-linked fences, and other

alpha textures.

Image taken from Battlefield 2 Figure 2. Comparing results of default rendering and 16× CSAA

Lumenex Engine: The New Standard in GPU Image Quality

Case Study: Half-Life 2

Half-Life 2 shows complex indoor and outdoor environments with high-dynamic

range lighting. With the GeForce 7 Series of GPUs, antialiasing could not be used in

conjunction with high dynamic-range lighting. The Lumenex engine, however,

handles all scenes equally well, providing the highest image quality with no

limitations (Figure 3).

Image taken from Half-Life 2 Figure 3. Lumenex engine provides the highest image

quality with no limitations

Lumenex Engine: The New Standard in GPU Image Quality

No AA vs. 4× MSAA vs. 16× CSAA

Figure 4 compare the differences between no antialiasing, traditional 4×

multisampling, and 16× CSAA in Half-Life 2. Default rendering once again depicts

crude, jagged edges. Nice relief is offered on the 4× MSAA, but the gradient steps

are clearly visible. Far superior graduation is depicted on 16×, where jagged edges

are smoothed out to produce a near-perfect transition.

Figure 4. Comparing no AA vs. 4× MSAA vs. 16× CSAA

Lumenex Engine: The New Standard in GPU Image Quality

Incredible Performance

With Lumenex, high image quality doesn’t mean low performance (Table 1). For most

applications, 16× CSAA costs only 10 to 20 percent more than standard 4× MSAA.

Table 1. Comparing Performance

Resources 1600 × 1200, 4× MSAA 1600 × 1200, 16×CSAA

3DMarks 3DMark 2006 7419 3DMarks 7044

Call of Duty 93.6 FPS 88.0 FPS

FarCry 134.3 FPS 113.2 FPS

X3: The Reunion 78.9 FPS 67.0 FPS

Lumenex Texture Filtering

Engine

Antialiasing removes artifacts on polygon edges, but the interior of polygons, where

textures are applied, does not receive any treatment. To display textures with all

their fine details, the GPU must perform high-quality texture filtering.

Textures represented in the 2D world rarely need filtering since one pixel in the

texture corresponds to one pixel on the screen; at 100 percent view, the texture is

depicted with perfect accuracy. Viewing the texture at 25 percent zoom requires

resampling the image to fit into a smaller area. In this case, every 4 pixels need to be

averaged down to 1, reducing the image to a quarter of its size. This is a very simple

form of texture filtering.

In 3D applications, textures are almost never seen at 100 percent view and are

frequently viewed at an angle relative to the screen. They usually recede from the

viewer, much like the opening title of Star Wars. Textures in this oblique orientation

are difficult to depict accurately. The GPU must take into account the angle at

which the texture is facing the screen and take multiple samples from the texture at

different locations. This process is known as anisotropic texture filtering.

Modern GPUs can take up to 128 texture samples for each screen pixel when

conducting anisotropic texture filtering. This sampling provides high-quality

filtering, but requires enormous bandwidth. Applied indiscriminately, it can

dramatically slow down the application.

To get around this performance penalty, GPUs typically enable high-quality

anisotropic filtering only on certain angles. For example, most games depict straight

corridors with walls erected at 90 degrees. Likewise, most of the game geometry is

placed perpendicular to the floor. Previous GPUs optimized their texture filtering

engines by only filtering objects at these key angles. The result was that all objects

parallel to the walls and ceiling were correctly filtered, but all objects at an angle

received only approximate filtering. While this approach was a reasonable trade-off

for the time, today’s titles are quite a different story from the rectangular corridors

and box rooms of the last generation. To enable the maximum image quality for

today’s games, far better texture filtering is required.

Lumenex Engine: The New Standard in GPU Image Quality

Case Study: Unreal

Tournament 2004

Figure 5 is a scene from Unreal Tournament 2004 that shows the limits of texture

filtering on today’s GPUs.

Figure 6 and Figure 7 show close-ups of the red box in Figure 5. The ramp is

divided into three sections. Section A receives good filtering due to its simple 90

degree projection. However, sections B and C are on an angle so receive little

filtering, resulting in blurred textures with little detail (Figure 6).

Figure 5. Filtering

Figure 6. Regular texture filtering

Lumenex Engine: The New Standard in GPU Image Quality The Lumenex engine delivers a more robust anisotropic filtering algorithm that accounts for all surfaces, regardless of their orientation. Figure 7 is the same image rendered on the GeForce 8800 GTX. Note how sections B and C are better defined.

Figure 7. Anisotropic texture filtering on the

GeForce 8800 GTX

Lumenex Engine: The New Standard in GPU Image Quality

Near-Perfect Results

To take the Lumenex engine to its limits we put it through the ‘torture test.’ This

test consists of a cylindrical tunnel that effectively captures all possible angles that

textures can be mapped to. If the hardware does not apply anisotropic filtering to all

portions of the scene, artifacts are produced.

In Figure 8, the left scene is rendered with traditional texture filtering. The result

shows glaring streaks appearing at 45 degree intervals. In a 3D scene, these areas

would receive the lowest quality filtering.

The right side in Figure 8 is the Lumenex engine at work. The result is a near-

perfect circle—the ideal result for this test. Translated to a 3D scene, this means

near-perfect results at any angle.

Figure 8. Default anisotropic texture filtering (GeForce 7

Series on left, GeForce 8 Series on right)

Lumenex Engine: The New Standard in GPU Image Quality 128-Bit High Dynamic-Range

Rendering

High dynamic-range (HDR) rendering is a technique used to render scenes with

large variations of brightness, producing images that exhibit lifelike contrast and

tone. Almost all of today’s popular games employ HDR rendering—FarCry,

Half-Life 2: Episode One, and The Elder Scrolls IV: Oblivion are just a few examples.

Most HDR graphics engines employ 16 bits per color component (red, green, blue,

and alpha) or a total of 64 bits for high dynamic-range rendering. While this is fine

today, future applications will require greater precision.

The Lumenex engine is designed for the highest level of precision by offering 32-bit

floating-point precision for each color component, or a total of 128 bits for high

dynamic-range rendering—a level of accuracy that exceeds many film renderers

(Figure 9). This format is also especially useful for scientific computing, where 32-

bit precision is a common standard. By offering 32-bit scalar precision and 128-bit

vector precision, the Lumenex engine is fully prepared for tomorrow’s advanced

applications.

Image courtesy of Masaki Kawase Figure 9. High dynamic-range rendering

Lumenex Engine: The New Standard in GPU Image Quality

10-Bit Display Pipeline

Today’s displays use 8 bits of information for each primary color, allowing a total of

16.7 million colors to be displayed. The human eye, however, is sensitive to a much

greater range of colors and brightness.

To bring the full spectrum of colors to life, the Lumenex engine is built with a full

10-bit display pipeline. This allows over a billion unique colors to be displayed—

sixty-four times more than the standard 8-bit color. With the next generation of 10-

bit content and displays, the Lumenex engine will be able to display images of

amazing depth and richness.

Conclusion

The Lumenex engine sets a new standard in image quality. It introduces the

industry’s highest quality antialiasing with 16× CSAA, enabling studio-quality

rendering with lightening-fast performance.

Texture filtering is taken to a new level with near-perfect results at every angle,

allowing next-generation games to be rendered with the highest level of detail.

Combined with full support for 128-bit HDR rendering and a 10-bit display

subsystem, the Lumenex engine represents the new gold standard in image quality.

Image Courtesy of Futuremark

Notice

ALL NVIDIA DESIGN SPECIFICATIONS, REFERENCE BOARDS, FILES, DRAWINGS, DIAGNOSTICS, LISTS, AND

OTHER DOCUMENTS (TOGETHER AND SEPARATELY, “MATERIALS”) ARE BEING PROVIDED “AS IS.” NVIDIA

MAKES NO WARRANTIES, EXPRESSED, IMPLIED, STATUTORY, OR OTHERWISE WITH RESPECT TO THE

MATERIALS, AND EXPRESSLY DISCLAIMS ALL IMPLIED WARRANTIES OF NONINFRINGEMENT,

MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE.

Information furnished is believed to be accurate and reliable. However, NVIDIA Corporation assumes no

responsibility for the consequences of use of such information or for any infringement of patents or other

rights of third parties that may result from its use. No license is granted by implication or otherwise under any patent or patent rights of NVIDIA Corporation. Specifications mentioned in this publication are subject to

change without notice. This publication supersedes and replaces all information previously supplied. NVIDIA

Corporation products are not authorized for use as critical components in life support devices or systems

without express written approval of NVIDIA Corporation.

Trademarks

NVIDIA, the NVIDIA logo, GeForce, and Lumenex are trademarks or registered trademarks of NVIDIA Corporation in the United States and other countries. Other company and product names may be trademarks of the respective companies with which they are associated

Copyright

? 2006 NVIDIA Corporation. All rights reserved.

NVIDIA Corporation

2701 San Tomas Expressway

相关主题