Monday, April 20, 2009

Rasterization: King of Anti-Aliasing?

With the ongoing marketing of interactive ray tracing, I have continued to evaluate why rasterization will continue to be the dominant rendering method in games for quite some time. Ray tracing is considered by many as the end-all-be-all in computer graphics and some people seem to simply turn their brain off when it comes time to discuss its disadvantages. While conversing with an old colleague on that note, I realized I've never met a rasterization fanatic that insists everything has to be rasterized. Wouldn't it make more since to use them in areas they excelled in? As you read the next part, keep telling yourself rasterization still dominates the movie industry for good reasons and why those same principles apply in games.

Rasterization: King (or Queen) of Anti-Aliasing
It's pretty much undisputed (to sane individuals) that rasterization trumps ray tracing in terms of anti-aliasing. When Pixar's REYES renderer was being developed, one of their primary goals was providing high-quality anti-aliasing. This is one of rasterization's biggest advantages and one of the principal reasons off-line renderers still use rasterization for visibility rays. This has to do with the basic difference between ray tracers and rasterizes. With a ray tracer, one ray is fired per pixel and an intersection is computed for that ray. With anti-aliasing, more rays are fired into the scene per pixel. Thus if you want n samples, you fire n rays and do n intersections. These intersections are in 3D and are relatively expensive to compute. Rasterization, on the other hand, does a lot of work to get a primitive onto the screen. Once it's there, however, it can be sampled efficiently in 2D. So, rasterizers project geometry into 2D, and we'd like to sample in 2D since we're making a 2D image anyways. How convenient.

Once the primitive is on the screen, rasterizers can crank out the samples


Imagine a pixel on screen as you see above and one desires to perform nine samples instead of one to provide a more alias-free image. With ray tracing, each sample is an expensive intersection test. With rasterization, the renderer simply has to decide, "Does this sample lie inside the primitive or outside?" Hrmm, which one seems simpler? That that may seem trite since a computer is doing all the work, but it's the difference between solving a complex 3D intersection or a 2D region test. With ray tracing, firing n-times more rays is n-times slower. With rasterization, the brunt of the work is getting it on the screen. Once it's projected into 2D, samples are dirt cheap.

Who cares about anti-aliasing?
The topic of anti-aliasing certainly isn't sexy. It doesn't require secondary rays, funky texture maps, or any complex data structure. It's usually just more work. But it's one thing we haven't seen really pushed on GPUs--more settled on. A modern video card can do 16x full screen anti-aliasing and certainly could do much more.

Look at the image below. There's nothing really complicated in this scene. Certainly nothing that would require a ray tracer--mirrored reflections, simple smoke, some large textures, and a lot of little details. For all I know this image could have been ray traced--probably considering every modeler comes with some kind of ray tracer--but it screams for a rasterizer. You'll see games soon turning this stuff out in real time.



Anti-aliasing in games
With production studios sometimes requiring more than 64x samples per pixel, it's no wonder why companies are still rasterizing away. With current graphics cards easily performing 8x sampling per pixel while ray tracers chug away on beastly machines to compete, it would take very little for GPUs to flex their anti-aliasing muscle with even more samples. Anti-aliasing right now is pretty good and it's only getting better. Movie quality anti-aliasing in a game? Woo hoo!

1 comment:

Niya said...

Great stuff about rasterization. Keep posting. Thanks for sharing this valuable information.

Regards,
clipping path