Author Topic: ANTIALIASING  (Read 951 times)

Offline Vulcan

  • Plutonium Member
  • *******
  • Posts: 9891
Re: ANTIALIASING
« Reply #15 on: November 07, 2009, 09:15:54 PM »
This is 50% truth and 50% untrue scoffing :)

A monitor, LCD or CRT, cannot render something smaller than it's dot pitch. At a certain point anti-aliasing becomes blurring, and you loose clarity.

Offline eagl

  • Platinum Member
  • ******
  • Posts: 6769
Re: ANTIALIASING
« Reply #16 on: November 07, 2009, 09:55:37 PM »
A monitor, LCD or CRT, cannot render something smaller than it's dot pitch. At a certain point anti-aliasing becomes blurring, and you loose clarity.

At those high resolutions, the clarity loss is VASTLY outweighed by the elimination of digital artifacts.  I learned graphics programming as a pup, including actually writing the code for a half-dozen different antialiasing techniques.  Including the math (matrix manipulation, ugh!).  I spent a lot of time on the subject, and the simple fact is that elimination of digital artifacts is critical to achieving realistic imaging and getting that "suspension of disbelief" in gaming.

If you believe that image sharpness is best, then by all means turn off image processing and advanced graphics techniques.  But you're missing out.  If you don't believe me, take a peek at some of the DX10 and DX11 "reviews" that look at the actual technology and algorithms that are being used.  To a great degree, many of the newest graphics features dramatically reduce sharpness, especially ones dealing with volumetric fog effects and HDR lighting.  In a static screenshot, the images sometimes even look pretty horrible compared to the same scene without the new techniques.  But put them into dynamic motion, and they look far more realistic. 

If you want to see the jaggies, then keep all the sharpness you want.  Heck, some monitors and cards will let you apply additional sharpness processing so you can see even more "detail".  It'll look like crap, no matter how high your resolution is.  If you want to eliminate digital artifacts including moire effects, crawling edges, and fences with portions that turn invisible when you look at them from certain angles, you're going to have to apply antialiasing and sacrifice a bit of sharpness to make the image look better.  And I say "better" in the sense that at any resolution you can display, seeing an entire fence that is blurred by 4xSSAA looks far more realistic than a fence that shimmers when you move, has sections that disappear when you stop, and has jaggies on every portion that is visible.  I'll take realistic and reduced digital artifacting over unrealistic "sharpness" any day.  And I did the ugly ugly math and wrote the coad to prove what I was seeing wasn't just wishful thinking, back when I first came to hate digital artifacts.
Everyone I know, goes away, in the end.