tags:

views:

514

answers:

2

I have a little problem. I've recently created an algorithm to allow thick lines to be drawed onscreen (as a quad structure), the problem is that when the line is very long and diagonal the aliasing is very high, making the line look very bad. What are my chance to reduce the aliasing while trying to have high performance?

I'm using (as the tags says) DirectX as graphics API.

+1  A: 

There is a very good article in GPU Gems 2 about antialiasing technique for lines, see it here:

http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter22.html

Stringer Bell
Nice, article. I'll see to implement this algorithm even if I need to change a little how my system works. :P
feal87
Yes, I think it's one of the best paper on lines AA out there. Good luck :)
Stringer Bell
A: 

GPU multisample AA will be much faster than anything you can do on the CPU or GPU yourself.

You should really try that before optimizing in a almost certainly wrong direction.

Axel Gneiting
Many integrated card especially on DX10 does not support antialias at all, its not a choice...
feal87
All DX10 chips except Intel's IGP crap support multisampling. And it has nothing to do with DX10 - they never supported it for prior DX versions either.For DX10.1 and DX11 they will be forced to implement it, because it is required by the API.Are Intel GMA owners really your target audience?
Axel Gneiting
I'm not doing gaming after all, then yes, it may be my target audience.
feal87