views:

921

answers:

3

How do I change the thickness of lines when drawing line lists using Direct3D?

This post says that line width is not supported and goes on to provide a workaround. Other options?

While we are on this topic, do shaders allow one to draw lines with dash patterns?

+3  A: 

You can use a geometry shader that will take as an input a segment of your line and output a quad (a triangle strip made of two triangles) so that the width of the quad is constant in screen space and matches the desire line thickness. It works perfectly well (for having implemented it in a CAD 3D engine).

If geometry shader is not an option, a workaround could be to use a vertex shader, but it will require some re-work of your VB. Keep in mind that the VS must then have knowledge of the line segment in its whole so you will end up storing p and p+1 for each of your VB elements, plus the cost of duplication of indices/vertices (depending the topology used and if you render your line as an indexed primitive or not).

If performance is not an issue, doing the expand on the CPU is maybe the way to go.

EDIT:

About dash patterns: you can use a geometry shader too in order to emulate glLineStipple behavior. If you have a GL_LINES topology, that is to say isolated lines, the pattern restarts at each new line segment. So you just have to compute in the geometry shader the screen-space horizontal start (or vertical start, depending the orientation) of your line segment and pass this extra infos to the pixel shader. The pixel shader will then be responsible of discarding fragments according to the factor and pattern values (DirectX 10/11 Integer and Bitwise instructions make it easy).

Again this works well, and you can combine it with emulated width lines (with the first technique above mentioned).

Now if you have GL_LINE_STRIP topology, the pattern restarts at each new primitive (so for each new draw call). The situation becomes a bit more complicated since you need to have the knowledge of the number of pixels that have been rendered before, and this for each line segment.

You can achieve that with rendering your line strip in a temporary VB using DirectX 10 stream-out functionality (each element of this VB corresponds to the screen-space length of each segment). Then you have to do a Parallel Prefix Sum (aka Scan) of this VB (for accumulating each line segment length values).

Lastly, you render your line strip like for GL_LINES but use this extra Scan VB informations in the PS.

Stringer Bell
+1  A: 

Line thickness is not only not supported by Direct3D, but neither is it supported by any currently existing GPU. There is no GPU that I am aware of that can even draw proper lines at all (they all fake lines by using degenerate polygons - that is, the second and third vertices are at the same position, so the triangle essentially collapses to a single line).

While it is possible to set a line width in OpenGL, the OpenGL driver creates extruded quads (which are not supported by and current GPU either, and emulated using two triangles for each quad) to draw the "lines" on the screen.

So there's probably no way around creating extruded quads for that purpose. There are several ways to achieve that, as Stringer Bell explained in his answer.

The easiest way that works for me is to create a vertex buffer that contains each vertex twice, with normals pointing "right" or "left", depending on whether they are the right or left edge of the line. Then a very simple vertex shader can perform the extrusion (by changing the vertex position by a multiple of its normal vector). This way you can quickly change the line width, without having to recalculate your geometry on the CPU. This can be useful if you want to adjust the line width depending on the object's size or distance.

Pepor
This is an interesting solution. Did you achieved to get same results than with a glLineWidth call (on a per pixel basis)?
Stringer Bell
I didn't check it per-pixel, but for untextured, antialiased lines the results should be sufficiently similar. Using a texture (to emulate OpenGL's stipple patterns, for example) is quite sure to get you into a world of pain, with nonlinear interpolation of texture coordinates (which NVidia Quadros handle suprisingly bad, even compared to the mostly identical Geforce part). The shader needs to perform a lot of additional work to get the texcoords right.
Pepor
You can try that yourself, just use some sort of "line length in screen pixels" for texture coordinates and draw the line(s) with a texture that has a stipple pattern on it. You'll be surprised how uneven the pattern is. At least I was. ;-)
Pepor