I believe that for a client/server to be classed as 'rendering' something it must involve some form of calculation based on some input to produce output that is intended for viewing or printing. Beyond that we then have to look at how much work is being done, relatively speaking, in order to produce the output.
So, I would not class a server that pumps out pixel data via a network connection as having rendered anything unless it's done some work to get those pixels. E.g. a Web Server piping back a bitmap hasn't rendered anything. But a terminal server that is streaming back a live computer desktop has.
It's best to think of Application Examples and try to describe those in terms of client/server.
A Desktop App that reads a database and displays its data is rendering client-side.
A Desktop App that employs some form of parameterised XML template-based UI that is stored in a database and retrieved (and formatted) by a stored procedure could be argued to be client and server.
A Direct3d game that downloads vertex data from a server is client-side
However, if the vertex data is dynamic, and the server has to do some calculations to serve it up, then you could also argue client + server rendering is taking place.
Web pages are nearly always a mix of client + server, especially those based on Asp.Net, JSP or PHP for example; since the page output is dynamic and therefore the HTML has to be 'rendered' by the server before then being rendered by the client.
It's difficult to provide concrete answers for any sole technology.
Windows Forms, Direct3D, OpenGL, iPhone et al, Flash, Pure WPF, Silverlight and all the rest are all capable of being pure client-side UI, and a mix - depending on the application.
I suppose you could argue, however, that there is no such thing as a pure server-side UI (can't wait for a comment from someone who can point to one!)