What is their purpose?
Take load of developer machines, provide a stable, reproducible environment for builds.
Why aren't the developers building the project on their local machines, or are they?
Because with complex software, amazingly many things can go wrong when just "compiling through". problems I have actually encountered:
- various versions of incomplete Dependency Checks, resulting in binaries not updated
- Publish commands not working, the error message in the log ignored, old DLLs delivered
- Build including local sources not yet commited to source control
(fortunately, no "damn customers" message boxes yet..)
- when trying to avoid above problem by building from another folder, some files picked from the wrong folder
- Target folder where binaries are aggregated contains additional stale developer files not to be released
We've got an amazing stability increase since all public releases start with a get from source control onto an empty folder. Before, there were lots of "funny problems" that "went away when Joe gave me a new DLL".
Are some projects so large that more powerful machines are needed to build it in a reasonable amount of time?
What's "reasonable"? If I run a batch build on my local machine, there are many things I can't do. Rather than pay developers for builds to complete, pay IT to buy a real build machine already.
Is it I have just not worked on projects large enough?
Size is certainly one factor, but not the only one.