I need a tool which analyzes C++ sources and says what code isn't used. Size of sources is ~500mb
PC-Lint is good. If it needs to be free/open source your choices dwindle. Cppcheck is free, and will check for unused private functions. I don't think that it looks for things like uninstantiated classes like PC-Lint.
Code coverage tool is what you need, but you will have to run our program through all functionality and see what is repoted as unused. Since the code could be DLL exported functions you will have to make sure nothing uses them externally. Some code coverage tools: Purify, CTC++, Boundschecker may have code coverage functionality if I remember right and a bunch of other tools.
Be very careful about removing any function that may have been exported without knowing what external program may be linking/using it.
Once again, I'll throw AQTime into the discussion. Has static code analysis for most, if not all, of the supported languages. I didn't really go into that part though, I mainly used the dynamic profilers (memory, performance and so on).
You could use a code coverage tool (dynamic analysis) to get an idea of what code isn't being executed, and then hand analyze to see if that code is really useless.
If you want a static analysis, you need a tool that can read the entire 500Mb of source code (est. 20 million lines? Wow!) and compute a conservative estimate of what is used. This requires doing a points-to analysis over the entire system.
Here's why: If you leave out any module Z, and decide that FOO is unused, you might find out later that Z happened to be the one that used FOO, or more subtly, Z copied a pointer value that happened to have &FOO in it to a third module M that in turn called the "unused" function throught the pointer.
What this means is that no static analysis tool that reads just single modules (compilation units) can answer this question safely. And at your scale, you can't afford to make dumb mistakes.
My company, Semantic Designs has done points-to analysis for 35 million line systems of C code using our DMS Software Reengineering Toolkit. DMS can read very large systems of source code. It required a custom tool, not so much because the source code was in an odd (archiac) dialect of C++ (systems in extremely modern dialects can't be this big, not enough time to code them!), but rather because in very large systems there are other peculiar factors at play. For the C system we did, there was a custom dynamic linker, and that affected the points-to analysis, which in turn had to be customized.
Because systems of the scale you are discussing alway have surprises like this (BIBSEH: "Because In Big Systems, Everything Happens"), you will likely need a custom tool to answer the question. DMS is designed to be customized. See http://www.semanticdesigns.com/Products/DMS/DMSToolkit.html and http://www.semanticdesigns.com/Products/FrontEnds/CppFrontEnd.html