I need to be able to roughly estimate how much additional time to allocate.
That's silly. There's no additional time.
Here's what you do. Take your existing testing budget. The 40% of development effort at the end of the project.
Spread most of this testing effort through the life of the project as unit testing. Call it 30% of the total effort, allocated everywhere.
Leave some of the testing effort at the end for "integration" and "performance" testing. Call it 10% of the total effort, allocated just at the end and just for integration and performance testing. Or User Acceptance Testing or whatever is left over that you didn't unit test.
There's no "additional". You have to do the testing anyway. You can either do it as first-class software, during the project, or you can scramble around at the end of the project doing a bad job.
The "cost" of TDD is -- at best -- a subjective impression. Read the excellent study summary carefully. "Subjectively, the teams experienced a 15–35% increase in initial development time after adopting TDD". [Emphasis added.]
Actually there is zero cost to TDD. TDD simply saves time.
It's much less expensive to do TDD than it is to create software other ways.
Why?
You have to test anyway. Since you have to test, you may as well plan for testing by driving all of your development around the testing.
When you have a broken test case, you have a focused set of tasks. When you are interrupted by phone calls, meetings, product demos, operational calls to support previous releases, it's easy to get distracted. When you have tests that fail, you get focus back immediately.
You have to test anyway. It's either Code then Test or it's Test then Code. You can't escape the cost of testing. Since you can't escape, do it first.
The idea that TDD has "additional" or "incremental" cost is crazy. Even a well-documented study like http://www.springerlink.com/content/q91566748q234325/ can't -- actually -- compare the same project done two ways. The idea of "additional development time" cannot actually be measured. That's why it's a "subjective impression". And they're simply wrong about it.
When you ask programmers -- who are new to TDD -- if TDD slowed them down, they lie about about it.
Yes. They Lie.
One. You made them change. Change is bad. Everyone knows that. They can't say it was easier, because it was "different".
Two. It calls management into question. We can't say bad things about our managers, that's bad. Everyone knows that. They can't say it was easier because previous things our managers demanded are now obviously wrong.
Three. Most programmers (and managers) think that there's a distinction between "real" code and "test" code. TDD takes longer to get to the "real" code because you spend your up-front time doing "test" code.
This "real" code vs. "test" code is a false distinction. Anyone who says this doesn't get how important testing is. Since testing is central to demonstrating that an application works, test code should be first-class. Making this distinction is wrong. Test code is real code.
The time spent writing test code is -- effectively -- time taken away from design of the real code. Rather than create "paper" designs, you are creating a working, living design in the form of test cases.
TDD saves time.
Folks who says otherwise are resisting change and/or trying to protect management from appearing to be wrong and/or making a false distinction between real code and test code. All things that are simply wrong.