How would you go about determining how much time was added due to working in legacy code as opposed to tested code for cost analysis if we really don't have a benchmark of working in non-legacy code to compare it too.
It's really hard to say how much time you could save if the code was maintainable. What you try to compare here is "how much would it cost to develop the same code again?" versus "how much does it cost to fix the remaining issues?" If you argue this way, you usually lose: The cost of developing something from scratch is usually much higher than the maintenance cost.
Why? Because maintenance cost can be spread over a long time while the other cost has to be paid in advance. So even if maintenance costs five times as much, it won't feel that way. On top of that, your new code would need to mature until it is as stable as what you have now. Also, you're rarely in the position to make that argument. Your boss already has decided that everything stays the way it is, so you'd have to convince him of the big change, first.
In order to make old crap maintainable, I usually start to add tests as I fix bugs. This allows me to make the code more and more maintainable while also spreading the cost. It makes maintenance a bit more expensive but in this case, you can always argue that you're trying to do a proper job.