Every modern source control system can slice and dice the history of a program. There are many tools to statically and dynamically analyze code. What sort of mathematical formula would allow me to integrate the amount of activity in a file along with the number of deployments of that software? We are finding that even if a program completes all of its unit tests, it requires more work than we would expect at upgrade time. A measure of this type should be possible, but sitting down and thinking about even its units has me stumped.
Update: If something gets sent to a test machine I could see marking it less rotten. If something gets sent to all test boxes I could see it getting a fresh marker. If something goes to production I could give it a nod and reduce its bitrot score. If there is a lot of activity within its files and it never gets sent anywhere I would ding the crap out of it. Don't focus on the code assume that any data I need is at hand.
What kind of commit analysis (commit comments (mentioned below) or time between commits) is fair data to apply?
Update: I think dimensional analysis could probably just be based on age. Relative to that is a little more difficult. Old code is rotten. The average age of each line of code still is simply a measure of time. Does a larger source module rot faster than a smaller, more complex one?
Update Code coverage is measured in lines. Code executed often must by definition be less rotten than code never executed. To accurately measure bitrot you would need coverage analysis to act as a damper.