I am trying to find a reference for approximately how many CPU cycles various operations require.
I don't need exact numbers (as this is going to vary between CPUs) but I'd like something relatively credible that gives ballpark figures that I could cite in discussion with friends.
As an example, we all know that floating point division takes more CPU cycles than say doing a bitshift.
I'd guess that the difference is that the division is around 100 cycles, where as a shift is 1 but I'm looking for something to cite to back that up.
Can anyone recommend such a resource?