I'm interesting in the time cost on a modern desktop CPU of some floating point operations in order to optimize a mathematical evaluation. In particular I'm interested on the comparison between complex operations like exp
, log
and simple operation like +
, *
, /
.
I tried to search for this information, but I could't find a source.
What is the cost of floating point operations?