The C standard does not dictate what machine language gets created based on the C code. You can sometimes make assumptions if you understand the underlying architecture but even that is unwise.
The days are long past where CPUs are simple beasts now that they have pipelining, multiple levels of caches and all sorts of other wondrous things to push their speed to the limit.
You should not be worrying about this level of optimization until you have a specific problem (some would say "at all").
Write your code to be readable. That should be rule number 1, 2 and 3. Which do you think is the greatest problem in software development, code running at 99.5% of it's maximum speed or developers spending days trying to figure out and/or fix what a colleague (or even themselves) did six months ago?
My advice is to worry about performance only when you find it's a problem, then benchmark on the target platforms to see where the greatest improvement can be gained. A 1% improvement in a if
statement is likely to be dwarfed by choosing a better algorithm elsewhere in your code (other things, such as number of times the code is called, being equal of course). Optimization should always be targeted to get the best bang-per-buck.