It's been a while since I've done any C/C++, so I'm not quite sure what is happening with *((int*)&a);
. However, to answer your last question, decimal to integer conversion is as simple as ~~3.145
since bitwise operators in JavaScript convert everything to 32bit integers.
You certainly don't get anything low-level like that in JavaScript. It would be extremely dangerous to allow recasting and pointer-frobbing in a language that has to be safe for untrusted potential-attacker web sites to use.
If you want to get a 32-bit IEEE754 representation of a single-precision value in a Number (which remember is not an int either; the only number type you get in JavaScript is double
), you will have to make it yourself by fiddling the sign, exponent and mantissa bits together. There's example code here.
There are some functions to implement conversion here. The solution is quite slow however, and you may wish to consider a redesign.
Like the other posters have said, JavaScript is loose typed, so there is no differentiation in data types from float to int or vice versa.
However, what you're looking for is
float to int:
Math.floor( 3.9 ); // result: 3 (truncate everything past .) or
Math.round( 3.9 ); // result: 4 (round to nearest whole number)
Depending on which you'd like. In C/C++ it would essentially be using Math.floor
to convert to integer from float.
int to float:
var a = 10;
a.toFixed( 3 ); // result: 10.000