int num = 5;
int denom = 7;
double d = num / denom;
this results in 0, I know you can force it to work by doing
double d = ((double) num) / denom;
but there has to be another way, right? I don't like casting primitives, who knows what may happen.