In the code below, the value of prod is not 9,000,000; it gets a garbage value. Why do we need num1 and num2 to be of type long?
#include <stdio.h>
int main()
{
int num1 = 3000, num2 = 3000;
long int prod = num1 * num2;
printf("%ld\n", prod);
return 0;
}