views:

398

answers:

5

If I write "long i = 1;" instead of "long i = 1l;", will the 1 be recognized as int and then implicitly converted to long?

Edit: Thank you all. I see there's no type conversion. Is this also the case with the suffix u (like 10u)? Then what's the use of those l and u?

+1  A: 

Most modern compilers should be smart enough to see that you're assigning the literal to a long, and will make the literal of that type instead of forcing a pre-assignment conversion.

Adam Maras
A: 

Today's compilers will recognize it and generate the same result.

Codism
+1  A: 

Pretty sure that if written exactly as stated, it will be equivalent to i = 1l; Any conversion will be done at compile time.

However, if you write

long i = (unsigned int)-1;

then i will probably not be what you expected.

Frank
When I compile that, `i` is -1. How is this unexpected?
Chris Lutz
On a 64-bit system with a LP64 model and 2s-complement arithmetic (OS X or Linux, for example), this will assign the value 0x00000000ffffffffL to i.
Stephen Canon
That's not unexpected, that's just what happens when you convert between signed and unsigned types and make assumptions about type sizes for "normal" platforms.
Chris Lutz
I did say *probably*. If you do not know enough to already know the answer, I would suggest that 0x00000000ffffffff probably was unexpected.
Frank
As -1 can be represented by long, the conversion will not take place. Check the ansi c standard for the details of type conversion rules.
steve
A: 

The compiler will see what you are trying to assign and set the value to 1 immediately. There is no type conversion that happens with a literal. Even if you said long x = 1.0, you won't see a runtime type conversion.

By the way, on Windows, long and int are the same so there wouldn't be a type conversion anyway.

[Edit: made last comment specific to Windows; removed reference to preprocessor]

Steve Rowe
64-bit OS X and Linux aren't modern platforms, then? =)
Stephen Canon
Can you tell I'm a Windows programmer? I didn't realize Linux and OSX changed the definition of long when they went to 64-bits.
Steve Rowe
Yep; I think that there are a few unixes that even adopted the ILP64 model, so you can't even assume that `int` didn't change with the transition to 64 bits.
Stephen Canon
The preprocessor is not involved in type conversion.
Laurence Gonsalves
I've fixed what were called out as the errors but people keep voting this down. If you are going to vote it down, please explain why so I can make the answer better.
Steve Rowe
Probably because your answer sounds misleading. There is a type conversion when you say `long x = 1.0`, but your answer makes it sound otherwise, for example.
GMan
Thanks GMan. Is the word "runtime" really that confusing?
Steve Rowe
I'm not sure. Possibly, because the OP didn't ask about runtime specifically just conversions in general. There is a compile-time conversion though, which is what the OP was probably asking about.
GMan
+3  A: 

The type of the constant 1 is int, so technically a type conversion will be done but it'll be done at compile time and nothing is lost.

However, consider the more interesting example of:

int main(void)
{
    long long i = -2147483648;
    long long j = -2147483647 - 1;

    printf( " i is %lld, j is %lld\n", i, j);

    return(0);
}

I get the following results from various compilers:

  • MSCV 9 (Version 15.00.21022.08):

                i is 2147483648, j is -2147483648
    
  • GCC (3.4.5):

                i is -2147483648, j is 0
    
  • Comeau (4.3.10.1):

                i is 2147483648, j is -2147483648
    
  • Digital Mars:

                i is -2147483648, j is -2147483648
    

I'm not sure yet how to account for the differences. It could be one or more of:

  • compiler bugs
  • C90 vs. C99 rules in operand promotion ("long long" support is C99, but some of these compilers might be compiling for C90 with "long long" as an extension)
  • implementation defined behavior
  • undefined behavior

FWIW, the behavior of MSVC and Comeau is what I expected - which is something that many might still find surprising. The logic (in my mind) for the first operation is:

  • -2147483648 gets tokenized as '-' and 2147483648
  • 2147483648 is an unsigned int (since it can't fit into an int - I believe this is different in C99)
  • applying the unary '-' operator results in 2147483648 again due to unsigned arithmetic rules
  • converting that into a long long doesn't change the sign.

The logic for the second operation is:

  • -2147483647 gets tokenized as '-' and 2147483647
  • 2147483647 is a signed int
  • subtracting 1 results in -2147483648 since there's no problem representing that number
  • converting that into a long long doesn't change the sign.
Michael Burr