tags:

views:

785

answers:

2

Looking at scala's Predef object, which is automatically imported, I found the following gem

implicit def char2int(x : Char) : Int

This has caused some sleazy bugs to sneak into my code (used _1 instead of _2 in Map[Char,Int]). I really don't get it, why would I want to implicitly convert Char to Int. The whole idea of having the Char type (which is a mere number) is so that I won't use it as a number (or vice versa).

I use scala's type system in order not to have errors like that!

The only (bad) excuse I though about is to be compatible with Java's horrible behaviour.

update: The main reason given by the two answers given so far, is that the implicit conversion is done to support ordered actions upon the Char type. So that for instance 'c'+1 would generate d. If that's what you want you should do

class Char ...
    ...
    def +(x:Int) = (this.toInt+x).toChar
    def <(x:Char) = this.toInt < x.toInt

and you could add and compare characters to your liking. The fact that Char is the only 16-bits number only means we need a new Word (or Short) type.

A: 

It may not be perfect but consider the alternative...

It's just an implementation choice. At one point, many languages did distinguish greatly between char and int, but since they always had to have conversion functions it didn't actually block very much and it did just become a big pain.

No one really wants to go back to the days of chr(n) and ord(c) all over the place. Imagine what fun UTF-8 handling would be.

Once C came out with chars really being ints, it rocked, and we've stayed with that even in much more strongly typed languages.

Now, should you really want an encapsulated Char that doesn't automatically convert, nothing stops you from defining a new type, say ElazarChar, for your own code. I suspect that you will hate it.

DigitalRoss
I considered the alternative and proposed one in the updated question.
Elazar Leibovich
You have very good points, but remember that the actual question you asked was not *"is this a good idea"* or *"how about this alternative"*, but *"Why does Scala..."*. I always try to answer the exact question asked, though that does often get me in trouble. Someone once asked how to shallow copy an (immutable) list so I gave that exact answer and got blasted for telling him to copy a val. :-)
DigitalRoss
+4  A: 

Well, if you think in terms of how a Char is represented, a Char is jusr an unsigned 16 bit field, with a range from (0 to 2^16 - 1). This can fit without an overflow in an Int (32 bit signed, with a range from -2^31 to 2^31 - 1).

Some of Scala's basic types are, in order of the length of their representation in bits:

  • Byte (8)
  • Char (16)
  • Int (32)
  • Long (64)

All are signed, except for Char and all are convertible to a type "below" it as they would never over/under-flow (except for Byte to Char, which doesn't exist). See all of those implicit conversions in the Predef.

This is the reason I believe the implicit conversions exist - to allow expressions like the following to exist:

def foo(c: Char) { println(c) }
foo('a' + 2) // prints c

Another explanation is along the lines of the one you have given (a Char is just a number...). For me it does make sense - the set of all Chars is included in the set of all Ints, and therefore, applying my own guidelines for using implicits, the conversion should really be implicit.

I do understand your annoyance as I like compilers to signal errors like the one you have just gave as an example. It would be nice if Scala had a way to turn implicit conversion off (or turn specific implicit conversions off, as turning them all off would probably wreck havoc!)

The only solution I see for your problem is using Map[RichChar, Int] or something similar - RichChar is implicitly converted to an Int, as implicit conversions cannot be chained. EDIT found out that there actually is no implicit conversion from RichChar to Char.

def foo(x: Int) = x + 1

import scala.runtime.RichChar

val ch = 'a'
val rch = new RichChar('a')

foo(ch) // compiles fine
// foo(rch) // does not compile

def bar(ch: Char) = println(ch) 

// bar(rch) // oops... does not compile
implicit def rch2char(rch: RichChar): Char = rch.self.asInstanceOf[Char]

bar(rch) // yay!

EDIT: Actually, if you have a good look at the Scala API, Char does have an overloaded + method which takes an Int argument. Same goes for Int. This could have to do with the fact that the underlying JVM does something similar.

Also note that the example I have gave you had nothing to do with allowing adding Ints to Chars! This is already allowed by the API. The more subtle point is that when you add an Int to a Char, you get an Int. The implicit conversion is there to allow using the result of this addition as a Char.

Also note the more theoretical answer I have given - Char is a subset of Int !

-- Flaviu Cipcigan

Flaviu Cipcigan