views:

104

answers:

1

I used ScalaQuery and Scala.

If I have an Array[Byte] object, how do I insert it into the table?

object TestTable extends BasicTable[Test]("test") {
  def id = column[Long]("mid", O.NotNull)
  def extInfo = column[Blob]("mbody", O.Nullable)

  def * = id ~ extInfo <> (Test, Test.unapply _)
}

case class Test(id: Long, extInfo: Blob)

Can I define the method used def extInfo = column[Array[Byte]]("mbody", O.Nullable), how to operate(UPDATE, INSERT, SELECT) with the BLOB type field?

BTW: no ScalaQuery tag

+2  A: 

Since the BLOB field is nullable, I suggest changing its Scala type to Option[Blob], for the following definition:

object TestTable extends Table[Test]("test") {
  def id = column[Long]("mid")
  def extInfo = column[Option[Blob]]("mbody")
  def * = id ~ extInfo <> (Test, Test.unapply _)
}

case class Test(id: Long, extInfo: Option[Blob])

You can use a raw, nullable Blob value if you prefer, but then you need to use orElse(null) on the column to actually get a null value out of it (instead of throwing an Exception):

      def * = id ~ extInfo.orElse(null) <> (Test, Test.unapply _)

Now for the actual BLOB handling. Reading is straight-forward: You just get a Blob object in the result which is implemented by the JDBC driver, e.g.:

  Query(TestTable) foreach { t =>
    println("mid=" + t.id + ", mbody = " +
      Option(t.extInfo).map { b => b.getBytes(1, b.length.toInt).mkString })
  }

If you want to insert or update data, you need to create your own BLOBs. A suitable implementation for a stand-alone Blob object is provided by JDBC's RowSet feature:

import javax.sql.rowset.serial.SerialBlob

TestTable insert Test(1, null)
TestTable insert Test(2, new SerialBlob(Array[Byte](1,2,3)))

Edit: And here's a TypeMapper[Array[Byte]] for Postgres (whose BLOBs are not yet supported by ScalaQuery):

  implicit object PostgresByteArrayTypeMapper extends
      BaseTypeMapper[Array[Byte]] with TypeMapperDelegate[Array[Byte]] {
    def apply(p: BasicProfile) = this
    val zero = new Array[Byte](0)
    val sqlType = java.sql.Types.BLOB
    override val sqlTypeName = "BYTEA"
    def setValue(v: Array[Byte], p: PositionedParameters) {
      p.pos += 1
      p.ps.setBytes(p.pos, v)
    }
    def setOption(v: Option[Array[Byte]], p: PositionedParameters) {
      p.pos += 1
      if(v eq None) p.ps.setBytes(p.pos, null) else p.ps.setBytes(p.pos, v.get)
    }
    def nextValue(r: PositionedResult) = {
      r.pos += 1
      r.rs.getBytes(r.pos)
    }
    def updateValue(v: Array[Byte], r: PositionedResult) {
      r.pos += 1
      r.rs.updateBytes(r.pos, v)
    }
    override def valueToSQLLiteral(value: Array[Byte]) =
      throw new SQueryException("Cannot convert BYTEA to literal")
  }
szeiger
I used above code, but postgresql Driver throwing an Exception: at org.postgresql.jdbc2.AbstractJdbc2ResultSet.toLong(AbstractJdbc2ResultSet.java:2736) at org.postgresql.jdbc2.AbstractJdbc2ResultSet.getLong(AbstractJdbc2ResultSet.java:2032) at org.postgresql.jdbc3.Jdbc3ResultSet.getBlob(Jdbc3ResultSet.java:52) at org.scalaquery.session.PositionedResult.nextBlob(PositionedResult.scala:22), I see the postgresql JDBC docs(http://jdbc.postgresql.org/documentation/83/binary-data.html), maybe need use getBytes on the ResultSet.
Googol Shan
Indeed, ScalaQuery's standard BLOB handling does not work on Postgres yet. (The example works for me on H2, MySQL, Derby and HsqlDb). I have opened http://github.com/szeiger/scala-query/issues/issue/7 for this problem. For now, if you need to give the DB non-standard type names or use non-standard accessors, you can implement your own TypeMapper (possibly for a custom type that is more convenient for you to use than java.sql.Blob).
szeiger
Sorry, I try to implement myself TypeMapper, I defined the new `implicit object BArrayTypeMapper extends BaseTypeMapper[Array[Byte]]` and `class BArrayTypeMapperDelegate extends TypeMapperDelegate[Array[Byte]]`, but one compile error block me, `could not find implicit value for evidence parameter of type org.scalaquery.ql.TypeMapper[Option[Array[Byte]]]`. Can you send me a sample code to implement a Array[Byte] on postgresql Driver, my email [email protected], thank you very much.
Googol Shan
I've edited my answer to include a TypeMaper[Array[Byte]] which works for me with Postgres's BYTEA type.
szeiger
I was defined the same code as yours yesterday, but can't complie it, the error is `could not find implicit value for evidence parameter of type org.scalaquery.ql.TypeMapper[Option[Array[Byte]]]`, sould I will import some implicit helper object?
Googol Shan
I defined the implicit object in my Table object.
Googol Shan
The implicit TypeMapper needs to be available whenever you use a column of its type, not just inside the tables that define those columns. Make sure to import it at the points in your code where you are getting those compiler errors.
szeiger
I got this compiler error in the TestTable defined the `def extInfo = column[Option[Array[Byte]]]("mbody", O.Nullable)`, I think maybe the compiler can't find how to implicit transform TypeMapper[T] to TypeMapper[Option[T]], but I already import the `org.scalaquery.ql._` and `org.scalaquery.ql.TypeMapper._`, the mysterious thing is I can compile without error one time, and I tested the code it works, but compile error again when I change some other class code, I used sbt to compile the project, I don't know why this happen.
Googol Shan
I find the reason, because I defined the implicit object PostgresByteArrayTypeMapper after the `def extInfo = ...` line, the compiler find the implicit type when it parse to the `def extInfo` line, but it can't find one type TypeMapper[Array[Byte]] yet. That mysterious one maybe is the implicit object compiled fine before, and the compiler can find the right type to implicit transform. When I clean the project it can't compile again. When I move the implicit object define before to the `def extInfo` line, this error not appear again. thanks for your great help.
Googol Shan