views:

276

answers:

4

I am using clojure.contrib.sql to fetch some records from an SQLite database.

(defn read-all-foo []
  (with-connection *db*
    (with-query-results res ["select * from foo"]
       (into [] res))))

Now, I don't really want to realize the whole sequence before returning from the function (i.e. I want to keep it lazy), but if I return res directly or wrap it some kind of lazy wrapper (for example I want to make a certain map transformation on result sequence), SQL-related bindings will be reset and connection will be closed after I return, so realizing the sequence will throw an exception.

How can I enclose the whole function in a closure and return a kind of iterator block (like yield in C# or Python)?

Or is there another way to return a lazy sequence from this function?

A: 

I have never used SQLite with Clojure before, but my guess is that with-connection closes the connection when it's body has been evaluated. So you need to manage the connection yourself if you want to keep it open, and close it when you finish reading the elements you're interested in.

Michiel Borkent
That what I want to do, but I want it to be automagically handled by an iterator block closure (or some other form of closure that implements a lazy seq interface).
Alex B
+5  A: 

The resultset-seq that with-query-results returns is probably already as lazy as you're going to get. Laziness only works as long as the handle is open, as you said. There's no way around this. You can't read from a database if the database handle is closed.

If you need to do I/O and keep the data after the handle is closed, then open the handle, slurp it in fast (defeating laziness), close the handle, and work with the results afterward. If you want to iterate over some data without keeping it all in memory at once, then open the handle, get a lazy seq on the data, doseq over it, then close the handle.

So if you want to do something with each row (for side-effects) and discard the results without eating the whole resultset into memory, then you could do this:

(defn do-something-with-all-foo [f]
  (let [sql "select * from foo"]
    (with-connection *db*
      (with-query-results res [sql]
        (doseq [row res]
          (f row))))))

user> (do-something-with-all-foo println)
{:id 1}
{:id 2}
{:id 3}
nil

;; transforming the data as you go
user> (do-something-with-all-foo #(println (assoc % :bar :baz)))
{:id 1, :bar :baz}
{:id 2, :bar :baz}
{:id 3, :bar :baz}

If you want your data to hang around long-term, then you may as well slurp it all in using your read-all-foo function above (thus defeating laziness). If you want to transform the data, then map over the results after you've fetched it all. Your data will all be in memory at that point, but the map call itself and your post-fetch data transformations will be lazy.

Brian Carper
+1  A: 

It is in fact possible to add a "terminating side-effect" to a lazy sequence, to be executed once, when the entire sequence is consumed for the first time:

(def s (lazy-cat (range 10) (do (println :foo) nil)))

(first s)
; => returns 0, prints out nothing

(doall (take 10 s))
; => returns (0 1 2 3 4 5 6 7 8 9), prints nothing

(last s)
; => returns 9, prints :foo

(doall s)
; => returns (0 1 2 3 4 5 6 7 8 9), prints :foo
; or rather, prints :foo if it it's the first time s has been
; consumed in full; you'll have to redefine it if you called
; (last s) earlier

I'm not sure I'd use this to close a DB connection, though -- I think it's considered best practice not to hold on to a DB connection indefinitely and putting your connection-closing call at the end of your lazy sequence of results would not only hold on to the connection longer than strictly necessary, but also open up the possibility that your programme will fail for an unrelated reason without ever closing the connection. Thus for this scenario, I would normally just slurp in all data. As Brian says, you can store it all somewhere unprocessed, than perform any transformations lazily, so you should be fine as long as you're not trying to pull in a really huge dataset in one chunk.

But then I don't know your exact circumstances, so if it makes sense from your point of view, you can definitely call a connection-closing function at the tail end of your result sequence. As Michiel Borkent points out, you wouldn't be able to use with-connection if you wanted to do this.

Michał Marczyk
Thats a very nice trick...
Hamza Yerlikaya
A: 

There is no way to create a function or macro "on top" of with-connection and with-query-results to add lazyness. Both close the their Connection and ResultSet respectively, when control flow leaves the lexical scope.

As Michal said, it would be no problem to create a lazy seq, closing its ResultSet and Connection lazily. As he also said, it wouldn't be a good idea, unless you can guarantee that the sequences are eventually finished.

A feasible solution might be:

(def *deferred-resultsets*)
(defmacro with-deferred-close [&body]
  (binding [*deferred-resultsets* (atom #{})]
    (let [ret# (do ~@body)]
      ;;; close resultsets
      ret# ))
(defmacro with-deferred-results [bind-form sql & body]
  (let [resultset# (execute-query ...)]
    (swap! *deferred-resultsets* conj resultset# )
    ;;; execute body, similar to with-query-results
    ;;; but leave resultset open
  ))

This would allow for e.g. keeping the resultsets open until the current request is finished.

Bendlas