views:

104

answers:

3

I have a scenario that I'm trying to turn into a more responsive UI by pre-fetching some sub-elements of the results before they're actually required by the user if possible. I'm unclear on how best to approach the threading, so I'm hoping someone can provide some advice.

Scenario

There is search form (.NET rich client) that enable the user to select an account for a given customer. The user searches for given text to find a collection of customers which are then displayed in a result grid. Then when the user selects a customer, the list of accounts for that customer are searched for and displayed in a second grid for user selection in order to make up the final context (that is an account) to open.

Existing System

I have this all running in a request/response manner using regular background threading to resolve customers and accounts for a customer respectively in direct response to the user selections. The UI is locked/disabled (but responsive) until the accounts are found.

Goal

What I want to achieve is to commence fetching of the accounts for the top N customers before the user has selected them... Where N is the number of items being displayed in the grid.

As the user scrolls the grid, the newly displayed items will be added to the "queue" to be fetched.

Questions

  1. Is the thread pool an appropriate mechanism for managing the threads? If so, can you force just a single queued work item to jump up in priority? - e.g. if the user selects that customer before they have started/finished fetching.
  2. If not, what else should I be doing?
  3. Either way, are you aware of any good blog posts and/or open source projects that exhibit this functionality?
+2  A: 

Yes, the threadpool is a good choice, maybe behind a Backgroundworker or .NET4's TaskParallel library.

But you cannot (should not) 'bump' a ThreadPool thread but I don't think that is going to be useful anyway.

What you should probably do is to use a thread-safe queue (of the top N items) and use 2+ threads to process the Queue. When an item is selected and not yet being processed you move it up or start a separate thread immediately.

Henk Holterman
As long as you don't mind the data potentially being loaded twice and throwing out the old copy, the cache-with-preloader approach is definitely the easiest to get right. If you're loading anything stateful or particularly expensive then you might need an automatic (lazy-loading) cache.
Aaronaught
A: 

If .NET 4 is an option, might I recommend the new ConcurrentStack collection.

http://msdn.microsoft.com/en-us/library/dd267331(v=VS.100).aspx

You can add all the items you want to pre-fetch and if an item is selected by the user, you can push that selection onto the stack making it the next instance to be retrieved. This works great with the new PLINQ and TPL and makes use of the new ThreadPool improvements in .NET 4.

http://channel9.msdn.com/shows/Going+Deep/Erika-Parsons-and-Eric-Eilebrecht--CLR-4-Inside-the-new-Threadpool/

James Alexander
+1  A: 

By way of an update / solution... I followed Henk's solution (sort of) such that I keep a queue of work item objects but I still process them using the ThreadPool. Selected items are 'bumped' by putting them on the front of the queue (rather than the back) [note: needed a special collection for this].

The following might describe it in detail (in lieu of code)

  • Modified Customer to keep a property of IList<Account> named KnownValues and also an object used for locking named KnownValuesSyncObject (since I want the KnownValues to be null when they aren't yet known).
  • Search form maintains an instance variable Deque<CustomerAccountLoadingWorkItem> (from PowerCollections)
  • A CustomerAccountLoadingWorkItem maintains a reference to the Customer it's meant to process for, as well as it's ManualResetEvent handle that it was created with.
  • In order to only start loading for visible items (i.e. not scrolled off the screen) I used answers from my other post to use Data Virtualization as a mechanism for queuing CustomerAccountLoadingWorkItem's as each 'page' was loaded. Immediately after adding to the work items queue, a task is added to the ThreadPool queue. The context/state for the ThreadPool.QueueUserWorkItem call was my _accountLoadingWorkItems queue.
  • The WaitCallback delegate launched by the ThreadPool, took the work item queue passed to it, then (using locks) dequeued the top item (if there wasn't one, then returned immediately); then processed that item. The function used was a static function as part of CustomerAccountLoadingWorkItem so that it could access it's private readonly state (that is, the Customer and ManualResetEvent)
  • At the end of processing the static processing function also sets the Customer.KnownValues (using the KnownValuesSyncObject for locking).
  • When the user selected a value in the Customers grid, in it's already separate thread (via BackgroundWorker) if the Customer.KnownValues isn't already filled (i.e. it's probably sitting in the queue somewhere), then it adds a new CustomerAccountLoadingWorkItem to the work item queue (but at the top!!! [this was why the Deque was required]) and also adds a new processing task to the ThreadPool. Then since it created the work item it calls ManaulResetEvent.WaitOne() to force waiting for the thread pool task to complete.

I hope that made sense...

Of note, since my solution still uses the ThreadPool when the user selects an item I still have to wait for currently running thread pool thread's to finish before my work item gets picked up, I figured that was OK and possibly even desirable seeing as though the resources used to query for the accounts (a webservice) will be semi-locked up it'll probably come up equivalently quick anyway (due to some poor architecting and a shared web service proxy).

Overall, I certainly made what should have been an easy-ish job somewhat more difficult and should I have been able to use Framework 4 I would have looked at going down the TPL route for sure.

Reddog