Everything I do in a desktop app, whether it is Winforms or WPF, I try to do on a background thread. This is done for two reasons. First, the user experience is better. Second, in WPF testing I have found it to be a better performer when loading a lot of data, like records into a grid or list.
Loading data upfront vs lazy loading is really a per-application customization. I would build a central data object that handles both scenarios. The way I might recommend doing this is to create an event driven dependency model. What I mean by this is that you can place an event or callback registration function on a data manager object that various units of code subscribe to when they need to use data, and then they are called back when the data are available. If the data are already available then the callback occurs immediately. Else, the code unit is called back when the data are loaded from a background thread. For example, in some window or component you might have some code that looks like:
DataManager.LoadDataAsync(dataCommandPatternObject, CallBackFunction);
...
public void CallbackFunction(SomeDataObjectClass data)
{
//load data into UI
}
If data loading is done through a central mechanism then if the same data are requested twice, a cache version can be used or the second request can wait if the first request is still running.
If data needs to be loaded up-front, a loading screen (splash screen) can request a number of pieces of data, and when each block of data loads a callback is fired. When all the callbacks have fired, the splash screen exists.
These are just a few points from some various techniques I have used over the years to manage the loading of large data-sets of mostly static/lookup data. On top of all of this, I would also recommend some sort of client-side disk caching for very large datasets that rarely change, and implement some sort of change tracking in the database. This would allow this data to be loaded from local disk by the client, which is faster that going to a DB. It also lets the DB scale better, since it is not serving out data that it highly repetitive, and instead it can focus on transactional data.