views:

299

answers:

2

I have 6000 work items stored in TFS

  1. Firstly, I need to receive them all (well, this is fairly simple to do using WIQL or something else)
  2. Then, I need to filter out all work items I was not working for a particular date range. For example, I am looking whether there were any changes performed for each work item.
  3. Finally, I show work items on the web page (the fastest step).

However, the whole process takes about 300 seconds to complete. I assume this is because I need to analyze history of each work item. So, are there any tricks that could possibly improve the time?

More details: I have a web application that needs to do all this stuff, I am using .Net, I am using Work Items Store cache (but it does not seem to provide much help) and I am free to use any tool to speed up the process.

A: 

Absolute fastest way, SSIS package (or any DTL) that performs data transformations and holds the denormalized data you need for this application somewhere. Depending on your scenario, this package could run nightly, hourly, whatever frequency (within sane limits) you need. However, if you need real time views of the data (and everybody thinks they do but rarely actually does) this won't work. I'd look into caching the data and only grabbing and filtering items that have changed.

It seems to me that the real bottleneck in this process is that you're grabbing all of the work items and then doing the filtering.

marr75
A: 

The fastest way would be to query the TfsWorkItemTracking database on the server directly via SQL. This is not recommended of cause, but i have done this for a similar web application and it works quite well and reasonable fast. The db structure is not too complicated.

Jan