views:

65

answers:

2

What is a good way for excel to test network connection speed to the network drive my excel project is stored on?

Many of my clients have been complaining that my excel project runs too slow (running it off the network). This is because some of the offices have slower connections than others to the network drive where it (and its access backend) is stored.

I think it'd be best to try and write a macro which measures connection speed to where the excel sheet is being run from, and then if it is below a certain threshold I'll have some sort of warning message telling the user they should download a copy locally.

Any recommendations of how I can do this?

Thanks, Dan

My only idea so far is to output 10,000 repeated characters to a randomly name file in ActiveWorkbook.Path then delete it. Based on the Now() before and after this operation I can estimate connection speed. This has lots of potential issues... full drive, write permissions, balance between runtime and accuracy.

+2  A: 

I'd suspect that the problem is not with your Excel application per se. Whether it's an xls or xla, the file gets loaded in to RAM on the client machine. The problem is the 800 pound Access gorilla.

As you may be aware, Access is not a client-server database regardless of the fact that the MDB may be sitting on a file server. When you run a query (whether SQL or QueryDef) from the client all the data required in the query is transferred to the client. This is in contrast to a true client-server database (think SQL Server or Oracle) where the query is resolved on the server and only the results returned to the client.

Some suggestions:

1) Record the time just before and after query executions and write the results to a central text file. This serves only to identify the slowest queries. Fix he worst culprits first.

2) Using a copy of the MDB, experiment with table indexing. Look at which non-unique fields are being use in where clauses. Reduce joins as much as you can.

3) If this is mainly a reporting tool, can you denormalise the tables? Access suffers badly from table depth performance issues.

4) Can you cache data locally? If one solution I allowed users to select what data they required (typically a fraction of the whole) and then cache it on their C: drive. Subsequent queries were executed against the local data.

5) Are you using a shared workbook? Avoid this if possible.

6) Talk to your users. Find out precisely what they find slow. When. What actions were they taking.

All the best

Marcus from London
I significantly normalized my database which gave good speed and space savings. I have done lots of indexing and think I've done the best optimizing I can do for the 2 queries (with changing where statements). At this point I don't think it can get any better, so I'm just trying to figure out how to test the network speed.
Dan
A: 

Try writing a fixed number of characters to the same directory the workbook is stored (Workbook.Path). Do this in a for loop, and have it repeat so that 10,000 characters are written (or whatever number you choose). Do a timing statement before and after. Figure out where an appropriate cut off is in seconds and display a message for anyone that took longer than your cutoff time.

Dan