tags:

views:

47

answers:

2

... is very slow. We're trying to deploy a 280 MB cspkg file through the VS2010 tools, and it takes roughly 35 minutes to upload, and another 10 minutes to deploy.

Are there any ways to speed up this upload process? We're comptemplating putting invariant data into a blob and pulling it from there, but we'd like to know what's happening in the first place.

edited to reflect we're using vs2010 azure integration tools

A: 

Both deployment methods (API and Portal) allow you to deploy from a file that is already uploaded to Azure Storage. The VSTS tools are just utilizing this feature behind the scenes. (In 2010 you have to provide storage credentials for this reason).

You should look into uploading the .cspkg into a Blob directly (vs through VSTS, and then write up a simple upload client that will break the upload into blocks, which can be uploaded simultaneously. You can then tweak this (block size and # of blocks uploading at a time) to better utilize your outgoing bandwidth. Then you just use the api to "assemble" them in Azure once they are all there. This should really speed up the upload.

I think to answer your question as to "whats happening", you are just getting synchronous WebClient I/O to Azure Storage, and all the limitations that come with it.

Taylor
A: 

We have been hitting a very similar problem recently, as we had to package about 40MB of 3rd party libraries to establish a SQL connection toward Oracle from Windows Azure.

Through Lokad.CQRS, we did exactly what you suggest, aka, putting all big static libraries and keeping the Azure package as lean as possible. It works very nicely.

Joannes Vermorel