tags:

views:

19

answers:

1

I have an app that I'm thinking about moving to Azure as a Worker Role with an external facing endpoint. It's a small little process that runs in about 200-400ms, but our users would like to start running the little job 50K-100K times a day, per user. Before I go building the Azure prototype, I need to figure out what kind of latency I can expect communicating with an Azure external endpoint. Obviously, the latency depends on the size of information that I'm sending and receiving, and it depends on the speed of my internet connection, but I can't find any metrics anywhere. Are there any kind of base line numbers out there?

For the sake of argument, lets say I'm on a T1 and I'm sending 10K up and 10K down with each job run.

+1  A: 

I don't think latency is exactly the term you looking for, that's the delay it takes sending each packet over the network which is affected more by your distance from the server, and the nature of your network.

Having said that, everyones results wrt to latency will be different, the only way to be sure will be to set up a prototype and run some performance tests on it. Also remember with Azure you can specify your data center, so select one near you.

Doobi
I was thinking about latency, because my process is fairly small and atomic and the time it takes to send the packets across the network will most likely dominate how long it takes the process to run on the server.
Jonathan Beerhalter
I suggest caution, and if possible a rethink, on that design. I'm maintaining a system currently that someone built similarly, and it's a nightmare wrt scalability, because latency is one thing you're stuck with, you can't just throw more CPU, memory or bandwidth at it.
Doobi