Feb 05, 2015

How important is network latency when I am choosing enterprise cloud-based apps?

There was a recent question about distant data centers, which brought up the issue of latency. When looking at cloud based apps, how important is latency? Does it significantly impact user experiences?
Very important. When we move on-premises apps into the cloud there are a lot of subtleties that can impact network latency in ways that we don't immediately realize. But trying to get a handle on this can be difficult. Endpoints aren’t fixed: apps and users can be anywhere in the world, attached to a wide range of link speeds. The cloud infrastructure configuration can also impact latency as well. For example, a typical Hadoop Big Data app can involve dozens or even hundreds of different compute servers that could be located a various data centers around the planet, and employ virtualized network infrastructure that can add additional packet delays to the mix.

On top of this, traditional latency measurements using traceroute and ping really don’t tell the entire story either. You want to measure end-to-end application performance using HTTP and other Internet protocols that the apps are actually using to transit across your connections.

Ideally, you want to reduce latency across several dimensions:
• Reduce latency of each network node
• Reduce number of network nodes needed to traverse from one stage to another
• Eliminate network congestion
• Reduce transport protocol latency
A good tool to help with these measurements is Dyn’s Internet Intelligence. You can read a review that I did for Network World here.

Finally, you will want to look at ways that you can directly connect to the cloud provider, using tools such as AWS Direct Connect, Google Cloud Interconnect or Azure Express Route that allow more consistent connections and reduced latencies.
Answer this