While businesses, and government agencies for that matter, are considering buying hosted services and moving data to the ‘cloud’, there is a growing concern about where the data actually is stored or the services are hosted. ‘Cloud’ based services have raised issues of security, control, and trust, but the fact that many tend to disregard is that the geographical location of your hosting provider goes beyond the scope of security for your data.
Low latency and high bandwith are important keys to a good user experience. While bandwith is just one element of what an average person perceives as the speed of a network, latency is another element that contributes to network speed.
Latency in a packet-switched network is measured either one-way (the time from the source sending a packet to the destination receiving it), or round-trip (the one-way latency from source to destination plus the one-way latency from the destination back to the source).
The term latency refers to delays typically incurred in the processing of network data. Put easily, according to Wikipedia: «(…) in a non-trivial network, a typical packet will be forwarded over many links via many gateways, each of which will not begin to forward the packet until it has been completely received. In such a network, the minimal latency is the sum of the minimum latency of each link, plus the transmission delay of each link except the final one, plus the forwarding latency of each gateway. In practice, this minimal latency is further augmented by queuing and processing delays.»
The Internet is what we may call a «non-trivial network», where data travels over many links, through many gateways. As the distance increases, the number of links and gateways, or nodes, increases, as do latency.
Moreover, the transport media itself, the cables, suffers from latency, even the fibre optic ones; «(…) light travelling about 1.5 times faster in a vacuum than it does in the cable. This works out to about 4.9 microseconds of latency for every kilometer. In shorter metro networks, the latency performance rises a bit more due to building risers and cross-connects and can bring the latency as high as 5 microseconds per kilometer.» Add that to the above inefficiencies and the round-trip latency on even the best networks can approach 300 ms.
High latency, in turns, yields slow page loads, delays in your web applications, lag in your remote desktop or terminal sessions; in short a horrific user experience. Buying lots of cheap bandwith on the American West Coast, will not help your business operation; not even a 10Gbps connection can make your data travel faster than the speed of light. According to Google: «Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we’ve seen in our internal studies that when a site responds slowly, visitors spend less time there.»
In a packet switching network «non-trivial network», such as the Internet, routing directs the transit of logically addressed packets from their source toward their ultimate destination through intermediate nodes. As stated above; when distance increases these packets must travel through an increased number of nodes to reach their ultimate destination. This does not only create latency; each node is typical a hardware device, and as such they sometimes break.
A broken node usually forces packets to find another way through the Internet, but sometimes such a route may not exist. When no route to host exists your users will experience downtime in stead of latency, even if your, and theirs, Internet connection seems otherwise fine. Without a route to your servers, your users will perceive your services as being down and inaccessible. For your users, it’s if as someone has pulled the plug on you. Hosting your servers far away from your users, makes you more prone to perceived downtime.
Search Engine Marketing
Appearing in organical search results in Google can be crucial for an Internet business operation. Despite what one might think, Google is, to some extent, very sharing about what factors affect search engine rankings. Two very important factors are page load speed and the geographical location of your server.
In April 2010, Google made official that page load speed affects Google rankings, stating that; «(…) faster sites don’t just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that’s why we’ve decided to take site speed into account in our search rankings.»
Now, slow page loads will affect your ranking, but the very geographical location of your web server can also affect your Google rankings by itself. To determine the geotargeting of your site Google considers your TLD and the server location. If you use a gTDL (generic Top Level Domain, such as .com/.net/.org), Google will try to find your server location to determine what country you operate in. Why? The search engine rankings for a given keyword differs in country-specific search results, because Google considers it important for their users to have information served in their own language.
In short, this means that if you operate in f.ex. Norway, using a .com domain, or even worse – a wrong ccTLD (country code Top Level Domain) such as .as, and host your site on the American East Coast, you will be outranked in the search engine results by Norwegian sites hosted in Norway and Norwegian sites with a .no TLD (do note that when using a ccTLD, Google does not consider the geographical location of your server).
Wether you are a hosting provider, or considering buying hosted, or ‘cloud’, services, you should make sure that the servers are located close to your users and invest in high-end hosting with fast networks. You might save a few hundred bucks each year using a service like Amazon, which is high-end but far away from Norway, but the stakes are high, and I think you will agree that the upside is relatively small.