Overall Summary
Over the past 6 months, Cloud Spectator has been tracking uptime and response time for websites built on different web host providers in an effort to analyze and improve performance in the web hosting industry. Commissioned by GoDaddy, the project uses Pingdom, a third-party web monitoring tool, to test and measure uptime and response time of web servers hosted in the US from nodes around the North America and Europe.
You can view monthly summaries from our previous blog posts.
The past 6 months of active tracking has resulted in a general understanding of web host provider performance. For Linux hosting, A Small Orange and GoDaddy were leaders for response time and uptime. On Windows hosting, GoDaddy and Rackspace exhibited the fastest response time, with HostGator showing the best uptime. Finally, on Managed WordPress hosting, FlyWheel and WPEngine emerged as leaders for response time, while FlyWheel and WebSynthesis maintained the best uptime.
Response Time: Linux
The average response time of all Linux web host providers was 1185ms over 6 months of testing. Among Linux web host providers, A Small Orange, GoDaddy and 1 & 1 displayed the shortest average response time (551ms, 588ms, and 869ms respectively). A Small Orange, GoDaddy, 1 & 1 and FatCow showed high stability (with their standard deviations being less than 10% of their averages) over the duration of the testing.
Response Time: Windows
The average response time of all Windows providers was 4616ms over 6 months of testing. Among Windows web host providers, GoDaddy, Rackspace and HostGator showed the shortest average response time (501ms, 536ms and 543ms respectively). Due to large variations across providers, the difference between the fastest Windows provider and the slowest was nearly 30x. All Windows providers except for Rackspace displayed high stability over the duration of the testing.
Response Time: Managed WordPress
The average response time for all Managed WordPress providers was 469ms over 6 months of testing, which was lower than that of both Linux and Windows providers. Among Managed WordPress providers, WPEngine, FlyWheel and BlueHost exhibited the shortest average response time (257ms, 258ms and 330ms respectively). All Managed WordPress providers except for Pagely and Pressable showed high stability over the duration of the testing.
Uptime: Linux
The average uptime of all Linux web host providers was 99.86% over 6 months of testing. A Small Orange, Rackspace and GoDaddy exhibited the highest average uptime results (99.99%, 99.97% and 99.96% uptime respectively). A Small Orange,GoDaddy, iPage, Rackspace, HostGator and iPower all had high stability (with their standard deviation being less than 0.1% of their averages) over the duration of the testing.
Uptime: Windows
The average uptime for all Windows web host providers was 99.84% over 6 months of testing. HostGator, Rackspace and GoDaddy showed the highest average uptime results (99.98%, 99.86% and 99.86% uptime respectively). GoDaddy and HostGator displayed high stability over the duration of the testing.
Uptime: Managed WordPress
The average uptime for all Managed WordPress web host providers was 99.67% over 6 months of testing. FlyWheel, WPEngine and WebSynthesis showed the highest average uptime results (100.00%, 99.99% and 99.99% uptime respectively). All Managed WordPress providers except for Pagely and Pressable displayed high stability over the duration of the testing.
Methodology
CloudSpecsTM gathered accurate metrics for response time and uptime with Pingdom’s monitoring service. Cloud Spectator adopted Pingdom’s methodology for measurements to further maintain objectivity.
Cloud Spectator tracked three types of hosting offerings: Linux, Windows and Managed WordPress. For each offering, a mock website was created to simulate end user experience as accurately as possible. Linux websites used WordPress, while Windows websites used DotNetNuke (DNN). All web content (images, text, etc.) were hosted on the local server, thus performance was not dependent on files or objects stored outside of the web server. Not all of the providers carried all three types of offerings; therefore, the providers listed in each section vary.
The web hosting providers measured in the three offerings were:
Offering | Providers |
---|---|
Linux | GoDaddy, 1&1, Network Solutions, FatCow, A Small Orange, BlueHost, Domain.com, HostGator, iPage, iPower and Rackspace. |
Windows | GoDaddy, 1&1, HostGator, Rackspace and WinHost. |
Managed WordPress | GoDaddy, Pagely, WPEngine, BlueHost, WebSynthesis, Flywheel, Siteground and Pressable. |
Cloud Spectator set up anonymous accounts on each hosting provider using all default settings.
Pingdom gathered uptime data by pinging the server at set intervals. The web server was recorded as being up if a response was received. If no response was received for two sequential pings, the web server would be marked as having a down time during that interval.
Pingdom gathered response data when the server was up. Response time was tracked using Pingdom’s Node Group, a reserved feature for its Enterprise Users. The node pinged the web server and recorded the time duration between when the message was sent and when it was received back as an indication of network speed. The time durations were tracked globally from several locations in a node group on a rotational basis, which included the following probes:
Region | Nodes |
---|---|
North America East | Toronto, Newark |
North America Central | St. Louis, Denver, Calgary |
North America West | Los Angeles, Las Vegas |
Europe | Prague, Amsterdam, Strasbourg, London |