Cloud Spectator released the CloudSpecsTM Web Host Monitor in order to help consumers make informed decisions when selecting web hosting providers and promote transparency in the shared hosting space. By tracking the uptime and response time of twenty-four Linux, Windows and Managed WordPress offerings, we were able to present performance rankings based on data driven results.
Below is a summary of the uptime and response time results of a variety of web hosting providers from the CloudSpecsTM Web Host Monitor.
Summary
Linux providers altogether showed average response time and uptime performance among all three offering types. Among Linux hosting providers, A Small Orange stood out as having both the shortest response time (547 ms) and perfect uptime (100%). GoDaddy also exhibited a short response time (621 ms) but its uptime score is slightly lower (99.98%), while Rackspace showed a longer response time (880 ms) despite its high uptime score (100%).
Compared to the results from Linux and Managed WordPress hosting options, the response times of Windows offerings showed significantly larger variations, with the response time of the slowest provider being nearly 30 times as long as the fastest one. The best response time for a Windows offering was GoDaddy (467 ms), while its uptime score had room for improvement (99.84%). Rackspace showed a high uptime score (100%) and relatively high response time (652 ms), following GoDaddy and HostGator (511 ms).
Generally speaking, uptime results from Managed WordPress hosting providers were a little more scattered than the results from Linux hosting providers and Windows hosting providers.
Managed WordPress offerings had a 60% shorter average response time than Linux hosting providers. Flywheel scored at the top for both response time (244 ms) and uptime (100%) among providers carrying Managed WordPress offerings. Flywheel’s response time was also the shortest among all Linux, Windows and/or WordPress hosting providers. GoDaddy and WPEngine ranked highly for both response time and uptime (328 ms & 100% and 252 ms & 100% respectively). While Pagely and BlueHost scored a little lower on uptime scores (99.95% and 99.99% respectively), their response times remained in the top tier among all providers (331 ms and 339 ms respectively).
It should be noted that although the data collected on uptime and response time is a good indication of web hosting performance, users should be cautious when generalizing those results given that the tests were performed in all default settings with limited location choices. If any specific hosting requirements are considered, we suggest customers go through relevant consultations and perform customized tests.
Uptime Leaders
Higher percentage is better, as it indicates the amount (in percentage) of overall time the server has been up for the month.
Linux
Windows
Managed WordPress
Response Time Leaders
Lower values are better for response time as they indicate less latency.
Linux
Windows
Managed WordPress
Methodology
CloudSpecsTM gathered accurate metrics for response time and uptime with Pingdom’s monitoring service. Cloud Spectator adopted Pingdom’s methodology for measurements to further maintain objectivity.
Cloud Spectator tracked three types of hosting offerings: Linux, Windows and Managed WordPress. For each offering, a mock website was created to simulate end user experience as accurately as possible. Linux websites used WordPress, while Windows websites used DotNetNuke (DNN). All web content (images, text, etc.) were hosted on the local server, thus performance was not dependent on files or objects stored outside of the web server. Not all of the providers carried all three types of offerings; therefore, the providers listed in each section vary.
The web hosting providers measured in the three offerings were:
Offering | Providers |
---|---|
Linux | GoDaddy, 1&1, Network Solutions, FatCow, A Small Orange, BlueHost, Domain.com, HostGator, iPage, iPower and Rackspace. |
Windows | GoDaddy, 1&1, HostGator, Rackspace and WinHost. |
Managed WordPress | GoDaddy, Pagely, WPEngine, BlueHost, WebSynthesis, Flywheel, Siteground and Pressable. |
Cloud Spectator set up anonymous accounts on each hosting provider using all default settings.
Pingdom gathered uptime data by pinging the server at set intervals. The web server was recorded as being up if a response was received. If no response was received for two sequential pings, the web server would be marked as having a down time during that interval.
Pingdom gathered response data when the server was up. Response time was tracked using Pingdom’s Node Group, a reserved feature for its Enterprise Users. The node pinged the web server and recorded the time duration between when the message was sent and when it was received back as an indication of network speed. The time durations were tracked globally from several locations in a node group on a rotational basis, which included the following probes:
Region | Nodes |
---|---|
North America East | Toronto, Newark |
North America Central | St. Louis, Denver, Calgary |
North America West | Los Angeles, Las Vegas |
Europe | Prague, Amsterdam, Strasbourg, London |