08 Nov 2018

In the last blog post we answered the question: Why Whitelist an Email Address?

TLDR; "If you expect to receive important emails from a trusted email address it is worth whitelisting the address to make sure that emails won't be accidentally blocked by an overzealous email client."

Here we provide step-by-step instructions on how to do it in Gmail by creating a filter:

1) Login to Gmail, click on the gear icon and select "Settings":

select Gmail settings

2) Select "Filters and blocked addresses":

select Gmail filters

3) Scroll past all your existing filters and select "Create a new filter":

create new filter Gmail

4) Add the email address that you want to whitelist to the "From" field. Here we added monitor@downtimemonkey.com to make sure that we never miss a 'website down' alert:

add email to filter Gmail

5) Check the "Never send to spam" box and click "Create Filter". The email address will now be whitelisted!

create filter Gmail

Whitelisting A Whole Domain

In 'Step 4' we whitelisted a single email address. It's also possible to whitelist all emails from a domain.

By adding @downtimemonkey.com to the "From" field instead of monitor@downtimemonkey.com we would whitelist every email address belonging to downtimemonkey.com.

Whitelisting Multiple Email Addresses

To whitelist more than one email address simply add each email address separated by the pipe symbol. For example, "monitor@downtimemonkey.com | verify@downtimemonkey.com".

The pipe symbol is a vertical bar '|' that can be added with shift and backslash on most keyboards.

31 Oct 2018

Whatever email client you use, be it Gmail or Thunderbird, Outlook or Apple Mail, you can be sure that it comes with some kind of spam management built in.

Most of the time this works well - legitimate emails are delivered to your inbox and spam is either rejected or gets funnelled to your spambox.

whitelisting an email address

However, differentiating between genuine emails and spam is a notoriously difficult problem to automate and despite utilising sophisticated artificial intelligence, email clients intermittently fail to do this successfully.

This means you will occasionally see spam emails make it through to your inbox, or worse... legitimate emails may be incorrectly blocked and you miss them!

Whitelisting Prevents Overzealous Blocking

If you expect to receive important emails from a trusted email address it is worth whitelisting the address to make sure that emails won't be accidentally blocked by an overzealous email client.

You can do this for monitor@downtimemonkey.com to be 100% sure that you'll receive your email alerts if a website that you monitor goes down.

It's very unlikely that these emails would be blocked (we've never had problems) but whitelisting is straightforward and gives peace of mind.

How To Whitelist An Email Address

Whitelisting is completed in the email client and each email client is different.

In the next few blog posts we'll give step-by-step instructions for whitelisting an email address in individual email clients, starting with Gmail.

26 Sep 2018

One of our main aims when developing Downtime Monkey is to make the user experience outstanding.

With that in mind we've rolled out an improvement to the user interface that enables you to view the current status of all your websites at a glance.

Here's a screenshot of the new 'overview box' - this is now the first thing you'll see when you view your monitors:

website monitoring user interface

Down Websites Listed First

Every website monitor is still listed individually. Individual monitors now appear just below the 'overview box' and can be easily scrolled through.

We've also made a small change to the list of monitors - previously all monitors were listed in alphabetical order but now websites that are down are shown first:

website monitor list

Who Benefits?

These changes should benefit users who monitor lots of websites because they can quickly find any websites that are down without having to scroll through a long list of monitors.

Since Free Plan users can monitor up to 60 websites, Pro Plan users can monitor up to 1000 sites and Enterprise users have unlimited monitors, this could benefit everyone and we've rolled it out across the board.

To check out the new user experience simply sign up, add some websites to monitor and view your monitors. It takes less than 2 minutes!

28 Aug 2018

Over the last few months we've made several upgrades for Pro users, including: bulk import of monitors and customisable alert times.

This time, we have an upgrade for our Free Plan users...

free email support

Feedback and FAQs

Until today email support was for Pro customers only.

Free account holders could provide feedback, report problems or ask questions via our online feedback form but didn't have access to email support.

When a good question was asked - we'd add it to our list of FAQs which can be viewed by anyone who is logged in.

Email Support For All Users

In reality though, every time a Free account holder asked a question through the feedback form we emailed them right back... it just seemed like the right thing to do.

With this in mind we're now officially providing email support to Free customers as well as Pro users.

To contact support, simply sign-up and visit help and support. Email us at the address provided and we'll get right back to you.

26 Jul 2018

...as opposed to just slow?

When you visit a webpage that is down, most of the time you'll see an error: you'd see a 404 error if the page can't be found or a 503 if the server isn't unavailable.

Although this is not what you want to see, it is helpful. You know that the site is down and have a rough idea why.

But sometimes you don't see an error... just a spinning wheel.

You may wait for 5 seconds, or 20 seconds, or a minute but at some point if the webpage doesn't appear, you'll decide that the site is not just slow, but down.

snail saying 'when is website down or slow'

Timeout Threshold

Downtime Monkey's monitoring scripts go through the same decision-making process.

If a server is running so slowly that it doesn't respond, at some point we have to call time and mark the site down.

The question is: "How long should we wait?"

Experimenting With Timeout Threshold

Over the past few months we have been varying the timeout threshold.

We've used various times betweeen 9 seconds and 30 seconds and examined the effect the changes have had on the efficiency and run-times of our monitoring scripts.

The aim was to find the timeout setting that allowed our monitoring scripts to run most efficiently and quickly, and set timeouts to this value.

Speed & Efficiency

We found that the monitoring scripts were quickest and most efficient when the timeout setting was lowest - this isn't surpising really but it forced us to re-think our approach.

If we dropped the timeout setting further the scripts would run faster still - in theory we could set timeouts to 1 millisecond and the monitoring scripts would be super fast. However, websites would constantly be marked as down when they didn't respond in time.

Obviously, it is unreasonable to expect a website response within a millisecond - but this poses the question: "What is a reasonable timeout setting?"

New Approach

Instead of choosing the timeout threshold that lets our monitoring scripts run fastest, we decided to use the value that is most widely considered appropriate.

So we set up a survey asking: "How long without a website response before you consider the site down, as opposed to just slow?"

The options for answering were: 5 seconds, 10 seconds, 20 seconds, 30 seconds and 1 minute.

Google+ was chosen because of the high response rate to polls.

We polled 9 Google+ communities (details here) with specific communities being selected because of the appropriate knowledge of the community members and their enthusiasm for past polls.

Thanks to everyone who took part!

Survey Results

Here are the full results from all polls combined:

Total Votes:


5 sec:


10 sec:


20 sec:


30 sec


1 min


Mode Timeout

10 sec

Mean Timeout

17 sec

website down or slow results

Mode and Mean

The most popular choice (the mode) was 10 seconds.

The average time (the mean), taking all votes into consideration, was 17 seconds.

It's noteworthy that 10 seconds was the most popular choice by a considerable margin and that this result was repeated across all individual polls: in every community 10 seconds was the most popular choice.

The distibution of votes was also very consistent across the individual polls - we were surprised at how reproduceable the results were. In every poll the mean time was between 16 and 19 seconds.

You can see results from each individual poll towards the end of the post.

Putting The Results Into Practice

We have now set the timeout threshold on all our monitoring scripts to 17 seconds, the mean choice of all votes.

It's true that our scripts would run slightly faster if we used a lower timeout but we believe that this is the most appropriate timeout setting for our users, based on the opinions of the tech/developer/designer community.

sloth talking about slow website

What Happens After A Monitor Times Out?

When a monitor passes the timeout threshold without a response, the site is marked as down and the event is recorded with a response code of 0.

Stats for the monitor are automatically updated to take the downtime into consideration.

Pro users can view all individual timeouts, and will see the explanation: "No Response: No HTTP code was received. Possible reasons for this are timeout (the server is not responding in time) or being blocked by a firewall."

SMS and email alerts will be sent if the site remains down for the time specified by the user in their alert settings.

Free users receive SMS alerts instantly and email alerts if the site stays down for one minute. Pro users can set their own custom alert times.

Results From Specific Polls


Number of Votes:


Mean Timeout:

16 seconds

programming website down or slow results

PHP Programmers

Number of Votes:


Mean Timeout:

16 seconds

PHP programmers website down or slow results

Computer Science

Number of Votes:


Mean Timeout:

17 seconds

computer science website down or slow results

Web Development

Number of Votes:


Mean Timeout:

18 seconds

web development website down or slow results

Computer Programmers

Number of Votes:


Mean Timeout:

16 seconds

 computer programmers website down or slow results

Cloud Computing

Number of Votes:


Mean Timeout:

16 seconds

cloud computing website down or slow results

Web Design

Number of Votes:


Mean Timeout:

16 seconds

web design website down or slow results

Web Designers

Number of Votes:


Mean Timeout:

17 seconds

web designers website down or slow results

Web Design & Development

Number of Votes:


Mean Timeout:

17 seconds

web designers website down or slow results

Defining Response Time

You may already have asked: "How is response time evaluated?" Good Question.

'Response Time' is not 'Page Load Time'

It's important not to confuse response time with page load time when considering the question: "How long without a website response before you consider the site down?"

Response time is always much quicker than page load time - read on to see why...

What happens when you visit a webpage

When you visit a website your computer sends a request to the web server, asking for the webpage data.

The server sends a response, which includes a status line, HTTP headers and webpage content.

The first thing that is received by your computer is the status line - this is just one line and contains only a few bytes of data. It tells your computer whether the request was successful or not.

Next, your computer receives the HTTP headers which contain details about the webpage - these are several lines long and typical headers size is 700-800 bytes (although they can be anything from 200 bytes to over 2KB).

Finally, your computer receives the webpage content - the data size varies considerably depending on the webpage but is usually in the megabytes. In 2017 the average webpage size was 3.034MB - that's over 3 million bytes!

Time To First Byte

Time to first byte (TTFB) is the time taken to receive the first byte of data of the status line (technical details here).

TTFB is sometimes used for response time. However, we don't think this is accurate because it's not the same as "the time of the first data byte of the page" appearing in the web browser.

Page Load Time

Page load time is "the time it takes to download and display the entire content of a web page in the browser window".

This is irrelevant to response time because content-heavy web pages can take a long time (sometimes minutes!) to fully load.

Time To Receive Headers

We consider the time taken to receive headers is the most practical measure of reponse time - headers are received just before the first content is loaded to the web browser. This is used to record response time in our monitoring scripts.

26 Jun 2018

...new features

We've rolled out a bunch of new features designed to make it quicker and easier to monitor large numbers of websites - you can now add, edit and delete website monitors in bulk.

Thanks to everyone who have submitted feature requests - this helps us focus the development of Downtime Monkey on the areas that you want.

bulk imports

Bulk Add/Import Monitors

There are two ways to add website monitors in bulk - you can import from a spreadsheet (as a CSV file) or add manually as a comma separated list.

Import from a CSV file

You can add hundreds of monitors in a seconds by importing from a CSV file. It's the best way to add monitors if you have a spreadsheet of all your websites.

Simply save the spreadsheet as a .csv file and upload.

Add Manually as a Comma Separated List

It's also possible to add multiple monitors manually. Simply follow each website with a comma in the input form... piece of cake!

bulk add website monitors

Don't worry about duplicates...

Downtime Monkey removes any duplicate URLs before creating new monitors - therefore you'll only ever have one monitor per webpage, making things easier to keep track of.

...or mistakes

Downtime Monkey checks that all URLs are valid. Invalid URLs aren't added as monitors but are shown as invalid in the results, so that you can find and correct them easily.

...or other text

Plain text (or any text that isn't a URL) is ignored too - this means that you can confidently import from a spreadsheet that contains other text as well as website URLs. Only the valid URLs will be imported.

...or dead websites

Every URL that you add is visited to check that the webpage is a real and operational. If the page is redirected or there is no response then the monitor won't be created - you'll be informed in the import results.

bulk import website monitors from CSV file

Bulk Edit Settings

You can now update the settings of all your website monitors at once. Sign up and login, go to your monitors (Pro) and select 'Edit All'.

bulk edit website monitors

You can turn email and SMS alerts on or off, select the email address and phone number for alerts and customise when alerts should be sent - click the button and all your monitors will be updated to the new settings.

It's still possible to apply individual settings to specific monitors - simply edit the settings of the individual monitor as before.

Bulk Delete Monitors

Much the same as bulk edit you can now delete all your monitors at once.

Note that when you delete a monitor there's no going back and all uptime stats will be deleted along with the monitor. If in doubt, we'd recommend keeping the monitors but turning alerts off.

bulk delete website monitors

Pro Features

These features will be most useful for power users who monitor lots of websites and for this reason we've rolled them out to Pro users. We intend to develop some more features for power users in the coming weeks and months and also something for our Free users.

If you have any feature requests, please let us know via the feedback form on the help and support page.

28 May 2018

...it's GDPR

Last week, we'll bet you've received an onslaught of "we've updated our privacy policy" emails.

If you're a website manager maybe you've been writing those emails and ensuring your site is compliant with the new regulations.

It's been interesting (for us anyway) to listen to reactions to GDPR. It seems that people are split between "what a nightmare - so much paperwork" and "this is great - it protects our privacy" and there is no doubt that we have felt a bit of both.

tired of paperwork


There is an entire section of the tech industry that has a business model of: provide a free app, get people's personal information and sell it to anyone who'll pay.

You know... that app that wants access to your location, all the details of your contacts, your emails and browsing history so that you can play virtual ping-pong.

But free ping-pong is awesome and what's the harm anyway?

If the Cambridge Analytica scandal is anything to go by, then enough to threaten the fabric of democracy in the developed world!


At Downtime Monkey we've always considered privacy and security important and right from the start we've put a fundamental principle at the heart of our service:

We don't ask for personal information unless we truly need it.

We keep third parties to a minimum but use some to enable us to provide our service and we apply the same principle to them.

Here's an example:

We use a text message API to send downtime alerts because the service is reliable worldwide and it saves us having to reinvent the wheel.

If your website goes down and you've set up SMS alerts Downtime Monkey relays your phone number and the website URL to the API and a text message is sent to your phone. We don't include your name or any other details - just the information needed to get the job done.


Following this principle means that there has been very little for us to change for GDPR. We haven't had to make any changes to the functionality of our application.

However, we have had to check our systems, ensure that all the third parties that we use are GDPR compliant, and put some documentation in place.

We dedicated a few days' work to this and although we'd rather have spent the time developing our services, we're happy that privacy is being taken seriously by (some) regulators.

So will GDPR fix the tech industry's privacy problem?

Although GDPR may help, it's unlikely to fix the situation completely. It's predictable that the organisations whose income is made from collecting their users' personal information and selling it to the highest bidder will find a way to get round the regulation, or just accept the fines that come their way.

One way or another we can't see this section of the tech industry disappearing overnight.

On a positive note though, maybe the regulation will encourage more developers and startups to use a model of business that puts customer privacy first.

To support our privacy-centric model of business Sign Up and then upgrade to a Pro Account. If you're already a Pro user thanks for your support - you make it all possible!

Oh yes... and we've updated our privacy policy.

25 Apr 2018

Downtime occurs. It's an unfortunate fact of online life.

No website is able to provide 100% uptime - even tech giants like Google suffer downtime, albeit very occasionally.

So, some amount of downtime is inevitable, but how much is acceptable?

This question is obviously subjective - downtime that's acceptable for one person may be intolerable for another. Therefore, we undertook a little research...

apollo reliability quote

The Survey

We ran polls across 14 different Google+ communities, asking the question "What's the minimum level of acceptable uptime for a website?"

The options for answering were: 99%, 99.9%, 99.99%, 99.999% and 'other'.

A big thanks to everyone who took the time to respond!

A community was selected for the poll if it was active, responsive, welcoming and if the topic of website uptime was considered relevant to the community.

website offline quote

Overall Results

Here are the combined results from all communities that were polled:

Total Votes:












acceptable uptime results

Average Result

We can see that, although the most popular result was 99.999% there was no 'runaway winner'. Therefore, we calculated an average result that would take all votes into consideration.

Note that simply taking the mean of all the results would have led to an average that was skewed towards the lowest option of 99%. To avoid this we calculated a meaningful average that allocated all votes an equal weight - you can see the method used at the end of the article.

Here is the average result as a percentage and as actual downtime:








4min 50sec


21min 2sec


4hr 12min

acceptable uptime results

Top Comments

"For e-commerce 5 nines for sure, but for a personal blog 99% would be acceptable."

"How many nines can you afford?"

"How much does it cost if the site is down?"

"99.999% (or even more) is pretty doable as long as you have the right architecture. I highly recommend reading the book 'Site Reliability Engineering: How Google Runs Production Systems'"

"Fun fact, the Apollo space program had 99.9% reliability as a goal while airlines today achieve 99.99999% reliability"

"It depends on when the downtime happens"

"It's not about what is acceptable, it's about 'what-it-is'"

"If a site goes offline on the web and no one is around to see it, does it make a 503?"

Results By Community

Here are the results broken down by community... or to go straight to the conclusions click here.

From the communities with more than 20 responses, 'Cloud Computing' had the highest result (99.977% average acceptable uptime) and 'Web Design' had the lowest result (99.898% average acceptable uptime):


Number of Votes:


Acceptable Uptime:


programming acceptable uptime results

Computer Programmers

Number of Votes:


Acceptable Uptime:


 computer programmers acceptable uptime results

PHP Programmers

Number of Votes:


Acceptable Uptime:


PHP programmers acceptable uptime results

Web Development

Number of Votes:


Acceptable Uptime:


web development acceptable uptime results

Computer Science

Number of Votes:


Acceptable Uptime:


computer science acceptable uptime results

Cloud Computing

Number of Votes:


Acceptable Uptime:


cloud computing acceptable uptime results

Web Design

Number of Votes:


Acceptable Uptime:


web design acceptable uptime results

Web Performance

Number of Votes:


Acceptable Uptime:


web performance acceptable uptime results

Web Designers

Number of Votes:


Acceptable Uptime:


web designers acceptable uptime results


Number of Votes:


Acceptable Uptime:


web performance acceptable uptime results

Enterpreneurs, Self-Employed & Small Business

Number of Votes:


Acceptable Uptime:


enterpreneurs business acceptable uptime results

Internet Marketing

Number of Votes:


Acceptable Uptime:


internet marketing acceptable uptime results

Enterpreneurs/Self-Employed Community

Number of Votes:


Acceptable Uptime:


enterpreneurs self employed acceptable uptime results

Blog Community

Number of Votes:


Acceptable Uptime:


blog community acceptable uptime results


Uptime of 99.95% was the average result from the survey and this seems like a reasonable value, allowing just over 4 hours of downtime per year.

However, not all websites are the same. Busy sites for businesses will require higher availability while 99% uptime is acceptable for casual sites with few visitors.

Not All Downtime Is The Same

Two hours of downtime at 4am on a Sunday may affect fewer users than 5 minutes of downtime on a Tuesday afternoon. So, if downtime is inevitable, say for essential maintenance, it makes sense to schedule it during off-peak hours.

Accurate Monitoring

Setting an uptime goal is one thing but making sure you achieve it is another. You should monitor your site and check the uptime stats regularly. Our free accounts have 3-minute checks and uptime stats to 1 decimal place and our pro accounts have checks every minute and uptime stats to 3 decimal places.

When To Be Alerted

Consider your acceptable uptime when setting custom alert times. For a website that requires 99.99% uptime or more you will probably want to be alerted the instant the site goes down, but for a website that requires 99% uptime you could schedule alerts to be sent when the site has been down for 10 minutes.

Appendix - The Meaningful Average

To avoid an average that was skewed towards the lowest option of 99% each option was allocated a weighted value on a linear scale. 99% was given a value of x, 99.9% a value of 2x, 99.99% a value of 3x and 99.999% a value of 4x.

A curve was then plotted of the weighted value against the percentage uptime.

The mean weighted value was calculated by multiplying the number of votes for each option by their weighted value, adding the products together and dividing the total by the total number of votes.

The mean weighted value was then applied to the curve and the corresponding percentage uptime was found.

21 Mar 2018

Not all websites are the same. From personal blogs to business websites, online shops to community forums, SaaS applications to video streaming services, websites come in all shapes, sizes and flavours.

It follows that not all websites have the same uptime requirements. If a personal blog goes down for 20 minutes it might not be a big problem, but the same downtime for a popular online shop could be a major concern.

With this in mind we developed a feature which enables the timing of alerts to be customised specifically for each website monitor.

lots of penguins

Customise Your Alert Times

For each monitor, you can set a custom alert time. This is the time that the website must remain down, before an alert is sent.

What are the options?

The alert time can be set to instant, 1, 2, 3, 5, 10, 15, 30 or 60 minutes. If a monitor's alert time is set to instant, an alert would be sent as soon as the website goes down. If it's set to 5 minutes, an alert would be sent if the site stays down for longer than 5 minutes.

Set different times for different websites

If you monitor multiple websites you can set different thresholds for each monitor. You could set your personal blog's alert time to 15 minutes and your business website's to 3 minutes.

Set different times for email and SMS alerts

It's possible to set different alert times for email and SMS alerts on the same monitor. For example, the email alert time could be set to 3 minutes while the SMS alert time is set at 10 minutes. If the website stays down for 5 minutes then you'd receive an email alert but no SMS alert. However, if the site stays down for 12 minutes, you'd receive both email and SMS alerts.

Do alert times affect stats & logs?

No. Alert times only affect when alerts are sent. Every downtime, no matter how short, is recorded and can be viewed in the monitor's logs and stats.

For example, if you set an alert time to 5 minutes and the website goes down for 4 minutes, no alert would be sent but the details of the downtime would be logged and the 24 hour stats would show as 99.722% uptime.

Great For Slow Websites

Custom alert times are really useful for monitoring websites on servers that are slow. There are many reasons that a website can be slow including hosting on overloaded servers or using bulky content management systems.

In an ideal world we'd all have super-fast websites but in reality, budget and time constraints mean that a lot of websites run slowly - 30% of the web now runs on Wordpress!


When a website fails to return a response to Downtime Monkey within 30 seconds it is marked as down due to timeout. 30 seconds is a long time for a site to respond (we're not talking total page load time here - just the time for the page to respond) and for a well optimised site this should almost never occur (see how to speed your website up).

However, for slow websites this can occur fairly regularly with the result that they may receive a lot of alerts.

The default alert time for email alerts is 1 minute so alerts will be sent if a website times out for two consecutive checks, one minute apart. For SMS alerts, the default setting is instant so every time a site times out an alert will be sent.

With a custom alert time in place, you can decide how long a site should be allowed to be down for before you're alerted. Setting a slightly longer period of, say, 5 minutes can really cut down on the number of alerts reecived from slow sites but means you'll still be informed if a prolonged downtime takes place.

Ideal For Bulk Monitoring

If you monitor several websites then setting a custom alert time is a good way to cut down on the number of alerts that you recieve, while still getting notified if a site goes down for a longer period.

For users who manage a lot of sites we also recommend that you rate limit your SMS alerts.

How To Set A Custom Alert Time

This is an advanced feature and is only available to Pro users.

Step 1 (when adding a new monitor)

Login and navigate to Add Monitor.

Step 1 (when editing an existing monitor)

Login, navigate to Monitors, and select the gear icon to edit the monitor's settings.

Step 2 - Set Email Alert Time

Select a time from the dropdown menu under 'Downtime before sending email'. Here the alert time is set to 5 minutes.

custom email alert time

Step 3 - Set SMS Alert Time

Select a time from the dropdown menu under 'Downtime before sending SMS'. Here the alert time is set to 10 minutes.

custom SMS alert time

Step 4

Click 'Add Monitor' or 'Update Monitor'

update monitor

If you'd like to propose an additional feature for Downtime Monkey, login and submit a feature request via the feedback form.