Greatly Reduced Server Loads

06 Feb 2020

...achieved with a few simple code changes!

As explained in the last post we've spent a few weeks working on small fixes and improvements, focusing on jobs that aren't urgent but shouldn't be overlooked.

Here we'll look at how a few simple code changes greatly increased efficiency when serving pages on the Downtime Monkey website. These improvements focus on reducing the server's CPU load and memory use.

The site is already highly tuned for page speed and although these changes don't speed up the site noticeably, they allow the site to receive more traffic without slowing due to server overload.

Two Much

'Too Much Johnson' 1938, directed by Orson Welles



TLDR; Looping through large datasets when serving a webpage is avoided. Instead pre-processed data is stored in the database and only the necessary values are pulled. When real-time data is needed on-page, only the relevant data is pulled from the database so a smaller dataset is used.

Is Efficiency Important?

To quote Donald Knuth from his book The Art of Computer Programming: "premature optimisation is the root of all evil". Working hard on performance and scaling at the very beginning of a start-up can be a large waste of time - it's better to focus on much needed features.

On the other hand, a poorly tuned site can run slowly or even crash when there is an increase in traffic and optimising your code is a lot cheaper than throwing RAM and CPU cores at the problem.

web server

The Optimal Time For Optimisation

In theory, the ideal time to optimise is just before you run into load problems, just before your traffic scales.

But as the saying goes: "in theory there is no difference between theory and practice, while in practice there is". It can be best to err on the side of caution and make optimisations well in advance of problems.

Traffic to the Downtime Monkey website has increased steadily each year, from just 350 visitors a month when we first launched to 3500 in 2018 and over 9000 at times through 2019. As 2020 gets underway we want to be prepared for continued growth. Time for some optimisations...

Minimise Processing On Busy Pages

Remove On-Page Loops

Looping through large amounts of data is CPU intensive and avoiding this is one of the low hanging fruits of performance optimisation. We took note of pages that looped through data before page load and where possible, put an alternative in place.

A good example is the home page where the number of downtimes logged in the last 90 days is displayed. Originally, this was calculated in real time using a foreach loop to count all the logged downtimes. For those interested, here's the PHP:

			

//pull all the records from the database				
$record = new Table($statusChanges);
$all_records = $record->get('status_changes');
$records = $record->tableData();

$downtime_count = 0;

$ninety_days_ago = strtotime('-90 days');

//loop through each record and count the relevant ones
foreach ($records as $record1) {
    if (strtotime($record1->change_time) > $ninety_days_ago) {
        if ($record1->change_to == 'down') {
            $downtime_count = ($downtime_count + 1);
        }
    } 
}

//format for display on page
$downtime_count = number_format($downtime_count);

			
		

Not pretty but simple, easy to read and it worked seamlessly for over 2 years.

However, when we first wrote this code there were only a few thousand downtimes in a 90 day period - now there can be over 150,000. It really doesn't make sense for the server to be crunching that kind of data every time someone visits the home page.

Instead, we now count the number of logged downtimes behind the scenes and store the figure in the database. When someone visits the home page we simply pull the figure from the database - no foreach loop required.

This was a little bit more work at the backend because we needed to schedule a script to run once a day and setup a table in the database to store the results. However, there were two major efficiency gains:

1) less memory is now used as only a few rows are pulled from the database rather than over 100,000

2) fewer CPU cycles are needed giving big savings in processing power

There is also the added advantage of slimmed-down code for the webpage:

			

//pull the pre-calculated stats from the database				
$stat = new Table($statsInstance);
$all_stats = $stat->get('dtm_stats');
$stats = $stat->tableData();

//select the downtime count (it's first) and format it for display on page
$downtime_count = number_format($stats[0]->downtime_count);

			
		

When Real Time Data Is Needed

The above solution worked because the data displayed doesn't need to be accurate in real time. However, in some circumstances we need to show data in real time. For example, when a website's percentage uptime is displayed it's important that it is accurate at the time it's viewed.

Originally, to display a website's percentage uptime, we pulled all the downtime logs from the database and looped through these to select the downtimes for the specific website:

			

//pull all the records from the database				
$record = new Table($statusChanges);
$all_records = $record->get('status_changes');
$records = $record->tableData();

//loop through each record and select the records specific to the monitor
foreach ($records as $record1) {
    if ($record1->monitor_id == $monitor_id) {
        //calculate percentage uptime
    } 
}

			
		

Now only the data for the specific website is pulled from the database. Pulling only the relevant data from the database consumes much less server memory and also has the advantage of reduced CPU cycles because there's a smaller dataset to loop through. It wasn't any more work to implement than the original and in retrospect we should have done it this way from the start:

			

//pull only the records we need from the database				
$record = new Tablerows($statusChanges);
$specific_records = $record->getByMonID('status_changes', $monitor_id);
$records = $record->tableData();

//loop through every record
foreach ($records as $record1) {
    //calculate percentage uptime
}

			
		

Performance tuning has the potential to be never ending but these simple fixes gave huge gains in efficiency when rolled out across the whole site. For now they are all we need. In the future, it would be great to have so many users that we'll need to re-optimise - sign-up and start monitoring your websites to help us get there!

All Posts

 Website Monitoring Prices Compared

 Scheduled Maintenance 17th June 2021

 US Text Alerts Updated For 10DLC

 A Quick Study Of Response Time

 'Early-bird' Discount Ends November

 Downtime Logs... All In One Place

 Timestamps On Downtime Alerts

 Stats At A Glance

 The Effects Of COVID-19 Lockdowns

 Lockdown Bugfixes & Midnight Coding

 Greatly Reduced Server Loads

 Monitoring URLs With Query Strings

 New Year's Carbon Offsetting

 Keeping Your Web Host Honest

 New Pro Plans For EU Individuals

 New Downtime Alert Options

 New SMS Provider for the US

 Free & Pro Monitoring Compared

 New SCA-ready Payments System

 Global Website Monitoring

 Downtime Alerts: An Ideal Custom Setup

 Server Upgrade & IP Address Change

 Website Monitoring: Cheap vs Free

 Improvements & Bugfixes

 Website Content (Keyword) Monitoring

 Cheap Website Monitoring Pro Plans

 Spring Cleaning = Bug Fixing

 Bug Found & Fixed

 Server Upgrade Scheduled Completed

 Whitelist Email Addresses in cPanel

 Monitoring Software Awards

 Website Downtime Alerts To Slack

 Whitelist Email Addresses: Thunderbird

 Monitor Response Time

 Whitelist Email Addresses in Yahoo Mail

 How we improved accessibility by 42%

 Whitelist Email Addresses in Outlook

 Whitelist Email Addresses In Gmail

 Why Whitelist An Email Address?

 User Interface Improvements

 Free Email Support For All

 When is a website considered down

 Bulk import, edit and delete monitors

 Privacy, democracy & bureaucracy

 How Much Downtime is Acceptable?

 Feature: Custom Alert Times

 Server Upgrade Scheduled Completed

 Free Plan Upgraded to 60 Monitors

 New Feature: Rate Limit SMS Alerts

 How We Boosted Page Speed By 58%

 How To Reduce Website Downtime

 Making the Monkey

 How To Monitor A Website

 5 Tips for Website Internationalisation

 We're Live...

 Initial Development Completed