Time of the essence on the web
There’s a well-known maxim in the world of online retailing which states that if your site takes more than four seconds to load in a visitor’s browser, you’ll lose the custom.
I would extend that rule to every kind of web site. So unless you are an organisation like Revenue and Customs ( www.hmrc.gov.uk ) and people have no choice but to continue with the transaction, you could be driving people away without even knowing it.
The four-second rule was established three years ago after a survey by Akamai ( www.akamai. com ).
It discovered that three out of four people would not return to a site if it took longer than several seconds to load.
To make matters worse, one in three respondents said their experiences on a site coloured their entire view of the company or organisation behind it — and would spread the word to their family and friends.
There are two quick ways to find out how your site performs. The first is to visit it using a dial-up connection. If it’s sluggish in any way, it will be glaringly obvious.
Of course, many sites these days contain video, audio or images that are designed for broadband. That’s where the second test comes in.
Go to http://websiteoptimization.com/services/analyze and type in your organisation’s web address.
The site will return a full report on its download performance, with a poor showing in any particular section being highlighted in red.
So let’s try the test on one of the most famous sites in the world. The White House ( www.whitehouse.gov ) gets a red rating on several counts.
The total size, for example, is more than 1Mb, which on a 56kbps dial-up modem would take a staggering four and a half minutes to load.
Even on some of the slower broadband connections, it weighs in at a time of nearly 30 seconds. That’s because it contains scripts, images and video.
While broadband is undoubtedly wonderful, it has tempted developers to add more and more content to their sites — and that’s not always a good thing.
There’s another useful online tool called Pingdom ( http://tools. pingdom.com ), which will check your site for broken links — another major cause of sluggishness.
A campaign called Save the Pixel (http://savethepixel.org) is dedicated to simple web design. It sets out to prove that less can be more.
One designer who supports its aims sets out his own list of simple but excellent web sites here: www.webdesignfromscratch.com/10-best-designed-web-sites.php .
But there are many more things that can trip you up in terms of speed and functionality.
How capable is your site when it comes to handling lots of traffic, for example? Some online retailers only find out the answer to this question the hard way, when their sites crash during the Christmas rush.
There are few things worse than losing potential custom at one of the most profitable times of the year. Again, there are lots of online and downloadable tools that enable you to test your site under load.
Some — to be found at www.softwareqatest.com/qatweb1.html — add hundreds of virtual visitors and check the resulting stress on your system. The problem for some organisations is that hundreds of users are not enough to put the software through its paces properly. In some cases you need thousands – or tens of thousands.
So often these automated tools are not good enough, and you need a solution that meets your own particular needs.
The other aspects of web content that are often overlooked are accessibility for disabled (usually visually impaired) users and the quality of written content.
I could write a whole column on the former, but for a quick introduction you can visit W3C ( www.w3.org/TR/WAI-WEBCONTENT ). As for text, you should ensure that it is provided by a skilled copywriter. Poorly written content with grammatical errors and mangled sentences give a bad impression of your company or organisation. You can find more on the subject here: www.webcredible.co.uk.