building for reliability

It's a simple fact: People expect your site to work. And they expect it to work the same way it did the last time they used it. But a lot of sites struggle with reliability, especially as they grow.

There are a lot of reasons why a site may not work (or may not work as expected). It may have grown quickly, it may have failed to invest in equipment, it may not have been tested properly, or it may have been built poorly.

An unreliable site may suffer from

  • Traffic overload

  • Bandwidth blackout

  • Browser incompatibility

  • Buggy code

traffic overload When sites or applications within them "go down," the culprit is often traffic. The site may have outgrown its infrastructure, or it may be experiencing a sudden surge of user requests that it's unequipped to handle. (Unexpected spikes can bring down even well-established sites.)

There are two factors limiting a site's capacity: the bandwidth and the servers. The bandwidth determines how much data can be transferred from your site at once. So if you're serving large files, like video, you're likely to get bottlenecked. The server determines how many users can log on simultaneously and how many requests can be processed at once. An overloaded server will have to turn away visitors and may end up crashing (and serving no users at all).

bandwidth blackout Sometimes, your hosting service may experience a service outage, where user requests can't get through to your servers. Even established internet providers may experience blackouts, but they're never acceptable. Follow up with your provider, and make sure it's addressed.

compatibility failure Sometimes a site that appears unreliable may actually be incompatible with a particular browser or platform. An application that performs perfectly on a PC may perform erratically on a Mac, or not at all. Mac-only users would think the site is broken, while users who switch between platforms would see it as unreliable.

It's the double-edged sword of web success: More users mean higher costs.


buggy code Sometimes unreliability is caused by bad code. A program may have a bug that causes it to function erratically, or a web page may have errors that cause it to fail under certain conditions.

improving reliability

As you can see, good planning is really the key to a reliable web site. And while you can't exactly go back and start over, there's a lot you can do to make your site more reliable, even once you're up and running.

  • Test your site thoroughly. Errors in code and in compatibility can be uncovered in the QA process if you have a QA process, that is. It's essential for any site that takes itself seriously to make every effort to identify and repair site errors (including those that only emerge in highly specific situations) before its users do.

  • Increase server capacity. If your site is growing, and you're seeing large increases in traffic or if you've begun serving large files, like audio or video you need to think about adding server capacity, either by increasing the number of servers or their individual processing power.

  • Increase bandwidth. As you grow, you'll need to add more bandwidth to support your users. This is the double-edged sword of web success: More users mean higher costs.



The Unusually Useful Web Book
The Unusually Useful Web Book
ISBN: 0735712069
EAN: 2147483647
Year: 2006
Pages: 195
Authors: June Cohen

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net