Caching is a computer's way of recalling things quickly. Frequently accessed data (or even data that isn't accessed so frequently) is stored in a place where the computer can retrieve it more quickly and easily than it can from the data's original location. A cache acts as a pocket of information that allows easy access. Figure 14.1 illustrates the benefits a cache provides.
Figure 14.1. A cache is an easy-access storage location.
If you've ever used a Web browser, you should be familiar with caching already. When you visit a page, the browser caches that page (and any images) in a special location on your hard drive. This does two things for you. First, the next time you visit that page, instead of having to download the page (or images) all over again, the browser simply loads it from your hard drive. Second, it allows you to view that page (or images) again without having to revisit the page at all. This showcases the two most important aspects of caching: fast access, and access when it normally wouldn't be possible.
| || |
This brings up an interesting point: What happens if data you've cached gets too old? If the Web server replaces the page or image with a more up-to-date version? Since the cache is the first place your computer will look, you'll end up with the old version. When this happens, the cache needs to be invalidated its period of usefulness has expired.
In general, caching is a good thing. (Don't let anyone tell you otherwise!) Anytime you can cache something, you receive a performance benefit; after all, that's the whole reason for caching. As mentioned in Day 4, "Using ASP.NET Objects with C# and VB.NET," the Session object that computers nowadays have plenty of RAM to spare for holding data. Most of the time only a small fraction of a server's resources are fully utilized; it only makes sense to put that to use by using caching. Throughout today's lesson I'll give you recommendations on the when and where of caching, and just how good of a thing it is.