It's very difficult to provide concrete rules for when caching should be used, because it is very application-dependent. In general, any form of caching will have a benefit as long as your server has sufficient memory and if you don't try to cache everything. Perhaps the best rule to stick with is to cache data that is frequently used and that costs a lot (in performance terms), which is often files and databases. Listing 6.23. Using Cache Removal Callbacks
Since there are different types of caching, Table 6.8 gives a brief look at the technique use, where the cached data is visible, and the size of data the technique is useful for.
You can see that because the cached data is stored in different places, its visibility is different, and this might dictate when the technique is used. When you investigate caching, it is important that you test the performance before and after caching has been done. You may think that a page is faster because it, or the data it relies upon, is being cached, but what if you have 500 simultaneous users, or 500 requests a second? How will the server cope with a large load? How can you fine-tune and find the optimum amount of data to cache, or the optimum time to cache it for, if you don't stress-test the application? |