The parameters for the algorithm can be adjusted depending upon the complexity of the problem to be solved . This section will define some of these parameters along with the effects that can be realized.
The initial temperature must be high enough to permit movement of solutions to other parts of the solution landscape. Graham Kendall shows that knowing the maximum distance between one neighbor (solution) and another permits the calculation of the initial temperature [Kendall 2002].
The initial temperature can also be dynamically defined. Given statistics on the rate of acceptance of worse solutions and discovery of new better solutions, the temperature can be raised until a sufficient number of acceptances/discoveries have occurred. This is similar to heating a solid until it is liquid, after this there is no point in further increasing the temperature [Dowsland 1995].
While zero is a good symbolic final temperature, the geometric function used within this example means that the algorithm will run far longer than practically useful. Therefore, the final temperature from Listing 2.1 shows 0.5 degrees. Depending upon the temperature function used, this value may vary.
The actual temperature function used can be varied depending upon the problem to be solved. Figure 2.5 shows the temperature over time using a geometric function. A large variety of other temperature cooling functions can be used. These functions may result in a steep reduction in the first half of the temperature schedule, or a slow reduction followed by a steep loss in temperature. A very nice illustration of sample cooling schedules can be found at Brian Luke's Web site "Simulated Annealing Cooling Schedules" [Luke 2001].
At higher temperatures , the simulated annealing algorithm looks for the global optimum over the entire solution landscape. As cooling occurs, there is less movement and the algorithm searches the local optimum to optimize it. The number of iterations to perform at each temperature step is therefore important. In our sample problem (see Listing 2.1) we specified that 100 iterations are performed. It's useful to manually experiment with the number of iterations to find what's optimal for the problem at hand.