The Annealing Algorithm - R.H.J.M. Otten, L.P.P.P. van Ginneken
Kluwer Academic Pub (1989)
In Collection

Read It:
Mathematical optimization

With a book as technical as this, it's hard to know who my potential review readers will be. Nevertheless, I will give a very CONCRETE and ACCESSIBLE review. Also you should bear in mind this: the book does have some concrete and accessible aspects before getting into complex issues; moreover -- and maybe this is the most important of all: the simpler, concrete and accessible aspects of this book DO HAVE UTILITY. I'll give a personal example of the utility of the book's simpler material at the end of my review. But now let's proceed in a nice logical order...

The book deals with "combinatorial optimization" problems. These are problems where there are (1) a gigantic number of discrete configurations that are possible, (2) a way of scoring how desirous a configuration is, and (3) ways to change the configuration from the present one. Examples include the scheduling problem of how to assign 20 workers to one job apiece for 20 jobs (with different worker/job pairings having different costs); and, of course, the famous traveling salesman problem -- requiring precisely one visit to each of N cities and a return to the first.

The most easily understood algorithm to solve combinatorial optimization is BLIND RANDOM SEARCH (BRS): generate a random configuration, score it, repeat (always keeping the best score yet encountered and its corresponding configuration saved in memory). You can have stop criteria as you wish -- including an OR'd pair (which I find to be itself a great improvement) -- such as UNTIL (a) score is X good or better OR (b) you've generated N random configurations.

BRS performs relatively poorly. A HUGE improvement is an algorithm called "Iterative Improvement" (II). This algorithm is covered on pages 6 thru 8 of the book. The idea is to take a BRS configuration then do some modest moves around that configuration -- scoring and repeating until you had k failures to improve. The best obtained is that one BRS "point". Generate a new BRS point and compare to the old as usual, but now the II loop probably substantially improved that old BRS score to which you are comparing.

Both BRS and II involve only "downhill" moves. Only a lower score and its companion configuration are kept and a new configuration never becomes the current one if its score is worse. The danger is "getting stuck" in a local minimum as opposed to the global minimum (truly best score). To avoid this danger there is the "probabilistic hill-climbing" algorithm of Metropolis. An improved configuration (one with a better score) still becomes the next current configuration, but you have some probability of taking the next current configuration as being the current contendor even if this contendor configuration has a worse score. The probability is related to the score and a parameter that might be thought of as temperature.

From the probabilistic hill-climbing algorithm of Metropolis, all you need to get to an annealing algorithm is a schedule for appropriately reducing the "temperature" parameter (which controls up-hill acceptance probability) in successive steps. The analogy is freezing a liquid to get its perfectly crystalline line-up of atoms, free of defects. Go too fast and you may get a glass rather than a crystal.

The book's chief aim is how to recommend IN GENERAL, without recourse to your specific problem, a schedule for the "temperature" changes. If this be your aim in considering the book, well it goes without saying you need not consider further: here's your book.

But what about the less technical reader? First of all, the book does gently introduce you to combinatorial optimization, blind random search, Metropolis and annealing. Second, the few pages on Iterative Improvement are EMINENTLY USEFUL in a PRACTICAL sense -- and are a good simple alternative to annealing (my example will be at the end). Third is that the book includes several ancillary extras.

The ancillary extras:

· tutorial on all of matrix mathematics
· tutorial on Markov Chains
· material on probability and conditional probability
· tutorial on Statistics -- esp. w.r.t. the Normal distribution and Central Limit Thm

I'm not saying that the ancillary extras are the best there is for a novice level reader, but most folks would not know of the existence of this material in a book called "The Annealing Algorithm".

The final bit of ancillary material is Pascal computer code for all the algorithms in the book and a complete program for doing the whole annealing bit on the electronic chip placement combinatorial problem.


My problem is not combinatorial optimization, but can still use the ideas of iterative improvement since I am solving a deterministic problem (one without any random element) using Monte Carlo methods (using random numbers). My problem: I have the coordinates of the midpoint of a line segment; the line segment's length is also known and is roughly one-fourth the diameter of a circle; the line segment lies the annular area between this circle and a circle with a radius half-a-line-segment bigger radius than that of the original circle; lastly, given the rotation angle of my line segment, I ask this: what are the coordinates (x,y) of the intersection of the line segment and the original circle? (I took steps to check that YES, there was an intersection.) Solving the problem analytically didn't work. (Or at least, I couldn't do it.) I had used a BRS Monte Carlo approach. Then, re-reading this book, it occurred to me to use the book's algorithm (Iterative Improvement) on pages 6-8 (Pascal code page 8). I got a big improvement in lowering the error. Obviously, I had to delete details in this review (like how I even know error in my problem, and if I do know it, why can't I fix it exactly -- hint: circle is the locus of all points equidistant from a given point), but the POINT FOR YOU is that I attained a great improvement in my problem just by using the book's explicit algorithm (Pascal code) for Iterative Improvement.

Finally, the book is nice to read -- both very easy-on-the-eyes typography (unusual for a "math" book) and a good flow to the authors' writing.

Product Details
No. of Pages 201