Annealing Solver
Simulated Annealing & Solution Analysis
How Simulated Annealing Works
Simulated annealing (SA) is a metaheuristic inspired by the physical process of cooling metal. When metal is heated and slowly cooled (annealed), atoms settle into a low-energy crystalline structure. SA applies the same principle to optimization.
The Algorithm
1. Start with a random binary state x
2. Set temperature T = T_start (high)
3. Repeat for num_steps:
a. Pick a random variable i
b. Flip it: x_i = 1 - x_i
c. Compute energy change: dE = E(new) - E(old)
d. If dE <= 0: accept (it's an improvement)
e. If dE > 0: accept with probability exp(-dE / T)
f. Cool: T = T * cooling_rate
4. Return the final state and its energyWhy It Works
The acceptance probability exp(-dE / T) is the key insight:
Temperature Schedule
The geometric cooling schedule multiplies temperature by a constant factor each step:
cooling_rate = (T_end / T_start) ^ (1 / num_steps)
T_next = T_current * cooling_rateThis produces exponential decay: fast cooling at high temperatures (where there's less to gain from exploration) and slow cooling at low temperatures (where precision matters).
Key parameters:
Multiple Reads
SA is stochastic — different starting points lead to different solutions. Running multiple independent chains (num_reads) and keeping the best increases your chance of finding the global optimum.
The energy distribution across reads tells you about the landscape:
Simulated vs Quantum Annealing
| Property | Simulated Annealing | Quantum Annealing |
|---|---|---|
| Escape mechanism | Thermal fluctuation | Quantum tunneling |
| Hardware | Any CPU | D-Wave quantum processor |
| Problem size | No hard limit | ~5000 qubits (with embedding) |
| Speed | O(num_steps × num_reads) | Fixed anneal time (~20μs) |
| Guarantee | Converges to optimal as T→0 (infinitely slow) | No formal guarantee |
| Practical advantage | Well-understood, reliable | Debated — active research |
The honest assessment: for problems under ~100 variables, well-tuned SA is competitive with or better than current quantum hardware. The QUBO framework is the valuable part — it works with both backends.
Analyzing Solutions
After running the solver, you should check:
Baseline Comparisons
Always compare against simple baselines:
Greedy fails when features are interacting: it might greedily pick two correlated features because each is individually good, missing that together they're redundant. SA considers the full QUBO structure.
Convergence Analysis
Plot best energy vs num_reads and best energy vs num_steps:
What's Next
In Module 5, you'll build an interactive dashboard that makes this analysis visual — sliders for parameters, charts for energy distributions, side-by-side comparisons with baselines.
This is chapter 4 of Quantum Optimization for AI.
Get the full hands-on course for $100 and build the complete system. Your projects become your portfolio.
View course details