Local Search Algorithms
Beyond Classical Search: Local Search Algorithms
Definition
Local search algorithms are heuristic search techniques that start with an initial solution and iteratively move to a neighboring solution with the goal of finding a better solution. Unlike systematic search methods, local search algorithms focus on exploring the solution space by making local changes, making them particularly effective for large and complex problems where global search is infeasible.
Key Concepts
- Initial Solution: A starting point for the search, which is a complete assignment of values.
- Neighborhood: The set of solutions that can be reached from the current solution by making small changes.
- Objective Function: A function that evaluates the quality of a solution, guiding the search process.
- Local Optima: Solutions that are better than their neighbors but not necessarily the best overall.
- Hill Climbing: An iterative algorithm that moves to the neighbor with the highest objective function value.
- Simulated Annealing: A probabilistic algorithm that allows occasional moves to worse solutions to escape local optima.
- Genetic Algorithms: Algorithms inspired by natural selection that use crossover and mutation to explore the solution space.
- Tabu Search: An algorithm that uses memory structures to avoid revisiting previously explored solutions.
Detailed Explanation
-
Hill Climbing:
- Procedure:
- Start with an initial solution.
- Evaluate the neighbors of the current solution.
- Move to the neighbor with the best objective function value.
- Repeat steps 2 and 3 until no improvement is possible (local optimum).
- Limitations: Can get stuck in local optima; does not guarantee finding the global optimum.
- Procedure:
-
Simulated Annealing:
- Procedure:
- Start with an initial solution and an initial temperature.
- At each step, randomly select a neighbor.
- If the neighbor is better, move to it; if worse, move to it with a probability that decreases with temperature.
- Gradually decrease the temperature according to a cooling schedule.
- Repeat until the system is "frozen" (temperature is low).
- Advantages: Can escape local optima; probability of accepting worse solutions decreases over time.
- Procedure:
-
Genetic Algorithms:
- Procedure:
- Start with a population of random solutions.
- Evaluate the fitness of each solution.
- Select pairs of solutions to reproduce based on their fitness.
- Apply crossover and mutation to produce new solutions.
- Replace the old population with the new one.
- Repeat steps 2-5 until a stopping criterion is met.
- Advantages: Explores a broad search space; effective for large and complex problems.
- Procedure:
-
Tabu Search:
- Procedure:
- Start with an initial solution and initialize a tabu list.
- At each step, evaluate the neighbors of the current solution.
- Move to the best neighbor that is not in the tabu list.
- Update the tabu list with the current solution.
- Repeat steps 2-4 until a stopping criterion is met.
- Advantages: Avoids cycles; can escape local optima by forbidding recently visited solutions.
- Procedure:
Diagrams
-
Hill Climbing Process:

-
Simulated Annealing Cooling Schedule:

Links to Resources
- Hill Climbing Algorithm
- Simulated Annealing Explained
- Genetic Algorithms Overview
- Tabu Search Algorithm
Notes and Annotations
-
Summary of key points:
- Local search algorithms improve an initial solution by exploring its neighborhood.
- Hill climbing is simple but can get stuck in local optima.
- Simulated annealing allows escaping local optima by probabilistically accepting worse solutions.
- Genetic algorithms use crossover and mutation to explore a broad search space.
- Tabu search uses memory to avoid revisiting previously explored solutions.
-
Personal annotations and insights:
- Local search algorithms are particularly useful for optimization problems where the search space is large and complex.
- Understanding the strengths and limitations of each algorithm helps in selecting the appropriate one for a given problem.
- Combining local search with other techniques, such as constraint propagation, can lead to more robust and efficient solutions.