Boundary Evolution Algorithm for SAT-NP
A boundary evolution Algorithm (BEA) is proposed by simultaneously taking into account the bottom and the high-level crossover and mutation, ie., the boundary of the hierarchical genetic algorithm. Operators and optimal individuals based on optional annealing are designed. Based on the numerous versions of genetic algorithm, the boundary evolution approach with crossover and mutation has been tested on the SAT problem and compared with two competing methods: a traditional genetic algorithm and another traditional hierarchical genetic algorithm, and among some others. The results of the comparative experiments in solving SAT problem have proved that the new hierarchical genetic algorithm based on simulated annealing and optimal individuals (BEA) can improve the success rate and convergence speed considerably for SAT problem due to its avoidance of both divergence and loss of optimal individuals, and by coronary, conducive to NP problem. Though more extensive comparisons are to be made on more algorithms, the consideration of the boundary elasticity of hierarchical genetic algorithm is an implication of evolutionary algorithm.
💡 Research Summary
The paper introduces a novel evolutionary framework called the Boundary Evolution Algorithm (BEA) designed to improve the performance of hierarchical genetic algorithms (HGAs) on Boolean satisfiability (SAT) problems, which are canonical NP‑complete tasks. The authors argue that evolution in nature often occurs at the “boundary” between neighboring species, where crossover (mixing of genetic material) and mutation (small changes) happen. Translating this biological insight into algorithmic terms, they define a “boundary elasticity” concept: the algorithm should simultaneously manipulate both the lower‑level (sub‑population) and the higher‑level (population of sub‑populations) boundaries to avoid loss of high‑quality individuals and to prevent premature divergence.
Two technical contributions constitute the core of BEA. First, the authors embed simulated annealing (SA) principles into the crossover and mutation operators at the lower level. When two parents A and B produce offspring C and D, the algorithm computes the maximum fitness among parents (t1) and among offspring (t2). If t2 ≥ t1 the offspring are accepted unconditionally. If t2 < t1, the offspring are still accepted with probability exp(−(t1−t2)/T), where T is the current annealing temperature. An analogous rule governs mutation: a mutated individual is always accepted if its fitness improves; otherwise it is accepted with the same SA‑based probability. The temperature T is decreased each generation by a cooling factor (0.95), thus allowing exploratory moves early on and increasingly greedy decisions later.
Second, the authors redesign the high‑level selection operator. Traditional HGAs select sub‑populations based solely on their mean fitness, which can discard a sub‑population that contains a single elite individual because the mean is dragged down by many low‑fitness members. BEA computes a weighted score for each sub‑population i as α·ri + β·gi, where ri is the mean fitness, gi is the fitness of the best individual in that sub‑population, and α, β ∈
Comments & Academic Discussion
Loading comments...
Leave a Comment