-
Notifications
You must be signed in to change notification settings - Fork 1
2. Minimize Mathematical Function Rosenbrock
In the past 2 examples we minimized a 2 dimensional Rastrigin function. Let's crank it up a notch and tackle an arbitrarily hard problem (how about a 15 dimensional function?). The objective of this part is to teach about multi threading, sub populations and migration.
Rosenbrock is another mathematical function used to evaluate genetic algorithms. For n dimensional problems it can be expressed as
and has his optimal solution at point f(1,1,...,1) = 0
Similarly to the rastrigin example first we need to define a utility or fitness function.
Function<double[], Double> rosenbrock = (x) -> {
double value = 0;
for(int i = 0; i < x.length-1; i++) {
value += (b * Math.pow((x[i+1] - Math.pow(x[i],2)),2) + Math.pow((a-x[i]),2));
}
return value;
};
Remember that in part 1 we said that the double prototype expects an array with the lower and upper bounds for each variable in our method? We could do this manually,
double[][] initialRange2D = new double[][] {
{ -4, 8 }, // initial min/max of x0
{ -4, 3 }}; // initial min/max of x1
double[][] initialRange3D = new double[][] {
{ -4, 8 }, // initial min/max of x0
{ -4, 3 }, // initial min/max of x1
{ -2, 4 }}; // initial min/max of x2
double[][] initialRange4D = new double[][] {
{ -4, 8 }, // initial min/max of x0
{ -4, 3 }, // initial min/max of x1
{ -6, 6 }, // initial min/max of x2
{ -4, 4 }}; // initial min/max of x3
but for higher dimensions this is tedious to write. Therefore let us write a helper method which creates the array for arbitrary n's.
private static double[][] generateInitialRange(int dimensions){
var RNG = new PcgRSFast();
double[][] initialRangeND = new double[dimensions][2];
//Initialize the array
for(int dim = 0; dim < dimensions; dim++) {
//Lets set the initial range randomly to -7 to 7
initialRangeND[dim][0] = RNG.nextInt(6)-7; //Min of xdim
initialRangeND[dim][1] = RNG.nextInt(6)+1; //Max of xdim
}
return initialRangeND;
}
Nothing out of the ordinary happens here. We generate an n dimensional 2 d double array with random values between -7 and 7. A special note is the Random Number Generator we are using. Java's build in RNG isn't good enough for simulation purposes. It is slow and the quality of generated numbers is statistically low. Same is true for the commonly used merseene twister, therefor Darwin ships with a rng of the pcg family. Random number generation turned out to be a bottleneck of this genetic algorithm, that's why additional steps were taken to circumvent this issue. For more information see https://github.com/KilianB/pcg-java.
new PcgRSFast()
creates a not thread safe rng while new PcgRS()
instance is thread safe.
public class RosenbrockNDimensional {
// rosenbrock function has 2 additional variables
static int a = 1; // if a != 0 the function is not symmetric
static int b = 100;
private static double[][] generateInitialRange(int dimensions){
var RNG = new PcgRSFast();
double[][] initialRangeND = new double[dimensions][2];
//Initialize the array
for(int dim = 0; dim < dimensions; dim++) {
//Lets set the initial range to -7 to 7
initialRangeND[dim][0] = RNG.nextInt(6)-7; //Min of xdim
initialRangeND[dim][1] = RNG.nextInt(6)+1; //Max of xdim
}
return initialRangeND;
}
public static void main(String[] args) {
Function<double[], Double> rosenbrock = (x) -> {
double value = 0;
for(int i = 0; i < x.length-1; i++) {
value += (b * Math.pow((x[i+1] - Math.pow(x[i],2)),2) + Math.pow((a-x[i]),2));
}
return value;
};
int dimensions = 15;
double[][] intitialRangeND = generateInitialRange(dimensions);
var ga = GeneticAlgorithm.builder()
.withPrototype(new DoublePrototype(intitialRangeND, rosenbrock))
.withMaxExecutionTime(10,TimeUnit.SECONDS)
.withMaxGenerationCount(Integer.MAX_VALUE)
.build();
Result result = ga.calculate(10000);
System.out.println(result);
}
}
Our results indicate that even after 10 seconds of execution out fitness value wasn't reached. Let's remember our last tutorial and use a fuzzy crossover approach. withCrossoverStrategy(new ScatteredFuzzy(2))
.
The results are better but we are not quite there yet. For now stepwise increase the population count and see if we can achieve better results.
There is a direct relationship between population size and the time the algorithm takes to perform a single generation. At least for this example doubling the population size will result in half the generations created for the given runtime.
Population Size | Generations Created | Fitness |
20 | 424718 | 0.018744 |
30 | 310282 | 0.001266 |
40 | 233961 | 0.00239 |
50 | 193956 | 0.00106 |
100 | 101591 | 0.00139 |
200 | 49186 | 0.00691 |
No matter how many individuals we add we don't reach our target fitness of 1e-4. The genetic algorithm is an iterative algorithm. It has to wait for a generation to finish calculating before it can perform it's next steps. As with many approaches divide and conquer is key in programming.
Instead of calculating an entire population of 100 individuals on 1 core we can take 25 individuals each and calculate them on several cores. That is the basic idea behind sub populations.
.migration()
.withNewSubpopulations(4)
.build()
Appending the above commands to the builder will create 4 sub populations which all inherit the settings as defined beforehand. Executing the genetic algorithm we can see that the console output changed:
Overall Population ||| Subpopulation
Gen | Best | Worst | Average | Sum ||| Best(0) | Best(1) | Best(2) | Best(3)
0 | 14437,7882 | 33908913,47 | 659416,72 | 65941672,28 ||| 24671,6697 | 14437,7882 | 19636,2021 | 16965,4176
10000 | 6,1834 | 4540185,70 | 171303,69 | 17130369,25 ||| 6,1834 | 10,8710 | 10,6274 | 10,1723
20000 | 4,9766 | 2798181,49 | 63764,50 | 6376450,16 ||| 4,9766 | 8,9518 | 8,7993 | 8,7156
Now we also get information about the best individual in every single sub population. Once the algorithm terminates we can see, that in fact, we computed 295649 Generations with 4 * 25 Individuals. It is about 3 times faster as if we had taken a single 100 individual population.
Sadly the result did not improve much. Yes we calculate more possible solutions but it's the same as executing the algorithm with 25 individuals 4 times. To harness the true power of sub populations we also need to introduce the concept of migration. Migration is the process of the best individuals moving from one sub population to another. Genetic algorithms abuse fit individuals, if a good solution has been found it can be beneficial to let other populations know about the progress we have made. Initially migration is disabled, lets configure the genetic algorithm to let individuals migrate every 100 generations created.
.migration()
.withMigrationInterval(100)
.withNewSubpopulations(4)
.build()
A too low migration (e.g. every generation) won't give the sub populations enough time to evolve before individuals are moved. A too high migration interval will slow down finding a valid solution.
Now execute the algorithm again, and we can see: that we found a valid solution in less than 7 seconds Fitness: 0.00099976 Runtime Total: 6819 ms
Currently we created 4 identical sub populations which used the same settings throughout. Darwin gives you full control over individual sub populations.
.migration()
.withMigrationProcess(new ForwardMigration())
.withMigrationInterval(100)
//With default settings
.withNewSubpopulations(2)
//Customized
.withNewSubpopulation()
.withScalingStrategy(new TopScaling())
.withSelectionStrategy(new Tournament(3))
.withCrossoverStrategy(new ScatteredFitnessFuzzy(2))
// Another one...
.withNewSubpopulation()
....
.build();
The above code creates 2 sub populations with the default settings defined before and a third sub population which uses a different crossover scaling and selection strategy. The Migration process indicates that not every sub population migrates into every other population, but instead migration only takes place from left to right. A full documentation can be found on the other wiki pages as well as in the javadocs.
Population diversity is an important topic in genetic algorithms. If all individuals are equivalent a crossover operation will not yield new results. Therefor, it is of importance to keep the population diverse enough to have different genes to play with while still allowing the population to eventually diverge.
.withForceCloneMutation(enabled,cutoff)
controls an additional filter step in the algorithm which forces all individuals to be unique at the end of a population. This is achieved by mutating duplicates until they a valid solution is created.
Usually this step speeds up computation quite a bit, in fact it has such a severe impact for many problem sets that it is enabled by default, but in this case it does not help. There isn't a definite rule to say when this setting should be turned on or off, but as it has great impact this is one of the features you should play around first. Usually, the more variables are involved the less likely force clone mutation contributes positively to the performance of the ga. Go ahead and disable it and you will win another second.
- Sub populations give you a handy tool to increase the population count and multi thread genetic algorithms without loosing increasing the run time of the algorithm
- Numerical problems should always attempt to use fuzzy crossover strategies
- force clone mutation is one of the first settings to play around with.
- To achieve the best performance one population per core will result in the highest throughput.
This chart shows that #SubPopulation%ProcessorCount == 0
(a multiple of your cpu cores) are best for performance. The effective Individual count (Generations * Population Size) doesn't even increase much between 1 and 4 sub populations even though we increase our computation power. This has to to with overhead for thread handling, statistics and individual sorting. Still, an individual in generation 100, in regards to the final solution, is more worth than an individual in the first generation.
We have set withMaxGenerationCount() to a value of Integer.MAX_VALUE which is fine if we only are concerned about about the actual run time. But
.withForceCloneMutation(false,0)
Care! do not use it if it's not thread save.