nomuramasahir0 / crfmnes Goto Github PK
View Code? Open in Web Editor NEW(CEC2022) Fast Moving Natural Evolution Strategy for High-Dimensional Problems
Home Page: https://arxiv.org/abs/2201.11422
License: MIT License
(CEC2022) Fast Moving Natural Evolution Strategy for High-Dimensional Problems
Home Page: https://arxiv.org/abs/2201.11422
License: MIT License
I've noticed that if "lamb" is defined as an odd number it will result in a ValueError as it tries to split data unevenly (except for "3", which mysteriously works just fine).
My suggestion is to add this line to the CRFMNES init:
assert (lamb > 0 and lamb % 2 == 0), f"The value of 'lamb' must be an even, positive integer greater than 0"
Importance sampling are used in large scale (N)ES so that samples from previous generation can be reused, thus improving sample efficiency and reducing sample evaluation time. Can we have a function to get a pdf of a sample for this? I come from a very shallow background of stats/maths so this is not straightforward for me to implement.
See https://github.com/google/evojax/tree/main/evojax/algo
I added two versions of the algo to EvoJax:
The algorithm performs exceptionally well for the EvoJax benchmarks.
Additionally there are quite interesting results when using it as part of an QD-algorithm,
see google/evojax#52 (not yet merged)
It also has been added as QD-Emitter in fcmaes, see
https://github.com/dietmarwo/fast-cma-es/blob/master/tutorials/Diversity.adoc
Finally it was applied at
https://www.esa.int/gsp/ACT/projects/spoc-2023/ (Surface Exploration with Morphing Rovers) and finally ranked 3rd (Team fcmaes).
While testing the simple example code that optimizes np.sum(x**2)
I noticed that the optimization process seems to stop for a while when the dimension is increased to a large value (e.g. ~1000).
Code to reproduce:
import numpy as np
from crfmnes import CRFMNES
import matplotlib.pyplot as plt
def loss_func(x):
x = x.reshape(-1)
return np.sum(x**2)
dim = 1000
mean = np.random.randn(dim, 1) * 0.25 + 0.5
sigma = 0.2
lamb = 6
crfmnes = CRFMNES(dim, loss_func, mean, sigma, lamb)
fig = plt.figure()
ax = fig.add_subplot(projection="3d")
ax.set_xlabel("X")
ax.set_ylabel("Y")
ax.set_zlabel("Iteration")
X = list(range(dim))
ax.plot(X, mean, 0, ".")
for i in range(40):
x_best, f_best = crfmnes.optimize(100)
ax.plot(X, x_best, i + 1, ".")
print(f_best)
plt.grid()
plt.show()
Example output (no progress for ~11 iterations of 100 optimizer steps):
353.90549790931476
353.90549790931476
353.90549790931476
353.90549790931476
353.90549790931476
353.90549790931476
353.90549790931476
353.90549790931476
353.90549790931476
353.90549790931476
353.90549790931476
339.25010722424685
286.1063896053389
247.91645354571136
213.37571948612083
183.2348610899695
154.01953822981847
134.00792653279
107.80148620615931
87.5663151895065
74.59439720712179
61.14215951634783
...
Note that this does not occur when the dim is changed to 100 instead (with optimizer steps of 10).
I just wanted to check if this behavior is at all expected, or if something has gone wrong.
I noticed that as the number of dimensions grows in CRFMNES(dim, ...)
the values calculated by "f" during the get_h_inv(dim)
calculation reach magnitude ~10^300 or so, eventually creating an overflow condition for high enough dim values. This also causes the calculation to take many iterations to converge as it first "explodes", then very gradually approaches the 1e-10 target value.
A simple fix that I've found is to replace the initial h_inv value with anything between 2 and 10 (6 seems like a good value), which rapidly converges to the same final values without exploding.
I suggest implementing this change unless there are drawbacks that I've missed.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.