From e3738c9bb5db8c839611d4b564a05205644c410e Mon Sep 17 00:00:00 2001 From: Thieu Nguyen Date: Fri, 20 Oct 2023 20:55:59 +0700 Subject: [PATCH] Update README --- README.md | 18 ++++++++++++------ 1 file changed, 12 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index 788ec56a..788d774b 100644 --- a/README.md +++ b/README.md @@ -43,7 +43,8 @@ approximate optimization. * **Dependencies:** numpy, scipy, pandas, matplotlib -### Goals + +

Goals

Our goals are to implement all of the classical as well as the state-of-the-art nature-inspired algorithms, create a simple interface that helps researchers access optimization algorithms as quickly as possible, and share knowledge of the optimization field with everyone without a fee. What you can do with mealpy: @@ -54,6 +55,10 @@ Our goals are to implement all of the classical as well as the state-of-the-art - Save results in various formats (csv, json, pickle, png, pdf, jpeg) - Export and import models can also be done with Mealpy. +
+ + + ### Citation Request @@ -83,6 +88,7 @@ Please include these citations if you plan to use this library: # Usage + ## Installation * Install the stable (latest) version from [PyPI release](https://pypi.python.org/pypi/mealpy): @@ -175,11 +181,11 @@ from mealpy.utils.problem import Problem # Our custom problem class class Squared(Problem): - def __init__(self, lb=(-5, -5, -5, -5, -5, -5), ub=(5, 5, 5, 5, 5, 5), minmax="min", name="Squared", **kwargs): + def __init__(cls, lb=(-5, -5, -5, -5, -5, -5), ub=(5, 5, 5, 5, 5, 5), minmax="min", name="Squared", **kwargs): super().__init__(lb, ub, minmax, **kwargs) - self.name = name + cls.name = name - def fit_func(self, solution): + def fit_func(cls, solution): return np.sum(solution ** 2) ``` @@ -617,7 +623,7 @@ All visualization examples: [Link](https://mealpy.readthedocs.io/en/latest/pages * **SADE**: Qin, A. K., & Suganthan, P. N. (2005, September). Self-adaptive differential evolution algorithm for numerical optimization. In 2005 IEEE congress on evolutionary computation (Vol. 2, pp. 1785-1791). IEEE. * **SHADE**: Tanabe, R., & Fukunaga, A. (2013, June). Success-history based parameter adaptation for differential evolution. In 2013 IEEE congress on evolutionary computation (pp. 71-78). IEEE. * **L_SHADE**: Tanabe, R., & Fukunaga, A. S. (2014, July). Improving the search performance of SHADE using linear population size reduction. In 2014 IEEE congress on evolutionary computation (CEC) (pp. 1658-1665). IEEE. - * **SAP_DE**: Teo, J. (2006). Exploring dynamic self-adaptive populations in differential evolution. Soft Computing, 10(8), 673-686. + * **SAP_DE**: Teo, J. (2006). Exploring dynamic cls-adaptive populations in differential evolution. Soft Computing, 10(8), 673-686. * **DSA - Differential Search Algorithm (not done)** * **BaseDSA**: Civicioglu, P. (2012). Transforming geocentric cartesian coordinates to geodetic coordinates by using differential search algorithm. Computers & Geosciences, 46, 229-247. @@ -801,7 +807,7 @@ All visualization examples: [Link](https://mealpy.readthedocs.io/en/latest/pages * **PSO - Particle Swarm Optimization** * **OriginalPSO**: Eberhart, R., & Kennedy, J. (1995, October). A new optimizer using particle swarm theory. In MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science (pp. 39-43). Ieee. * **PPSO**: Ghasemi, M., Akbari, E., Rahimnejad, A., Razavi, S. E., Ghavidel, S., & Li, L. (2019). Phasor particle swarm optimization: a simple and efficient variant of PSO. Soft Computing, 23(19), 9701-9718. - * **HPSO_TVAC**: Ghasemi, M., Aghaei, J., & Hadipour, M. (2017). New self-organising hierarchical PSO with jumping time-varying acceleration coefficients. Electronics Letters, 53(20), 1360-1362. + * **HPSO_TVAC**: Ghasemi, M., Aghaei, J., & Hadipour, M. (2017). New cls-organising hierarchical PSO with jumping time-varying acceleration coefficients. Electronics Letters, 53(20), 1360-1362. * **C_PSO**: Liu, B., Wang, L., Jin, Y. H., Tang, F., & Huang, D. X. (2005). Improved particle swarm optimization combined with chaos. Chaos, Solitons & Fractals, 25(5), 1261-1271. * **CL_PSO**: Liang, J. J., Qin, A. K., Suganthan, P. N., & Baskar, S. (2006). Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE transactions on evolutionary computation, 10(3), 281-295.