Fireworks Algorithm (FWA)

(Prof. Tan's speech on FWA,PPT, ZIP file, FWA Matlab Codes, FWA C Codes)

Swarm intelligence

Optimization Problems have being one of major problems and can be found everywhere, especially in the academy field and industrial world. To solve these kind problems, many methods have been proposed. Recently Swarm Intelligence (SI) optimization has been a very hot subject and method for optimization problems and showed its great success. James Kennedy defined it as follows : ” Swarm intelligence refers to a kind of problem-solving ability that emerges in the interactions of simple information-processing units.[3]” The units that build up the swarm can be living creature and also can be lifeless bodies. In the years since the introduction of Swarm Intelligence, SI has a number of branches with a number of researchers exploring the concepts, behavior, cognition underlie the nature and made great contribution to these fields, such as, Particle Swarm Optimization (PSO)[4], Ant Colony Optimization (ACO)[5], Genetic Algorithm (GA)[6] and Fish School Search (FSS)[7]. In PSO algorithms, it simulates the behavior of ” searching food among flocks” who share the social information, previous experiences among the swarm and individuals.

 

Description: Description: F:\LATEX\latex_template\latex\bai\fig02.JPG

Fig1. From Biological phenomenon or process to Artificial Computational Approach

 

Motivation of FWA

In the same way, the inspiration of Inspired by observing fireworks explosion, a novel swarm intelligence algorithm, called Fireworks Algorithm (FA), is proposed for global optimization of complex functions. The FA is presented and implemented by simulating the explosion process of fireworks. In the FA, two explosion (search) processes are employed and mechanisms for keeping diversity of sparks are also well designed.

 

Fig2. Mathematical Modeling Process of Fireworks Algorithm

 

Fireworks Algorithm

When a firework is set o, a shower of sparks will fill the local space around the firework. In our opinion, the explosion process of a firework can be viewed as a search in the local space around a specific point where the firework is set o through the sparks generated in the explosion. Mimicking the process of setting fireworks, a rough framework of the FA is depicted in Fig. 3.

Fig.3. Framework of fireworks algorithm

 

To validate the convergence speed of the FA, we conducted more thorough experiments. Fig. 4 depicts the convergence curves of the FA, the CPSO and the SPSO on eight benchmark functions averaged over 20 independent runs. From these results, we can arrive at a conclusion that the proposed FA has a much faster speed than the CPSO and the SPSO. From Table 3, we can find that the FA can find excellent solutions with only 10000 times of function evaluations. This also reflects the fast convergence speed of the proposed FA.

Fig.4. Convergence curves of the FA, the CPSO and the SPSO on eight benchmark functions.

 

Table. 1. Statistical mean and standard deviation of solutions found by the FA, the CPSO and the SPSO on nine benchmark functions over 20 independent runs of 10000 function evaluations

Application based on FWA

Low-rank approximations are utilized in several content based retrieval and data mining applications, such as text and multimedia mining, web search, etc. and achieve a more compact representation of the data with only limited loss in information. They reduce storage and runtime requirements, and also reduce redundancy and noise in the data representation while capturing the essential associations. The Non-negative Matrix Factorization (NMF, (Lee and Seung 1999)) leads to a low-rank approximation which satisfies non-negativity constraints. NMF approximates a data matrix  by where  and  are the NMF factors. NMF requires all entries in, andto be zero or positive.

Description: Description: Description: D:\Work\WorkPKU-CI\IJSIR\WordVersion\FigsPlots\NMFplot.eps

Figure 5 - Scheme of very coarse NMF approximation with very low rank k. Although k is significantly smaller than m and n, the typical structure of the original data matrix can be retained (note the three different groups of data objects in the left, middle, and right part of A).

Optimization Strategy 1 – Initialization

Description: Description: Description: C:\Dokumente und Einstellungen\Andreas\Desktop\algInit.jpg

Algorithm 1 – Pseudo code for the initialization procedure for NMF factors W and H. The two for-loops in lines 4 and 10 can be executed concurrently. SIO = Swarm Intelligence Optimization

Evaluation of Optimization Strategy 1

Initialization: Before evaluating the improvement of the NMF approximation quality as such, we first measure the initial error after initializing  and (before running the NMF algorithm). Figure 3 and Figure 4 show the average approximation error (i.e. Frobenius norm / fitness) per row (left) and per column (right) for data set DS-RAND.

文本框:

Figure 3 – Left hand-side: average approximation error per row (after initializing rows of W). Right hand-side: average approximation error per column (after initializing of H). NMF rank k = 5. Legends are ordered according to approximation error (top = worst, bottom = best).

For more details about this paper, please see [8][9].

 

Reference

[1] Y. Tan and Y. Zhu (2010). Fireworks algorithm for optimization. ICSI 2010, Part I, LNCS 6145, pp. 355-364

[2] D. Bratton and J. Kennedy (2007). Defining a Standard for Particle Swarm Optimization. Proceedings of the 2007 IEEE Swarm Intelligence Symposium (SIS 2007), pp. 120-127

[3] J. Kennedy (2006). Handbook of Nature-Inspired and Innovative, Springer, Chapter 6, pp. 187-219

[4] J. Kennedy and R. Eberhart(1995), Particle Swarm Optimization. Proceedings of the 1995 IEEE International Conference on Neural Networks(Perth, Australia): IEEE Service Center, Piscataway, NJ, IV, pp. 1942-1948.

[5] M. Dorigo, V. Maniezzo and A. Colorni (1996). Ant system: optimization by a colony of cooperating agents. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, Volume: 26, Issue: 1, pp. 29-41.

[6] J. Holland (1975). Adaptation in natural and artificial systems. University of Michigan Press,  Ann Arbor.

[7] C. J. A. B. Filho, F. B. de Lima Neto, A. J. C. C. Lins, A. I. S. Nascimento and M. P. Lima (2008). A novel search algorithm based on fish school behavior. Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. pp. 2646-2651

[8] Andreas Janecek* and Ying Tan, Swarm Intelligence for Non-negative Matrix Factorization, IJSIR 2011

[9] Andreas Janecek* and Ying Tan, Using population based algorithms for initializing nonnegative matrix factorization, ICSI 2011, Part II, LNCS 6729, pp. 307–316, 2011.