SOM+PSO. A Novel Method to Obtain Classification Rules.

Authors

  • Laura Cristina Lanzarini Instituto de Investigación en Informática LIDI, Facultad de Informática, Universidad Nacional de La Plata, La Plata, Argentina
  • Augusto Villa Monte Instituto de Investigación en Informática LIDI, Facultad de Informática, Universidad Nacional de La Plata, La Plata, Argentina
  • Franco Ronchetti Instituto de Investigación en Informática LIDI, Facultad de Informática, Universidad Nacional de La Plata, La Plata, Argentina

Keywords:

adaptive strategies, self-organizing maps, particle swarm optimization, data mining, classification rules

Abstract

Currently, most processes have a volume of historical information that makes its manual processing difficult. Data mining, one of the most significant stages in the Knowledge Discovery in Databases (KDD) process, has a set of techniques capable of modeling and summarizing these historical data, making it easier to understand them and helping the decision making process in future situations. This article presents a new data mining adaptive technique called SOM+PSO that can build, from the available information, a reduced set of simple classification rules from which the most significant relations between the features recorded can be derived. These rules operate both on numeric and nominal attributes, and they are built by combining a variation of a population metaheuristic and a competitive neural network. The method proposed was compared with the PART method and measured over 19 databases (mostly from the UCI repository), and satisfactory results were obtained.

Downloads

Download data is not yet available.

References

[1] R. Agrawal and R. Srikant, “Fast algorithms for mining association rules in large databases,” in Proceedings o f the 20th International Confer­ence on Very Large Data Bases, ser. VLDB ’94. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc., 1994, pp. 487-499.
[2] T. Scheffer, “Finding association rules that trade support optimally against confidence,” in Principles o f Data Mining and Knowledge Dis­covery, ser. Lecture Notes in Computer Science, L. Raedt and A. Siebes, Eds. Springer Berlin Heidelberg, 2001, vol. 2168, pp. 424-435.
[3] Y. Ye and C.-C. Chiang, “A parallel apriori algorithm for frequent itemsets mining,” in Pro­ceedings o f the Fourth International Conference on Software Engineering Research, Manage­ment and Applications, ser. SERA ’06. Wash­ington, DC, USA: IEEE Computer Society, 2006, pp. 87-94.
[4] J. R. Quinlan, C4.5: programs fo r machine learning. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc., 1993.
[5] E. Frank and I. H. Witten, “Generating accu­rate rule sets without global optimization,” in Proceedings o f the Fifteenth International Con­ference on Machine Learning, ser. ICML '98. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc., 1998, pp. 144-151.
[6] Z. Wang, X. Sun, and D. Zhang, “A pso-based classification rule mining algorithm,” in Pro­ceedings o f the 3rd International Conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications. With As­pects o f Artificial Intelligence, ser. ICIC ’07. Berlin, Heidelberg: Springer-Verlag, 2007, pp. 377-384.
[7] T. Sousa, A. Silva, and A. Neves, “Particle swarm based data mining algorithms for clas­sification tasks,” Parallel Comput., vol. 30, no. 5-6, pp. 767-783, May 2004.
[8] N. Khan, M. Iqbal, and A. Baig, “Data mining by discrete pso using natural encoding,” in Future Information Technology (FutureTech), 2010 5th International Conference on, 2010, pp. 1- 6.
[9] N. Khan, A. Baig, and M. Iqbal, “A new discrete pso for data classification,” in Infor­mation Science and Applications (ICISA), 2010 International Conference on, 2010, pp. 1-6.
[10] M. Chen and S. Ludwig, “Discrete particle swarm optimization with local search strategy for rule classification,” in Nature and Bio­logically Inspired Computing (NaBIC), 2012 Fourth World Congress on, 2012, pp. 162-167.
[11] Y. Jiang, L. Wang, and L. Chen, “A hybrid dynamical evolutionary algorithm for classifica­tion rule discovery,” in Intelligent Information Technology Application, 2008. IITA ’08. Second International Symposium on, vol. 3, 2008, pp. 76-79.
[12] H. Wang and Y. Zhang, “Improvement of discrete particle swarm classification system,” in Fuzzy Systems and Knowledge Discovery (FSKD), 2011 Eighth International Conference on, vol. 2, 2011, pp. 1027-1031.
[13] L. Yan and J. Zeng, “Using particle swarm op­timization and genetic programming to evolve classification rules,” in Intelligent Control and Automation, 2006. WCICA 2006. The Sixth World Congress on, vol. 1, 2006, pp. 3415­-3419.
[14] A. Ozcift, M. Kaya, A. Gülten, and M. Karabulut, “Swarm optimized organizing map (swom): A swarm intelligence based optimization of self-organizing map,” Expert Systems with Ap­plications, vol. 36, no. 7, pp. 10640 - 10648, 2009.
[15] C. Hung and L. Huang, “Extracting rules from optimal clusters of self-organizing maps,” in Computer Modeling and Simulation, 2010. ICCMS ’10. Second International Conference on, vol. 1, 2010, pp. 382-386.
[16] H. W. and L. L., “Dynamic self-organizing maps,” in XXXI Conf. Latinoamericana de In­formatica, C E LI2005, 2005.
[17] T. Kohonen, “Neurocomputing: foundations of research,” J. A. Anderson and E. Rosenfeld, Eds. Cambridge, MA, USA: MIT Press, 1988, ch. Self-organized formation of topologically correct feature maps, pp. 509-521.
[18] J. B. MacQueen, “Some methods for classi­fication and analysis of multivariate observa­tions,” in Proc. o f the fifth Berkeley Sympo­sium on Mathematical Statistics and Probabil­ity, L. M. L. Cam and J. Neyman, Eds., vol. 1. University of California Press, 1967, pp. 281­-297.
[19] T. Kohonen, M. R. Schroeder, and T. S. Huang, Eds., Self-Organizing Maps, 3rd ed. Secaucus, NJ, USA: Springer-Verlag New York, Inc., 2001.
[20] J. Kennedy and R. C. Eberhart, “Particle swarm optimization,” in Proceedings o f the IEEE In­ternational Conference on Neural Networks, 1995, pp. 1942-1948.
[21] ------ , “A discrete binary version of the particle swarm algorithm,” in Proceedings o f the IEEE International Conference on Systems, Man, and Cybernetics, vol. 5. Washington, DC, USA: IEEE Computer Society, 1997, pp. 4104-4108.
[22] L. Lanzarini, J. Lopez, J. A. Maulini, and A. Giusti, “A new binary pso with velocity control,” in Advances in Swarm Intelligence, ser. Lecture Notes in Computer Science, Y. Tan, Y. Shi, Y. Chai, and G. Wang, Eds. Springer Berlin Heidelberg, 2011, vol. 6728, pp. 111­-119.
[23] G. Venturini, “Sia: A supervised inductive algo­rithm with genetic search for learning attributes based concepts,” in Machine Learning: ECML-93, ser. Lecture Notes in Computer Science, P. Brazdil, Ed. Springer Berlin Heidelberg, 1993, vol. 667, pp. 280-296.
[24] Y. Shi and R. Eberhart, “Parameter selection in particle swarm optimization,” in Evolution­ary Programming VII, ser. Lecture Notes in Computer Science, V. Porto, N. Saravanan, D. Waagen, and A. Eiben, Eds. Springer Berlin Heidelberg, 1998, vol. 1447, pp. 591-600.
[25] J. Kennedy and R. C. Eberhart, Swarm intel­ligence. San Francisco, CA, USA: Morgan Kaufmann Publishers Inc., 2001.
[26] K. Bache and M. Lichman, “UCI machine learning repository,” 2013. [Online]. Available: http://archive.ics.uci.edu/ml

Downloads

Published

2015-04-01

How to Cite

Lanzarini, L. C., Villa Monte, A., & Ronchetti, F. (2015). SOM+PSO. A Novel Method to Obtain Classification Rules. Journal of Computer Science and Technology, 15(01), p. 15–22. Retrieved from https://journal.info.unlp.edu.ar/JCST/article/view/524

Issue

Section

Original Articles

Most read articles by the same author(s)

1 2 > >>