Quick Hidden Layer Size Tuning in ELM for Classification Problems
Abstract
The extreme learning machine is a fast neural network with outstanding performance. However, the selection of an appropriate number of hidden nodes is time-consuming, because training must be run for several values, and this is undesirable for a real-time response. We propose to use moving average, exponential moving average, and divide-and-conquer strategies to reduce the number of training’s required to select this size. Compared with the original, constrained, mixed, sum, and random sum extreme learning machines, the proposed methods achieve a percentage of time reduction up to 98\% with equal or better generalization ability.
References
Al-Btoush, A., Abbadi, M., Hassanat, A., Tarawneh, A., Hasanat, A., and Prasath, S. New features for eye-tracking systems: Preliminary results. In Intl Conf Inf Comm Syst (2019), IEEE, pp. 179–184.
Al-Btoush, A., Fern´andez-Delgado, M., Cernadas, E., and Barro, S. Extreme learning machine with confidence interval based bias initialization. In 2021 Second International Conference on Intelligent Data Science Technologies and Applications (IDSTA) (2021), IEEE, pp. 23–30.
Al Nawaiseh, A. J., Albtoush, A., Al Nawaiseh, S. J., et al. Evaluate database management system quality by analytic hierarchy process (ahp) and simple additive weighting (saw) methodolog. MENDEL 28, 2 (2022), 67–75.
Albtoush, A., Fernandez-Delgado, M., Cernadas, E., and Barro, S. Quick extreme learning machine for large-scale classification. Neural Computing and Applications 34, 8 (2022), 5923–5938.
Cao, L., Yue, Y., Zhang, Y., and Cai, Y. Improved crow search algorithm optimized extreme learning machine based on classification algorithm and application. IEEE Access 9 (2021), 20051–20066.
Dani, Y., Gunawan, A. Y., Khodra, M. L., and Indratno, S. W. Detecting outliers using modified recursive pca algorithm for dynamic streaming data. MENDEL 29, 2 (2023), 237–244.
Deng, W., Bai, Z., Huang, G., and Zheng, Q. A fast SVD-hidden-nodes based extreme learning machine for large-scale data analytics. Neural Netw 77 (2016), 14–28.
Elsheikh, A., Saba, A., Elaziz, M., Lu, S., et al. Deep learning-based forecasting model for COVID-19 outbreak in Saudi Arabia. Process Saf Environ 149 (2021), 223–233.
Feng, G., Huang, G.-B., Lin, Q., and Gay, R. Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE T Neur Netw 20, 8 (2009), 1352–1357.
Hsieh, C.-J., Si, S., and Dhillon, I. A divideand- conquer solver for kernel support vector machines. In Intl Conf Mach Learn (2014), pp. 566–574.
Huang, G.-B., and Chen, L. Convex incremental extreme learning machine. Neurocomputing 70, 16-18 (2007), 3056–3062.
Huang, G.-B., and Chen, L. Enhanced random search based incremental extreme learning machine. Neurocomputing 71, 16-18 (2008), 3460– 3468.
Huang, G.-B., Zhu, Q.-Y., Mao, K., Siew, C.-K., Saratchandran, P., and Sundararajan, N. Can threshold networks be trained directly? IEEE T Circuits Syst II: Express Briefs 53, 3 (2006), 187–191.
Huang, G.-B., Zhu, Q.-Y., and Siew, C.-K. Extreme learning machine: a new learning scheme of feedforward neural networks. In IEEE Intl J Conf Neur Netw (2004), vol. 2, pp. 985–990.
Huang, G.-B., Zhu, Q.-Y., and Siew, C.-K. Extreme learning machine: theory and applications. Neurocomputing 70, 1-3 (2006), 489–501.
Huang, Y., and Lai, D. Hidden node optimization for extreme learning machine. AASRI Procedia 3 (2012), 375–380. Conf on Modelling, Identification and Control.
Lai, J., Wang, X., Li, R., Song, Y., and Lei, L. BD-ELM: a regularized extreme learning machine using biased dropconnect and biased dropout. Math Probl Eng 2020 (2020).
Lan, Y., Soh, Y. C., and Huang, G.-B. A constructive enhancement for online sequential extreme learning machine. In Intl J Conf Neural Netw (2009), Ieee, pp. 1708–1713.
Mahmoudi, M. R., and Baroumand, S. Modeling the stochastic mechanism of sensor using a hybrid method based on seasonal autoregressive integrated moving average time series and generalized estimating equations. ISA Transactions (2021), 300–305.
Marques, N. C., and Gomes, C. Implementing an intelligent moving average with a neural network. In Proc European Conf Artif Intel. IOS Press, 2010, pp. 1129–1130.
Miche, Y., Sorjamaa, A., Bas, P., Simula, O., Jutten, C., and Lendasse, A. OP-ELM: optimally pruned extreme learning machine. IEEE T Neur Netw 21, 1 (2009), 158–162.
Rong, H.-J., Ong, Y.-S., Tan, A.-H., and Zhu, Z. A fast pruned-extreme learning machine for classification problem. Neurocomputing 72, 1-3 (2008), 359–366.
Schuler, J. P. S., Romani, S., Abdel- Nasser, M., Rashwan, H., and Puig, D. Grouped pointwise convolutions reduce parameters in convolutional neural networks. MENDEL 28, 1 (2022), 23–31.
Sheela, K. G., and Deepa, S. N. Review on methods to fix number of hidden neurons in neural networks. Math Probl Eng 2013 (2013), 1–11.
Simila, T., and Tikka, J. Multiresponse sparse regression with application to multidimensional scaling. In Intl Conf Artif Neural Netw (2005), Springer, pp. 97–102.
Song, S., Wang, M., and Lin, Y. An improved algorithm for incremental extreme learning machine. Syst Sci & Control Eng 8, 1 (2020), 308–317.
Suresh, S., Saraswathi, S., and Sundararajan, N. Performance enhancement of extreme learning machine for multi-category sparse data classification problems. Engin Appl Artif Intel 23, 7 (2010), 1149–1157.
Uribe, I. M. Predictive model of the enso phenomenon based on regression trees. MENDEL 29, 1 (2023), 7–14.
Yu, D., and Deng, L. Efficient and effective algorithms for training single-hidden-layer neural networks. Pattern Recogn Lett 33, 5 (2012), 554–558.
Zhu, Q.-Y., Qin, A. K., Suganthan, P. N., and Huang, G.-B. Evolutionary extreme learning machine. Patt Recogn 38, 10 (2005), 1759–1763.
Zhu, W., Miao, J., and Qing, L. Constrained extreme learning machines: A study on classification cases. arXiv:1501.06115, 2015.
Copyright (c) 2024 MENDEL
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
MENDEL open access articles are normally published under a Creative Commons Attribution-NonCommercial-ShareAlike (CC BY-NC-SA 4.0) https://creativecommons.org/licenses/by-nc-sa/4.0/ . Under the CC BY-NC-SA 4.0 license permitted 3rd party reuse is only applicable for non-commercial purposes. Articles posted under the CC BY-NC-SA 4.0 license allow users to share, copy, and redistribute the material in any medium of format, and adapt, remix, transform, and build upon the material for any purpose. Reusing under the CC BY-NC-SA 4.0 license requires that appropriate attribution to the source of the material must be included along with a link to the license, with any changes made to the original material indicated.