
Nikiforov
Vladimir O.
D.Sc., Prof.
doi: 10.17586/2226-1494-2025-25-1-169-173
Application of the dynamic regressor extension and mixing approach in machine learning on the example of perceptron
Read the full article

For citation:
Abstract
This paper explores the application of the Dynamic Regressor Extension and Mixing method to improve the learning speed in machine learning tasks. The proposed approach is demonstrated using a perceptron applied to regression and binary classification problems. The method transforms a multi-parameter optimization problem into a set of independent scalar regressions, significantly accelerating the convergence of the algorithm and reducing computational costs. Results from computer simulations, including comparisons with stochastic gradient descent and Adam methods, confirm the advantages of the proposed approach in terms of convergence speed and computational efficiency.
Acknowledgements. The research was supported by the Ministry of Science and Higher Education of the Russian Federation (project No. FSER-2025-0002).
References
- Rosenblatt F. The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Review, 1958. vol. 65, no. 6, pp. 386–408. https://doi.org/10.1037/h0042519
- Haykin Simon S. Neural Networks: A Comprehensive Foundation. Macmillan, 1994, 696 p.
- Karthick K. Comprehensive overview of optimization techniques in machine learning training. Control Systems and Optimization Letters, 2024, vol. 2, no. 1, pp. 23–27. https://doi.org/10.59247/csol.v2i1.69
- Reyad M., Sarhan A.M., Arafa M. A modified Adam algorithm for deep neural network optimization. Neural Computing and Applications, 2023, vol. 35, no. 23, pp. 17095–17112. https://doi.org/10.1007/s00521-023-08568-z
- Wang Y., Xiao Z., Cao G. A convolutional neural network method based on Adam optimizer with power-exponential learning rate for bearing fault diagnosis. Journal of Vibroengineering, 2022, vol. 24, no. 4, pp. 666–678. https://doi.org/10.21595/jve.2022.22271
- Liu M., Yao D., Liu Z., Guo J., Chen J. An improved Adam optimization algorithm combining adaptive coefficients and composite gradients based on randomized block coordinate descent. Computational Intelligence and Neuroscience, 2023, vol. 10, no. 1, pp. 4765891. https://doi.org/10.1155/2023/4765891
- Demidenko E.Z. Linear and Nonlinear Regression. Moscow, Finstat Publ., 1981, 302 p. (in Russian)
- Aranovskiy S., Bobtsov A., Ortega R., Pyrkin A. Performance enhancement of parameter estimators via dynamic regressor extension and mixing. IEEE Transactions on Automatic Control, 2017, vol. 62, no. 7. https://doi.org/10.1109/tac.2016.2614889
- Ljung L. System Identification: Theory for the User. Prentice-Hall, 1987, 519 p.