The mixture of multiple regression equations: open problems
Abstract
In this article multiple regression equations are considered. The study is based on a sample that is influenced by the external environment. This external environment is represented in the form of factors that influence the main sample. The sample is divided into parts and a~multiple regression equation is constructed for each part. We construct a mixture of regression equations. There are posed open problems concerning determinination of the coefficients of mixture of nonlinear regression equations via lasso, ridge and elastic regression estimators.
References
R. Maiboroda, V. Miroshnychenko, O. Sugakova, Quantile estimators for regression errors in mixture models with varying concentrations, Bulletin of Taras Shevchenko National University of Kyiv. Physical and Mathematical Sciences, 78 (2024), №1, 45–50. https://doi.org/10.17721/1812-5409.2024/1.8
R. Maiboroda, V. Miroshnychenko, Asymptotic normality of modified LS estimator for mixture of nonlinear regressions, Modern Stochastics: Theory and Applications, 7 (2020) №4, 435–448. https://doi.org/10.15559/20-VMSTA167
R. Maiboroda, O. Sugakova, Estimation and classification by observations from a mixture, Kyiv University, Kyiv, 2008. (in Ukrainian)
V.O. Miroshnychenko, Residual analysis in regression mixture model, Bulletin of Taras Shevchenko National University of Kyiv, Series. Physics and Mathematics, 3 (2019), №3, 8–16. https://doi.org/10.17721/1812-5409.2019/3.1
B. Grun, F. Leisch, Fitting finite mixtures of linear regression models with varying & fixed effects in R. In Alfredo Rizzi and Maurizio Vichi (Eds.), Compstat 2006, Proceedings in Computational Statistics, Heidelberg: Physica Verlag, 2006, 853–860.
Ya.I. Yeleyko, O.A. Yarova, Mixture of distributions based on the Markov chain, Cybernetics and System Analysis, 58 (2022), №5, 754–757. https://doi.org/10.1007/s10559-022-00508-4
M. Gruber, Improving efficiency by shrinkage: The James–Stein and Ridge regression estimators, CRC Press, part 2, 1998.
Y. Jiang, Variable selection with prior information for generalized linear models via the prior lasso method, J. Amer. Stat. Assoc., 111 (2016), №513, 355–376. https://doi/10.1080/01621459.2015.1008363
R. Tibshirani, Regression shrinkage and selection via the lasso, J. Royal Stat. Soc. Series B (methodological), 58 (1996), №1, 267–288. https://doi.org/10.1111/j.2517-6161.1996.tb02080.x
A.K. Md. Ehsanes Saleh, M. Arashi, B.M. Kibria, Theory of Ridge regression estimation with applications, New York: John Wiley & Sons, 2019.
J.K. Tay, B. Narasimhan, T. Hastie, Elastic net regularization paths for all generalized linear models, J. Stat. Software, 106 (2023), №1. https://doi.org/10.18637/jss.v106.i01
H. Zou, T. Hastie, Regularization and variable selection via the elastic net, J. Royal Stat. Soc. Series B (statistical Methodology), Wiley, 67 (2025), №2, 301–320. https://doi.org/10.1111/j.1467-9868.2005.00503.x
Copyright (c) 2025 T. Ya. Yeleyko, O. A. Yarova

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Matematychni Studii is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) license.