Multifactor Regression Analysis for Cardiac Diagnosis Prediction Based on Amplitude Variability Function

Authors

  • A. S. Sverstiuk Ternopil Ivan Puluj National Technical University
  • L. Ye. Mosiy Ternopil Ivan Puluj National Technical University

Keywords:

amplitude variability function, multifactor regression model, electrocardiographic signal, cardiac diagnosis, stepwise regression, extrasystole, eft bundle branch block, CDPCAVF prediction coefficient, statistical predictors, automatic classification of cardiac pathologies

Abstract

Multifactor regression model for automatic cardiac diagnosis prediction based on statistical characteristics of the amplitude variability function of electrocardiographic signals is proposed. The model employs seven statistically significant predictors (mean, median, mode, standard deviation, sample variance, kurtosis, and skewness) selected by means of stepwise regression with forward variable selection from the initial set of 13 parameters. The predictor inclusion criterion established a significance level of p < 0.05. The model for prediction of cardiac diagnosis, based on the function of the amplitude variability takes into consideration weight coefficients for each predictor, determined by the least squares method. Central tendency measurements make the greatest contribution into diagnostic outcome formation (β = 201.78 for mean and β = 68.69 for median).

The model provides effective differentiation among three clinical states: conditional norm, cardiac rhythm disorders in the form of extrasystole, and morphological conduction abnormalities represented by incomplete left bundle branch block, achieving a Nagelkerke coefficient of determination R² = 0.991. Normal conditions are characterized by minimal amplitude variability values (mathematical expectation 0.00003…0.00064 mV), while extrasystole demonstrates an increase of 3—4 orders of magnitude. Model validation on a representative sample of 204 electrocardiographic signals (102 normal, 51 extrasystole, 51 block) confirmed its high statistical significance (Fisher's F-criterion 953.93, p < 0.001) and compliance with fundamental regression analysis assumptions. Residual analysis demonstrated normal distribution and homoscedasticity, confirming model adequacy.

The proposed approach combines the advantages of high interpretability of classical statistical methods with innovative application of the amplitude variability function for comprehensive analysis of morphological and rhythmic characteristics of cardiac signals. The practical significance lies in creating mathematical tools for automated cardiovascular disease diagnostic systems and clinical decision support systems.

Author Biographies

A. S. Sverstiuk, Ternopil Ivan Puluj National Technical University

Dr. Sc. (Eng.), Professor, Professor of the Chair of Medical Informatics of Ternopil Ivan Horbachevsky National Medical University, Professor of the Chair of Computer Science of Ternopil Ivan Puluj National Technical University

L. Ye. Mosiy, Ternopil Ivan Puluj National Technical University

Post-Graduate Student of the Chair of Computer Science

References

B. Li, and Y. Lian, “A Forecasting Approach for Wholesale Market Agricultural Product Prices Based on Combined Residual Correction,” Applied Sciences, no. 15(10), 5575, 2025. https://doi.org/10.3390/app15105575 .

S. Ali, M. Ali, D. Bhatti, M. S., and B. J. Choi, “Explainable Clustered Federated Learning for Solar Energy Forecasting,” Energies, no. 18(9), 2380, 2025. https://doi.org/10.3390/en18092380 .

H. Zhang, Y. Liu, C. Zhang, and N. Li, “Machine Learning Methods for Weather Forecasting: a Survey,” Atmosphere, no. 16(1), pp. 82, 2025. https://doi.org/10.3390/atmos16010082 .

H. Hewamalage, K. Ackermann, and C. Bergmeir,. Forecast evaluation for data scientists: common pitfalls and best practices. Data Mining and Knowledge Discovery, no. 37(2), pp. 788-832, 2022. https://doi.org/10.1007/s10618-022-00894-5 .

G. Skenderi, C. Joppi, M., and M. Cristani, “On the Use of Learning-Based Forecasting Methods for Ameliorating Fashion Business Processes: A Position Paper,” In Lecture Notes in Computer Science, pp. 647-659, 2023. Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-37742-6_50 .

J. Kaur, K. S. Parmar, and S. Singh, “Autoregressive models in environmental forecasting time series: a theoretical and application review” Environmental Science and Pollution Research, no. 30(8), pp. 19617-19641, 2023. https://doi.org/10.1007/s11356-023-25148-9

M. F. Rizvi, “ARIMA Model Time Series Forecasting,” International Journal for Research in Applied Science and Engineering Technology, no. 12(5), pp. 3782-3785, 2024. https://doi.org/10.22214/ijraset.2024.62416 .

“Vector Autoregressive Models for Multivariate Time Series. (n.d.),” in Modeling Financial Time Series with S-PLUS®, pp. 385-429. Springer New York. https://doi.org/10.1007/978-0-387-32348-0_11 .

M. Di Mauro, G. Galatro, F. Postiglione, W. Song, and A. Liotta, “Multivariate Time Series Characterization and Forecasting of VoIP Traffic in Real Mobile Networks,” IEEE Transactions on Network and Service Management, no. 21(1), pp. 851-865, 2024. https://doi.org/10.1109/tnsm.2023.3295748 .

S. Verstyuk, “Modeling Multivariate Time Series in Economics: From Auto-Regressions to Recurrent Neural Networks,” SSRN Electronic Journal, 2019. https://doi.org/10.2139/ssrn.3357211 .

M. Khairalla, X. Ning, and N. AL‐Jallad, “Modelling and optimisation of effective hybridisation model for time‐series data forecasting,” The Journal of Engineering, no. 2018 (2), pp. 117-122, 2018. https://doi.org/10.1049/joe.2017.0337 .

Z. Fang, S. Yang, C. Lv, S. An, and W. Wu, “Application of a data-driven XGBoost model for the prediction of COVID-19 in the USA: a time-series study,” BMJ Open, no. 12 (7), e056685, 2022. https://doi.org/10.1136/bmjopen-2021-056685 .

O. S. Alamu,, and M. K. Siam, “Stock Price Prediction and Traditional Models: An Approach to Achieve Short-, Medium- and Long-Term Goals,” Journal of Intelligent Learning Systems and Applications, no. 16(04), pp. 363-83, 2024. https://doi.org/10.4236/jilsa.2024.164018 .

E. Buchweitz, S. Ahal, O. Papish, and G. Adini, Two-Stage Regularization of Pseudo-Likelihood Estimators with Application to Time Series (Version 2). arXiv, 2020. https://doi.org/10.48550/ARXIV.2007.11306 .

E. R. Isaac, and B. Singh, QBSD: Quartile-Based Seasonality Decomposition for Cost-Effective RAN KPI Forecasting (Version 3). arXiv, 2023. https://doi.org/10.48550/ARXIV.2306.05989 .

Z. Hossain, A. Rahman, M. Hossain, andJ. H. Karami, “Over-Differencing and Forecasting with Non-Stationary Time Series Data,” Dhaka University Journal of Science, no. 67(1), pp. 21-26, 2019.https://doi.org/10.3329/dujs.v67i1.54568

Ilkka Karanta, “Optimization methods in time series interpolation,” Communications in Statistics - Simulation and Computation, no. 22 (4), 1pp. 181-1203, 1993. https://doi.org/10.1080/03610919308813148 .

J. F. Torres, D. Hadjout, A. Sebaa, F. Martínez-Álvarez, and A. Troncoso, “Deep Learning for Time Series Forecasting: A Survey,” Big Data, no. 9(1), pp. 3-21, 2021. https://doi.org/10.1089/big.2020.0159 .

Siami-Namini, S., Tavakoli, N., and A. S. Namin, A Comparative Analysis of Forecasting Financial Time Series Using ARIMA, LSTM, and BiLSTM (Version 1), arXiv, 2019.https://doi.org/10.48550/ARXIV.1911.09512 .

S.-K. Sim, P. Maass, and P. G. Lind, “Wind Speed Modeling by Nested ARIMA Processes,” Energies, no. 12(1), pp. 69, 2018. https://doi.org/10.3390/en12010069 .

B. Lim, and S. Zohren, “Time-series forecasting with deep learning: a survey. Philosophical Transactions of the Royal Society,” Mathematical, Physical and Engineering Sciences, no. 379(2194), 20200209, 2021. https://doi.org/10.1098/rsta.2020.0209 .

T. Hall, and K. Rasheed, “A Survey of Machine Learning Methods for Time Series Prediction,” Applied Sciences, 15(11), 5957, 2025. https://doi.org/10.3390/app15115957 .

R. P. Masini, M. C. Medeiros, and E. F. Mendes, Machine Learning Advances for Time Series Forecasting (Version 3), arXiv. 2020. https://doi.org/10.48550/ARXIV.2012.12802 .

A. Sanghani, N. Bhatt, and N. C. Chauhan, A Review of Soft Computing Techniques for Time Series Forecasting. Indian Journal of Science and Technology, 9(S1), 2016. https://doi.org/10.17485/ijst/2016/v9is1/99604 .

Ü. Ç. Büyükşahin, and Ş. Ertekin, “Improving forecasting accuracy of time series data using a new ARIMA-ANN hybrid method and empirical mode decomposition,” Neurocomputing, no. 361, pp. 151-163, 2019. https://doi.org/10.1016/j.neucom.2019.05.099 .

L. Zhang, W. Bian, W. Qu, L. Tuo, and Y. Wang, “Time series forecast of sales volume based on XGBoost,” Journal of Physics: Conference Series, no. 1873(1), 012067, 2021. https://doi.org/10.1088/1742-6596/1873/1/012067 .

E. Vaiciukynas, P. Danenas, V. Kontrimas, and R. Butleris, “Two-Step Meta-Learning for Time-Series Forecasting Ensemble,” IEEE Access, no. 9, pp. 62687-62696, 2021. https://doi.org/10.1109/access.2021.3074891 .

M. Zanotti, The Cost of Ensembling: Is it Always Worth Combining? Elsevier BV, 2025. https://doi.org/10.2139/ssrn.5287421

P. Lara-Benítez, M. Carranza-García, and J. C. Riquelme, “An Experimental Review on Deep Learning Architectures for Time Series Forecasting,” International Journal of Neural Systems, no. 31(03), 2130001, 2021. https://doi.org/10.1142/s0129065721300011 .

A. Navon, and Y. Keller, Financial Time Series Prediction Using Deep Learning (Version 1), arXiv, 2017. https://doi.org/10.48550/ARXIV.1711.04174 .

I. Chaikovsky, et al., “Artificial intelligence in monitoring and correction of functional state based on electrocardiosigna,” in Advances in Artificial Intelligence, pp. 237-293, 2024. Elsevier . https://doi.org/10.1016/b978-0-443-19073-5.00015-x

S. M. Koval, et al., Classification of nonstationary cardiac signals based on their spectral and probabilistic properties. In Information Technology in Medical Diagnostics I, pp. 267-274, 2019. CRC Press. https://doi.org/10.1201/9780429057618-31 .

S. Lupenko, Ia. Lytvynenko, A. Sverstiuk, B. Shelestovskyi, A. Horkunenko, “Software for statistical processing and modeling of synchronously registered cardio signals,” The Fourth International Workshop on Computer Modeling and Intelligent Systems (CMIS-2021). 27 April 2021, Zaporizhzhia, Ukraine. p. 194-205.

Y. Leshchyshyn, L. Scherbak, O. Nazarevych, V. Gotovych, P. Tymkiv, and G. Shymchuk, “Multicomponent Model of the Heart Rate Variability Change-point,” in 2019 IEEE XVth International Conference on the Perspective Technologies and Methods in MEMS Design (MEMSTECH), pp. 110-113, 2019. IEEE. https://doi.org/10.1109/memstech.2019.8817379 .

Я. П. Драґан, Ю. Б. Паляниця, О. В. Гевко, і І. Ю. Дедів, «Обґрунтування структури системи дистанційної діагностики адаптаційних резервів серця,» Науковий вісник НЛТУ України, зб. наук.-техн. праць, (25.10), с. 255-259, 2015.

Я. Драґан, «Математичне й алгоритмічно-програмне забезпечення комп’ютерних засобів статистичного опрацювання коливань (ритмічних процесів),» Вісник Нац. ун-ту «Львівська політехніка», № 621, с. 124-130, 2008.

A. Sverstiuk, L. Mosiy, “Information technology for electrocardiographic signal analysis based on mathematical models of temporal and amplitude variability,” Computer systems and information technologies, no. 2, pp. 36-44, 2025. https://doi.org/10.31891/csit-2025-2-4 .

Abstract views: 1

Published

2025-10-10

How to Cite

[1]
A. S. Sverstiuk and L. Y. Mosiy, “Multifactor Regression Analysis for Cardiac Diagnosis Prediction Based on Amplitude Variability Function”, Вісник ВПІ, no. 4, pp. 136–145, Oct. 2025.

Issue

Section

Information technologies and computer sciences

Metrics

Downloads

Download data is not yet available.