Navegar por los elementos (1 total)
- Resumen es exacto "Cardiovascular diseases (CVD) are one of the leading causes of death in the world. According to data from the World Health Organization (WHO), in 2018 in Argentina, 35% of all registered deaths were due to CVD. These pathologies are manifested as alterations in the electrical behaviour of the heart, which can be detected and classified according to alterations in the shape of an electrocardiogram (ECG) trace. From this, problems such as arrhythmias, ventricular blocks, fibrillation, ischemia, infarction, among others, can be identified. The engineering field has developed and adapted a large number of methods for processing and analyzing ECG signals, making great strides in the detection and early diagnosis of CVDs. However, in some cases the lack of specific databases make it difficult to optimize computational techniques, resulting in an increase of the number of false positives and a low specificity when a computer-assisted diagnosis is made. One solution to this type of problems have been the creation of mathematical/computational modeling systems for cardiac electrical activity, which make it possible to recreate diverse and complex electrophysiological situations. This document is composed of three papers that used as starting point the ECG signal. The first paper discusses the problem of data loss in Heart Rate Variability (HRV) time series obtained from electrocardiographic recordings, measured in laboratory rats with different cardiovascular conditions. This kind of problem has a several impact on calculation of HRV indices used to stratify cardiovascular risk. As initial step, we proceeded to detect and quantify heartbeat losses and determine the number and occurrence of these artifacts in our database. Throught this information we built a heartbeat loss generator model and test some artifactcorrection algorithms. Five correction methods were used to deal with this problem: Deletion, Linear Interpolation, Cubic Spline interplation, modified Average Moving Window and Nonlinear Predictive Interpolation. The performance of the above correction methods was evaluated through the calculation of different HRV indices, which have been widely used in several investigations to detect and estimate cardiac risk. In this work, such indices were calculated from HRV signals without the presence of losses and after the respective artifact corrections. From the results we found, we can establish that at low levels of information losses most of the correction methods solve the problem efficiently; obtaining very similar HRV indices in all the analyzed situations. However, when we model a high percentage of missing heartbeats, we observe a low performance in almost all correction techniques, except for the nonlinear predictive interpolation method. For this reason, and thanks to the possibility of modeling different types of losses, we can establish that the previous algorithm is the most effective method to recover lost heartbeats in HRV series, without altering or adding information in its linear and nonlinear parameters. The second and third papers present two approaches developed for modeling and synthesis of ECG signals with myocardial ischemia due to occlusion of one of the main coronary arteries. First, the modeling system known as ECGSyn was used, which is based on a system of coupled nonlinear differential equations supplied with specific parameters (width: ai, height: bi, location: ti) extracted from the P, Q, R, S and T waves of the ECG signal. In this case, with the participation of an expert cardiologist, electrocardiographic signals were selected from occlusions in the right coronary artery and without presence of other cardiovascular pathologies. Then, all heartbeats were segmented and submitted to a feature extraction process by a nonlinear fitting procedure that use the sum of 5 Gaussian functions and the nonlinear gradient descent method. All the characteristic parameters were adjusted by means of normal distribution functions and thus represented the population behavior of the analyzed subjects. In this way, it was possible to generate random parameters for the ECGSyn model and generate artificial heartbeats with ischemic features. Finally, we verified that the simulated beats are highly comparable with the real signals, presenting morphological dynamics that follow the same trend as the real heartbeats and with variations of the ST-T complex in concordance with the observations from the original database. The previous modeling approach showed some problems in the data characterization and heartbeat generation stages for high levels of ischemia, introducing important morphological alterations. In order to find a solution, a Gaussian mixture model (GMM) was used to learn the dynamics of the data and generate ECG signals from patients who suffered occlusion in the right and left anterior descending coronary arteries. All the information was classified according to the involved coronary artery and its corresponding ECG leads; then, this data was used to train an validate each GMM. Final step consisted of generating artificial heartbeats for the different coronary arteries involved in this work. The results showed a better performance in comparisson to the previous model, ECGSyn, as it permitted the generation of more complex morphologies and the inclusion of new occlusion sites. We can conclude that through the modeling of heartbeat loss it is possible to quantitatively compare methods used for artifact correction in RR series. In addition, it was possible to generate electrocardiographic signals with ischemic morphology. The latter allow the training and validation of automatic detection systems, as well as the optimization of algorithms for noise suppression in this kind of biomedical recordings."
Ordenar por: