Academic literature on the topic 'MIT/BIH data base'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'MIT/BIH data base.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "MIT/BIH data base"

1

G., K. Singh, Sharma A., and Velusami S. "Automatic Detection of diagnostic features using real-time ECG signals: Application to patients prone to Cardiac Arrhythmias." International Journal of BioSciences and Technology (IJBST) ISSN: 0974-3987 2, no. 7 (2009): 96–125. https://doi.org/10.5281/zenodo.1436599.

Full text
Abstract:
<strong>ABSTRACT</strong> A composite method for the automatic detection of diagnostic features related to the depolarization sequence (P-QRS complex) of the heart, for arrhythmia classification, using single lead ECG is presented. The non-syntactic approach based upon slope and amplitude thresholds along with a set of empirical criteria is employed for segmenting QRS complexes from a variety of noisy ECG recordings acquired from the MIT/BIH arrhythmia database. The background noise is removed from the non-QRS portions using an appropriate filtering method that causes no change in the amplitudes or boundaries of P and T waves. In R-R intervals, the isoelectric line is determined by developing a technique based upon the method of least squares approximation. Amplitude threshold bands are set up above and below the isoelectric line to detect P and /or T wave peaks. Using a group of decision logic rules, framed on the basis of an exhaustive study of normal and arrhythmic ECG signals and detailed consultations with two independent cardiologists, the P waves are discriminated from the T waves.&nbsp; In total, 37 useful diagnostic features have been deduced pertaining to the QRS and P waves in the time domain. The QRS detection algorithm of the composite method was validated using 32,800 beats of several records of the MIT/BIH database for which a detection accuracy of 99.96% was achieved within the tolerance limits recommended by the CSE Working Party. The composite method for detection of both P and QRS wave features has been validated using all the 25 records for lead II of the CSE Dataset-3, before being applied to approximately 8000 beats of the MIT/BIH database. As regards noisy signals, only those were analysed that had a baseline wander not exceeding 0.25 Hz<strong>.</strong> &nbsp; <strong>Keywords: </strong>Arrhythmia, ECG signal, P-QRS complex, MIT/BIH data base, atrio-ventricular conduction ratio, diagnostic feature &nbsp; http://www.ijbst.org/Home/papers-published/ijbst-2009-volume-2-issue-7
APA, Harvard, Vancouver, ISO, and other styles
2

Ziti Fariha Mohd Apandi, Ryojun Ikeura, Soichiro Hayakawa, and Shigeyoshi Tsutsumi. "QRS Detection Based on Discrete Wavelet Transform for ECG Signal with Motion Artifacts." Journal of Advanced Research in Applied Sciences and Engineering Technology 40, no. 1 (2024): 118–28. http://dx.doi.org/10.37934/araset.40.1.118128.

Full text
Abstract:
Motion artifacts in ECG signals recorded during physical exercises activities can affect the diagnosis of arrhythmia. To minimize the faults in arrhythmia detection, it was important to choose accurate algorithm for detecting QRS in ECG signal with noises produced during physical movements of the patients. Therefore, choosing the QRS detection algorithm with good competency for the signal affected by noises and motion artifacts is needed for arrhythmia detection analysis. The QRS detection based on Discrete Wavelet Transform was implemented and presented in this paper. The performance of the algorithm was assessed using the MIT-BIH Arrhythmia Database and MIT-BIH Noise Stress Database. For the MIT-BIH Arrhythmia database, the average Sensitivity (Se) and positive Predictivity (+P) of the algorithm were 98.24% and 98.61%, respectively. The algorithms had a lower average false negative rate (FNR) than the pan Tompkins algorithm when applied to the MIT-BIH noise stress test database, which was 0.033% for record 118 and 0.032% for record 119, respectively. The results demonstrated that the algorithms perform well when dealing with arrhythmia data and motion artifact at various levels of signal to noise ratio.
APA, Harvard, Vancouver, ISO, and other styles
3

Yan, Wei, and Zhen Zhang. "Online Automatic Diagnosis System of Cardiac Arrhythmias Based on MIT-BIH ECG Database." Journal of Healthcare Engineering 2021 (December 16, 2021): 1–9. http://dx.doi.org/10.1155/2021/1819112.

Full text
Abstract:
Arrhythmias are a relatively common type of cardiovascular disease. Most cardiovascular diseases are often accompanied by arrhythmias. In clinical practice, an electrocardiogram (ECG) can be used as a primary diagnostic tool for cardiac activity and is commonly used to detect arrhythmias. Based on the hidden and sudden nature of the MIT-BIH ECG database signal and the small-signal amplitude, this paper constructs a hybrid model for the temporal correlation characteristics of the MIT-BIH ECG database data, to learn the deep-seated essential features of the target data, combine the characteristics of the information processing mechanism of the arrhythmia online automatic diagnosis system, and automatically extract the spatial features and temporal characteristics of the diagnostic data. First, a combination of median filter and bandstop filter is used to preprocess the data in the ECG database with individual differences in ECG waveforms, and there are problems of feature inaccuracy and useful feature omission which cannot effectively extract the features implied behind the massive ECG signals. Its diagnostic algorithm integrates feature extraction and classification into one, which avoids some bias in the feature extraction process and provides a new idea for the automatic diagnosis of cardiovascular diseases. To address the problem of feature importance variability in the temporal data of the MIT-BIH ECG database, a hybrid model is constructed by introducing algorithms in deep neural networks, which can enhance its diagnostic efficiency.
APA, Harvard, Vancouver, ISO, and other styles
4

YANG, GUANGYING. "ELECTROCARDIOGRAM ARRHYTHMIA PATTERN RECOGNITION BASED ON AN IMPROVED WAVELET NEURAL NETWORK." Journal of Mechanics in Medicine and Biology 13, no. 01 (2013): 1350018. http://dx.doi.org/10.1142/s0219519413500188.

Full text
Abstract:
Electrocardiography (ECG) is a transthoracic interpretation of the electrical activity of the heart over a period of time, as detected by electrodes attached to the outer surface of the skin and recorded by a device external to the body. ECG signal classification is very important for the clinical detection of arrhythmia. This paper presents an application of an improved wavelet neural network structure to the classification of the ECG beats, because of the high precision and fast learning rate. Feature extraction method in this paper is wavelet transform. Our experimental data set is taken from the MIT-BIH arrhythmia database. The correct detection rate of QRS wave is 95% by testing the data of MIT-BIH database. The proposed methods are applied to a large number of ECG signals consisting of 600 training samples and 120 test samples from the MIT-BIH database. The samples equally represent six different ECG signal types, including normal beat, atrial premature beat, ventricular premature beat, left bundle branch block, right bundle branch block and paced beat. In comparison with pattern recognition methods of BP neural networks, RBF neural networks and Support Vector Machines (SVM), the results in this experiment prove that the wavelet neural network method has a better recognition rate when classifying electrocardiogram signals. The experimental results prove that supposed method in this paper is effective for arrhythmia pattern recognition field.
APA, Harvard, Vancouver, ISO, and other styles
5

Auliya, Ghina, and Jannes Effendi. "Detection of Atrial Fibrillation Based on Long Short-Term Memory." Computer Engineering and Applications Journal 10, no. 1 (2021): 21–31. http://dx.doi.org/10.18495/comengapp.v10i1.361.

Full text
Abstract:
Atrial fibrillation is a quivering or irregular heartbeat (arrhythmia) that can lead to blood clots, stroke, heart failure, and even sudden cardiac death. This study used several public datasets of electrocardiogram (ECG) signals, including MIT-BIH Atrial Fibrillation, China Physiological Signal Challenge 2018, MIT-BIH Normal Sinus Rhythm based on QT-Database, and Fantasia Database. All datasets were divided into 3 cases with the experiment windows size 10, 5, and 2 seconds for two classes, namely Normal and Atrial Fibrillation. The recurrent neural networks method is appropriate for processing sequential data such as ECG signals, and k-fold Cross-Validation can help evaluate models effectively to achieve high performance. Overall, LSTM performance achieved accuracy, sensitivity, specificity, precision, F1-score, is 94.56% 94.67%, 94.67%, 94.43%, and 94.51%.
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Ludi, Xiaoguang Zhou, Ying Xing, and Siqi Liang. "A Fast and Simple Adaptive Bionic Wavelet Transform: ECG Baseline Shift Correction." Cybernetics and Information Technologies 16, no. 6 (2016): 60–68. http://dx.doi.org/10.1515/cait-2016-0077.

Full text
Abstract:
Abstract An ECG baseline shift correction method is presented on the base of the adaptive bionic wavelet transform. After modifying the bionic wavelet transform according to the characteristics of the ECG signal, we propose a novel adaptive BWT algorithm. Using the contaminated and actual data in the MIT-BIH database, the method of fast and simple adaptive bionic wavelet transform can effectively correct the baseline shift under the premise of maintaining the geometric characteristics of the ECG signal. Evaluation of the proposed method shows that the average improvement SNR of FABWT is 2.187 dB more than the CWT-based best case result.
APA, Harvard, Vancouver, ISO, and other styles
7

João Vitor Mendes Pinto dos Santos and Thamiles Rodrigues de Melo. "Machine Learning-Based Cardiac Arrhythmia Detection in Electrocardiogram Signals." JOURNAL OF BIOENGINEERING, TECHNOLOGIES AND HEALTH 7, no. 2 (2024): 113–16. http://dx.doi.org/10.34178/jbth.v7i2.378.

Full text
Abstract:
The cardiovascular system is vital for human physiology, regulating blood circulation. Cardiovascular Diseases (CVDs), including cardiac arrhythmias, can disrupt the heartbeat rhythm, impacting blood circulation. Black-box computational modeling of this system can facilitate the development of novel methods and devices to assist in diagnosing and treating CVDs. Artificial Neural Networks (ANNs) represent an effective black-box approach. Implementation involves selecting a database, separating training and test sets, and defining the model structure. The MIT-BIH database is commonly utilized to train computational models to detect cardiac arrhythmias. However, preliminary results with the ANN model trained using MIT-BIH data failed to meet the expected objectives, presenting numerous challenges. Nonetheless, given its nascent stage, there remains potential for optimizations, rendering it a prospective tool for diagnosing cardiac arrhythmias.
APA, Harvard, Vancouver, ISO, and other styles
8

Mathunjwa, Bhekumuzi M., Yin-Tsong Lin, Chien-Hung Lin, Maysam F. Abbod, Muammar Sadrawi, and Jiann-Shing Shieh. "ECG Recurrence Plot-Based Arrhythmia Classification Using Two-Dimensional Deep Residual CNN Features." Sensors 22, no. 4 (2022): 1660. http://dx.doi.org/10.3390/s22041660.

Full text
Abstract:
In this paper, an effective electrocardiogram (ECG) recurrence plot (RP)-based arrhythmia classification algorithm that can be implemented in portable devices is presented. Public databases from PhysioNet were used to conduct this study including the MIT-BIH Atrial Fibrillation Database, the MIT-BIH Arrhythmia Database, the MIT-BIH Malignant Ventricular Ectopy Database, and the Creighton University Ventricular Tachyarrhythmia Database. ECG time series were segmented and converted using an RP, and two-dimensional images were used as inputs to the CNN classifiers. In this study, two-stage classification is proposed to improve the accuracy. The ResNet-18 architecture was applied to detect ventricular fibrillation (VF) and noise during the first stage, whereas normal, atrial fibrillation, premature atrial contraction, and premature ventricular contractions were detected using ResNet-50 in the second stage. The method was evaluated using 5-fold cross-validation which improved the results when compared to previous studies, achieving first and second stage average accuracies of 97.21% and 98.36%, sensitivities of 96.49% and 97.92%, positive predictive values of 95.54% and 98.20%, and F1-scores of 95.96% and 98.05%, respectively. Furthermore, a 5-fold improvement in the memory requirement was achieved when compared with a previous study, making this classifier feasible for use in resource-constricted environments such as portable devices. Even though the method is successful, first stage training requires combining four different arrhythmia types into one label (other), which generates more data for the other category than for VF and noise, thus creating a data imbalance that affects the first stage performance.
APA, Harvard, Vancouver, ISO, and other styles
9

Rajeshwari, M. R., and K. S. Kavitha. "Enhanced tolerance-based intuitionistic fuzzy rough set theory feature selection and ResNet-18 feature extraction model for arrhythmia classification." Multiagent and Grid Systems 18, no. 3-4 (2023): 241–61. http://dx.doi.org/10.3233/mgs-220317.

Full text
Abstract:
Arrhythmia classification on Electrocardiogram (ECG) signals is an important process for the diagnosis of cardiac disease and arrhythmia disease. The existing researches in arrhythmia classification have limitations of imbalance data problem and overfitting in classification. This research applies Fuzzy C-Means (FCM) – Enhanced Tolerance-based Intuitionistic Fuzzy Rough Set Theory (ETIFRST) for feature selection in arrhythmia classification. The selected features from FCM-ETIFRST were applied to the Multi-class Support Vector Machine (MSVM) for arrhythmia classification. The ResNet18 – Convolution Neural Network (CNN) was applied for feature extraction in input signal to overcome imbalance data problem. Conventional feature extraction along with CNN features are applied for FCM-ETIFRST feature selection process. The FCM-ETIFRST method in arrhythmia classification is evaluated on MIT-BIH and CPCS 2018 dataset. The FCM-ETIFRST has 98.95% accuracy and Focal loss-CNN has 98.66% accuracy on MIT-BIH dataset. The FCM-ETIFRST method has 98.45% accuracy and Explainable Deep learning Model (XDM) method have 93.6% accuracy on CPCS 2018 dataset.
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Di, Yujuan Si, Weiyi Yang, Gong Zhang, and Jia Li. "A Novel Electrocardiogram Biometric Identification Method Based on Temporal-Frequency Autoencoding." Electronics 8, no. 6 (2019): 667. http://dx.doi.org/10.3390/electronics8060667.

Full text
Abstract:
For good performance, most existing electrocardiogram (ECG) identification methods still need to adopt a denoising process to remove noise interference beforehand. This specific signal preprocessing technique requires great efforts for algorithm engineering and is usually complicated and time-consuming. To more conveniently remove the influence of noise interference and realize accurate identification, a novel temporal-frequency autoencoding based method is proposed. In particular, the raw data is firstly transformed into the wavelet domain, where multi-level time-frequency representation is achieved. Then, a prior knowledge-based feature selection is proposed and applied to the transformed data to discard noise components and retain identity-related information simultaneously. Afterward, the stacked sparse autoencoder is introduced to learn intrinsic discriminative features from the selected data, and Softmax classifier is used to perform the identification task. The effectiveness of the proposed method is evaluated on two public databases, namely, ECG-ID and Massachusetts Institute of Technology-Biotechnology arrhythmia (MIT-BIH-AHA) databases. Experimental results show that our method can achieve high multiple-heartbeat identification accuracies of 98.87%, 92.3%, and 96.82% on raw ECG signals which are from the ECG-ID (Two-recording), ECG-ID (All-recording), and MIT-BIH-AHA database, respectively, indicating that our method can provide an efficient way for ECG biometric identification.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "MIT/BIH data base"

1

Petrova, Mila. "(Mis)trusting health research synthesis studies : exploring transformations of 'evidence'." Thesis, University of Exeter, 2014. http://hdl.handle.net/10871/14426.

Full text
Abstract:
This thesis explores the transformations of evidence in health research synthesis studies – studies that bring together evidence from a number of research reports on the same/ similar topic. It argues that health research synthesis is a broad and intriguing field in a state of pre-formation, in spite of the fact that it may appear well established if equated with its exemplar method – the systematic review inclusive of meta-analysis. Transformations of evidence are processes by which pieces of evidence are modified from what they are in the primary study report into what is needed in the synthesis study while, supposedly, having their integrity fully preserved. Such processes have received no focused attention in the literature. Yet they are key to the validity and reliability of synthesis studies. This work begins to describe them and explore their frequency, scope and drivers. A ‘meta-scientific’ perspective is taken, where ‘meta-scientific’ is understood to include primarily ideas from the philosophy of science and methodological texts in health research, and, to a lesser extent, social studies of science and psychology of science thinking. A range of meta-scientific ideas on evidence and factors that shape it guide the analysis of processes of “data extraction” and “coding” during which much evidence is transformed. The core of the analysis involves the application of an extensive Analysis Framework to 17 highly heterogeneous research papers on cancer. Five non-standard ‘injunctions’ complement the Analysis Framework – for comprehensiveness, extensive multiple coding, extreme transparency, combination of critical appraisal and critique, and for first coding as close as possible to the original and then extending towards larger transformations. Findings suggest even lower credibility of the current overall model of health research synthesis than initially expected. Implications are discussed and a radical vision for the future proposed.
APA, Harvard, Vancouver, ISO, and other styles
2

PESCE, ELENA. "Model-based Design of Experiments for Large Dataset." Doctoral thesis, Università degli studi di Genova, 2021. http://hdl.handle.net/11567/1057610.

Full text
Abstract:
The first part of this thesis presents the motivation for adapting ideas and methods from the theory of the model-based optimal design of experiments in the context of Big Data while guarding against different sources of bias. In particular, the key focus is on the issue of guarding against bias from confounders and how to use the theory of the design of experiment and randomization to remove bias depending on the constraints in the design. Starting with A/B experiments, largely used by major Tech Companies in online marketing, the theory of circuits is introduced and an algebraic methods which gives a wide choice of randomization schemes is presented. Furthermore, a robust exchange algorithm to deal with the problem of outliers in a Big Dataset is proposed. The second part is based on a marine insurance use case sponsored by Swiss Re Corporate Solutions, commercial insurance division of the Swiss Re Group. Several temporal disaggregation methods for dealing with time series collected at different time frequencies are reviewed and applied to real data in order to obtain a curated dataset for predicting future losses.
APA, Harvard, Vancouver, ISO, and other styles
3

CORBETTA, ALESSANDRO. "Multiscale Crowd Dynamics: Physical Analysis, Modeling and Applications." Doctoral thesis, Politecnico di Torino, 2016. http://hdl.handle.net/11583/2659720.

Full text
Abstract:
In this thesis we investigate the dynamics of pedestrian crowds in a fundamental and applied perspective. Envisioning a quantitative understanding we employ ad hoc large-scale experimental measurements as well as analytic and numerical models. Moreover, we analyze current regulations in matter of pedestrians structural actions (structural loads), in view of the need of guaranteeing pedestrian safety in serviceable built environments. This work comes in three complementary parts, in which we adopt distinct perspectives and conceptually different tools, respectively from statistical physics, mathematical modeling and structural engineering. Chapter 1 introduces these perspectives and gives an outline of the thesis. The statistical dynamics of individual pedestrians is the subject of Part I. Although individual trajectories may appear random, once we analyze them in large ensembles we expect ``preferred'' behaviors to emerge. Thus, we envisage individual paths as fluctuations around such established routes. To investigate this aspect, we perform year-long 24/7 measurements of pedestrian trajectories in real-life conditions, which we analyze statistically and via Langevin-like models. Two measurement locations have been considered: a corridor-shaped landing in the Metaforum building at Eindhoven University of Technology and the main walkway within Eindhoven Train Station. The measurement technique we employ, based on overhead Microsoft \Kinect\ 3D-range sensors and on ad hoc tracking algorithms, is introduced in Chapter 2. In Chapter 3 we describe the low density pedestrian flows in the Metaforum landing. In this location hundreds of thousands of high-resolution trajectories have been collected. First, we discuss standard crowd-traffic descriptions based on average quantities such as fundamental diagrams. Then, thanks to our large dataset, we address the dynamics beyond average values via probability distributions of pedestrian positions and velocities. Chapter 4 focuses on the dynamics of pedestrians crossing the landing alone, i.e. undisturbed by peers. The simple crossing dynamics is affected by stochastic fluctuations due to the variability of individuals' behavior as well as external factors. In the chapter we propose a quantitative Langevin-like model for these stochastic fluctuations, that we compare with the experimental data in terms of stationary velocity distributions and time correlation functions. The avoidance regime which takes place when two pedestrians walk simultaneously in the landing and in opposite directions is addressed in Chapter 5. In this regime, the statistical features of pedestrian motion change from the undisturbed case (Chapter 4). Here, we study the avoidance dynamics as a linear superposition of the undisturbed motion and an interaction force. First, we estimate average interaction force fields from the data. Then, we extend the Langevin model of Chapter 4 to reproduce statistics of the pair-wise interactions. Finally, in Chapter 6, we discuss in brief the measurements collected at Eindhoven Train Station in view of future dense crowd analyses. In Part II we zoom out from the perspective of individual pedestrians and we look at crowds, adopting a genuine mathematical modeling point of view. In this context a microscopic, i.e. particle-like, or a macroscopic, i.e. fluid-like, observation scale can be employed. In Chapter 7, we establish a general background of crowd dynamics modeling, which includes an introduction of the modeling framework by Cristiani, Piccoli and Tosin (CPT framework, in use in Chapters 8,9,11 and 12. This framework is suitable to model systems governed by social interactions and stands on a first order measure-valued evolution equation. The use of measures is crucial in the following, as it enables a unified treatment of crowd flows at the microscopic and macroscopic scales. Chapter 8 comprises a comparison of microscopic and macroscopic dynamics given via the CPT framework. In a Wasserstein space context, we wonder when these two dynamics are consistent as the number of agents involved grows. In this comparison we consider agents whose mass (in a measure sense) is independent on the size of the crowd. In Chapter 9 we focus on the modeling of crowds moving across elongated geometries resembling footbridges. We address pedestrians' motion in a macroscopic perspective via the CPT framework. According to the framework, dynamics are prescribed as a linear superposition of two components: a desired velocity (that encodes the motion of pedestrians walking alone) and a social velocity (that weights the crowd mass via an interaction kernel to assess individual reactions to mutual presence). Footbridge-like geometries are simple scenarios in which, from phenomenological considerations, we are able to give these components a reasonable form and thus perform simulations. In Part III we consider crowd flows on footbridges in relation to the way the safety of pedestrians is ensured by the current building practice and in terms of crowd-structure dynamics interaction. Chapter 10 addresses crowd-footbridge systems in terms of featured uncertainties. We provide a categorized review of the literature giving a synthetic comparison of uncertainties involved. In general, beside the uncertainties affecting the mechanical properties of the structure, the status of the crowd is itself uncertain. Taking inspiration from wind engineering, we approach the crowd dynamics through a separation of the approaching and the crossing traffic. Within the review, we consider how building regulations address the crowd load. On one hand, no uncertainty, nor variability, is considered on the crowd state, therefore the roughest possible model (constant load) is typically retained. On the other hand, we notice how a large dissent is present in the prescribed load values, suggesting a possible inadequacy in regulations. Chapter 11 rises from the point made in Chapter 10. We propose a framework to deal with uncertainties related to the crowd traffic on footbridges. The framework addresses the pedestrian density, a major player in the determination of live loads. Following the previous categorization, the framework is a composition of different modeling blocks and it considers approaching and crossing traffic at different scales, respectively macroscopic and microscopic. The output is a probabilistic description of the spatial density of the crossing crowd. In Chapter 12 we consider the dynamics of the human-structure system as a whole, targeting the vertical vibrations of slender footbridges excited by crowds of walking pedestrians. We combine the microscopic counterpart of the CPT modeling framework for the pedestrian dynamics with a simple single degree of freedom structural model to provide a modeling framework for the crowd-structure interaction. We realize the coupling modeling the dynamical forces exchanged by the structure and each pedestrian via per-pedestrian single degree of freedom vertical oscillators. We study how the active presence of pedestrians influences the structure dynamics in terms of vertical accelerations and varied effective damping. A discussion chapter addressing independently the content of the parts and then commenting on them as a whole closes the thesis.
APA, Harvard, Vancouver, ISO, and other styles
4

Hübner, Sarah. "Zur Entwicklung des tierärztlichen Berufsstandes in Deutschland seit dem Jahr 2000 - eine empirische Verbleibstudie mit Geschlechtervergleich." Doctoral thesis, Universitätsbibliothek Leipzig, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-225606.

Full text
Abstract:
Zur Zeit gibt es keinen quantitativen Gesamtüberblick und keinen bundesweiten Vergleich der Zahlen von Studienanfängern, Absolventen mit abgelegter Tierärztlichen Prüfung (TP), Tierärzten mit Approbation sowie Kammermitgliedschaften. Es wird untersucht, wie sich das Verhältnis zwischen der Anzahl der von den veterinärmedizinischen Bildungsstätten erteilten TP zur Anzahl der in Deutschland erteilten Approbationen und diese wiederum zu den bestehenden Pflichtmitgliedschaften in den Landestierärztekammern für den Untersuchungszeitraum der Abschlussjahrgänge 2000 bis 2010 darstellt. Es wurde Datenmaterial der Stiftung für Hochschulzulassung, der fünf veterinärmedizinischen Bildungsstätten, des Deutschen Tierärzteblattes, der Approbationsbehörden und der Zentralen Tierärztedatei Dresden genutzt. Anschließend wurden die Daten mittels Recherche in öffentlichen Medien ergänzt. Insgesamt wurden n = 8036 Personen zur Untersuchung herangezogen, wovon n = 6715 (84 %) auswertbar waren, dabei lag der Frauenanteil stets bei durchschnittlich 82 %. Es zeigte sich, dass die überwiegende Mehrheit (92 %) der auswertbaren Personen ihre Approbation innerhalb der ersten drei Monate nach Bestehen der TP erhielt. 84 % ließen nur maximal drei Monate zwischen Approbationserhalt und Kammerbeitritt vergehen. 75 % der Absolventen bleiben ihrem Ausbildungsland treu bzw. kehren dorthin zurück, eine veterinärmedizinische Hochschule bzw. Fakultät hat somit einen fachkräftebindenden Effekt für das jeweilige Bundesland. Im Bereich der Haupttätigkeitsfelder geht der Trend nach wie vor in Richtung „Praktiker“ (52 %). Personen ohne Berufsausübung bzw. Doktoranden nehmen den zweitgrößten Anteil (17 %) der Tätigkeitsfelder ein. Dabei steht die Einstufung der Doktoranden der Tiermedizin in tierärztlich „Tätige“ oder „nicht Tätige“ zur Diskussion, da diese in Deutschland noch in einer rechtlichen Grauzone liegt. Das Anmeldesystem ausgehend von der Approbationsbeantragung bis zur Kammermitgliedschaft bei den Tierärzten in Deutschland, mit weniger als 3 % nicht registrierter Kammermitgliedschaften sowie weniger als 1 % niemals beantragter Approbationen, funktioniert recht gut. Dies scheint in erster Linie am starken Pflichtbewusstsein der deutschen Tierärzte zu liegen. Lücken in der Zusammenarbeit zwischen Approbationsbehörden und Landestierärztekammern bzw. Fehlerquellen bei der Datenübermittlung fielen bisher nicht auf und die rechtliche Verfolgung von Versäumnissen einzelner Tierärzte spielt in der Kammerverwaltung eine untergeordnete bis gar keine Rolle, da rechtliche Vergehen tatsächlich Ausnahmen darstellen. Dennoch sollten die Datenbasis und auch der Datenfluss zwischen den beteiligten Institutionen vereinheitlicht, verifiziert und auch regelmäßig ausgewertet werden, denn ohne die Anwendung von Kontroll- und Sanktionsmaßnahmen ist die rechtsverbindliche Pflichtmitgliedschaft de facto eine reine Selbstverpflichtung. Eine einheitliche Stellungnahme zum Status der Doktoranden seitens der berufspolitischen Organe ist dringend notwendig. Doktoranden sollten zur Gruppe der tierärztlich „Tätigen“ zählen und der Nachweis der Approbation für alle mit der Promotion einhergehenden Arbeitsschritte Pflicht sein. In Anbetracht einer diesbezüglich bisher fehlenden bundeseinheitlichen Regelung, ist die Frage, ob man in Deutschland ohne Probleme mit fehlender Approbation tierärztlich tätig werden kann, eindeutig mit „ja“ zu beantworten.
APA, Harvard, Vancouver, ISO, and other styles
5

Bode, Thomas. "Ein Datenbanksystem (P.A.S.T) zur Verarbeitung und Interpretation von palynologischen Daten aus dem Paläogen Mitteleuropas mit Diversitätsbetrachtungen." Doctoral thesis, 2001. http://hdl.handle.net/11858/00-1735-0000-0006-B25B-E.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

"Data disaggregation with ecological inference: implementation of models based in the truncated normal and on the binomial-beta via em algorithm." Tese, MAXWELL, 2000. http://www.maxwell.lambda.ele.puc-rio.br/cgi-bin/db2www/PRG_0991.D2W/SHOW?Cont=1347:pt&Mat=&Sys=&Nr=&Fun=&CdLinPrg=pt.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hübner, Sarah. "Zur Entwicklung des tierärztlichen Berufsstandes in Deutschland seit dem Jahr 2000 - eine empirische Verbleibstudie mit Geschlechtervergleich." Doctoral thesis, 2016. https://ul.qucosa.de/id/qucosa%3A15682.

Full text
Abstract:
Zur Zeit gibt es keinen quantitativen Gesamtüberblick und keinen bundesweiten Vergleich der Zahlen von Studienanfängern, Absolventen mit abgelegter Tierärztlichen Prüfung (TP), Tierärzten mit Approbation sowie Kammermitgliedschaften. Es wird untersucht, wie sich das Verhältnis zwischen der Anzahl der von den veterinärmedizinischen Bildungsstätten erteilten TP zur Anzahl der in Deutschland erteilten Approbationen und diese wiederum zu den bestehenden Pflichtmitgliedschaften in den Landestierärztekammern für den Untersuchungszeitraum der Abschlussjahrgänge 2000 bis 2010 darstellt. Es wurde Datenmaterial der Stiftung für Hochschulzulassung, der fünf veterinärmedizinischen Bildungsstätten, des Deutschen Tierärzteblattes, der Approbationsbehörden und der Zentralen Tierärztedatei Dresden genutzt. Anschließend wurden die Daten mittels Recherche in öffentlichen Medien ergänzt. Insgesamt wurden n = 8036 Personen zur Untersuchung herangezogen, wovon n = 6715 (84 %) auswertbar waren, dabei lag der Frauenanteil stets bei durchschnittlich 82 %. Es zeigte sich, dass die überwiegende Mehrheit (92 %) der auswertbaren Personen ihre Approbation innerhalb der ersten drei Monate nach Bestehen der TP erhielt. 84 % ließen nur maximal drei Monate zwischen Approbationserhalt und Kammerbeitritt vergehen. 75 % der Absolventen bleiben ihrem Ausbildungsland treu bzw. kehren dorthin zurück, eine veterinärmedizinische Hochschule bzw. Fakultät hat somit einen fachkräftebindenden Effekt für das jeweilige Bundesland. Im Bereich der Haupttätigkeitsfelder geht der Trend nach wie vor in Richtung „Praktiker“ (52 %). Personen ohne Berufsausübung bzw. Doktoranden nehmen den zweitgrößten Anteil (17 %) der Tätigkeitsfelder ein. Dabei steht die Einstufung der Doktoranden der Tiermedizin in tierärztlich „Tätige“ oder „nicht Tätige“ zur Diskussion, da diese in Deutschland noch in einer rechtlichen Grauzone liegt. Das Anmeldesystem ausgehend von der Approbationsbeantragung bis zur Kammermitgliedschaft bei den Tierärzten in Deutschland, mit weniger als 3 % nicht registrierter Kammermitgliedschaften sowie weniger als 1 % niemals beantragter Approbationen, funktioniert recht gut. Dies scheint in erster Linie am starken Pflichtbewusstsein der deutschen Tierärzte zu liegen. Lücken in der Zusammenarbeit zwischen Approbationsbehörden und Landestierärztekammern bzw. Fehlerquellen bei der Datenübermittlung fielen bisher nicht auf und die rechtliche Verfolgung von Versäumnissen einzelner Tierärzte spielt in der Kammerverwaltung eine untergeordnete bis gar keine Rolle, da rechtliche Vergehen tatsächlich Ausnahmen darstellen. Dennoch sollten die Datenbasis und auch der Datenfluss zwischen den beteiligten Institutionen vereinheitlicht, verifiziert und auch regelmäßig ausgewertet werden, denn ohne die Anwendung von Kontroll- und Sanktionsmaßnahmen ist die rechtsverbindliche Pflichtmitgliedschaft de facto eine reine Selbstverpflichtung. Eine einheitliche Stellungnahme zum Status der Doktoranden seitens der berufspolitischen Organe ist dringend notwendig. Doktoranden sollten zur Gruppe der tierärztlich „Tätigen“ zählen und der Nachweis der Approbation für alle mit der Promotion einhergehenden Arbeitsschritte Pflicht sein. In Anbetracht einer diesbezüglich bisher fehlenden bundeseinheitlichen Regelung, ist die Frage, ob man in Deutschland ohne Probleme mit fehlender Approbation tierärztlich tätig werden kann, eindeutig mit „ja“ zu beantworten.:Inhaltsverzeichnis I Abkürzungsverzeichnis III 1 Einleitung 1 2 Literaturübersicht 3 2.1 Approbation 3 2.2 Bedeutung „tätiger“ Tierarzt 5 2.3 Aufgaben und Stellung der Tierärztekammern 6 2.4 Zentrale Tierärztedatei 7 2.4.1 Entstehungsgeschichte und Funktionsweise 7 2.4.2 Aufgaben 8 2.4.3 Die Jahresstatistik der Bundestierärztekammer 8 3 Material und Methodik 10 3.1 Stiftung für Hochschulzulassung 10 3.2 Veterinärmedizinische Bildungsstätten 10 3.3 Deutsches Tierärzteblatt 11 3.4 Approbationsbehörden 11 3.5 Jahresstatistik der Bundestierärztekammer 12 3.6 Datenbasis 12 3.7 Zentrale Tierärztedatei Dresden 12 3.8 Internetrecherche 14 3.9 Angaben zum Datenschutz 14 3.10 Methodenkritik 15 4 Ergebnisse 17 4.1 Allgemeine Datenlage der veterinärmedizinischen Bildungsstätten 17 4.2 Untersuchter Gesamtdatensatz der Approbierten und Repräsentativität der Datenbasis aus dem Deutschen Tierärzteblatt 24 4.3 Zeitspanne zwischen Ablegung der Tierärztlichen Prüfung und Erhalt der Approbation 25 4.4 Zeitspanne zwischen Approbation und Kammermitgliedschaft 26 4.5 Approbation ohne Kammermitgliedschaft 27 4.6 Ergebnisse der Internetrecherche 28 4.6.1 Korrigierte Zahlen zur Approbation ohne Kammermitgliedschaft 28 4.6.2 Weder Kammermitgliedschaft noch Approbation 30 4.7 Tätigkeitsbereiche 31 4.8 Verbleib 32 4.9 Austritt aus der Kammerzugehörigkeit 35 5 Diskussion 36 5.1 Zu den Datensätzen und deren Verarbeitung durch die einzelnen Institutionen 36 5.2 Zu den Datensätzen der Tierärztlichen Prüfungen 37 5.3 Zu den Approbationszahlen 38 5.4 Zum Zeitraum zwischen Tierärztlicher Prüfung und Approbation 39 5.5 Zum Zeitraum zwischen Approbation und Kammermitgliedschaft 40 5.6 Zu den Tätigkeitsbereichen 41 5.7 Zum Verbleib 43 5.8 Zu den „Nichtgemeldeten“ 44 5.8.1 Schwächen der Zentralen Tierärztedatei/Datenerfassung 44 5.8.2 Einflussfaktor „Ausländer“ 45 5.9 Auswirkungen und Konsequenzen bei Verstößen gegen die Approbationspflicht und Kammermitgliedschaft 46 5.10 Situation der Doktoranden – Approbation ein Muss? 49 6 Zusammenfassung 52 7 Summary 54 8 Literaturverzeichnis 56 Anhang 62 Danksagung 75
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "MIT/BIH data base"

1

Varlamov, Oleg. Fundamentals of creating MIVAR expert systems. INFRA-M Academic Publishing LLC., 2021. http://dx.doi.org/10.12737/1513119.

Full text
Abstract:
Methodological and applied issues of the basics of creating knowledge bases and expert systems of logical artificial intelligence are considered. The software package "MIV Expert Systems Designer" (KESMI) Wi!Mi RAZUMATOR" (version 2.1), which is a convenient tool for the development of intelligent information systems. Examples of creating mivar expert systems and several laboratory works are given. The reader, having studied this tutorial, will be able to independently create expert systems based on KESMI. &#x0D; The textbook in the field of training "Computer Science and Computer Engineering" is intended for students, bachelors, undergraduates, postgraduates studying artificial intelligence methods used in information processing and management systems, as well as for users and specialists who create mivar knowledge models, expert systems, automated control systems and decision support systems. &#x0D; Keywords: cybernetics, artificial intelligence, mivar, mivar networks, databases, data models, expert system, intelligent systems, multidimensional open epistemological active network, MOGAN, MIPRA, KESMI, Wi!Mi, Razumator, knowledge bases, knowledge graphs, knowledge networks, Big knowledge, products, logical inference, decision support systems, decision-making systems, autonomous robots, recommendation systems, universal knowledge tools, expert system designers, logical artificial intelligence.
APA, Harvard, Vancouver, ISO, and other styles
2

Chu-ho, Pak. Han'guk kyoyuk chŏngch'aek nonp'yŏng: Pik teit'ŏ kiban chŏngkwŏnbyŏl kyoyuk chŏngch'aek ŭi chuje punsŏk, chaengchŏm mit taean nonŭi = Review of educational policies in South Korea : discussion of alternatives, issues, and topics by analyzing government policies based on big data. Pagyŏng Story, 2022.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Garbsch, Christian. Data Warehouse Factory: Bi-Automation Durch Data Vault Mit Ssis und SAS Base. Diplomica Verlag, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Goorhuis-Brouwer, Sieneke, and Bas Levering. Dolgedraaid. Mogen peuters nog peuteren en kleuters nog kleuteren? Uitgeverij SWP, 2006. http://dx.doi.org/10.36254/978-90-6665-702-1.

Full text
Abstract:
De spontane ontwikkeling van peuters en kleuters staat onder grote druk. Niet alleen pedagogen en leerkrachten zijn bezorgd over de huidige ontwikkelingen binnen het onderwijs aan jonge kinderen, waar leerlingvolgsystemen, test- en compensatieprogramma’s de overhand krijgen. Ook medisch specialisten maken zich zorgen. Meer dan vroeger zien zij hun poliklinieken volstromen met ‘gezonde’ kinderen, bij wie desalniettemin een achterstand in de motorische, sociaal-emotionele of taalontwikkeling wordt verondersteld. Hieruit wordt pijnlijk duidelijk dat de variaties die bij een normaal ontwikkelingsproces horen niet altijd meer erkend worden. In deze bundel met discussiebijdragen van deskundigen uit het onderwijs, de pedagogiek, ontwikkelingspsychologie en medische wetenschappen, wordt aandacht gevraagd voor de eigen positie en identiteit van peuters en kleuters in het onderwijs. Mogen peuters nog peuteren en kleuters nog kleuteren? Orthopedagoog Sieneke Goorhuis-Brouwer is als hoogleraar spraak- en taalstoornissen bij kinderen verbonden aan de Rijksuniversiteit te Groningen. Bas Levering is lector Algemene Pedagogiek aan Fontys Hogescholen en hoofdredacteur van Pedagogiek in Praktijk Magazine.
APA, Harvard, Vancouver, ISO, and other styles
5

Pijnenburg, Huub, Jo Hermanns, Tom van Yperen, Giel Hutschemaekers, and Adri van Montfoort. Zorgen dat het werkt: Werkzame factoren in de zorg voor jeugd. 2nd ed. Uitgeverij SWP, 2011. http://dx.doi.org/10.36254/978-90-8850-131-9.

Full text
Abstract:
Evidence based werken in de zorg voor jeugd? Prima! Maar wat doen we met vragen als: - In wiens handen werken interventies; wat kenmerkt effectieve professionals? - Wat is de invloed van de werkalliantie van professionals en cliënten? - Waarom werken interventies, en onder welke condities? - Hoe kunnen we steunfactoren benutten in de leefomgeving van jeugdigen en opvoeders? - Wat betekent dit alles voor de manier waarop we hulp moeten organiseren en beroepskrachten moeten opleiden? Vijf bijdragen maken dit boek waardevol voor jeugdzorgprofessionals en studenten. Vijf auteurs die thuis zijn in veld en wetenschap laten hun licht schijnen over: - De samenhang tussen werkzame factoren, met nadruk op kenmerken van effectieve professionals en het belang van cliënt-hulpverlener-alliantie (Huub Pijnenburg) - Vernieuwende opvattingen over inrichting van contextuele jeugdzorg en niet-vrijblijvende samenwerking bij complexe hulpvragen (Jo Hermanns) - Mogelijkheden voor effectiviteitsverbetering, waaronder aandacht voor implementatie van effectieve interventies (Tom van Yperen) - Recente ontwikkelingen in het denken over evidence based practice en de zoektocht naar een werkzame alliantie tussen praktijk en wetenschap (Giel Hutschemaekers) - De samenhang tussen een integrale visie op jeugdzorg, belangen van overheden, en dimensies in werk en opleiding van beroepskrachten (Adri van Montfoort) De eerste bijdrage is een bewerking van de intreerede van Huub Pijnenburg bij de aanvaarding van zijn lectoraat Werkzame Factoren in de Zorg voor Jeugd aan de Hogeschool van Arnhem en Nijmegen. Dit lectoraat zoekt samen met de praktijk naar antwoorden op vragen over werkzaamheid van zorg voor jeugd, en wat dit betekent voor beroepskrachten en instellingen. De factoren die de werkzaamheid van de psychosociale zorg voor jeugd beïnvloeden, laten zich kennen als een bonte familie. Meer kennis over de leden van deze familie en hun onderlinge band zal de werkzaamheid van de jeugdzorg vergroten. Want dat is en blijft de grote uitdaging: zorgen dat het werkt.
APA, Harvard, Vancouver, ISO, and other styles
6

mhGAP-interventiegids bij psychische, neurologische stoornissen en aandoeningen door middelengebruik in niet-gespecialiseerde zorginstellingen: Actieprogramma voor geestelijke gezondheid (mhGAP) - versie 2.0. Pan American Health Organization, 2022. http://dx.doi.org/10.37774/9789275222553.

Full text
Abstract:
Psychische en neurologische stoornissen en aandoeningen veroorzaakt door middelengebruik (MNS-aandoeningen) komen veel voor en veroorzaken over de hele wereld een grote ziektelast en invaliditeit. Om de kloof tussen de beschikbare middelen en de grote vraag naar hulpverlening te overbruggen, heeft de Wereldgezondheidsorganisatie het Actieprogramma voor de Geestelijke Gezondheid (mhGAP) gelanceerd. Het doel van mhGAP is om zorg en diensten op te schalen met behulp van evidence-based interventies voor de preventie en behandeling van prioritaire MNS-aandoeningen. De mhGAP-interventiegids versie 1.0 voor MNS-aandoeningen voor niet-gespecialiseerde zorginstellingen werd in 2010 ontwikkeld als een eenvoudig technisch hulpmiddel om de geïntegreerde behandeling van prioritaire MNS-aandoeningen mogelijk te maken met behulp van protocollen voor klinische besluitvorming. De mhGAP-IG 1.0-versie wordt gebruikt in meer dan 90 landen en is een groot succes. Het is een genoegen om de mhGAP versie 2.0 te presenteren, met nieuwe informatie over begeleiding gebaseerd op bewijzen, verbeterde bruikbaarheid en nieuwe onderdelen die door zowel zorgverleners als programmamanagers kunnen worden gebruikt voor een nog groter bereik. We hopen dat deze gids een richtlijn zal blijven bieden voor het leveren van zorg en diensten aan mensen met MNS-stoornissen over de hele wereld en dat dit ons een stap dichterbij ons doel van universele gezondheidsdekking brengt.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "MIT/BIH data base"

1

Tawalbeh, Lo’ai, Swathi Lakkineni, Fadi Muheidat, Ummugul Bulut, and Ahmed A. Abd El-latif. "Big Data Analytics for Secure Edge-Based Manufacturing Internet of Things (MIoT)." In EAI/Springer Innovations in Communication and Computing. Springer International Publishing, 2024. http://dx.doi.org/10.1007/978-3-031-51097-7_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Xiao, Yingfu Sai, Lai Man, Xingze Zhang, Yan Zhao, and Shawn Wilson. "A Design Based on Big Data Processing Frame for Data Mid-platform in Time of IoT." In Lecture Notes in Electrical Engineering. Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-4775-9_97.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Durá Gil, Juan V., Alfredo Remon, Iván Martínez Rodriguez, et al. "3D Human Big Data Exchange Between the Healthcare and Garment Sectors." In Technologies and Applications for Big Data Value. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-78307-5_11.

Full text
Abstract:
Abstract3D personal data is a type of data that contains useful information for product design, online sale services, medical research and patient follow-up.Currently, hospitals store and grow massive collections of 3D data that are not accessible by researchers, professionals or companies. About 2.7 petabytes a year are stored in the EU26.In parallel to the advances made in the healthcare sector, a new, low-cost 3D body-surface scanning technology has been developed for the goods consumer sector, namely, apparel, animation and art. It is estimated that currently one person is scanned every 15 min in the USA and Europe. And increasing.The 3D data of the healthcare sector can be used by designers and manufacturers of the consumer goods sector. At the same time, although 3D body-surface scanners have been developed primarily for the garment industry, 3D scanners’ low cost, non-invasive character and ease of use make them appealing for widespread clinical applications and large-scale epidemiological surveys.However, companies and professionals of the consumer goods sector cannot easily access the 3D data of the healthcare sector. And vice versa. Even exchanging information between data owners in the same sector is a big problem today. It is necessary to overcome problems related to data privacy and the processing of huge 3D datasets.To break these silos and foster the exchange of data between the two sectors, the BodyPass project has developed: (1) processes to harmonize 3D databases; (2) tools able to aggregate 3D data from different huge datasets; (3) tools for exchanging data and to assure anonymization and data protection (based on blockchain technology and distributed query engines); (4) services and visualization tools adapted to the necessities of the healthcare sector and the garment sector.These developments have been applied in practical cases by hospitals and companies of in the garment sector.
APA, Harvard, Vancouver, ISO, and other styles
4

Fu, Juanlin, Li Yan, Chunrong Zhao, et al. "Allocation of Public Service Facilities Based on Community Life Circles: A Case Study in Mianyang, China." In Lecture Notes in Civil Engineering. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-8401-1_55.

Full text
Abstract:
AbstractThe allocation of public service facilities based on community life circle is the main measure of urban social governance at present. Although there are many researches on the allocation of public service facilities, the evaluation on different levels of community life circle is still insufficient. Taking Mianyang city of Sichuan Province as an example, based on big data such as city blocks, roads and POI, the different levels of community life circles are delineated and their public service facilities are evaluated. The results show the following: (1) Combined 15, 10 and 5 min walking time with roads and administrative boundaries, three levels of community life circle is defined comprehensively. (2) The public service facility coverage rate and community life circle standard rate have gradually decreased from a 15-min community life circle(15min-CLC) to a 10-min community life circle(10min-CLC) and then a 5-min community life circle(5min-CLC). The unevenly distributed basic public service facilities, such as secondary schools and hospitals, mainly accounts for the low rates. (3) All categories of public service facilities present remarkable spatial auto-correlation, which shows on a characteristic transform from “core-periphery” to “one-core multi-center” distribution pattern with the decrease of the community life circle level. The study revealed the spatial distribution characteristics of urban public service facilities under different community life circle levels, and provided theoretical basis for optimizing the allocation of public service.
APA, Harvard, Vancouver, ISO, and other styles
5

Williams, Jason. "CyVerse for Reproducible Research: RNA-Seq Analysis." In Plant Bioinformatics. Springer US, 2022. http://dx.doi.org/10.1007/978-1-0716-2067-0_3.

Full text
Abstract:
AbstractPosing complex research questions poses complex reproducibility challenges. Datasets may need to be managed over long periods of time. Reliable and secure repositories are needed for data storage. Sharing big data requires advance planning and becomes complex when collaborators are spread across institutions and countries. Many complex analyses require the larger compute resources only provided by cloud and high-performance computing infrastructure. Finally at publication, funder and publisher requirements must be met for data availability and accessibility and computational reproducibility. For all of these reasons, cloud-based cyberinfrastructures are an important component for satisfying the needs of data-intensive research. Learning how to incorporate these technologies into your research skill set will allow you to work with data analysis challenges that are often beyond the resources of individual research institutions. One of the advantages of CyVerse is that there are many solutions for high-powered analyses that do not require knowledge of command line (i.e., Linux) computing. In this chapter we will highlight CyVerse capabilities by analyzing RNA-Seq data. The lessons learned will translate to doing RNA-Seq in other computing environments and will focus on how CyVerse infrastructure supports reproducibility goals (e.g., metadata management, containers), team science (e.g., data sharing features), and flexible computing environments (e.g., interactive computing, scaling).
APA, Harvard, Vancouver, ISO, and other styles
6

Urrea, Claudia, Kirky Delong, Joe Diaz, et al. "MIT Full STEAM Ahead: Bringing Project-Based, Collaborative Learning to Remote Learning Environments." In Knowledge Studies in Higher Education. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-82159-3_20.

Full text
Abstract:
AbstractWith schools and educational centers around the country moving from in-person to emergency remote learning due to the COVID-19 pandemic, education faces an unprecedented crisis (Hodges et al., Educause Review 27, 2020). This case study presents the efforts and impact of Full STEAM Ahead (FSA) launched by the Massachusetts Institute of Technology (MIT) in response to the pandemic to support remote collaborative learning for K-12 learners, parents, and educators. We present two FSA initiatives: (1) weekly themed packages with developmentally appropriate activities for K-12 remote learning and (2) Full STEAM Ahead Into Summer (FSAIS), an online summer program for middle school Massachusetts students, specifically targeting students who are at risk for “COVID Slide.” (Institute-wide Task Force on the Future of MIT Education-Final Report: http://web.mit.edu/future-report/TaskForceFinal_July28.pdf?) Our operative theory of change is that we can improve K-12 remote collaborative learning experiences through developing and sharing a curriculum that exemplifies the minds-on and hands-on approach advocated by MIT, strategically leveraging existing structures and projects within MIT, and establishing partnerships with the local and international community. We gauge the effect of these efforts on contributing members of the MIT community and targeted learners by analyzing data gathered through participant surveys and artifacts such as the website, packages, modules, and student projects created during the summer programs. Our findings indicate that existing structures and resources – with community building – facilitated the achievement of our goal to develop and distribute problem-based learning activities and that interaction and community building were central in meeting those goals. This work contributes to the knowledge base regarding emergency online learning and the development of effective university outreach efforts.
APA, Harvard, Vancouver, ISO, and other styles
7

Urrea, Claudia, Kirky Delong, Joe Diaz, et al. "MIT Full STEAM Ahead: Bringing Project-Based, Collaborative Learning to Remote Learning Environments." In Knowledge Studies in Higher Education. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-82159-3_20.

Full text
Abstract:
AbstractWith schools and educational centers around the country moving from in-person to emergency remote learning due to the COVID-19 pandemic, education faces an unprecedented crisis (Hodges et al., Educause Review 27, 2020). This case study presents the efforts and impact of Full STEAM Ahead (FSA) launched by the Massachusetts Institute of Technology (MIT) in response to the pandemic to support remote collaborative learning for K-12 learners, parents, and educators. We present two FSA initiatives: (1) weekly themed packages with developmentally appropriate activities for K-12 remote learning and (2) Full STEAM Ahead Into Summer (FSAIS), an online summer program for middle school Massachusetts students, specifically targeting students who are at risk for “COVID Slide.” (Institute-wide Task Force on the Future of MIT Education-Final Report: http://web.mit.edu/future-report/TaskForceFinal_July28.pdf?) Our operative theory of change is that we can improve K-12 remote collaborative learning experiences through developing and sharing a curriculum that exemplifies the minds-on and hands-on approach advocated by MIT, strategically leveraging existing structures and projects within MIT, and establishing partnerships with the local and international community. We gauge the effect of these efforts on contributing members of the MIT community and targeted learners by analyzing data gathered through participant surveys and artifacts such as the website, packages, modules, and student projects created during the summer programs. Our findings indicate that existing structures and resources – with community building – facilitated the achievement of our goal to develop and distribute problem-based learning activities and that interaction and community building were central in meeting those goals. This work contributes to the knowledge base regarding emergency online learning and the development of effective university outreach efforts.
APA, Harvard, Vancouver, ISO, and other styles
8

Pandey, Anukul, Barjinder Singh Saini, Butta Singh, and Neetu Sood. "Analysis of Electrocardiogram Data Compression Techniques." In Data Analytics in Medicine. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-1204-3.ch049.

Full text
Abstract:
In this Chapter, a MATLAB-based approach is presented for compression of Electrocardiogram (ECG) data. The methodology employs in three different domains namely direct, transformed and parameter extraction methods. The selected techniques from direct ECG compression methods are TP, AZTEC, Fan, and Cortes. Moreover selected techniques from transformed ECG compression methods are Walsh Transform, DCT, and Wavelet transform. For each of the technique, the basic implementation of the algorithm was explored, and performance measures were calculated. All 48 records of MIT-BIH arrhythmia ECG database were employed for performance evaluation of various implemented techniques. Moreover, based on requirements, any basic techniques can be selected for further innovative processing that may include the lossless encoding.
APA, Harvard, Vancouver, ISO, and other styles
9

Pandey, Anukul, Barjinder Singh Saini, Butta Singh, and Neetu Sood. "Analysis of Electrocardiogram Data Compression Techniques." In Computational Tools and Techniques for Biomedical Signal Processing. IGI Global, 2017. http://dx.doi.org/10.4018/978-1-5225-0660-7.ch013.

Full text
Abstract:
In this Chapter, a MATLAB-based approach is presented for compression of Electrocardiogram (ECG) data. The methodology employs in three different domains namely direct, transformed and parameter extraction methods. The selected techniques from direct ECG compression methods are TP, AZTEC, Fan, and Cortes. Moreover selected techniques from transformed ECG compression methods are Walsh Transform, DCT, and Wavelet transform. For each of the technique, the basic implementation of the algorithm was explored, and performance measures were calculated. All 48 records of MIT-BIH arrhythmia ECG database were employed for performance evaluation of various implemented techniques. Moreover, based on requirements, any basic techniques can be selected for further innovative processing that may include the lossless encoding.
APA, Harvard, Vancouver, ISO, and other styles
10

Singh, Butta, Manjit Singh, and Dixit Sharma. "Chaotic Function Based ECG Encryption System." In Handbook of Research on Healthcare Administration and Management. IGI Global, 2017. http://dx.doi.org/10.4018/978-1-5225-0920-2.ch013.

Full text
Abstract:
Remote health-care monitoring systems communicate biomedical information (e.g. Electrocardiogram (ECG)) over insecure networks. Protection of the integrity, authentication and confidentiality of the medical data is a challenging issue. This chapter proposed an encryption process having a 4-round five steps -encryption structure includes: the random pixel insertion, row separation, substitution of each separated row, row combination and rotation. Accuracy and security analysis of proposed method for 2D ECG encryption is evaluated on MIT-BIH arrhythmia database.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "MIT/BIH data base"

1

Lin, Jia Lun, Jin Yang Xia, and Xiao Ling Li. "Research on a CNN Based Clinical Electrocardiogram Classification Model." In 12th Annual International Conference on Material Science and Engineering. Trans Tech Publications Ltd, 2025. https://doi.org/10.4028/p-fj7a4x.

Full text
Abstract:
Electrocardiogram (ECG) is the most commonly used diagnostic method for heart diseases such as arrhythmia. However, its inherent complexity, to some extent, reduces the accuracy of diagnosis. To quickly and automatically identify the type of arrhythmia, this paper constructs a clinical ECG classification model based on Convolutional Neural Network (CNN) to assist clinicians in analyzing ECG signals. The MIT-BIH ECG database is used as the research data source, and the heart beats are classified into 5 categories based on AAMI EC57 standard. 95% of the ECG data is randomly divided into training and testing sets, and the remaining 5% is used as the internal testing set. Based on the experimental outcomes, the model's accuracy exceeds 96%, indicating a commendable overall performance.
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Yushu, and Luke Jain. "MIC Assessment Model for Upstream Production and Transport Facilities." In CORROSION 2016. NACE International, 2016. https://doi.org/10.5006/c2016-07769.

Full text
Abstract:
Abstract A model has been developed to improve microbiologically influenced corrosion (MIC) assessment and direct mitigation for upstream production and transport facilities. This model leverages previous MIC modeling activities with attention to increasing delineation of predicted MIC probability, incorporating practical field experience, considering data from advanced bio-monitoring techniques, and assessing MIC risk. The input parameters for the model are of two different types; prediction factors and monitoring factors. These two types of factors are treated by two different mathematical operations. The probability of MIC for a given system is provided as an index as well as a semi-quantitative probability class. The MIC Risk is based on the Probability of MIC combined with the Consequence of Failure (COF) due to an internal corrosion failure. The risk assessment component of the model recommends general surveillance, bio-monitoring and/or mitigation for both design and operating scenarios based on the results of the MIC risk assessment.
APA, Harvard, Vancouver, ISO, and other styles
3

Kharshan, Margarita, Alla Furman, and Brian Wuertz. "Biodegradable VCI Building Block for Biofuels." In CORROSION 2007. NACE International, 2007. https://doi.org/10.5006/c2007-07362.

Full text
Abstract:
Abstract For years the chemical industry has relied on petroleum as the primary ingredient in thousands of products. Numerous industrial product manufacturers use petroleum-derived substances in their formulations. However, there is concern about the drastic increases in oil and gas prices, thereby increasing the cost of those products. Tighter environmental regulations continue to put pressure on oil-based product producers and their users to find safer solutions. As an alternative the use of renewable biobased products provides environmentally safe products to the manufacturers and users that offer comparable performance, economics and the added benefit of biodegradability of the final products. Incorporation of volatile corrosion inhibitors (VCI) in lubricating products provides a number of advantages. VCIs, when added to the carrier, provide corrosion protection to machinery during the operation, storage or transportation period. Properly chosen combination of inhibitors prolongs service life of machinery by minimizing the corrosive wear of the fuel systems and storage tanks. Building Blocks for fuels (BBB) and BBB Bio are utilized to control the corrosive characteristics of fuels. BBB and BBB Bio provide protection in all 3 phases: liquid, interface, and vapor phases. In addition, BBB Bio was developed with soybean oil as a carrier; allowing it to be added to a variety of biofuel and conventional fuels including diesel and gasoline during operation, storage, transport, and distribution. BBB and BBB Bio pass the rust test in accordance to MIL-PRF-25017 and ASTM D-665-92. These and other additional laboratory test data and photos are presented.
APA, Harvard, Vancouver, ISO, and other styles
4

Singh, Vishavpreet, Suman Tewary, Viren Sardana, and H. K. Sardana. "Arrhythmia Detection - A Machine Learning based Comparative Analysis with MIT-BIH ECG Data." In 2019 IEEE 5th International Conference for Convergence in Technology (I2CT). IEEE, 2019. http://dx.doi.org/10.1109/i2ct45611.2019.9033665.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ramirez-Rodriguez, C. "A hybrid neuro-fuzzy system for the classification of normal, fusion and PVC cardiac beats in the MIT-BIH database." In IEE Colloquium on Artificial Intelligence Methods for Biomedical Data Processing. IEE, 1996. http://dx.doi.org/10.1049/ic:19960637.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ungureanu, Florina, and Robert gabriel Lupu. "THE ASSESSMENT OF LEARNING EMOTIONAL STATE USING THE EEG HEADSETS." In eLSE 2015. Carol I National Defence University Publishing House, 2015. http://dx.doi.org/10.12753/2066-026x-15-085.

Full text
Abstract:
In the last decade, the interest in the development of tools and devices for recognizing human emotions in learning process has increased continuously. It was proved that the electrical brain activity using electroencephalography (EEG) represents a useful methodological tool in understanding cortical processes that underlie performance and students' engagement in learning activities. Specific emotional states, like mental stress, concentration, relaxation, fatigue, and cognitive increase activation in Delta (0.5-4 Hz), Theta (4-7 Hz), Alpha 1 and 2 (8-12 Hz), Beta 1 and 2 (12-30 Hz), Gamma (30-70 Hz) frequencies. For example, the increase of frontal Beta-1 spectral power is associated with cognitive tasks demands and the decrease of Beta-1 power values reflects relaxation. Alpha is the dominant frequency in the human EEG and is generated in widespread areas of the cortex through cortico-cortical and thalamo-cortical interactions reflecting emotions. The EEG-based Brain Computer Interface (BCI) systems have been widely studied in different medical labs with high quality EEG recording devices which are much too expensive and need special employment. As an alternative to these professional equipment, several low-cost EEG devices were developed for out of the lab applications e.g. schools, universities, sports medicine, psychology or even neurophysiology. Two pertinent and detailed studies developed at Princeton University (http://compmem.princeton.edu/experimenter) and University of Mons (www.biomedical-engineering-online.com/content/12/1/56) reveal that the Emotiv Epoc headset could be taken into account. This low-cost EEG device has higher relative operational and maintenance costs than its medical-grade competitor, it "is able to record EEG data in a satisfying manner" but it should only be chosen for non critical applications such as games, communication or feedback evaluation in a well-known scenarios. Our study aims to use the Emotiv headset for evaluating the students' emotional state in learning process. For this goal, some steps were completed: EEG data acquisition and analysis. The headset connects wirelessly to any PC via a USB dongle allowing freedom of movement, has 14 channels/sensors and the data sample rate is 128Hz. Due to the problem of eye/head movements automatic artefact detection must be completed and only artefact-free segments must be used for analysis. A good option is to use the Matlab toolbox EEGLAB, freely available open source research software (http://sccn.ucsd.edu/eeglab) and the Independent Component Analysis (ICA) method for general EEG analysis and artifacting. Spectral power for each frequency band is computed and used for statistical analysis. Statistical analysis. Mean values and standard deviation could be calculated and relevant statistical values tests (Kolmogorov-Smirnov, ANOVA or Mauchley) have to be performed for different learning status, like reading a very simple text or a very difficult one, discussion on a subject, playing a game, programming under time restrictions. The obtained row data of the acquired signal should be compared with those provided by MIT-BIH Database Distribution (http://ecg.mit.edu/). Based on this EEG feedback approach, some conclusions can be formulated regarding students' implication in learning or the difficulty of the teaching materials.
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Pengfei, Yong Qi, Xinyi Li, and Li Su. "An ensemble MIC-based approach for performance diagnosis in big data platform." In 2013 IEEE International Conference on Big Data. IEEE, 2013. http://dx.doi.org/10.1109/bigdata.2013.6691701.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Oliveira, Gustavo Henrique de, and Franklin César Flores. "Classification of heart arrhythmia by digital image processing and machine learning." In Seminário Integrado de Software e Hardware. Sociedade Brasileira de Computação - SBC, 2023. http://dx.doi.org/10.5753/semish.2023.230225.

Full text
Abstract:
The electrocardiogram (ECG) exam can be used reliably as a measure to monitor the functionality of the cardiovascular system. Although there are many similarities between different ECG conditions, the focus of most studies has been to classify a set of database signals known as PhysionNet MIT-BIH and PTB Diagnostics data sets, rather than classifying problems in real images. In this article, we propose methods to extract features from the exam image and then algorithms such as CNN, decision tree, extra trees and random forest are used for the classification of exams, which is able to accurately classify according to the AAMI EC57 standard . According to the results, the suggested method is capable of making predictions with an average accuracy of 97.4 %.
APA, Harvard, Vancouver, ISO, and other styles
9

Ulyanov, S., A. V. Shevchenko, A. A. Shevchenko, and A. Reshetnikov. "IMPERFECT KNOWLEDGE BASE SELF-ORGANIZATION IN ROBOTIC INTELLIGENT COGNITIVE CONTROL: QUANTUM SUPREMACY ON BIG DATA ANALYSIS." In 9th International Conference "Distributed Computing and Grid Technologies in Science and Education". Crossref, 2021. http://dx.doi.org/10.54546/mlit.2021.19.68.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chih-Yen Chiang, I-Ting Kuo, Mei-Yung Tsou, Kuang-Yi Chang, S. J. Hsu, and Chia-Tai Chan. "MID-based electronic data collection and adaptive drug delivery model for improving quality of PCA services." In 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI). IEEE, 2012. http://dx.doi.org/10.1109/bhi.2012.6211651.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "MIT/BIH data base"

1

Neyedley, K., J. J. Hanley, P. Mercier-Langevin, and M. Fayek. Ore mineralogy, pyrite chemistry, and S isotope systematics of magmatic-hydrothermal Au mineralization associated with the Mooshla Intrusive Complex (MIC), Doyon-Bousquet-LaRonde mining camp, Abitibi greenstone belt, Québec. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/328985.

Full text
Abstract:
The Mooshla Intrusive Complex (MIC) is an Archean polyphase magmatic body located in the Doyon-Bousquet-LaRonde (DBL) mining camp of the Abitibi greenstone belt, Québec. The MIC is spatially associated with numerous gold (Au)-rich VMS, epizonal 'intrusion-related' Au-Cu vein systems, and shear zone-hosted (orogenic?) Au deposits. To elucidate genetic links between deposits and the MIC, mineralized samples from two of the epizonal 'intrusion-related' Au-Cu vein systems (Doyon and Grand Duc Au-Cu) have been characterized using a variety of analytical techniques. Preliminary results indicate gold (as electrum) from both deposits occurs relatively late in the systems as it is primarily observed along fractures in pyrite and gangue minerals. At Grand Duc gold appears to have formed syn- to post-crystallization relative to base metal sulphides (e.g. chalcopyrite, sphalerite, pyrrhotite), whereas base metal sulphides at Doyon are relatively rare. The accessory ore mineral assemblage at Doyon is relatively simple compared to Grand Duc, consisting of petzite (Ag3AuTe2), calaverite (AuTe2), and hessite (Ag2Te), while accessory ore minerals at Grand Duc are comprised of tellurobismuthite (Bi2Te3), volynskite (AgBiTe2), native Te, tsumoite (BiTe) or tetradymite (Bi2Te2S), altaite (PbTe), petzite, calaverite, and hessite. Pyrite trace element distribution maps from representative pyrite grains from Doyon and Grand Duc were collected and confirm petrographic observations that Au occurs relatively late. Pyrite from Doyon appears to have been initially trace-element poor, then became enriched in As, followed by the ore metal stage consisting of Au-Ag-Te-Bi-Pb-Cu enrichment and lastly a Co-Ni-Se(?) stage enrichment. Grand Duc pyrite is more complex with initial enrichments in Co-Se-As (Stage 1) followed by an increase in As-Co(?) concentrations (Stage 2). The ore metal stage (Stage 3) is indicated by another increase in As coupled with Au-Ag-Bi-Te-Sb-Pb-Ni-Cu-Zn-Sn-Cd-In enrichment. The final stage of pyrite growth (Stage 4) is represented by the same element assemblage as Stage 3 but at lower concentrations. Preliminary sulphur isotope data from Grand Duc indicates pyrite, pyrrhotite, and chalcopyrite all have similar delta-34S values (~1.5 � 1 permille) with no core-to-rim variations. Pyrite from Doyon has slightly higher delta-34S values (~2.5 � 1 permille) compared to Grand Duc but similarly does not show much core-to-rim variation. At Grand Duc, the occurrence of Au concentrating along the rim of pyrite grains and associated with an enrichment in As and other metals (Sb-Ag-Bi-Te) shares similarities with porphyry and epithermal deposits, and the overall metal association of Au with Te and Bi is a hallmark of other intrusion-related gold systems. The occurrence of the ore metal-rich rims on pyrite from Grand Duc could be related to fluid boiling which results in the destabilization of gold-bearing aqueous complexes. Pyrite from Doyon does not show this inferred boiling texture but shares characteristics of dissolution-reprecipitation processes, where metals in the pyrite lattice are dissolved and then reconcentrated into discrete mineral phases that commonly precipitate in voids and fractures created during pyrite dissolution.
APA, Harvard, Vancouver, ISO, and other styles
2

Daudelin, Francois, Lina Taing, Lucy Chen, Claudia Abreu Lopes, Adeniyi Francis Fagbamigbe, and Hamid Mehmood. Mapping WASH-related disease risk: A review of risk concepts and methods. United Nations University Institute for Water, Environment and Health, 2021. http://dx.doi.org/10.53328/uxuo4751.

Full text
Abstract:
The report provides a review of how risk is conceived of, modelled, and mapped in studies of infectious water, sanitation, and hygiene (WASH) related diseases. It focuses on spatial epidemiology of cholera, malaria and dengue to offer recommendations for the field of WASH-related disease risk mapping. The report notes a lack of consensus on the definition of disease risk in the literature, which limits the interpretability of the resulting analyses and could affect the quality of the design and direction of public health interventions. In addition, existing risk frameworks that consider disease incidence separately from community vulnerability have conceptual overlap in their components and conflate the probability and severity of disease risk into a single component. The report identifies four methods used to develop risk maps, i) observational, ii) index-based, iii) associative modelling and iv) mechanistic modelling. Observational methods are limited by a lack of historical data sets and their assumption that historical outcomes are representative of current and future risks. The more general index-based methods offer a highly flexible approach based on observed and modelled risks and can be used for partially qualitative or difficult-to-measure indicators, such as socioeconomic vulnerability. For multidimensional risk measures, indices representing different dimensions can be aggregated to form a composite index or be considered jointly without aggregation. The latter approach can distinguish between different types of disease risk such as outbreaks of high frequency/low intensity and low frequency/high intensity. Associative models, including machine learning and artificial intelligence (AI), are commonly used to measure current risk, future risk (short-term for early warning systems) or risk in areas with low data availability, but concerns about bias, privacy, trust, and accountability in algorithms can limit their application. In addition, they typically do not account for gender and demographic variables that allow risk analyses for different vulnerable groups. As an alternative, mechanistic models can be used for similar purposes as well as to create spatial measures of disease transmission efficiency or to model risk outcomes from hypothetical scenarios. Mechanistic models, however, are limited by their inability to capture locally specific transmission dynamics. The report recommends that future WASH-related disease risk mapping research: - Conceptualise risk as a function of the probability and severity of a disease risk event. Probability and severity can be disaggregated into sub-components. For outbreak-prone diseases, probability can be represented by a likelihood component while severity can be disaggregated into transmission and sensitivity sub-components, where sensitivity represents factors affecting health and socioeconomic outcomes of infection. -Employ jointly considered unaggregated indices to map multidimensional risk. Individual indices representing multiple dimensions of risk should be developed using a range of methods to take advantage of their relative strengths. -Develop and apply collaborative approaches with public health officials, development organizations and relevant stakeholders to identify appropriate interventions and priority levels for different types of risk, while ensuring the needs and values of users are met in an ethical and socially responsible manner. -Enhance identification of vulnerable populations by further disaggregating risk estimates and accounting for demographic and behavioural variables and using novel data sources such as big data and citizen science. This review is the first to focus solely on WASH-related disease risk mapping and modelling. The recommendations can be used as a guide for developing spatial epidemiology models in tandem with public health officials and to help detect and develop tailored responses to WASH-related disease outbreaks that meet the needs of vulnerable populations. The report’s main target audience is modellers, public health authorities and partners responsible for co-designing and implementing multi-sectoral health interventions, with a particular emphasis on facilitating the integration of health and WASH services delivery contributing to Sustainable Development Goals (SDG) 3 (good health and well-being) and 6 (clean water and sanitation).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography