To see the other types of publications on this topic, follow the link: Signal processing. Time-series analysis.

Dissertations / Theses on the topic 'Signal processing. Time-series analysis'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Signal processing. Time-series analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Morrill, Jeffrey P., and Jonathan Delatizky. "REAL-TIME RECOGNITION OF TIME-SERIES PATTERNS." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/608854.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada<br>This paper describes a real-time implementation of the pattern recognition technology originally developed by BBN [Delatizky et al] for post-processing of time-sampled telemetry data. This makes it possible to monitor a data stream for a characteristic shape, such as an arrhythmic heartbeat or a step-response whose overshoot is unacceptably large. Once programmed to recognize patterns of interest, it generates a symbolic description of a time-series signal in intuitive, object-oriented terms. The basic technique is to decompose the signal into a hierarchy of simpler components using rules of grammar, analogous to the process of decomposing a sentence into phrases and words. This paper describes the basic technique used for pattern recognition of time-series signals and the problems that must be solved to apply the techniques in real time. We present experimental results for an unoptimized prototype demonstrating that 4000 samples per second can be handled easily on conventional hardware.
APA, Harvard, Vancouver, ISO, and other styles
2

Jiang, Wei. "Signal processing strategies for ground-penetrating radar." Thesis, University of Bath, 2011. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.538111.

Full text
Abstract:
Interpretation of ground penetrating radar (GPR) signals can be a key point in the overall operability of a GPR system. In stepped-frequency and Frequency-Modulated Continuous-Wave (FMCW)GPR systems in particular, the target or object of interest is often located by analysis of Fast Fourier Transform (FFT) derived data. Increasing the GPR system bandwidth can improve resolution, but at the cost of reduced penetrating depth. The challenge is to develop high-resolution signal processing strategies for GPR.A number of Fourier based methods are investigated. However, the main response over a target's position can make it difficult to recognise closely spaced targets. The Least-Suare method is found to be the best autoregression-based estimator. However the method requires high Signal-to-Noise ratio to achieve high- resolution. Furthermore a number of subspace-based methods are investigated. Although the MUItiple Signal Classification (MUSIC) method can theoretically offer infinite resolution, they must be seeded with the number of targets actually present. A superimposed MUSIC technique is proposed to suppress false targets. A novel windowed MUSIC (W-MUSIC) algorithm is developed, and it offers high resolution while still able to minimise spurious responses. Since the performance of any FMCW GPR is critically linked to the linearity of the sweep frequency, the non-linearity in the target range estimation is studied. A Novel Short-Time MUSIC method is proposed and higher time and frequency resolution is achieved than the conventional Short-Time Fourier Transform method. In addition a modified Adaptive Sampling method is proposed to solve the non-linear problem by utilising a reference channel in a GPR system.
APA, Harvard, Vancouver, ISO, and other styles
3

Terwilleger, Erin. "Multidimensional time-frequency analysis /." free to MU campus, to others for purchase, 2002. http://wwwlib.umi.com/cr/mo/fullcit?p3052223.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Yerramothu, Madhu Kishore. "Stochastic Gaussian and non-Gaussian signal modeling." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2008. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Santos, Rui Pedro Silvestre dos. "Time series morphological analysis applied to biomedical signals events detection." Master's thesis, Faculdade de Ciências e Tecnologia, 2011. http://hdl.handle.net/10362/10227.

Full text
Abstract:
Dissertation submitted in the fufillment of the requirements for the Degree of Master in Biomedical Engineering<br>Automated techniques for biosignal data acquisition and analysis have become increasingly powerful, particularly at the Biomedical Engineering research field. Nevertheless, it is verified the need to improve tools for signal pattern recognition and classification systems, in which the detection of specific events and the automatic signal segmentation are preliminary processing steps. The present dissertation introduces a signal-independent algorithm, which detects significant events in a biosignal. From a time series morphological analysis, the algorithm computes the instants when the most significant standard deviation discontinuities occur, segmenting the signal. An iterative optimization step is then applied. This assures that a minimal error is achieved when modeling these segments with polynomial regressions. The adjustment of a scale factor gives different detail levels of events detection. An accurate and objective algorithm performance evaluation procedure was designed. When applied on a set of synthetic signals, with known and quantitatively predefined events, an overall mean error of 20 samples between the detected and the actual events showed the high accuracy of the proposed algorithm. Its ability to perform the detection of signal activation onsets and transient waveshapes was also assessed, resulting in higher reliability than signal-specific standard methods. Some case studies, with signal processing requirements for which the developed algorithm can be suitably applied, were approached. The algorithm implementation in real-time, as part of an application developed during this research work, is also reported. The proposed algorithm detects significant signal events with accuracy and significant noise immunity. Its versatile design allows the application in different signals without previous knowledge on their statistical properties or specific preprocessing steps. It also brings added objectivity when compared with the exhaustive and time-consuming examiner analysis. The tool introduced in this dissertation represents a relevant contribution in events detection, a particularly important issue within the wide digital biosignal processing research field.
APA, Harvard, Vancouver, ISO, and other styles
6

Pentaris, Fragkiskos. "Digital signal processing for structural health monitoring of buildings." Thesis, Brunel University, 2014. http://bura.brunel.ac.uk/handle/2438/10560.

Full text
Abstract:
Structural health monitoring (SHM) systems is a relatively new discipline, studying the structural condition of buildings and other constructions. Current SHM systems are either wired or wireless, with a relatively high cost and low accuracy. This thesis exploits a blend of digital signal processing methodologies, for structural health monitoring (SHM) and develops a wireless SHM system in order to provide a low cost implementation yet reliable and robust. Existing technologies of wired and wireless sensor network platforms with high sensitivity accelerometers are combined, in order to create a system for monitoring the structural characteristics of buildings very economically and functionally, so that it can be easily implemented at low cost in buildings. Well-known and established statistical time series methods are applied to SHM data collected from real concrete structures subjected to earthquake excitation and their strong and weak points are investigated. The necessity to combine parametric and non-parametric approaches is justified and to this direction novel and improved digital signal processing techniques and indexes are applied to vibration data recordings, in order to eliminate noise and reveal structural properties and characteristics of the buildings under study, that deteriorate due to environmental, seismic or anthropogenic impact. A characteristic and potential harming specific case study is presented, where consequences to structures due to a strong earthquake of magnitude 6.4 M are investigated. Furthermore, is introduced a seismic influence profile of the buildings under study related to the seismic sources that exist in the broad region of study.
APA, Harvard, Vancouver, ISO, and other styles
7

wang, xiaoni. "A STUDY OF EQUATORIAL IONOPSHERIC VARIABILITY USING SIGNAL PROCESSING TECHNIQUES." Doctoral diss., University of Central Florida, 2007. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/2415.

Full text
Abstract:
The dependence of equatorial ionosphere on solar irradiances and geomagnetic activity are studied in this dissertation using signal processing techniques. The statistical time series, digital signal processing and wavelet methods are applied to study the ionospheric variations. The ionospheric data used are the Total Electron Content (TEC) and the critical frequency of the F2 layer (foF2). Solar irradiance data are from recent satellites, the Student Nitric Oxide Explorer (SNOE) satellite and the Thermosphere Ionosphere Mesosphere Energetics Dynamics (TIMED) satellite. The Disturbance Storm-Time (Dst) index is used as a proxy of geomagnetic activity in the equatorial region. The results are summarized as follows. (1) In the short-term variations < 27-days, the previous three days solar irradiances have significant correlation with the present day ionospheric data using TEC, which may contribute 18% of the total variations in the TEC. The 3-day delay between solar irradiances and TEC suggests the effects of neutral densities on the ionosphere. The correlations between solar irradiances and TEC are significantly higher than those using the F10.7 flux, a conventional proxy for short wavelength band of solar irradiances. (2) For variations < 27 days, solar soft X-rays show similar or higher correlations with the ionosphere electron densities than the Extreme Ultraviolet (EUV). The correlations between solar irradiances and foF2 decrease from morning (0.5) to the afternoon (0.1). (3) Geomagnetic activity plays an important role in the ionosphere in short-term variations < 10 days. The average correlation between TEC and Dst is 0.4 at 2-3, 3-5, 5-9 and 9-11 day scales, which is higher than those between foF2 and Dst. The correlations between TEC and Dst increase from morning to afternoon. The moderate/quiet geomagnetic activity plays a distinct role in these short-term variations of the ionosphere (~0.3 correlation).<br>Ph.D.<br>School of Electrical Engineering and Computer Science<br>Engineering and Computer Science<br>Electrical Engineering PhD
APA, Harvard, Vancouver, ISO, and other styles
8

Kwong, Siu-shing. "Detection of determinism of nonlinear time series with application to epileptic electroencephalogram analysis." View the Table of Contents & Abstract, 2005. http://sunzi.lib.hku.hk/hkuto/record/B35512222.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kirsch, Matthew Robert. "Signal Processing Algorithms for Analysis of Categorical and Numerical Time Series: Application to Sleep Study Data." Case Western Reserve University School of Graduate Studies / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=case1278606480.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jayaraman, Vinoth, Sivakumaran Sivalingam, and Sangeetha Munian. "Analysis of Real Time EEG Signals." Thesis, Linnéuniversitetet, Institutionen för fysik och elektroteknik (IFE), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-34164.

Full text
Abstract:
The recent evolution in multidisciplinary fields of Engineering, neuroscience, microelectronics, bioengineering and neurophysiology have reduced the gap between human and machine intelligence. Many methods and algorithms have been developed for analysis and classification of bio signals, 1 or 2-dimensional, in time or frequency distribution. The integration of signal processing with the electronic devices serves as a major root for the development of various biomedical applications. There are many ongoing research in this area to constantly improvise and build an efficient human- robotic system. Electroencephalography (EEG) technology is an efficient way of recording electrical activity of the brain. The advancement of EEG technology in biomedical application helps in diagnosing various brain disorders as tumors, seizures, Alzheimer’s disease, epilepsy and other malfunctions in human brain. The main objective of our thesis deals with acquiring and pre-processing of real time EEG signals using a single dry electrode placed on the forehead. The raw EEG signals are transmitted in a wireless mode (Bluetooth) to the local acquisition server and stored in the computer. Various machine learning techniques are preferred to classify EEG signals precisely. Different algorithms are built for analysing various signal processing techniques to process the signals. These results can be further used for the development of better Brain-computer interface systems.
APA, Harvard, Vancouver, ISO, and other styles
11

Samimy, Bahman. "Mechanical Signature Analysis Using Time-Frequency Signal Processing /." The Ohio State University, 1995. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487931512618312.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Little, M. A. "Biomechanically informed nonlinear speech signal processing." Thesis, University of Oxford, 2007. http://ora.ox.ac.uk/objects/uuid:6f5b84fb-ab0b-42e1-9ac2-5f6acc9c5b80.

Full text
Abstract:
Linear digital signal processing based around linear, time-invariant systems theory finds substantial application in speech processing. The linear acoustic source-filter theory of speech production provides ready biomechanical justification for using linear techniques. Nonetheless, biomechanical studies surveyed in this thesis display significant nonlinearity and non-Gaussinity, casting doubt on the linear model of speech production. In order therefore to test the appropriateness of linear systems assumptions for speech production, surrogate data techniques can be used. This study uncovers systematic flaws in the design and use of exiting surrogate data techniques, and, by making novel improvements, develops a more reliable technique. Collating the largest set of speech signals to-date compatible with this new technique, this study next demonstrates that the linear assumptions are not appropriate for all speech signals. Detailed analysis shows that while vowel production from healthy subjects cannot be explained within the linear assumptions, consonants can. Linear assumptions also fail for most vowel production by pathological subjects with voice disorders. Combining this new empirical evidence with information from biomechanical studies concludes that the most parsimonious model for speech production, explaining all these findings in one unified set of mathematical assumptions, is a stochastic nonlinear, non-Gaussian model, which subsumes both Gaussian linear and deterministic nonlinear models. As a case study, to demonstrate the engineering value of nonlinear signal processing techniques based upon the proposed biomechanically-informed, unified model, the study investigates the biomedical engineering application of disordered voice measurement. A new state space recurrence measure is devised and combined with an existing measure of the fractal scaling properties of stochastic signals. Using a simple pattern classifier these two measures outperform all combinations of linear methods for the detection of voice disorders on a large database of pathological and healthy vowels, making explicit the effectiveness of such biomechanically-informed, nonlinear signal processing techniques.
APA, Harvard, Vancouver, ISO, and other styles
13

Shah, Nauman. "Statistical dynamical models of multivariate financial time series." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:428015e6-8a52-404e-9934-0545c80da4e1.

Full text
Abstract:
The last few years have witnessed an exponential increase in the availability and use of financial market data, which is sampled at increasingly high frequencies. Extracting useful information about the dependency structure of a system from these multivariate data streams has numerous practical applications and can aid in improving our understanding of the driving forces in the global financial markets. These large and noisy data sets are highly non-Gaussian in nature and require the use of efficient and accurate interaction measurement approaches for their analysis in a real-time environment. However, most frequently used measures of interaction have certain limitations to their practical use, such as the assumption of normality or computational complexity. This thesis has two major aims; firstly, to address this lack of availability of suitable methods by presenting a set of approaches to dynamically measure symmetric and asymmetric interactions, i.e. causality, in multivariate non-Gaussian signals in a computationally efficient (online) framework, and secondly, to make use of these approaches to analyse multivariate financial time series in order to extract interesting and practically useful information from financial data. Most of our proposed approaches are primarily based on independent component analysis, a blind source separation method which makes use of higher-order statistics to capture information about the mixing process which gives rise to a set of observed signals. Knowledge about this information allows us to investigate the information coupling dynamics, as well as to study the asymmetric flow of information, in multivariate non-Gaussian data streams. We extend our multivariate interaction models, using a variety of statistical techniques, to study the scale-dependent nature of interactions and to analyse dependencies in high-dimensional systems using complex coupling networks. We carry out a detailed theoretical, analytical and empirical comparison of our proposed approaches with some other frequently used measures of interaction, and demonstrate their comparative utility, efficiency and accuracy using a set of practical financial case studies, focusing primarily on the foreign exchange spot market.
APA, Harvard, Vancouver, ISO, and other styles
14

Dang, Pei. "Time-frequency analysis based on mono-components." Thesis, University of Macau, 2011. http://umaclib3.umac.mo/record=b2489938.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

McLaughlin, John J. "Applications of operator theory to time-frequency analysis and classification /." Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/5861.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Thonet, Gilles. "New aspects of time-frequency analysis for biomedical signal processing /." Lausanne : EPFL, 1999. http://library.epfl.ch/theses/?nr=1913.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Smith, Daniel. "An analysis of blind signal separation for real time application." Access electronically, 2006. http://www.library.uow.edu.au/adt-NWU/public/adt-NWU20070815.152400/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Niethammer, Marc. "Application of time frequency representations to characterize ultrasonic signals." Thesis, Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/19005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Zhang, Xiaojun. "Analysis and pre-processing of signals observed in optical feedback self-mixing interferometry." School of Electrical, Computer and Telecommunications Engineering - Faculty of Informatics, 2008. http://ro.uow.edu.au/theses/102.

Full text
Abstract:
Since the laser technlology has been applied for providing highly precise measurement, laser interferometry based systems have found increasing applications in the distance, displace measurement and related applications. Recently, a simple construction of laser interferometer with the use of so-called optical feedback self-mixing interferometry (OFSMI) effect has become a popular technique in optical measurement field. In comparison with conventional interferometer, OFSMI enables simple, compact size and cheap interferometer devices to be implemented.This thesis studies the spectrum characteristics of OFSMI signals and outlines novel approaches to analysze and process the noisy signal at the time and frequency domain simultaneously. The work is motivated by the observation that, when OFSMI signal is given at weak feedback regime (feedback parameter C _ 1), the signal is strictly bandlimited, consequently an linear band-pass filter can be applied to remove the noise disturbance while preserving the signals waveform unchanged. On the other hand, in case of OFSMI signal is obtained with C > 1, an efficient denoising algorithm based on joint time-frequency representation (TFR) can be applied. It has been found that TFR approach provides an sufficient prospective for study the behavior of OFSMI signals for C > 1.This work contributes to the framework of pre-processing and analyzing of OFMSI signals. This thesis focus on the spectrum characteristics and the noise attenuation at weak and moderate feedback regime. To achieve this, the ability of band-pass FIR filters and TFR methods in OFSMI signal processing have been evaluated and compared. The results of this work lead to an significant improvement to the performance of OFSMI based laser measurement system.
APA, Harvard, Vancouver, ISO, and other styles
20

Ahlström, Christer. "Nonlinear phonocardiographic Signal Processing." Doctoral thesis, Linköpings universitet, Fysiologisk mätteknik, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-11302.

Full text
Abstract:
The aim of this thesis work has been to develop signal analysis methods for a computerized cardiac auscultation system, the intelligent stethoscope. In particular, the work focuses on classification and interpretation of features derived from the phonocardiographic (PCG) signal by using advanced signal processing techniques. The PCG signal is traditionally analyzed and characterized by morphological properties in the time domain, by spectral properties in the frequency domain or by nonstationary properties in a joint time-frequency domain. The main contribution of this thesis has been to introduce nonlinear analysis techniques based on dynamical systems theory to extract more information from the PCG signal. Especially, Takens' delay embedding theorem has been used to reconstruct the underlying system's state space based on the measured PCG signal. This processing step provides a geometrical interpretation of the dynamics of the signal, whose structure can be utilized for both system characterization and classification as well as for signal processing tasks such as detection and prediction. In this thesis, the PCG signal's structure in state space has been exploited in several applications. Change detection based on recurrence time statistics was used in combination with nonlinear prediction to remove obscuring heart sounds from lung sound recordings in healthy test subjects. Sample entropy and mutual information were used to assess the severity of aortic stenosis (AS) as well as mitral insufficiency (MI) in dogs. A large number of, partly nonlinear, features was extracted and used for distinguishing innocent murmurs from murmurs caused by AS or MI in patients with probable valve disease. Finally, novel work related to very accurate localization of the first heart sound by means of ECG-gated ensemble averaging was conducted. In general, the presented nonlinear processing techniques have shown considerably improved results in comparison with other PCG based techniques. In modern health care, auscultation has found its main role in primary or in home health care, when deciding if special care and more extensive examinations are required. Making a decision based on auscultation is however difficult, why a simple tool able to screen and assess murmurs would be both time- and cost-saving while relieving many patients from needless anxiety. In the emerging field of telemedicine and home care, an intelligent stethoscope with decision support abilities would be of great value.
APA, Harvard, Vancouver, ISO, and other styles
21

Lopes, David Manuel Baptista. "Signal reconstruction from partial or modified linear time frequency representations." Thesis, University of Southampton, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.364726.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Dahl, Jason F. "Time Aliasing Methods of Spectrum Estimation." Diss., CLICK HERE for online access, 2003. http://contentdm.lib.byu.edu/ETD/image/etd157.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Bikdash, Marwan. "Analysis and filtering of time-varying signals." Thesis, Virginia Polytechnic Institute and State University, 1988. http://hdl.handle.net/10919/80015.

Full text
Abstract:
The characterization, analysis and filtering of a slowly time-varying (STV) deterministic signal are considered. A STV signal is characterized as a sophisticated signal whose windowed sections are elementary signals. Mixed time-frequency representations (MTFRs) such as the Wigner distribution (WD), the Pseudo-Wigner distribution (PWD), the Short-time Fourier transform (STFT) and the optimally smoothed Wigner distribution (OSWD) used in analyzing STV signals are analyzed and compared. The OSWD is shown to perform satisfactorily even if the signals are amplitude modulated. The OSWD is shown to yield the exact instantaneous frequency for STV signals having quadratic phase: and to have a minimal and meaningful Bandwidth (BW) that does not depend on the slope of the instantaneous frequency curve in the time-frequency plane, unlike the BW of the spectrogram. We also present some contributions to the ongoing debate addressing the issue of choosing the MTFR that is best suited to the analysis of STV signals. Using analytical and experimental results, the performances of the different MTFRs are compared, and the conditions under which a given MTFR performs better are considered. The filtering of a signal from a noise-corrupted measurement, and the decomposition of a STV signal into its components in the presence of noise, are considered. These two related problems have been solved through masking the MTFRs of the measured signal. This approach has been successfully used in the case of the WD, PWD and the STFT. We propose extending the use of this approach to the OSWD. An equivalent time-domain implementation based on linear shift-variant (LSV) filters is derived and fully analyzed. It is based on the concept of local nonstationarity cancellation. The proposed filter is shown to have a superior performance when compared to the filter based on masking the STFT. The sensitivity of the filter is studied. The filter ability to suppress white noise and to decompose a STV signal into its components is analyzed and illustrated.<br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
24

Bayram, Saffet. "Overloaded Array Processing: System Analysis, Signal Extraction Techniques, and Time-delay Estimation." Thesis, Virginia Tech, 2000. http://hdl.handle.net/10919/36039.

Full text
Abstract:
In airborne communication systems such as airborne cell-extender repeaters the receiver faces the challenge of demodulating the signal of interest (SOI) in the presence of excessive amounts of Co-Channel Interference (CCI) from a large number of sources. This results in the overloaded environment where the number of near-equal power co-channel interferers exceeds the number of antenna array elements. This thesis first analyzes the interference environment experienced by an airborne cellular repeater flying at high altitudes. Link budget analysis using a two-ray propagation model shows that the antenna array mounted on an airborne receiver has to recover the SOI out of hundreds of co-channel interfering signals. This necessitates use of complex overloaded array signal processing techniques. An extensive literature survey on narrowband signal extraction algorithms shows that joint detection schemes, coupled with antenna arrays, provide a solution for narrowband overloaded array problem where as traditional beamforming techniques fail. Simulation results in this thesis investigates three "promising" overloaded array processing algorithms, Multi-User Decision Feedback Equalizer (MU-DFE), Iterative Least Squares with Projection (ILSP), and Iterative Least Squares with Enumeration (ILSE). ILSE is a non-linear joint maximum-likelihood detector, is shown to demodulate many more signals than elements even when the users are closely spaced and the channel is blindly estimated. Multi-user time delay estimation is one of the most important aspects of channel estimation for overloaded array processing. The final chapter of the thesis proposes a low-complexity data-aided time-delay estimation structure for embedding in a Per Survivor Processing (PSP) trellis for overloaded array processing. An extensive analysis proves that the multi-user delay estimation is separable, which leads to the proposed multi-user algorithm that estimates the user delays with a bank of simple data-aided synchronization loops to reduce the complexity. This thesis shows simulation results for the single-user case where the low-complexity Delay Locked Loop (DLL) structure, working at a low oversampling rate of 2 samples per symbol, estimates and compensates for any integer or non-integer sample delay within ±Tsym(symbol period). Two extensions to this technique are proposed to provide efficient multi-user delay estimation. The first multi-user structure employs a bank of DLLs, which compensate for the timing offset of each user simultaneously. This multi-user algorithm is suitable for CDMA-type applications, where each user has a distinct PN-code with good auto- and cross-correlation properties. We show that for spreading gain of 31, the presence of an interpolator enables us to reduce the oversampling factor from 4 to 2 samples per chip. Thus, the requirements of the A/D converter are relaxed without sacrificing system performance. Furthermore, we show that the proposed scheme meets the requirements of multi-user interference cancellation techniques for residual worst-case timing errors, i.e., residual timing error < 0.2 Tc, as reported in [200]. Finally, the thesis recommends a similar multi-user structure for narrowband TDMA-type system, which is based on bank of DLLs with whitening pre-filters at the front end of each branch.<br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
25

Bekker, Scott Henry. "Continuous real-time recovery of optical spectral features distorted by fast-chirped readout." Thesis, Montana State University, 2006. http://etd.lib.montana.edu/etd/2006/bekker/BekkerS0506.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Rankine, Luke. "Newborn EEG seizure detection using adaptive time-frequency signal processing." Queensland University of Technology, 2006. http://eprints.qut.edu.au/16200/.

Full text
Abstract:
Dysfunction in the central nervous system of the neonate is often first identified through seizures. The diffculty in detecting clinical seizures, which involves the observation of physical manifestations characteristic to newborn seizure, has placed greater emphasis on the detection of newborn electroencephalographic (EEG) seizure. The high incidence of newborn seizure has resulted in considerable mortality and morbidity rates in the neonate. Accurate and rapid diagnosis of neonatal seizure is essential for proper treatment and therapy. This has impelled researchers to investigate possible methods for the automatic detection of newborn EEG seizure. This thesis is focused on the development of algorithms for the automatic detection of newborn EEG seizure using adaptive time-frequency signal processing. The assessment of newborn EEG seizure detection algorithms requires large datasets of nonseizure and seizure EEG which are not always readily available and often hard to acquire. This has led to the proposition of realistic models of newborn EEG which can be used to create large datasets for the evaluation and comparison of newborn EEG seizure detection algorithms. In this thesis, we develop two simulation methods which produce synthetic newborn EEG background and seizure. The simulation methods use nonlinear and time-frequency signal processing techniques to allow for the demonstrated nonlinear and nonstationary characteristics of the newborn EEG. Atomic decomposition techniques incorporating redundant time-frequency dictionaries are exciting new signal processing methods which deliver adaptive signal representations or approximations. In this thesis we have investigated two prominent atomic decomposition techniques, matching pursuit and basis pursuit, for their possible use in an automatic seizure detection algorithm. In our investigation, it was shown that matching pursuit generally provided the sparsest (i.e. most compact) approximation for various real and synthetic signals over a wide range of signal approximation levels. For this reason, we chose MP as our preferred atomic decomposition technique for this thesis. A new measure, referred to as structural complexity, which quantifes the level or degree of correlation between signal structures and the decomposition dictionary was proposed. Using the change in structural complexity, a generic method of detecting changes in signal structure was proposed. This detection methodology was then applied to the newborn EEG for the detection of state transition (i.e. nonseizure to seizure state) in the EEG signal. To optimize the seizure detection process, we developed a time-frequency dictionary that is coherent with the newborn EEG seizure state based on the time-frequency analysis of the newborn EEG seizure. It was shown that using the new coherent time-frequency dictionary and the change in structural complexity, we can detect the transition from nonseizure to seizure states in synthetic and real newborn EEG. Repetitive spiking in the EEG is a classic feature of newborn EEG seizure. Therefore, the automatic detection of spikes can be fundamental in the detection of newborn EEG seizure. The capacity of two adaptive time-frequency signal processing techniques to detect spikes was investigated. It was shown that a relationship between the EEG epoch length and the number of repetitive spikes governs the ability of both matching pursuit and adaptive spectrogram in detecting repetitive spikes. However, it was demonstrated that the law was less restrictive forth eadaptive spectrogram and it was shown to outperform matching pursuit in detecting repetitive spikes. The method of adapting the window length associated with the adaptive spectrogram used in this thesis was the maximum correlation criterion. It was observed that for the time instants where signal spikes occurred, the optimal window lengths selected by the maximum correlation criterion were small. Therefore, spike detection directly from the adaptive window optimization method was demonstrated and also shown to outperform matching pursuit. An automatic newborn EEG seizure detection algorithm was proposed based on the detection of repetitive spikes using the adaptive window optimization method. The algorithm shows excellent performance with real EEG data. A comparison of the proposed algorithm with four well documented newborn EEG seizure detection algorithms is provided. The results of the comparison show that the proposed algorithm has significantly better performance than the existing algorithms (i.e. Our proposed algorithm achieved a good detection rate (GDR) of 94% and false detection rate (FDR) of 2.3% compared with the leading algorithm which only produced a GDR of 62% and FDR of 16%). In summary, the novel contribution of this thesis to the fields of time-frequency signal processing and biomedical engineering is the successful development and application of sophisticated algorithms based on adaptive time-frequency signal processing techniques to the solution of automatic newborn EEG seizure detection.
APA, Harvard, Vancouver, ISO, and other styles
27

Sodagar, Iraj. "Analysis and design of time-varying filter banks." Diss., Georgia Institute of Technology, 1994. http://hdl.handle.net/1853/13437.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Marti, Gautier. "Some contributions to the clustering of financial time series and applications to credit default swaps." Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLX097/document.

Full text
Abstract:
Nous commençons cette thèse par passer en revue l'ensemble épars de la littérature sur les méthodes de partitionnement automatique des séries temporelles financières. Ensuite, tout en introduisant les jeux de données qui ont aussi bien servi lors des études empiriques que motivé les choix de modélisation, nous essayons de donner des informations intéressantes sur l'état du marché des couvertures de défaillance peu connu du grand public sinon pour son rôle lors de la crise financière mondiale de 2007-2008. Contrairement à la majorité de la littérature sur les méthodes de partitionnement automatique des séries temporelles financières, notre but n'est pas de décrire et expliquer les résultats par des explications économiques, mais de pouvoir bâtir des modèles et autres larges systèmes d'information sur ces groupes homogènes. Pour ce faire, les fondations doivent être stables. C'est pourquoi l'essentiel des travaux entrepris et décrits dans cette thèse visent à affermir le bien-fondé de l'utilisation de ces regroupements automatiques en discutant de leur consistance et stabilité aux perturbations. De nouvelles distances entre séries temporelles financières prenant mieux en compte leur nature stochastique et pouvant être mis à profit dans les méthodes de partitionnement automatique existantes sont proposées. Nous étudions empiriquement leur impact sur les résultats. Les résultats de ces études peuvent être consultés sur www.datagrapple.com<br>In this thesis we first review the scattered literature about clustering financial time series. We then try to give as much colors as possible on the credit default swap market, a relatively unknown market from the general public but for its role in the contagion of bank failures during the global financial crisis of 2007-2008, while introducing the datasets that have been used in the empirical studies. Unlike the existing body of literature which mostly offers descriptive studies, we aim at building models and large information systems based on clusters which are seen as basic building blocks: These foundations must be stable. That is why the work undertaken and described in the following intends to ground further the clustering methodologies. For that purpose, we discuss their consistency and propose alternative measures of similarity that can be plugged in the clustering methodologies. We study empirically their impact on the clusters. Results of the empirical studies can be explored at www.datagrapple.com
APA, Harvard, Vancouver, ISO, and other styles
29

Hartquist, John E. "Real-time Musical Analysis of Polyphonic Guitar Audio." DigitalCommons@CalPoly, 2012. https://digitalcommons.calpoly.edu/theses/808.

Full text
Abstract:
In this thesis, we analyze the audio signal of a guitar to extract musical data in real-time. Specifically, the pitch and octave of notes and chords are displayed over time. Previous work has shown that non-negative matrix factorization is an effective method for classifying the pitches of simultaneous notes. We explore the effect of window size, hop length, and other parameters to maximize the resolution and accuracy of the output.Other groups have required prerecorded note samples to build a library of note templates to search for. We automate this step and compute the library at run-time, tuning it specifically for the input guitar. The program we present generates a musical visualization of the results in addition to suggestions for fingerings of chords in the form of a fretboard display and tablature notation. This program is built as an applet and is accessible from the web browser.
APA, Harvard, Vancouver, ISO, and other styles
30

Kyriazis, Panagiotis A. "Analysis and processing of mechanically stimulated electrical signals for the identification of deformation in brittle materials." Thesis, Brunel University, 2010. http://bura.brunel.ac.uk/handle/2438/4604.

Full text
Abstract:
The fracture of brittle materials is of utmost importance for civil engineering and seismology applications. A different approach towards the aim of early identification of fracture and the prediction of failure before it occurs is attempted in this work. Laboratory experiments were conducted in a variety of rock and cement based material specimens of various shapes and sizes. The applied loading schemes were cyclic or increasing and the specimens were tested to compression and bending type loading of various levels. The techniques of Pressure Stimulated Current and Bending Stimulated Current were used for the detection of electric signal emissions during the various deformation stages of the specimens. The detected signals were analysed macroscopically and microscopically so as to find suitable criteria for fracture prediction and correlation between the electrical and mechanical parameters. The macroscopic proportionality of the mechanically stimulated electric signal and the strain was experimentally verified, the macroscopic trends of the PSC and BSC electric signals were modelled and the effects of material memory to the electric signals were examined. The current of a time-varying RLC electric circuit was tested against experimental data with satisfactory results and it was proposed as an electrical equivalent model. Wavelet based analysis of the signal revealed the correlation between the frequency components of the electric signal and the deformation stages of the material samples. Especially the increase of the high frequency component of the electric signal seems to be a good precursor of macrocracking initiation point. The additional electric stimulus of a dc voltage application seems to boost the frequency content of the signal and reveals better the stages of cracking process. The microscopic analysis method is scale-free and thus it can confront with the problems of size effects and material properties effects. The AC conductivity time series of fractured and pristine specimens were also analysed by means of wavelet transform and the spectral analysis was used to differentiate between the specimens. A non-destructive technique may be based on these results. Analysis has shown that the electric signal perturbation is an indicator of the forthcoming fracture, as well as of the fracture that has already occurred in specimens.
APA, Harvard, Vancouver, ISO, and other styles
31

Kosek, Paul C. "Improved analysis of musical sounds using time-frequency distributions." Thesis, McGill University, 2005. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=83189.

Full text
Abstract:
The objective of this research is to improve the analysis of musical sounds in comparison to traditional additive analysis, i.e. Fourier Analysis. Namely, the focus of this study is to improve the tracking of time-evolving partials. Traditional analysis methods assume constant amplitudes and frequencies over each successive frame in which a signal is analyzed. Tracking the time-evolution of these partials, however, can require the implementation of complex probabilistic techniques. This thesis presents an alternative method in which the Ambiguity Function, a distribution in both time and frequency, is used to create a clearer, more accurate representation that requires fewer complex methods to track partials. Through the use of a more accurate spectral representation and the inclusion of a chirp rate parameter, partials may be more readily followed based upon spectral parameters alone. This new method that is presented will build upon the traditional methods by first employing Fourier analysis to identify partials, and then utilizing the Analytic Signal and Ambiguity Function to improve individual spectral parameter estimations and partial tracking. The overall intent of this work is that through this method, one may create an improved spectral model that is more useful to musical analysis.
APA, Harvard, Vancouver, ISO, and other styles
32

Nayebi, Kambiz. "A time domain framework for the analysis and design of FIR multirate filter bank systems." Diss., Georgia Institute of Technology, 1990. http://hdl.handle.net/1853/13867.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Hung, Roy. "Time domain analysis and synthesis of cello tones based on perceptual quality and playing gestures /." Hong Kong : University of Hong Kong, 1998. http://sunzi.lib.hku.hk/hkuto/record.jsp?B20665672.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Shandilya, Sharad. "ASSESSMENT AND PREDICTION OF CARDIOVASCULAR STATUS DURING CARDIAC ARREST THROUGH MACHINE LEARNING AND DYNAMICAL TIME-SERIES ANALYSIS." VCU Scholars Compass, 2013. http://scholarscompass.vcu.edu/etd/3198.

Full text
Abstract:
In this work, new methods of feature extraction, feature selection, stochastic data characterization/modeling, variance reduction and measures for parametric discrimination are proposed. These methods have implications for data mining, machine learning, and information theory. A novel decision-support system is developed in order to guide intervention during cardiac arrest. The models are built upon knowledge extracted with signal-processing, non-linear dynamic and machine-learning methods. The proposed ECG characterization, combined with information extracted from PetCO2 signals, shows viability for decision-support in clinical settings. The approach, which focuses on integration of multiple features through machine learning techniques, suits well to inclusion of multiple physiologic signals. Ventricular Fibrillation (VF) is a common presenting dysrhythmia in the setting of cardiac arrest whose main treatment is defibrillation through direct current countershock to achieve return of spontaneous circulation. However, often defibrillation is unsuccessful and may even lead to the transition of VF to more nefarious rhythms such as asystole or pulseless electrical activity. Multiple methods have been proposed for predicting defibrillation success based on examination of the VF waveform. To date, however, no analytical technique has been widely accepted. For a given desired sensitivity, the proposed model provides a significantly higher accuracy and specificity as compared to the state-of-the-art. Notably, within the range of 80-90% of sensitivity, the method provides about 40% higher specificity. This means that when trained to have the same level of sensitivity, the model will yield far fewer false positives (unnecessary shocks). Also introduced is a new model that predicts recurrence of arrest after a successful countershock is delivered. To date, no other work has sought to build such a model. I validate the method by reporting multiple performance metrics calculated on (blind) test sets.
APA, Harvard, Vancouver, ISO, and other styles
35

Darrington, John Mark. "Real time extraction of ECG fiducial points using shape based detection." University of Western Australia. School of Computer Science and Software Engineering, 2009. http://theses.library.uwa.edu.au/adt-WU2009.0152.

Full text
Abstract:
The electrocardiograph (ECG) is a common clinical and biomedical research tool used for both diagnostic and prognostic purposes. In recent years computer aided analysis of the ECG has enabled cardiographic patterns to be found which were hitherto not apparent. Many of these analyses rely upon the segmentation of the ECG into separate time delimited waveforms. The instants delimiting these segments are called the
APA, Harvard, Vancouver, ISO, and other styles
36

Purkayastha, Pratik. "Diagnostics and Prognostics of safety critical systems using machine learning, time and frequency domain analysis." Thesis, Blekinge Tekniska Högskola, Institutionen för tillämpad signalbehandling, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17603.

Full text
Abstract:
The prime focus of this thesis was to develop a robust Prognostic and Diagnostic Health Management module (PDHM), capable of detecting faults, classifying faults, fault progression tracking and estimating time to failure. Priority was to obtain as much accuracy as possible with the bare minimum amount of sensors as possible. Algorithms like k-Nearest Neighbors (k-NN), Linear and Non- Linear regression and development of rule engine to identify safe operating limits were deployed. The entire solution was developed using R (v 3.5.0). The accuracy of around 98% was obtained in diagnostics. For Prognostics, our ability to predict time to failure more accurately increases with time. Some balance must be there between learning horizon and predicting horizon in order to get good predictions with reasonable time left to hit catastrophic failure. In conclusion, the PDHM module works just as desired and makes Predictive maintenance, smart replacement and crisis prediction possible ensuring the safety and security of people on board and assets.
APA, Harvard, Vancouver, ISO, and other styles
37

Upperman, Gary J. "Implementation of a cyclostationary spectral analysis algorithm on an SRC reconfigurable computer for real-time signal processing." Thesis, Monterey, Calif. : Naval Postgraduate School, 2008. http://bosun.nps.edu/uhtbin/hyperion-image.exe/08Mar%5FUpperman%5FGary.pdf.

Full text
Abstract:
Thesis (M.S. in Electrical Engineering)--Naval Postgraduate School, March 2008.<br>Thesis Advisor(s): Fouts, Douglas J. ; Pace, Phillip E. "March 2008." Description based on title screen as viewed on May 16, 2008. Includes bibliographical references (p. 101-102). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
38

Lee, Sang-Kwon. "Adaptive signal processing and higher order time frequency analysis for acoustic and vibration signatures in condition monitoring." Thesis, University of Southampton, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.242731.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Bartůšek, Jan. "Time Frequency Analysis of ERP Signals." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2007. http://www.nusl.cz/ntk/nusl-412769.

Full text
Abstract:
Tato práce se zabývá vylepšením algoritmu pro sdružování (clustering) ERP signálů pomocí analýzy časových a prostorových vlastností pseudo-signálů získaných za pomocí metody analýzy nezávislých komponent (Independent Component Analysis). Naším zájmem je nalezení nových vlastností, které by zlepšily stávající výsledky. Tato práce se zabývá použitím Fourierovy transformace (Fourier Transform), FIR filtru a krátkodobé Fourierovy transformace ke zkvalitnění informace pro sdružovací algoritmy. Princip a použitelnost metody jsou popsány a demonstrovány ukázkovým algoritmem. Výsledky ukázaly, že pomocí dané metody je možné získat ze vstupních dat zajímavé informace, které mohou být úspěšně použity ke zlepšení výsledků.
APA, Harvard, Vancouver, ISO, and other styles
40

Parry, Robert Mitchell. "Separation and Analysis of Multichannel Signals." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/19743.

Full text
Abstract:
Music recordings contain the mixed contribution of multiple overlapping instruments. In order to better understand the music, it would be beneficial to understand each instrument independently. This thesis focuses on separating the individual instrument recordings within a song. In particular, we propose novel algorithms for separating instrument recordings given only their mixture. When the number of source signals does not exceed the number of mixture signals, we focus on a subclass of source separation algorithms based on joint diagonalization. Each approach leverages a different form of source structure. We introduce repetitive structure as an alternative that leverages unique repetition patterns in music and compare its performance against the other techniques. When the number of source signals exceeds the number of mixtures (i.e. the underdetermined problem), we focus on spectrogram factorization techniques for source separation. We extend single-channel techniques to utilize the additional spatial information in multichannel recordings, and use phase information to improve the estimation of the underlying components.
APA, Harvard, Vancouver, ISO, and other styles
41

Löfgren, Isabelle. "Interharmonic Analysis of Sustainable Energy Sources and Loads : Comparing two signal processing methods for estimation of interharmonics." Thesis, Högskolan Dalarna, Energiteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:du-34236.

Full text
Abstract:
In this report, studies on interharmonics from three different measurement sites are performed. The first site is a wind park with three turbines, where the measurements are performed at the point of common coupling of these three. The second site is a network which consists of a PV inverter and two types of EV chargers – a DC charger or an AC charger. Measurements are performed with three different set-ups in this site – only AC charger connected, only DC charger connected, and AC charger and PV inverter connected simultaneously. The third site where measurements were made is a microgrid using frequency control in order to signal how the microgrid should operate at the moment. The interharmonic analysis was conducted using desynchronized processing technique (DP) and Sliding-Window Estimation of Signal Parameters via Rotational Invariance Techniques (SlidingWindow ESPRIT or SWESPRIT). The result from the wind park is that closely and evenly spaced interharmonics can be seen when the current suddenly increases (could be fast variations in wind speed). It is however uncertain if these interharmonics are caused by spectral leakage or not since SWESPRIT estimates the fundamental frequency to vary drastically when wind speed varies. It is observed that the SWESPRIT estimation of fundamental frequency could be caused by sudden changes in phase angle as the current varies. Further investigation and analysis are needed. The result from the measurements on the site with EV chargers and a PV inverter is that eight distinct patterns can be observed. Some patterns appear to come from the upstream grid, while some appear to be caused by either one of the EV chargers or the PV inverter, or interaction between them. Further studies are needed. The result from the microgrid measurements is that two distinct patterns at high frequencies (above 1000 Hz) can be observed during grid connected mode and island mode, respectively. During transitions between grid connection and island mode or vice versa, the fundamental frequency varies drastically, and it is therefore hard to analyse potential interharmonics and draw inferences. Further studies are needed. Advantages and disadvantages, as well as ideas for improvements, of the two applied signal processing methods are discussed throughout the different case-studies.
APA, Harvard, Vancouver, ISO, and other styles
42

Yang, Zhenghong. "Joint time frequency analysis of Global Positioning System (GPS) multipath signals." Ohio : Ohio University, 1998. http://www.ohiolink.edu/etd/view.cgi?ohiou1176234303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Wang, Yan Bo. "Adaptive decomposition of signals into mono-components." Thesis, University of Macau, 2010. http://umaclib3.umac.mo/record=b2489954.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Guan, Yunpeng. "Velocity Synchronous Approaches for Planetary Gearbox Fault Diagnosis under Non-Stationary Conditions." Thesis, Université d'Ottawa / University of Ottawa, 2019. http://hdl.handle.net/10393/38636.

Full text
Abstract:
Time-frequency methods are widely used tools to diagnose planetary gearbox fault under non-stationary conditions. However, the existing time-frequency methods still have some problems, such as smearing effect and cross-term interference, and these problems limit the effectiveness of the existing time-frequency methods in planetary gearbox fault diagnosis under non-stationary conditions. To address the aforementioned problems, four time-frequency methods are proposed in this thesis. As nowadays a large portion of the industrial equipment is equipped with tachometers, the first three methods are for the cases that the shaft rotational speed is easily accessible and the last method is for the cases of shaft rotational speed is not easily accessible. The proposed methods are itemized as follows: (1) The velocity synchronous short-time Fourier transform (VSSTFT), which is a type of linear transform based on the domain mappings and short-time Fourier transform to address the smear effect of the existing linear transforms under known time-varying speed conditions; (2) The velocity synchrosqueezing transform (VST), which is a type of remapping method based on the domain mapping and synchrosqueezing transform to address the smear effect of existing remapping methods under known time-varying speed conditions; (3) The velocity synchronous bilinear distribution (VSBD), which is a type of bilinear distribution based on the generalized demodulation and Cohen’s class bilinear distribution to address the smear effect and cross-term interference of existing bilinear distributions under known time-varying speed conditions and (4) The velocity synchronous linear chirplet transform (VSLCT), which is a non-parametric combined approach of linear transform and concentration-index-guided parameter determination to provide a smear-free and cross-term-free TFR under unknown time-varying speed conditions. In this work, simple algorithms are developed to avoid the signal resampling process required by the domain mappings or demodulations of the first three methods (i.e., the VSSTFT, VST and VSBD). They are designed to have different resolutions, readabilities, noise tolerances and computational efficiencies. Therefore, they are capable to adapt different application conditions. The VSLCT, as a kind of linear transform, is designed for unknown rotational speed conditions. It utilizes a set of shaft-rotational-speed-synchronous bases to address the smear problem and it is capable to dynamically determine the signal processing parameters (i.e., window length and normalized angle) to provide a clear TFR with desirable time-frequency resolution in response to condition variations. All of the proposed methods in this work are smear-free and cross-term-free, the TFRs generated by the methods are clearer and more precise compared with the existing time-frequency methods. The faults of planetary gearboxes, if any, can be diagnosed by identifying the fault-induced components from the obtained TFRs. The four methods are all newly applied to fault diagnosis. The effectiveness of them has been validated using both simulated and experimental vibration signals of planetary gearboxes collected under non-stationary conditions.
APA, Harvard, Vancouver, ISO, and other styles
45

Björk, Anders. "Chemometric and signal processing methods for real time monitoring and modeling : applications in the pulp and paper industry." Doctoral thesis, KTH, Kemi, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4383.

Full text
Abstract:
In the production of paper, the quality of the pulp is an important factor both for the productivity and for the final quality. Reliable real-time measurements of pulp quality are therefore needed. One way is to use acoustic or vibration sensors that give information-rich signals and place the sensors at suitable locations in a pulp production line. However, these sensors are not selective for the pulp properties of interest. Therefore, advanced signal processing and multivariate calibration are essential tools. The current work has been focused on the development of calibration routes for extraction of information from acoustic sensors and on signal processing algorithms for enhancing the information-selectivity for a specific pulp property or class of properties. Multivariate analysis methods like Principal Components Analysis (PCA), Partial Least Squares (PLS) and Orthogonal Signal Correction (OSC) have been used for visualization and calibration. Signal processing methods like Fast Fourier Transform (FFT), Fast Wavelet Transform (FWT) and Continuous Wavelet Transform (CWT) have been used in the development of novel signal processing algorithms for extraction of information from vibrationacoustic sensors. It is shown that use of OSC combined with PLS for prediction of Canadian Standard Freeness (CSF) using FFT-spectra produced from vibration data on a Thermo Mechanical Pulping (TMP) process gives lower prediction errors and a more parsimonious model than PLS alone. The combination of FFT and PLS was also used for monitoring of beating of kraft pulp and for screen monitoring. When using regular FFT-spectra on process acoustic data the obtained information tend to overlap. To circumvent this two new signal processing methods were developed: Wavelet Transform Multi Resolution Spectra (WT-MRS) and Continuous Wavelet Transform Fibre Length Extraction (CWT-FLE). Applying WT-MRS gave PLS-models that were more parsimonious with lower prediction error for CSF than using regular FFT-Spectra. For a Medium Consistency (MC) pulp stream WT-MRS gave predictions errors comparable to the reference methods for CSF and Brightness. The CWT-FLE method was validated against a commercial fibre length analyzer and good agreement was obtained. The CWT-FLE-curves could therefore be used instead of other fibre distribution curves for process control. Further, the CWT-FLE curves were used for PLS modelling of tensile strength and optical parameters with good results. In addition to the mentioned results a comprehensive overview of technologies used with acoustic sensors and related applications has been performed.<br>Vid framställning av pappersprodukter är kvaliteten på massan en viktig faktor för produktiviteten och kvalitén på slutresultatet. Det är därför viktigt att ha tillgång till tillförlitliga mätningar av massakvalitet i realtid. En möjlighet är att använda akustik- eller vibrationssensorer i lämpliga positioner vid enhetsoperationer i massaprocessen. Selektiviteten hos dessa mätningar är emellertid relativt låg i synnerhet om mätningarna är passiva. Därför krävs avancerad signalbehandling och multivariat kalibrering. Det nu presenterade arbetet har varit fokuserat på kalibreringsmetoder för extraktion av information ur akustiska mätningar samt på algoritmer för signalbehandling som kan ge förbättrad informationsselektivitet. Multivariata metoder som Principal Component Analysis (PCA), Partial Least Squares (PLS) and Orthogonal Signal Correction (OSC) har använts för visualisering och kalibrering. Signalbehandlingsmetoderna Fast Fourier Transform (FFT), Fast Wavelet Transform (FWT) och Continuous Wavelet Transform (CWT) har använts i utvecklingen av nydanande metoder för signalbehandling anpassade till att extrahera information ur signaler från vibrations/akustiska sensorer. En kombination av OSC och PLS applicerade på FFT-spektra från raffineringen i en Termo Mechnaical Pulping (TMP) process ger lägre prediktionsfel för Canadian Standard Freeness (CSF) än enbart PLS. Kombinationen av FFT och PLS har vidare använts för monitorering av malning av sulfatmassa och monitorering av silning. Ordinära FFT-spektra av t.ex. vibrationssignaler är delvis överlappande. För att komma runt detta har två signalbehandlingsmetoder utvecklats, Wavelet Transform Multi Resolution Spectra (WT-MRS) baserat på kombinationen av FWT och FFT samt Continuous Wavelet Transform Fibre Length Extraction (CWT-FLE) baserat på CWT. Tillämpning av WT-MRS gav enklare PLS-modeller med lägre prediktionsfel för CSF jämfört med att använda normala FFT-spektra. I en annan tillämpning på en massaström med relativt hög koncentration (Medium Consistency, MC) kunde prediktioner för CSF samt ljushet erhållas med prediktionsfel jämförbart med referensmetodernas fel. Metoden CWT-FLE validerades mot en kommersiell fiberlängdsmätare med god överensstämmelse. CWT-FLE-kurvorna skulle därför kunna användas i stället för andra fiberdistributionskurvor för processtyrning. Vidare användes CWT-FLE kurvor för PLS modellering av dragstyrka samt optiska egenskaper med goda resultat. Utöver de nämnda resultaten har en omfattande litteratursammanställning gjorts över området och relaterade applikationer.<br>QC 20100629
APA, Harvard, Vancouver, ISO, and other styles
46

Brockman, Erik. "TIME-FREQUENCY ANALYSIS OF INTRACARDIAC ELECTROGRAM." DigitalCommons@CalPoly, 2009. https://digitalcommons.calpoly.edu/theses/188.

Full text
Abstract:
The Cardiac Rhythm Management Division of St. Jude Medical specializes in the development of implantable cardioverter defibrillators that improve the quality of life for patients diagnosed with a variety of cardiac arrhythmias, especially for patients prone to sudden cardiac death. With the goal to improve detection of cardiac arrhythmias, this study explored the value in time-frequency analysis of intracardiac electrogram in four steps. The first two steps characterized, in the frequency domain, the waveforms that construct the cardiac cycle. The third step developed a new algorithm that putatively provides the least computationally expensive way to identifying cardiac waveforms in the frequency domain. Lastly, this novel approach to analyzing intracardiac electrogram was compared to a threshold crossing algorithm that strictly operates in the time domain and that is currently utilized by St. Jude Medical. The new algorithm demonstrated an equally effective method in identifying the QRS complex on the ventricular channel. The next steps in pursing time-frequency analysis of intracardiac electrogram include implementing the new algorithm on a testing platform that emulates the latest implantable cardioverter defibrillator manufactured by St. Jude Medical and pursuing a similar algorithm that can be employed on the atrial channel.
APA, Harvard, Vancouver, ISO, and other styles
47

Hill, Adam J. "Analysis, modeling and wide-area spatiotemporal control of low-frequency sound reproduction." Thesis, University of Essex, 2012. http://hdl.handle.net/10545/230034.

Full text
Abstract:
This research aims to develop a low-frequency response control methodology capable of delivering a consistent spectral and temporal response over a wide listening area. Low-frequency room acoustics are naturally plagued by room-modes, a result of standing waves at frequencies with wavelengths that are integer multiples of one or more room dimension. The standing wave pattern is different for each modal frequency, causing a complicated sound field exhibiting a highly position-dependent frequency response. Enhanced systems are investigated with multiple degrees of freedom (independently-controllable sound radiating sources) to provide adequate low-frequency response control. The proposed solution, termed a chameleon subwoofer array or CSA, adopts the most advantageous aspects of existing room-mode correction methodologies while emphasizing efficiency and practicality. Multiple degrees of freedom are ideally achieved by employing what is designated a hybrid subwoofer, which provides four orthogonal degrees of freedom configured within a modest-sized enclosure. The CSA software algorithm integrates both objective and subjective measures to address listener preferences including the possibility of individual real-time control. CSAs and existing techniques are evaluated within a novel acoustical modeling system (FDTD simulation toolbox) developed to meet the requirements of this research. Extensive virtual development of CSAs has led to experimentation using a prototype hybrid subwoofer. The resulting performance is in line with the simulations, whereby variance across a wide listening area is reduced by over 50% with only four degrees of freedom. A supplemental novel correction algorithm addresses correction issues at select narrow frequency bands. These frequencies are filtered from the signal and replaced using virtual bass to maintain all aural information, a psychoacoustical effect giving the impression of low-frequency. Virtual bass is synthesized using an original hybrid approach combining two mainstream synthesis procedures while suppressing each method‟s inherent weaknesses. This algorithm is demonstrated to improve CSA output efficiency while maintaining acceptable subjective performance.
APA, Harvard, Vancouver, ISO, and other styles
48

Britton, Matthew Scott. "Stochastic task scheduling in time-critical information delivery systems." Title page, contents and abstract only, 2003. http://web4.library.adelaide.edu.au/theses/09PH/09phb8629.pdf.

Full text
Abstract:
"January 2003" Includes bibliographical references (leaves 120-129) Presents performance analyses of dynamic, stochastic task scheduling policies for a real- time-communications system where tasks lose value as they are delayed in the system.
APA, Harvard, Vancouver, ISO, and other styles
49

洪觀宇 and Roy Hung. "Time domain analysis and synthesis of cello tones based on perceptual quality and playing gestures." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1998. http://hub.hku.hk/bib/B31215348.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Harris, Jack D. "Online source separation in reverberant environments exploiting known speaker locations." Thesis, Loughborough University, 2015. https://dspace.lboro.ac.uk/2134/19627.

Full text
Abstract:
This thesis concerns blind source separation techniques using second order statistics and higher order statistics for reverberant environments. A focus of the thesis is algorithmic simplicity with a view to the algorithms being implemented in their online forms. The main challenge of blind source separation applications is to handle reverberant acoustic environments; a further complication is changes in the acoustic environment such as when human speakers physically move. A novel time-domain method which utilises a pair of finite impulse response filters is proposed. The method of principle angles is defined which exploits a singular value decomposition for their design. The pair of filters are implemented within a generalised sidelobe canceller structure, thus the method can be considered as a beamforming method which cancels one source. An adaptive filtering stage is then employed to recover the remaining source, by exploiting the output of the beamforming stage as a noise reference. A common approach to blind source separation is to use methods that use higher order statistics such as independent component analysis. When dealing with realistic convolutive audio and speech mixtures, processing in the frequency domain at each frequency bin is required. As a result this introduces the permutation problem, inherent in independent component analysis, across the frequency bins. Independent vector analysis directly addresses this issue by modeling the dependencies between frequency bins, namely making use of a source vector prior. An alternative source prior for real-time (online) natural gradient independent vector analysis is proposed. A Student's t probability density function is known to be more suited for speech sources, due to its heavier tails, and is incorporated into a real-time version of natural gradient independent vector analysis. The final algorithm is realised as a real-time embedded application on a floating point Texas Instruments digital signal processor platform. Moving sources, along with reverberant environments, cause significant problems in realistic source separation systems as mixing filters become time variant. A method which employs the pair of cancellation filters, is proposed to cancel one source coupled with an online natural gradient independent vector analysis technique to improve average separation performance in the context of step-wise moving sources. This addresses `dips' in performance when sources move. Results show the average convergence time of the performance parameters is improved. Online methods introduced in thesis are tested using impulse responses measured in reverberant environments, demonstrating their robustness and are shown to perform better than established methods in a variety of situations.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography