Academic literature on the topic 'Gaussian Mixture Model'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Gaussian Mixture Model.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Gaussian Mixture Model"

1

Zickert, Gustav, and Can Evren Yarman. "Gaussian mixture model decomposition of multivariate signals." Signal, Image and Video Processing 16, no. 2 (October 29, 2021): 429–36. http://dx.doi.org/10.1007/s11760-021-01961-y.

Full text
Abstract:
AbstractWe propose a greedy variational method for decomposing a non-negative multivariate signal as a weighted sum of Gaussians, which, borrowing the terminology from statistics, we refer to as a Gaussian mixture model. Notably, our method has the following features: (1) It accepts multivariate signals, i.e., sampled multivariate functions, histograms, time series, images, etc., as input. (2) The method can handle general (i.e., ellipsoidal) Gaussians. (3) No prior assumption on the number of mixture components is needed. To the best of our knowledge, no previous method for Gaussian mixture model decomposition simultaneously enjoys all these features. We also prove an upper bound, which cannot be improved by a global constant, for the distance from any mode of a Gaussian mixture model to the set of corresponding means. For mixtures of spherical Gaussians with common variance $$\sigma ^2$$ σ 2 , the bound takes the simple form $$\sqrt{n}\sigma $$ n σ . We evaluate our method on one- and two-dimensional signals. Finally, we discuss the relation between clustering and signal decomposition, and compare our method to the baseline expectation maximization algorithm.
APA, Harvard, Vancouver, ISO, and other styles
2

MA, JINWEN, and TAIJUN WANG. "ENTROPY PENALIZED AUTOMATED MODEL SELECTION ON GAUSSIAN MIXTURE." International Journal of Pattern Recognition and Artificial Intelligence 18, no. 08 (December 2004): 1501–12. http://dx.doi.org/10.1142/s0218001404003812.

Full text
Abstract:
Gaussian mixture modeling is a powerful approach for data analysis and the determination of the number of Gaussians, or clusters, is actually the problem of Gaussian mixture model selection which has been investigated from several respects. This paper proposes a new kind of automated model selection algorithm for Gaussian mixture modeling via an entropy penalized maximum-likelihood estimation. It is demonstrated by the experiments that the proposed algorithm can make model selection automatically during the parameter estimation, with the mixing proportions of the extra Gaussians attenuating to zero. As compared with the BYY automated model selection algorithms, it converges more stably and accurately as the number of samples becomes large.
APA, Harvard, Vancouver, ISO, and other styles
3

Mirra, J., and S. Abdullah. "Bayesian gaussian finite mixture model." Journal of Physics: Conference Series 1725 (January 2021): 012084. http://dx.doi.org/10.1088/1742-6596/1725/1/012084.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wichert, Andreas. "Quantum-like Gaussian mixture model." Soft Computing 25, no. 15 (June 11, 2021): 10067–81. http://dx.doi.org/10.1007/s00500-021-05941-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lotsi, Anani, and Ernst Wit. "Sparse Gaussian graphical mixture model." Afrika Statistika 11, no. 2 (December 1, 2016): 1041–59. http://dx.doi.org/10.16929/as/2016.1041.91.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Nguyen, Thanh Minh, Q. M. Jonathan Wu, and Hui Zhang. "Bounded generalized Gaussian mixture model." Pattern Recognition 47, no. 9 (September 2014): 3132–42. http://dx.doi.org/10.1016/j.patcog.2014.03.030.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Xie, Fangzheng, and Yanxun Xu. "Bayesian Repulsive Gaussian Mixture Model." Journal of the American Statistical Association 115, no. 529 (April 1, 2019): 187–203. http://dx.doi.org/10.1080/01621459.2018.1537918.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Alangari, Nourah, Mohamed El Bachir Menai, Hassan Mathkour, and Ibrahim Almosallam. "Intrinsically Interpretable Gaussian Mixture Model." Information 14, no. 3 (March 3, 2023): 164. http://dx.doi.org/10.3390/info14030164.

Full text
Abstract:
Understanding the reasoning behind a predictive model’s decision is an important and longstanding problem driven by ethical and legal considerations. Most recent research has focused on the interpretability of supervised models, whereas unsupervised learning has received less attention. However, the majority of the focus was on interpreting the whole model in a manner that undermined accuracy or model assumptions, while local interpretation received much less attention. Therefore, we propose an intrinsic interpretation for the Gaussian mixture model that provides both global insight and local interpretations. We employed the Bhattacharyya coefficient to measure the overlap and divergence across clusters to provide a global interpretation in terms of the differences and similarities between the clusters. By analyzing the GMM exponent with the Garthwaite–Kock corr-max transformation, the local interpretation is provided in terms of the relative contribution of each feature to the overall distance. Experimental results obtained on three datasets show that the proposed interpretation method outperforms the post hoc model-agnostic LIME in determining the feature contribution to the cluster assignment.
APA, Harvard, Vancouver, ISO, and other styles
9

Kim, Sung-Suk, Keun-Chang Kwak, Jeong-Woong Ryu, and Myung-Geun Chun. "A Neuro-Fuzzy Modeling using the Hierarchical Clustering and Gaussian Mixture Model." Journal of Korean Institute of Intelligent Systems 13, no. 5 (October 1, 2003): 512–19. http://dx.doi.org/10.5391/jkiis.2003.13.5.512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wei, Hui, and Wei Zheng. "Image Denoising Based on Improved Gaussian Mixture Model." Scientific Programming 2021 (September 22, 2021): 1–8. http://dx.doi.org/10.1155/2021/7982645.

Full text
Abstract:
An image denoising method is proposed based on the improved Gaussian mixture model to reduce the noises and enhance the image quality. Unlike the traditional image denoising methods, the proposed method models the pixel information in the neighborhood around each pixel in the image. The Gaussian mixture model is employed to measure the similarity between pixels by calculating the L2 norm between the Gaussian mixture models corresponding to the two pixels. The Gaussian mixture model can model the statistical information such as the mean and variance of the pixel information in the image area. The L2 norm between the two Gaussian mixture models represents the difference in the local grayscale intensity and the richness of the details of the pixel information around the two pixels. In this sense, the L2 norm between Gaussian mixture models can more accurately measure the similarity between pixels. The experimental results show that the proposed method can improve the denoising performance of the images while retaining the detailed information of the image.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Gaussian Mixture Model"

1

Lan, Jing. "Gaussian mixture model based system identification and control." [Gainesville, Fla.] : University of Florida, 2006. http://purl.fcla.edu/fcla/etd/UFE0014640.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lu, Liang. "Subspace Gaussian mixture models for automatic speech recognition." Thesis, University of Edinburgh, 2013. http://hdl.handle.net/1842/8065.

Full text
Abstract:
In most of state-of-the-art speech recognition systems, Gaussian mixture models (GMMs) are used to model the density of the emitting states in the hidden Markov models (HMMs). In a conventional system, the model parameters of each GMM are estimated directly and independently given the alignment. This results a large number of model parameters to be estimated, and consequently, a large amount of training data is required to fit the model. In addition, different sources of acoustic variability that impact the accuracy of a recogniser such as pronunciation variation, accent, speaker factor and environmental noise are only weakly modelled and factorized by adaptation techniques such as maximum likelihood linear regression (MLLR), maximum a posteriori adaptation (MAP) and vocal tract length normalisation (VTLN). In this thesis, we will discuss an alternative acoustic modelling approach — the subspace Gaussian mixture model (SGMM), which is expected to deal with these two issues better. In an SGMM, the model parameters are derived from low-dimensional model and speaker subspaces that can capture phonetic and speaker correlations. Given these subspaces, only a small number of state-dependent parameters are required to derive the corresponding GMMs. Hence, the total number of model parameters can be reduced, which allows acoustic modelling with a limited amount of training data. In addition, the SGMM-based acoustic model factorizes the phonetic and speaker factors and within this framework, other source of acoustic variability may also be explored. In this thesis, we propose a regularised model estimation for SGMMs, which avoids overtraining in case that the training data is sparse. We will also take advantage of the structure of SGMMs to explore cross-lingual acoustic modelling for low-resource speech recognition. Here, the model subspace is estimated from out-domain data and ported to the target language system. In this case, only the state-dependent parameters need to be estimated which relaxes the requirement of the amount of training data. To improve the robustness of SGMMs against environmental noise, we propose to apply the joint uncertainty decoding (JUD) technique that is shown to be efficient and effective. We will report experimental results on the Wall Street Journal (WSJ) database and GlobalPhone corpora to evaluate the regularisation and cross-lingual modelling of SGMMs. Noise compensation using JUD for SGMM acoustic models is evaluated on the Aurora 4 database.
APA, Harvard, Vancouver, ISO, and other styles
3

Vakil, Sam. "Gaussian mixture model based coding of speech and audio." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=81575.

Full text
Abstract:
The transmission of speech and audio over communication channels has always required speech and audio coders with reasonable search and computational complexity and good performance relative to the corresponding distortion measure.
This work introduces a coding scheme which works in a perceptual auditory domain. The input high dimensional frames of audio and speech are transformed to power spectral domain, using either DFT or MDCT. The log spectral vectors are then transformed to the excitation domain. In the quantizer section the vectors are DCT transformed and decorrelated. This operation gives the possibility of using diagonal covariances in modelling the data. Finally, a GMM based VQ is performed on the vectors.
In the decoder part the inverse operations are done. However, in order to prevent negative power spectrum elements due to inverse perceptual transformation in the decoder, instead of direct inversion, a Nonnegative Least Squares Algorithm has been used to switch back to frequency domain. For the sake of comparison, a reference subband based "Excitation Distortion coder" is implemented and comparing the resulting coded files showed a better performance for the proposed GMM based coder.
APA, Harvard, Vancouver, ISO, and other styles
4

Sadarangani, Nikhil 1979. "An improved Gaussian mixture model algorithm for background subtraction." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/87293.

Full text
Abstract:
Thesis (M.Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2002.
Includes bibliographical references (leaves 71-72).
by Nikhil Sadarangani.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
5

Stuttle, Matthew Nicholas. "A gaussian mixture model spectral representation for speech recognition." Thesis, University of Cambridge, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.620077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Juan. "Estimation of individual treatment effect via Gaussian mixture model." HKBU Institutional Repository, 2020. https://repository.hkbu.edu.hk/etd_oa/839.

Full text
Abstract:
In this thesis, we investigate the estimation problem of treatment effect from Bayesian perspective through which one can first obtain the posterior distribution of unobserved potential outcome from observed data, and then obtain the posterior distribution of treatment effect. We mainly consider how to represent a joint distribution of two potential outcomes - one from treated group and another from control group, which can give us an indirect impression of correlation, since the estimation of treatment effect depends on correlation between two potential outcomes. The first part of this thesis illustrates the effectiveness of adapting Gaussian mixture models in solving the treatment effect problem. We apply the mixture models - Gaussian Mixture Regression (GMR) and Gaussian Mixture Linear Regression (GMLR)- as a potentially simple and powerful tool to investigate the joint distribution of two potential outcomes. For GMR, we consider a joint distribution of the covariate and two potential outcomes. For GMLR, we consider a joint distribution of two potential outcomes, which linearly depend on covariate. Through developing an EM algorithm for GMLR, we find that GMR and GMLR are effective in estimating means and variances, but they are not effective in capturing correlation between two potential outcomes. In the second part of this thesis, GMLR is modified to capture unobserved covariance structure (correlation between outcomes) that can be explained by latent variables introduced through making an important model assumption. We propose a much more efficient Pre-Post EM Algorithm to implement our proposed GMLR model with unobserved covariance structure in practice. Simulation studies show that Pre-Post EM Algorithm performs well not only in estimating means and variances, but also in estimating covariance.
APA, Harvard, Vancouver, ISO, and other styles
7

Delport, Marion. "A spatial variant of the Gaussian mixture of regressions model." Diss., University of Pretoria, 2017. http://hdl.handle.net/2263/65883.

Full text
Abstract:
In this study the nite mixture of multivariate Gaussian distributions is discussed in detail including the derivation of maximum likelihood estimators, a discussion on identi ability of mixture components as well as a discussion on the singularities typically occurring during the estimation process. Examples demonstrate the application of the nite mixture of univariate and bivariate Gaussian distributions. The nite mixture of multivariate Gaussian regressions is discussed including the derivation of maximum likelihood estimators. An example is used to demonstrate the application of the mixture of regressions model. Two methods of calculating the coe cient of determination for measuring model performance are introduced. The application of nite mixtures of Gaussian distributions and regressions to image segmentation problems is examined. The traditional nite mixture models however, have a shortcoming in that commonality of location of observations (pixels) is not taken into account when clustering the data. In literature, this shortcoming is addressed by including a Markov random eld prior for the mixing probabilities and the present study discusses this theoretical development. The resulting nite spatial variant mixture of Gaussian regressions model is de ned and its application is demonstrated in a simulated example. It was found that the spatial variant mixture of Gaussian regressions delivered accurate spatial clustering results and simultaneously accurately estimated the component model parameters. This study contributes an application of the spatial variant mixture of Gaussian regressions model in the agricultural context: maize yields in the Free State are modelled as a function of precipitation, type of maize and season; GPS coordinates linked to the observations provide the location information. A simple linear regression and traditional mixture of Gaussian regressions model were tted for comparative purposes and the latter identi ed three distinct clusters without accounting for location information. It was found that the application of the spatial variant mixture of regressions model resulted in spatially distinct and informative clusters, especially with respect to the type of maize covariate. However, the estimated component regression models for this data set were quite similar. The investigated data set was not perfectly suited for the spatial variant mixture of regressions model application and possible solutions were proposed to improve the model results in future studies. A key learning from the present study is that the e ectiveness of the spatial variant mixture of regressions model is dependent on the clear and distinguishable spatial dependencies in the underlying data set when it is applied to map-type data.
Dissertation (MSc)--University of Pretoria, 2017.
Statistics
MSc
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
8

Malsiner-Walli, Gertraud, Sylvia Frühwirth-Schnatter, and Bettina Grün. "Model-based clustering based on sparse finite Gaussian mixtures." Springer, 2016. http://dx.doi.org/10.1007/s11222-014-9500-2.

Full text
Abstract:
In the framework of Bayesian model-based clustering based on a finite mixture of Gaussian distributions, we present a joint approach to estimate the number of mixture components and identify cluster-relevant variables simultaneously as well as to obtain an identified model. Our approach consists in specifying sparse hierarchical priors on the mixture weights and component means. In a deliberately overfitting mixture model the sparse prior on the weights empties superfluous components during MCMC. A straightforward estimator for the true number of components is given by the most frequent number of non-empty components visited during MCMC sampling. Specifying a shrinkage prior, namely the normal gamma prior, on the component means leads to improved parameter estimates as well as identification of cluster-relevant variables. After estimating the mixture model using MCMC methods based on data augmentation and Gibbs sampling, an identified model is obtained by relabeling the MCMC output in the point process representation of the draws. This is performed using K-centroids cluster analysis based on the Mahalanobis distance. We evaluate our proposed strategy in a simulation setup with artificial data and by applying it to benchmark data sets. (authors' abstract)
APA, Harvard, Vancouver, ISO, and other styles
9

Tran, Denis. "A study of bit allocation for Gaussian mixture model quantizers and image coders /." Thesis, McGill University, 2005. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=83937.

Full text
Abstract:
This thesis describes different bit allocation schemes and their performances when applied on coding line spectral frequencies (LSF) using the GMM-based coder designed by Subramaniam and a simple image transform coder. The new algorithms are compared to the original bit allocation formula; the Pruning algorithm used by Subramaniam, Segall's method and the Greedy bit allocation algorithm using the Log Spectral Distortion and the Mean-Square Error for the LSF quantizer and the Peak Signal-to-Noise Ratio for the image coder.
First, a Greedy level allocation algorithm is developed based on the philosophy of the Greedy algorithin but, it does so level by level, considering the best benefit and bit cost yielded by an allocation. The Greedy level allocation algorithm is computationally intensive in general, thus we discuss combining it with other algorithms to obtain lower costs.
Second, another algorithm solving problems of negative bit allocations and integer level is proposed. The level allocations are to keep a certain ratio with respect to each other throughout the algorithm in order to remain closest to the condition for lowest distortion. Moreover, the original formula assumes a 6dB gain for each added bit, which is not generally true. The algorithm presents a new parameter k, which controls the benefit of adding one bit, usually set at 0.5 in the high-rate optimal bit allocation formula for MSE calling the new algorithm, the Two-Stage Iterative Bit Allocation (TSIBA) algorithm. Simulations show that modifying the bit allocation formula effectively brings about some gains over the previous methods.
The formula containing the new parameter is generalized into a, formula introducing a new parameter which weights not only the variances but also the dimensions, training the new parameter on their distribution function. The TSIBA was an a-posteriori decision algorithm, where the decision on which value of k to select for lowest distortion was decided after computing all distortions. The Generalized TSIBA (GTSIBA), on the other hand, uses a training procedure to estimate which weighting factor to set for each dimension at a certain bit rate. Simulation results show yet another improvement when using the Generalized TSIBA over all previous methods.
APA, Harvard, Vancouver, ISO, and other styles
10

Shashidhar, Sanda, and Amirisetti Sravya. "Online Handwritten Signature Verification System : using Gaussian Mixture Model and Longest Common Sub-Sequences." Thesis, Blekinge Tekniska Högskola, Institutionen för tillämpad signalbehandling, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-15807.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Gaussian Mixture Model"

1

Anomaly Detection Using a Variational Autoencoder Neural Network with a Novel Objective Function and Gaussian Mixture Model Selection Technique. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

1st, Krishna M. Vamsi. Brain Tumor Segmentation Using Bivariate Gaussian Mixture Models. Selfypage Developers Pvt Ltd, 2022.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Speaker Verification in the Presence of Channel Mismatch Using Gaussian Mixture Models. Storming Media, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cheng, Russell. Finite Mixture Examples; MAPIS Details. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198505044.003.0018.

Full text
Abstract:
Two detailed numerical examples are given in this chapter illustrating and comparing mainly the reversible jump Markov chain Monte Carlo (RJMCMC) and the maximum a posteriori/importance sampling (MAPIS) methods. The numerical examples are the well-known galaxy data set with sample size 82, and the Hidalgo stamp issues thickness data with sample size 485. A comparison is made of the estimates obtained by the RJMCMC and MAPIS methods for (i) the posterior k-distribution of the number of components, k, (ii) the predictive finite mixture distribution itself, and (iii) the posterior distributions of the component parameters and weights. The estimates obtained by MAPIS are shown to be more satisfactory and meaningful. Details are given of the practical implementation of MAPIS for five non-normal mixture models, namely: the extreme value, gamma, inverse Gaussian, lognormal, and Weibull. Mathematical details are also given of the acceptance-rejection importance sampling used in MAPIS.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Gaussian Mixture Model"

1

Sarang, Poornachandra. "Gaussian Mixture Model." In Thinking Data Science, 197–207. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-02363-7_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Scrucca, Luca, Chris Fraley, T. Brendan Murphy, and Adrian E. Raftery. "Visualizing Gaussian Mixture Models." In Model-Based Clustering, Classification, and Density Estimation Using mclust in R, 153–88. Boca Raton: Chapman and Hall/CRC, 2023. http://dx.doi.org/10.1201/9781003277965-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wang, Jingdong, Jianguo Lee, and Changshui Zhang. "Kernel Trick Embedded Gaussian Mixture Model." In Lecture Notes in Computer Science, 159–74. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-39624-6_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Azam, Muhammad, Basim Alghabashi, and Nizar Bouguila. "Multivariate Bounded Asymmetric Gaussian Mixture Model." In Unsupervised and Semi-Supervised Learning, 61–80. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-23876-6_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ahn, Sung Mahn, and Sung Baik. "Minimal RBF Networks by Gaussian Mixture Model." In Lecture Notes in Computer Science, 919–27. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11538059_95.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hussain, H., S. H. Salleh, C. M. Ting, A. K. Ariff, I. Kamarulafizam, and R. A. Suraya. "Speaker Verification Using Gaussian Mixture Model (GMM)." In IFMBE Proceedings, 560–64. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21729-6_140.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yang, Xi, Kaizhu Huang, and Rui Zhang. "Unsupervised Dimensionality Reduction for Gaussian Mixture Model." In Neural Information Processing, 84–92. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-12640-1_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hufnagel, Heike. "A Generative Gaussian Mixture Statistical Shape Model." In A Probabilistic Framework for Point-Based Shape Modeling in Medical Image Analysis, 27–55. Wiesbaden: Vieweg+Teubner Verlag, 2011. http://dx.doi.org/10.1007/978-3-8348-8600-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Palmer, Jason A., Kenneth Kreutz-Delgado, and Scott Makeig. "Super-Gaussian Mixture Source Model for ICA." In Independent Component Analysis and Blind Signal Separation, 854–61. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11679363_106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sun, Mengya. "Pruning Technology Based on Gaussian Mixture Model." In The 2021 International Conference on Machine Learning and Big Data Analytics for IoT Security and Privacy, 137–44. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-89508-2_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Gaussian Mixture Model"

1

Lucas, Alexandre, Salvador Carvalhosa, and Sara Golmaryami. "Gaussian Mixture Model for Battery Operation Anomaly Detection." In 2024 International Conference on Smart Energy Systems and Technologies (SEST), 1–6. IEEE, 2024. http://dx.doi.org/10.1109/sest61601.2024.10694471.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Garcia, Vincent, Frank Nielsen, and Richard Nock. "Hierarchical Gaussian mixture model." In 2010 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2010. http://dx.doi.org/10.1109/icassp.2010.5495750.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gong, Dayong, and Zhihua Wang. "An improved Gaussian mixture model." In 2012 International Conference on Graphic and Image Processing, edited by Zeng Zhu. SPIE, 2013. http://dx.doi.org/10.1117/12.2010876.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Janouek, Jan, Petr Gajdo, Michal Radecky, and Vaclav Snael. "Gaussian Mixture Model Cluster Forest." In 2015 IEEE 14th International Conference on Machine Learning and Applications (ICMLA). IEEE, 2015. http://dx.doi.org/10.1109/icmla.2015.12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yuchai Wan, Xiabi Liu, and Yuyang Tang. "Simplifying Gaussian mixture model via model similarity." In 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE, 2016. http://dx.doi.org/10.1109/icpr.2016.7900124.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Haindl, Michal, and Vojtech Havlicek. "Three-dimensional Gaussian mixture texture model." In 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE, 2016. http://dx.doi.org/10.1109/icpr.2016.7899934.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Jaimes, Luis G., and Juan M. Calderon. "Gaussian mixture model for crowdsensing incentivization." In 2018 IEEE 8th Annual Computing and Communication Workshop and Conference (CCWC). IEEE, 2018. http://dx.doi.org/10.1109/ccwc.2018.8301762.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jagtap, Shilpa S., and D. G. Bhalke. "Speaker verification using Gaussian Mixture Model." In 2015 International Conference on Pervasive Computing (ICPC). IEEE, 2015. http://dx.doi.org/10.1109/pervasive.2015.7087080.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Song, Bo, and Victor O. K. Li. "Gaussian mixture model of evolutionary algorithms." In GECCO '14: Genetic and Evolutionary Computation Conference. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2576768.2598252.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhang, Jiehao, Xianbin Hong, Sheng-Uei Guan, Xuan Zhao, Huang Xin, and Nian Xue. "Maximum Gaussian Mixture Model for Classification." In 2016 8th International Conference on Information Technology in Medicine and Education (ITME). IEEE, 2016. http://dx.doi.org/10.1109/itme.2016.0139.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Gaussian Mixture Model"

1

Gardiner, Thomas, and Allen Robinson. Gaussian Mixture Model Solvers for the Boltzmann Equation. Office of Scientific and Technical Information (OSTI), October 2022. http://dx.doi.org/10.2172/2402991.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

De Leon, Phillip L., and Richard D. McClanahan. Efficient speaker verification using Gaussian mixture model component clustering. Office of Scientific and Technical Information (OSTI), April 2012. http://dx.doi.org/10.2172/1039402.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ramakrishnan, Aravind, Ashraf Alrajhi, Egemen Okte, Hasan Ozer, and Imad Al-Qadi. Truck-Platooning Impacts on Flexible Pavements: Experimental and Mechanistic Approaches. Illinois Center for Transportation, November 2021. http://dx.doi.org/10.36501/0197-9191/21-038.

Full text
Abstract:
Truck platoons are expected to improve safety and reduce fuel consumption. However, their use is projected to accelerate pavement damage due to channelized-load application (lack of wander) and potentially reduced duration between truck-loading applications (reduced rest period). The effect of wander on pavement damage is well documented, while relatively few studies are available on the effect of rest period on pavement permanent deformation. Therefore, the main objective of this study was to quantify the impact of rest period theoretically, using a numerical method, and experimentally, using laboratory testing. A 3-D finite-element (FE) pavement model was developed and run to quantify the effect of rest period. Strain recovery and accumulation were predicted by fitting Gaussian mixture models to the strain values computed from the FE model. The effect of rest period was found to be insignificant for truck spacing greater than 10 ft. An experimental program was conducted, and several asphalt concrete (AC) mixes were considered at various stress levels, temperatures, and rest periods. Test results showed that AC deformation increased with rest period, irrespective of AC-mix type, stress level, and/or temperature. This observation was attributed to a well-documented hardening–relaxation mechanism, which occurs during AC plastic deformation. Hence, experimental and FE-model results are conflicting due to modeling AC as a viscoelastic and the difference in the loading mechanism. A shift model was developed by extending the time–temperature superposition concept to incorporate rest period, using the experimental data. The shift factors were used to compute the equivalent number of cycles for various platoon scenarios (truck spacings or rest period). The shift model was implemented in AASHTOware pavement mechanic–empirical design (PMED) guidelines for the calculation of rutting using equivalent number of cycles.
APA, Harvard, Vancouver, ISO, and other styles
4

Yu, Guoshen, and Guillermo Sapiro. Statistical Compressive Sensing of Gaussian Mixture Models. Fort Belvoir, VA: Defense Technical Information Center, October 2010. http://dx.doi.org/10.21236/ada540728.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hogden, J., and J. C. Scovel. MALCOM X: Combining maximum likelihood continuity mapping with Gaussian mixture models. Office of Scientific and Technical Information (OSTI), November 1998. http://dx.doi.org/10.2172/677150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yu, Guoshen, Guillermo Sapiro, and Stephane Mallat. Solving Inverse Problems with Piecewise Linear Estimators: From Gaussian Mixture Models to Structured Sparsity. Fort Belvoir, VA: Defense Technical Information Center, June 2010. http://dx.doi.org/10.21236/ada540722.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography