Academic literature on the topic 'Gaussian mixture models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Gaussian mixture models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Gaussian mixture models"

1

Ju, Zhaojie, and Honghai Liu. "Fuzzy Gaussian Mixture Models." Pattern Recognition 45, no. 3 (2012): 1146–58. http://dx.doi.org/10.1016/j.patcog.2011.08.028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

McNicholas, Paul David, and Thomas Brendan Murphy. "Parsimonious Gaussian mixture models." Statistics and Computing 18, no. 3 (2008): 285–96. http://dx.doi.org/10.1007/s11222-008-9056-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Viroli, Cinzia, and Geoffrey J. McLachlan. "Deep Gaussian mixture models." Statistics and Computing 29, no. 1 (2017): 43–51. http://dx.doi.org/10.1007/s11222-017-9793-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Verbeek, J. J., N. Vlassis, and B. Kröse. "Efficient Greedy Learning of Gaussian Mixture Models." Neural Computation 15, no. 2 (2003): 469–85. http://dx.doi.org/10.1162/089976603762553004.

Full text
Abstract:
This article concerns the greedy learning of gaussian mixtures. In the greedy approach, mixture components are inserted into the mixture one aftertheother.We propose a heuristic for searching for the optimal component to insert. In a randomized manner, a set of candidate new components is generated. For each of these candidates, we find the locally optimal new component and insert it into the existing mixture. The resulting algorithm resolves the sensitivity to initialization of state-of-the-art methods, like expectation maximization, and has running time linear in the number of data points an
APA, Harvard, Vancouver, ISO, and other styles
5

Kunkel, Deborah, and Mario Peruggia. "Anchored Bayesian Gaussian mixture models." Electronic Journal of Statistics 14, no. 2 (2020): 3869–913. http://dx.doi.org/10.1214/20-ejs1756.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chassagnol, Bastien, Antoine Bichat, Cheïma Boudjeniba, et al. "Gaussian Mixture Models in R." R Journal 15, no. 2 (2023): 56–76. http://dx.doi.org/10.32614/rj-2023-043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

chassagnol, bastien. "Gaussian Mixture Models in R." R Journal 2023-2, no. 2023-2 (2023): 56–76. https://doi.org/10.32614/RJ-2023-043.

Full text
Abstract:
  Gaussian mixture models (GMMs) are widely used for modelling stochastic problems. Indeed, a wide diversity of packages have been developed in R. However, no recent review describing the main features offered by these packages and comparing their performances has been performed. In this article, we first introduce GMMs and the EM algorithm used to retrieve the parameters of the model and analyse the main features implemented among seven of the most widely used R packages. We then empirically compare their statistical and computational performances in relation with the choice of the initi
APA, Harvard, Vancouver, ISO, and other styles
8

Ruzgas, Tomas, and Indrė Drulytė. "Kernel Density Estimators for Gaussian Mixture Models." Lietuvos statistikos darbai 52, no. 1 (2013): 14–21. http://dx.doi.org/10.15388/ljs.2013.13919.

Full text
Abstract:
The problem of nonparametric estimation of probability density function is considered. The performance of kernel estimators based on various common kernels and a new kernel K (see (14)) with both fixed and adaptive smoothing bandwidth is compared in terms of the symmetric mean absolute percentage error using the Monte Carlo method. The kernel K is everywhere positive but has lighter tails than the Gaussian density. Gaussian mixture models from a collection introduced by Marron and Wand (1992) are taken for Monte Carlo simulations. The adaptive kernel method outperforms the smoothing with a fix
APA, Harvard, Vancouver, ISO, and other styles
9

Chen, Yongxin, Tryphon T. Georgiou, and Allen Tannenbaum. "Optimal Transport for Gaussian Mixture Models." IEEE Access 7 (2019): 6269–78. http://dx.doi.org/10.1109/access.2018.2889838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Nasios, N., and A. G. Bors. "Variational learning for Gaussian mixture models." IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics) 36, no. 4 (2006): 849–62. http://dx.doi.org/10.1109/tsmcb.2006.872273.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Gaussian mixture models"

1

Kunkel, Deborah Elizabeth. "Anchored Bayesian Gaussian Mixture Models." The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1524134234501475.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nkadimeng, Calvin. "Language identification using Gaussian mixture models." Thesis, Stellenbosch : University of Stellenbosch, 2010. http://hdl.handle.net/10019.1/4170.

Full text
Abstract:
Thesis (MScEng (Electrical and Electronic Engineering))--University of Stellenbosch, 2010.<br>ENGLISH ABSTRACT: The importance of Language Identification for African languages is seeing a dramatic increase due to the development of telecommunication infrastructure and, as a result, an increase in volumes of data and speech traffic in public networks. By automatically processing the raw speech data the vital assistance given to people in distress can be speeded up, by referring their calls to a person knowledgeable in that language. To this effect a speech corpus was developed and various
APA, Harvard, Vancouver, ISO, and other styles
3

Gundersen, Terje. "Voice Transformation based on Gaussian mixture models." Thesis, Norwegian University of Science and Technology, Department of Electronics and Telecommunications, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-10878.

Full text
Abstract:
<p>In this thesis, a probabilistic model for transforming a voice to sound like another specific voice is tested. The model is fully automatic and only requires some 100 training sentences from both speakers with the same acoustic content. The classical source-filter decomposition allows prosodic and spectral transformation to be performed independently. The transformations are based on a Gaussian mixture model and a transformation function suggested by Y. Stylianou. Feature vectors of the same content from the source and target speaker, aligned in time by dynamic time warping, are fitted to a
APA, Harvard, Vancouver, ISO, and other styles
4

Subramaniam, Anand D. "Gaussian mixture models in compression and communication /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2003. http://wwwlib.umi.com/cr/ucsd/fullcit?p3112847.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cilliers, Francois Dirk. "Tree-based Gaussian mixture models for speaker verification." Thesis, Link to the online version, 2005. http://hdl.handle.net/10019.1/1639.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lu, Liang. "Subspace Gaussian mixture models for automatic speech recognition." Thesis, University of Edinburgh, 2013. http://hdl.handle.net/1842/8065.

Full text
Abstract:
In most of state-of-the-art speech recognition systems, Gaussian mixture models (GMMs) are used to model the density of the emitting states in the hidden Markov models (HMMs). In a conventional system, the model parameters of each GMM are estimated directly and independently given the alignment. This results a large number of model parameters to be estimated, and consequently, a large amount of training data is required to fit the model. In addition, different sources of acoustic variability that impact the accuracy of a recogniser such as pronunciation variation, accent, speaker factor and en
APA, Harvard, Vancouver, ISO, and other styles
7

Pinto, Rafael Coimbra. "Continuous reinforcement learning with incremental Gaussian mixture models." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2017. http://hdl.handle.net/10183/157591.

Full text
Abstract:
A contribução original desta tese é um novo algoritmo que integra um aproximador de funções com alta eficiência amostral com aprendizagem por reforço em espaços de estados contínuos. A pesquisa completa inclui o desenvolvimento de um algoritmo online e incremental capaz de aprender por meio de uma única passada sobre os dados. Este algoritmo, chamado de Fast Incremental Gaussian Mixture Network (FIGMN) foi empregado como um aproximador de funções eficiente para o espaço de estados de tarefas contínuas de aprendizagem por reforço, que, combinado com Q-learning linear, resulta em performance com
APA, Harvard, Vancouver, ISO, and other styles
8

Chockalingam, Prakash. "Non-rigid multi-modal object tracking using Gaussian mixture models." Connect to this title online, 2009. http://etd.lib.clemson.edu/documents/1252937467/.

Full text
Abstract:
Thesis (M.S.) -- Clemson University, 2009.<br>Contains additional supplemental files. Title from first page of PDF file. Document formatted into pages; contains vii, 54 p. ; also includes color graphics.
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Bo Yu. "Deterministic annealing EM algorithm for robust learning of Gaussian mixture models." Thesis, University of Macau, 2011. http://umaclib3.umac.mo/record=b2493309.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Plasse, Joshua H. "The EM Algorithm in Multivariate Gaussian Mixture Models using Anderson Acceleration." Digital WPI, 2013. https://digitalcommons.wpi.edu/etd-theses/290.

Full text
Abstract:
Over the years analysts have used the EM algorithm to obtain maximum likelihood estimates from incomplete data for various models. The general algorithm admits several appealing properties such as strong global convergence; however, the rate of convergence is linear which in some cases may be unacceptably slow. This work is primarily concerned with applying Anderson acceleration to the EM algorithm for Gaussian mixture models (GMM) in hopes of alleviating slow convergence. As preamble we provide a review of maximum likelihood estimation and derive the EM algorithm in detail. The iterates that
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Gaussian mixture models"

1

1st, Krishna M. Vamsi. Brain Tumor Segmentation Using Bivariate Gaussian Mixture Models. Selfypage Developers Pvt Ltd, 2022.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Speaker Verification in the Presence of Channel Mismatch Using Gaussian Mixture Models. Storming Media, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cheng, Russell. Finite Mixture Examples; MAPIS Details. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198505044.003.0018.

Full text
Abstract:
Two detailed numerical examples are given in this chapter illustrating and comparing mainly the reversible jump Markov chain Monte Carlo (RJMCMC) and the maximum a posteriori/importance sampling (MAPIS) methods. The numerical examples are the well-known galaxy data set with sample size 82, and the Hidalgo stamp issues thickness data with sample size 485. A comparison is made of the estimates obtained by the RJMCMC and MAPIS methods for (i) the posterior k-distribution of the number of components, k, (ii) the predictive finite mixture distribution itself, and (iii) the posterior distributions o
APA, Harvard, Vancouver, ISO, and other styles
4

Anomaly Detection Using a Variational Autoencoder Neural Network with a Novel Objective Function and Gaussian Mixture Model Selection Technique. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Gaussian mixture models"

1

Yu, Dong, and Li Deng. "Gaussian Mixture Models." In Automatic Speech Recognition. Springer London, 2014. http://dx.doi.org/10.1007/978-1-4471-5779-3_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Reynolds, Douglas. "Gaussian Mixture Models." In Encyclopedia of Biometrics. Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-73003-5_196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Reynolds, Douglas. "Gaussian Mixture Models." In Encyclopedia of Biometrics. Springer US, 2015. http://dx.doi.org/10.1007/978-1-4899-7488-4_196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Liu, Honghai, Zhaojie Ju, Xiaofei Ji, Chee Seng Chan, and Mehdi Khoury. "Fuzzy Gaussian Mixture Models." In Human Motion Sensing and Recognition. Springer Berlin Heidelberg, 2017. http://dx.doi.org/10.1007/978-3-662-53692-6_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Scrucca, Luca, Chris Fraley, T. Brendan Murphy, and Adrian E. Raftery. "Visualizing Gaussian Mixture Models." In Model-Based Clustering, Classification, and Density Estimation Using mclust in R. Chapman and Hall/CRC, 2023. http://dx.doi.org/10.1201/9781003277965-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lee, Hyoung-joo, and Sungzoon Cho. "Combining Gaussian Mixture Models." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-28651-6_98.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Aladjem, Mayer. "Projection Pursuit Fitting Gaussian Mixture Models." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-70659-3_41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Blömer, Johannes, and Kathrin Bujna. "Adaptive Seeding for Gaussian Mixture Models." In Advances in Knowledge Discovery and Data Mining. Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-31750-2_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zeng, Jia, and Zhi-Qiang Liu. "Type-2 Fuzzy Gaussian Mixture Models." In Type-2 Fuzzy Graphical Models for Pattern Recognition. Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-662-44690-4_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ponsa, Daniel, and Xavier Roca. "Unsupervised Parameterisation of Gaussian Mixture Models." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-36079-4_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Gaussian mixture models"

1

Rudić, Branislav, Markus Pichler-Scheder, and Dmitry Efrosinin. "Valid Decoding in Gaussian Mixture Models." In 2024 IEEE 3rd Conference on Information Technology and Data Science (CITDS). IEEE, 2024. https://doi.org/10.1109/citds62610.2024.10791365.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Chuchen, Bingqing Zhao, Shanjun Tang, Xing Li, and Xin Liao. "Target point alignment with Gaussian mixture models." In Sixteenth International Conference on Signal Processing Systems (ICSPS 2024), edited by Robert Minasian and Li Chai. SPIE, 2025. https://doi.org/10.1117/12.3060665.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Maas, Ryan, Jeremy Hyrkas, Olivia Grace Telford, Magdalena Balazinska, Andrew Connolly, and Bill Howe. "Gaussian Mixture Models Use-Case." In the 3rd VLDB Workshop. ACM Press, 2015. http://dx.doi.org/10.1145/2803140.2803143.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Beaufays, F., M. Weintraub, and Yochai Konig. "Discriminative mixture weight estimation for large Gaussian mixture models." In 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258). IEEE, 1999. http://dx.doi.org/10.1109/icassp.1999.758131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Levine, Stacey, Katie Heaps, Joshua Koslosky, and Glenn Sidle. "Image Fusion using Gaussian Mixture Models." In British Machine Vision Conference 2013. British Machine Vision Association, 2013. http://dx.doi.org/10.5244/c.27.89.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Keselman, Leonid, and Martial Hebert. "Direct Fitting of Gaussian Mixture Models." In 2019 16th Conference on Computer and Robot Vision (CRV). IEEE, 2019. http://dx.doi.org/10.1109/crv.2019.00012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zeng, Jia, Lei Xie, and Zhi-Qiang Liu. "Gaussian Mixture Models with Uncertain Parameters." In 2007 International Conference on Machine Learning and Cybernetics. IEEE, 2007. http://dx.doi.org/10.1109/icmlc.2007.4370617.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

D'souza, Kevin, and K. T. V. Talele. "Voice conversion using Gaussian Mixture Models." In 2015 International Conference on Communication, Information & Computing Technology (ICCICT). IEEE, 2015. http://dx.doi.org/10.1109/iccict.2015.7045743.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bouguila, Nizar. "Non-Gaussian mixture image models prediction." In 2008 15th IEEE International Conference on Image Processing. IEEE, 2008. http://dx.doi.org/10.1109/icip.2008.4712321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gupta, Hitesh Anand, and Vinay M. Varma. "Noise classification using Gaussian Mixture Models." In 2012 1st International Conference on Recent Advances in Information Technology (RAIT). IEEE, 2012. http://dx.doi.org/10.1109/rait.2012.6194530.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Gaussian mixture models"

1

Yu, Guoshen, and Guillermo Sapiro. Statistical Compressive Sensing of Gaussian Mixture Models. Defense Technical Information Center, 2010. http://dx.doi.org/10.21236/ada540728.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hogden, J., and J. C. Scovel. MALCOM X: Combining maximum likelihood continuity mapping with Gaussian mixture models. Office of Scientific and Technical Information (OSTI), 1998. http://dx.doi.org/10.2172/677150.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yu, Guoshen, Guillermo Sapiro, and Stephane Mallat. Solving Inverse Problems with Piecewise Linear Estimators: From Gaussian Mixture Models to Structured Sparsity. Defense Technical Information Center, 2010. http://dx.doi.org/10.21236/ada540722.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ramakrishnan, Aravind, Ashraf Alrajhi, Egemen Okte, Hasan Ozer, and Imad Al-Qadi. Truck-Platooning Impacts on Flexible Pavements: Experimental and Mechanistic Approaches. Illinois Center for Transportation, 2021. http://dx.doi.org/10.36501/0197-9191/21-038.

Full text
Abstract:
Truck platoons are expected to improve safety and reduce fuel consumption. However, their use is projected to accelerate pavement damage due to channelized-load application (lack of wander) and potentially reduced duration between truck-loading applications (reduced rest period). The effect of wander on pavement damage is well documented, while relatively few studies are available on the effect of rest period on pavement permanent deformation. Therefore, the main objective of this study was to quantify the impact of rest period theoretically, using a numerical method, and experimentally, using
APA, Harvard, Vancouver, ISO, and other styles
5

Gardiner, Thomas, and Allen Robinson. Gaussian Mixture Model Solvers for the Boltzmann Equation. Office of Scientific and Technical Information (OSTI), 2022. http://dx.doi.org/10.2172/2402991.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

De Leon, Phillip L., and Richard D. McClanahan. Efficient speaker verification using Gaussian mixture model component clustering. Office of Scientific and Technical Information (OSTI), 2012. http://dx.doi.org/10.2172/1039402.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!