Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Gaussian mixture models.

Artykuły w czasopismach na temat „Gaussian mixture models”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Gaussian mixture models”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Ju, Zhaojie, and Honghai Liu. "Fuzzy Gaussian Mixture Models." Pattern Recognition 45, no. 3 (2012): 1146–58. http://dx.doi.org/10.1016/j.patcog.2011.08.028.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

McNicholas, Paul David, and Thomas Brendan Murphy. "Parsimonious Gaussian mixture models." Statistics and Computing 18, no. 3 (2008): 285–96. http://dx.doi.org/10.1007/s11222-008-9056-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Viroli, Cinzia, and Geoffrey J. McLachlan. "Deep Gaussian mixture models." Statistics and Computing 29, no. 1 (2017): 43–51. http://dx.doi.org/10.1007/s11222-017-9793-z.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Verbeek, J. J., N. Vlassis, and B. Kröse. "Efficient Greedy Learning of Gaussian Mixture Models." Neural Computation 15, no. 2 (2003): 469–85. http://dx.doi.org/10.1162/089976603762553004.

Pełny tekst źródła
Streszczenie:
This article concerns the greedy learning of gaussian mixtures. In the greedy approach, mixture components are inserted into the mixture one aftertheother.We propose a heuristic for searching for the optimal component to insert. In a randomized manner, a set of candidate new components is generated. For each of these candidates, we find the locally optimal new component and insert it into the existing mixture. The resulting algorithm resolves the sensitivity to initialization of state-of-the-art methods, like expectation maximization, and has running time linear in the number of data points an
Style APA, Harvard, Vancouver, ISO itp.
5

Kunkel, Deborah, and Mario Peruggia. "Anchored Bayesian Gaussian mixture models." Electronic Journal of Statistics 14, no. 2 (2020): 3869–913. http://dx.doi.org/10.1214/20-ejs1756.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Chassagnol, Bastien, Antoine Bichat, Cheïma Boudjeniba, et al. "Gaussian Mixture Models in R." R Journal 15, no. 2 (2023): 56–76. http://dx.doi.org/10.32614/rj-2023-043.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

chassagnol, bastien. "Gaussian Mixture Models in R." R Journal 2023-2, no. 2023-2 (2023): 56–76. https://doi.org/10.32614/RJ-2023-043.

Pełny tekst źródła
Streszczenie:
  Gaussian mixture models (GMMs) are widely used for modelling stochastic problems. Indeed, a wide diversity of packages have been developed in R. However, no recent review describing the main features offered by these packages and comparing their performances has been performed. In this article, we first introduce GMMs and the EM algorithm used to retrieve the parameters of the model and analyse the main features implemented among seven of the most widely used R packages. We then empirically compare their statistical and computational performances in relation with the choice of the initi
Style APA, Harvard, Vancouver, ISO itp.
8

Ruzgas, Tomas, and Indrė Drulytė. "Kernel Density Estimators for Gaussian Mixture Models." Lietuvos statistikos darbai 52, no. 1 (2013): 14–21. http://dx.doi.org/10.15388/ljs.2013.13919.

Pełny tekst źródła
Streszczenie:
The problem of nonparametric estimation of probability density function is considered. The performance of kernel estimators based on various common kernels and a new kernel K (see (14)) with both fixed and adaptive smoothing bandwidth is compared in terms of the symmetric mean absolute percentage error using the Monte Carlo method. The kernel K is everywhere positive but has lighter tails than the Gaussian density. Gaussian mixture models from a collection introduced by Marron and Wand (1992) are taken for Monte Carlo simulations. The adaptive kernel method outperforms the smoothing with a fix
Style APA, Harvard, Vancouver, ISO itp.
9

Chen, Yongxin, Tryphon T. Georgiou, and Allen Tannenbaum. "Optimal Transport for Gaussian Mixture Models." IEEE Access 7 (2019): 6269–78. http://dx.doi.org/10.1109/access.2018.2889838.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Nasios, N., and A. G. Bors. "Variational learning for Gaussian mixture models." IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics) 36, no. 4 (2006): 849–62. http://dx.doi.org/10.1109/tsmcb.2006.872273.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
11

Zhang, Baibo, Changshui Zhang, and Xing Yi. "Active curve axis Gaussian mixture models." Pattern Recognition 38, no. 12 (2005): 2351–62. http://dx.doi.org/10.1016/j.patcog.2005.01.017.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
12

Zeng, Jia, Lei Xie, and Zhi-Qiang Liu. "Type-2 fuzzy Gaussian mixture models." Pattern Recognition 41, no. 12 (2008): 3636–43. http://dx.doi.org/10.1016/j.patcog.2008.06.006.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
13

Bolin, David, Jonas Wallin, and Finn Lindgren. "Latent Gaussian random field mixture models." Computational Statistics & Data Analysis 130 (February 2019): 80–93. http://dx.doi.org/10.1016/j.csda.2018.08.007.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

MAEBASHI, K., N. SUEMATSU, and A. HAYASHI. "Component Reduction for Gaussian Mixture Models." IEICE Transactions on Information and Systems E91-D, no. 12 (2008): 2846–53. http://dx.doi.org/10.1093/ietisy/e91-d.12.2846.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
15

Nowakowska, Ewa, Jacek Koronacki, and Stan Lipovetsky. "Clusterability assessment for Gaussian mixture models." Applied Mathematics and Computation 256 (April 2015): 591–601. http://dx.doi.org/10.1016/j.amc.2014.12.038.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
16

Di Zio, Marco, Ugo Guarnera, and Orietta Luzi. "Imputation through finite Gaussian mixture models." Computational Statistics & Data Analysis 51, no. 11 (2007): 5305–16. http://dx.doi.org/10.1016/j.csda.2006.10.002.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
17

Yin, Jian Jun, and Jian Qiu Zhang. "Convolution PHD Filtering for Nonlinear Non-Gaussian Models." Advanced Materials Research 213 (February 2011): 344–48. http://dx.doi.org/10.4028/www.scientific.net/amr.213.344.

Pełny tekst źródła
Streszczenie:
A novel probability hypothesis density (PHD) filter, called the Gaussian mixture convolution PHD (GMCPHD) filter was proposed. The PHD within the filter is approximated by a Gaussian sum, as in the Gaussian mixture PHD (GMPHD) filter, but the model may be non-Gaussian and nonlinear. This is implemented by a bank of convolution filters with Gaussian approximations to the predicted and posterior densities. The analysis results show the lower complexity, more amenable for parallel implementation of the GMCPHD filter than the convolution PHD (CPHD) filter and the ability to deal with complex obser
Style APA, Harvard, Vancouver, ISO itp.
18

Maleki, Mohsen, and A. R. Nematollahi. "Autoregressive Models with Mixture of Scale Mixtures of Gaussian Innovations." Iranian Journal of Science and Technology, Transactions A: Science 41, no. 4 (2017): 1099–107. http://dx.doi.org/10.1007/s40995-017-0237-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
19

Meinicke, Peter, and Helge Ritter. "Resolution-Based Complexity Control for Gaussian Mixture Models." Neural Computation 13, no. 2 (2001): 453–75. http://dx.doi.org/10.1162/089976601300014600.

Pełny tekst źródła
Streszczenie:
In the domain of unsupervised learning, mixtures of gaussians have become a popular tool for statistical modeling. For this class of generative models, we present a complexity control scheme, which provides an effective means for avoiding the problem of overfitting usually encountered with unconstrained (mixtures of) gaussians in high dimensions. According to some prespecified level of resolution as implied by a fixed variance noise model, the scheme provides an automatic selection of the dimensionalities of some local signal subspaces by maximum likelihood estimation. Together with a resoluti
Style APA, Harvard, Vancouver, ISO itp.
20

Wei, Hui, and Wei Zheng. "Image Denoising Based on Improved Gaussian Mixture Model." Scientific Programming 2021 (September 22, 2021): 1–8. http://dx.doi.org/10.1155/2021/7982645.

Pełny tekst źródła
Streszczenie:
An image denoising method is proposed based on the improved Gaussian mixture model to reduce the noises and enhance the image quality. Unlike the traditional image denoising methods, the proposed method models the pixel information in the neighborhood around each pixel in the image. The Gaussian mixture model is employed to measure the similarity between pixels by calculating the L2 norm between the Gaussian mixture models corresponding to the two pixels. The Gaussian mixture model can model the statistical information such as the mean and variance of the pixel information in the image area. T
Style APA, Harvard, Vancouver, ISO itp.
21

Ueda, Naonori, Ryohei Nakano, Zoubin Ghahramani, and Geoffrey E. Hinton. "SMEM Algorithm for Mixture Models." Neural Computation 12, no. 9 (2000): 2109–28. http://dx.doi.org/10.1162/089976600300015088.

Pełny tekst źródła
Streszczenie:
We present a split-and-merge expectation-maximization (SMEM) algorithm to overcome the local maxima problem in parameter estimation of finite mixture models. In the case of mixture models, local maxima often involve having too many components of a mixture model in one part of the space and too few in another, widely separated part of the space. To escape from such configurations, we repeatedly perform simultaneous split-and-merge operations using a new criterion for efficiently selecting the split-and-merge candidates. We apply the proposed algorithm to the training of gaussian mixtures and mi
Style APA, Harvard, Vancouver, ISO itp.
22

Krasilnikov, A. I. "Classification of Models of Two-component Mixtures of Symmetrical Distributions with Zero Kurtosis Coefficient." Èlektronnoe modelirovanie 45, no. 5 (2023): 20–38. http://dx.doi.org/10.15407/emodel.45.05.020.

Pełny tekst źródła
Streszczenie:
On the basis of a family of two-component mixtures of distributions, a class K of symmetric non-Gaussian distributions with a zero kurtosis coefficient is defined, which is divided into two groups and five types. The dependence of the fourth-order cumulant on the weight coefficient of the mixture is studied, as a result of which the conditions are determined under which the kurtosis coefficient of the mixture is equal to zero. The use of a two-component mixture of Subbotin distributions for modeling single-vertex symmetric distributions with a zero kurtosis coefficient is justified. Examples o
Style APA, Harvard, Vancouver, ISO itp.
23

Røge, Rasmus E., Kristoffer H. Madsen, Mikkel N. Schmidt, and Morten Mørup. "Infinite von Mises–Fisher Mixture Modeling of Whole Brain fMRI Data." Neural Computation 29, no. 10 (2017): 2712–41. http://dx.doi.org/10.1162/neco_a_01000.

Pełny tekst źródła
Streszczenie:
Cluster analysis of functional magnetic resonance imaging (fMRI) data is often performed using gaussian mixture models, but when the time series are standardized such that the data reside on a hypersphere, this modeling assumption is questionable. The consequences of ignoring the underlying spherical manifold are rarely analyzed, in part due to the computational challenges imposed by directional statistics. In this letter, we discuss a Bayesian von Mises–Fisher (vMF) mixture model for data on the unit hypersphere and present an efficient inference procedure based on collapsed Markov chain Mont
Style APA, Harvard, Vancouver, ISO itp.
24

Masmoudi, Khalil, and Afif Masmoudi. "An EM algorithm for singular Gaussian mixture models." Filomat 33, no. 15 (2019): 4753–67. http://dx.doi.org/10.2298/fil1915753m.

Pełny tekst źródła
Streszczenie:
In this paper, we introduce finite mixture models with singular multivariate normal components. These models are useful when the observed data involves collinearities, that is when the covariance matrices are singular. They are also useful when the covariance matrices are ill-conditioned. In the latter case, the classical approaches may lead to numerical instabilities and give inaccurate estimations. Hence, an extension of the Expectation Maximization algorithm, with complete proof, is proposed to derive the maximum likelihood estimators and cluster the data instances for mixtures of singular
Style APA, Harvard, Vancouver, ISO itp.
25

YAMADA, Makoto, and Masashi SUGIYAMA. "Direct Importance Estimation with Gaussian Mixture Models." IEICE Transactions on Information and Systems E92-D, no. 10 (2009): 2159–62. http://dx.doi.org/10.1587/transinf.e92.d.2159.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
26

YE, Peng, Fang LIU, and Zhiyong ZHAO. "Multiple Gaussian Mixture Models for Image Registration." IEICE Transactions on Information and Systems E97.D, no. 7 (2014): 1927–29. http://dx.doi.org/10.1587/transinf.e97.d.1927.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
27

Yang, Jianbo, Xin Yuan, Xuejun Liao, et al. "Video Compressive Sensing Using Gaussian Mixture Models." IEEE Transactions on Image Processing 23, no. 11 (2014): 4863–78. http://dx.doi.org/10.1109/tip.2014.2344294.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
28

Zhiwu Lu and H. H. S. Ip. "Generalized Competitive Learning of Gaussian Mixture Models." IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 39, no. 4 (2009): 901–9. http://dx.doi.org/10.1109/tsmcb.2008.2012119.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
29

Akaho, Shotaro, and Hilbert J. Kappen. "Nonmonotonic Generalization Bias of Gaussian Mixture Models." Neural Computation 12, no. 6 (2000): 1411–27. http://dx.doi.org/10.1162/089976600300015439.

Pełny tekst źródła
Streszczenie:
Theories of learning and generalization hold that the generalization bias, defined as the difference between the training error and the generalization error, increases on average with the number of adaptive parameters. This article, however, shows that this general tendency is violated for a gaussian mixture model. For temperatures just below the first symmetry breaking point, the effective number of adaptive parameters increases and the generalization bias decreases. We compute the dependence of the neural information criterion on temperature around the symmetry breaking. Our results are conf
Style APA, Harvard, Vancouver, ISO itp.
30

Yu, Guoshen, and Guillermo Sapiro. "Statistical Compressed Sensing of Gaussian Mixture Models." IEEE Transactions on Signal Processing 59, no. 12 (2011): 5842–58. http://dx.doi.org/10.1109/tsp.2011.2168521.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
31

Matza, Avi, and Yuval Bistritz. "Skew Gaussian mixture models for speaker recognition." IET Signal Processing 8, no. 8 (2014): 860–67. http://dx.doi.org/10.1049/iet-spr.2013.0270.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
32

Arellano, Claudia, and Rozenn Dahyot. "Robust ellipse detection with Gaussian mixture models." Pattern Recognition 58 (October 2016): 12–26. http://dx.doi.org/10.1016/j.patcog.2016.01.017.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
33

Lee, Kevin H., and Lingzhou Xue. "Nonparametric Finite Mixture of Gaussian Graphical Models." Technometrics 60, no. 4 (2018): 511–21. http://dx.doi.org/10.1080/00401706.2017.1408497.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
34

Jones, Daniel M., and Alan F. Heavens. "Gaussian mixture models for blended photometric redshifts." Monthly Notices of the Royal Astronomical Society 490, no. 3 (2019): 3966–86. http://dx.doi.org/10.1093/mnras/stz2687.

Pełny tekst źródła
Streszczenie:
ABSTRACT Future cosmological galaxy surveys such as the Large Synoptic Survey Telescope (LSST) will photometrically observe very large numbers of galaxies. Without spectroscopy, the redshifts required for the analysis of these data will need to be inferred using photometric redshift techniques that are scalable to large sample sizes. The high number density of sources will also mean that around half are blended. We present a Bayesian photometric redshift method for blended sources that uses Gaussian mixture models to learn the joint flux–redshift distribution from a set of unblended training g
Style APA, Harvard, Vancouver, ISO itp.
35

Wallet, Bradley C., and Robert Hardisty. "Unsupervised seismic facies using Gaussian mixture models." Interpretation 7, no. 3 (2019): SE93—SE111. http://dx.doi.org/10.1190/int-2018-0119.1.

Pełny tekst źródła
Streszczenie:
As the use of seismic attributes becomes more widespread, multivariate seismic analysis has become more commonplace for seismic facies analysis. Unsupervised machine-learning techniques provide methods of automatically finding patterns in data with minimal user interaction. When using unsupervised machine-learning techniques, such as [Formula: see text]-means or Kohonen self-organizing maps (SOMs), the number of clusters can often be ambiguously defined and there is no measure of how confident the algorithm is in the classification of data vectors. The model-based probabilistic formulation of
Style APA, Harvard, Vancouver, ISO itp.
36

Silva, Diogo S. F., and Clayton V. Deutsch. "Multivariate data imputation using Gaussian mixture models." Spatial Statistics 27 (October 2018): 74–90. http://dx.doi.org/10.1016/j.spasta.2016.11.002.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
37

Hedelin, P., and J. Skoglund. "Vector quantization based on Gaussian mixture models." IEEE Transactions on Speech and Audio Processing 8, no. 4 (2000): 385–401. http://dx.doi.org/10.1109/89.848220.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
38

Zhang, J., and D. Ma. "Nonlinear Prediction for Gaussian Mixture Image Models." IEEE Transactions on Image Processing 13, no. 6 (2004): 836–47. http://dx.doi.org/10.1109/tip.2004.828197.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
39

Jiucang Hao, Te-Won Lee, and Terrence J. Sejnowski. "Speech Enhancement Using Gaussian Scale Mixture Models." IEEE Transactions on Audio, Speech, and Language Processing 18, no. 6 (2010): 1127–36. http://dx.doi.org/10.1109/tasl.2009.2030012.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
40

Fiez, Tanner, and Lillian J. Ratliff. "Gaussian Mixture Models for Parking Demand Data." IEEE Transactions on Intelligent Transportation Systems 21, no. 8 (2020): 3571–80. http://dx.doi.org/10.1109/tits.2019.2939499.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
41

Burges, Christopher John. "Discriminative Gaussian mixture models for speaker verification." Journal of the Acoustical Society of America 113, no. 5 (2003): 2393. http://dx.doi.org/10.1121/1.1584172.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
42

Morgan, Grant B. "Generating Nonnormal Distributions via Gaussian Mixture Models." Structural Equation Modeling: A Multidisciplinary Journal 27, no. 6 (2020): 964–74. http://dx.doi.org/10.1080/10705511.2020.1718502.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
43

Reynolds, Douglas A., Thomas F. Quatieri, and Robert B. Dunn. "Speaker Verification Using Adapted Gaussian Mixture Models." Digital Signal Processing 10, no. 1-3 (2000): 19–41. http://dx.doi.org/10.1006/dspr.1999.0361.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
44

Chuan, Ching-Hua. "Audio Classification and Retrieval Using Wavelets and Gaussian Mixture Models." International Journal of Multimedia Data Engineering and Management 4, no. 1 (2013): 1–20. http://dx.doi.org/10.4018/jmdem.2013010101.

Pełny tekst źródła
Streszczenie:
This paper presents an audio classification and retrieval system using wavelets for extracting low-level acoustic features. The author performed multiple-level decomposition using discrete wavelet transform to extract acoustic features from audio recordings at different scales and times. The extracted features are then translated into a compact vector representation. Gaussian mixture models with expectation maximization algorithm are used to build models for audio classes and individual audio examples. The system is evaluated using three audio classification tasks: speech/music, male/female sp
Style APA, Harvard, Vancouver, ISO itp.
45

Améndola, Carlos, Alexander Engström, and Christian Haase. "Maximum number of modes of Gaussian mixtures." Information and Inference: A Journal of the IMA 9, no. 3 (2019): 587–600. http://dx.doi.org/10.1093/imaiai/iaz013.

Pełny tekst źródła
Streszczenie:
Abstract Gaussian mixture models are widely used in Statistics. A fundamental aspect of these distributions is the study of the local maxima of the density or modes. In particular, it is not known how many modes a mixture of $k$ Gaussians in $d$ dimensions can have. We give a brief account of this problem’s history. Then, we give improved lower bounds and the first upper bound on the maximum number of modes, provided it is finite.
Style APA, Harvard, Vancouver, ISO itp.
46

Zeng, Xiaoying, and Eugene Pinsky. "Elliptical Mixture Models Improve the Accuracy of Gaussian Mixture Models with Expectation-maximization Algorithm." International Journal on Cybernetics & Informatics 14, no. 2 (2025): 87–106. https://doi.org/10.5121/ijci.2025.140206.

Pełny tekst źródła
Streszczenie:
This study addresses the limitations of Gaussian Mixture Models (GMMs) in clustering complex datasets and proposes Elliptical Mixture Models (EMMs) as a robust and flexible alternative. By adapting the Expectation-Maximization (EM) algorithm to handle elliptical distributions, the study introduces a novel computational framework that enhances clustering performance for data with irregular shapes and heavy tails. Leveraging the integration of R’s advanced statistical tools into Python workflows, this approach enables practical implementation of EMMs. Empirical evaluations on three datasets Rice
Style APA, Harvard, Vancouver, ISO itp.
47

Zhao, Mingyang, Xiaohong Jia, Lubin Fan, Yuan Liang, and Dong-Ming Yan. "Robust Ellipse Fitting Using Hierarchical Gaussian Mixture Models." IEEE Transactions on Image Processing 30 (2021): 3828–43. http://dx.doi.org/10.1109/tip.2021.3065799.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
48

Belton, D., S. Moncrieff, and J. Chapman. "Processing tree point clouds using Gaussian Mixture Models." ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences II-5/W2 (October 16, 2013): 43–48. http://dx.doi.org/10.5194/isprsannals-ii-5-w2-43-2013.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
49

NISHIMOTO, Hiroki, Renyuan ZHANG, and Yasuhiko NAKASHIMA. "GPGPU Implementation of Variational Bayesian Gaussian Mixture Models." IEICE Transactions on Information and Systems E105.D, no. 3 (2022): 611–22. http://dx.doi.org/10.1587/transinf.2021edp7121.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
50

Liang, Xi-Long, Yu-Qin Chen, Jing-Kun Zhao, and Gang Zhao. "Partitioning the Galactic halo with Gaussian Mixture Models." Research in Astronomy and Astrophysics 21, no. 5 (2021): 128. http://dx.doi.org/10.1088/1674-4527/21/5/128.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!