Academic literature on the topic 'Marked log-Gaussian Cox process'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Marked log-Gaussian Cox process.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Marked log-Gaussian Cox process"

1

Medialdea, Adriana, José Miguel Angulo, and Jorge Mateu. "Structural Complexity and Informational Transfer in Spatial Log-Gaussian Cox Processes." Entropy 23, no. 9 (August 31, 2021): 1135. http://dx.doi.org/10.3390/e23091135.

Full text
Abstract:
The doubly stochastic mechanism generating the realizations of spatial log-Gaussian Cox processes is empirically assessed in terms of generalized entropy, divergence and complexity measures. The aim is to characterize the contribution to stochasticity from the two phases involved, in relation to the transfer of information from the intensity field to the resulting point pattern, as well as regarding their marginal random structure. A number of scenarios are explored regarding the Matérn model for the covariance of the underlying log-intensity random field. Sensitivity with respect to varying values of the model parameters, as well as of the deformation parameters involved in the generalized informational measures, is analyzed on the basis of regular lattice partitionings. Both a marginal global assessment based on entropy and complexity measures, and a joint local assessment based on divergence and relative complexity measures, are addressed. A Poisson process and a log-Gaussian Cox process with white noise intensity, the first providing an upper bound for entropy, are considered as reference cases. Differences regarding the transfer of structural information from the intensity field to the subsequently generated point patterns, reflected by entropy, divergence and complexity estimates, are discussed according to the specifications considered. In particular, the magnitude of the decrease in marginal entropy estimates between the intensity random fields and the corresponding point patterns quantitatively discriminates the global effect of the additional source of variability involved in the second phase of the double stochasticity.
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Jia, and Jarno Vanhatalo. "Bayesian model based spatiotemporal survey designs and partially observed log Gaussian Cox process." Spatial Statistics 35 (March 2020): 100392. http://dx.doi.org/10.1016/j.spasta.2019.100392.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Beneš, Viktor, Karel Bodlák, Jesper Møller, and Rasmus Waagepetersen. "A CASE STUDY ON POINT PROCESS MODELLING IN DISEASE MAPPING." Image Analysis & Stereology 23, no. 3 (May 3, 2011): 159. http://dx.doi.org/10.5566/ias.v24.p159-168.

Full text
Abstract:
We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis (TBE), and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence of the risk on the covariates. Instead of using the common area level approaches we base the analysis on a Bayesian approach for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using Markov chain Monte Carlo methods. A particular problem which is thoroughly discussed is to determine a model for the background population density. The risk map shows a clear dependency with the population intensity models and the basic model which is adopted for the population intensity determines what covariates influence the risk of TBE. Model validation is based on the posterior predictive distribution of various summary statistics.
APA, Harvard, Vancouver, ISO, and other styles
4

Samartsidis, Pantelis, Claudia R. Eickhoff, Simon B. Eickhoff, Tor D. Wager, Lisa Feldman Barrett, Shir Atzil, Timothy D. Johnson, and Thomas E. Nichols. "Bayesian log‐Gaussian Cox process regression: applications to meta‐analysis of neuroimaging working memory studies." Journal of the Royal Statistical Society: Series C (Applied Statistics) 68, no. 1 (June 29, 2018): 217–34. http://dx.doi.org/10.1111/rssc.12295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rostami, Mehran, Younes Mohammadi, Abdollah Jalilian, and Bashir Nazparvar. "Modeling spatio-temporal variations of substance abuse mortality in Iran using a log-Gaussian Cox point process." Spatial and Spatio-temporal Epidemiology 22 (August 2017): 15–25. http://dx.doi.org/10.1016/j.sste.2017.05.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Valente, Fernanda, and Márcio Laurini. "Tornado Occurrences in the United States: A Spatio-Temporal Point Process Approach." Econometrics 8, no. 2 (June 11, 2020): 25. http://dx.doi.org/10.3390/econometrics8020025.

Full text
Abstract:
In this paper, we analyze the tornado occurrences in the Unites States. To perform inference procedures for the spatio-temporal point process we adopt a dynamic representation of Log-Gaussian Cox Process. This representation is based on the decomposition of intensity function in components of trend, cycles, and spatial effects. In this model, spatial effects are also represented by a dynamic functional structure, which allows analyzing the possible changes in the spatio-temporal distribution of the occurrence of tornadoes due to possible changes in climate patterns. The model was estimated using Bayesian inference through the Integrated Nested Laplace Approximations. We use data from the Storm Prediction Center’s Severe Weather Database between 1954 and 2018, and the results provided evidence, from new perspectives, that trends in annual tornado occurrences in the United States have remained relatively constant, supporting previously reported findings.
APA, Harvard, Vancouver, ISO, and other styles
7

Pulido, Eliana Soriano, Carlos Valencia Arboleda, and Juan Pablo Rodríguez Sánchez. "Study of the spatiotemporal correlation between sediment-related blockage events in the sewer system in Bogotá (Colombia)." Water Science and Technology 79, no. 9 (May 1, 2019): 1727–38. http://dx.doi.org/10.2166/wst.2019.172.

Full text
Abstract:
Abstract The planning and scheduling of maintenance operations of large conventional sewer systems generate a complex decision-making environment due to the difficulty in the collection and analysis of the spatiotemporal information about the operational and structural condition of their components (e.g. pipes, gully pots and manholes). As such, water utilities generally carry out these operations following a corrective approach. This paper studies the impact of the spatiotemporal correlation between these failure events using Log-Gaussian Cox Process (LGCP) models. In addition, the association of failure events to physical and environmental covariates was assessed. The proposed methods were applied to analyze sediment-related blockages in the sewer system of an operative zone in Bogotá (Colombia). The results of this research allowed the identification of significant covariates that were further used to model spatiotemporal clusters with high sediment-related failure risk in sewer systems. The LGCP model proved to be more accurate in comparison to those models that build upon a fundamental assumption that a failure is equally likely to occur at any time regardless of the state of the system and the system's history of failures (i.e. a homogeneous Poisson process model).
APA, Harvard, Vancouver, ISO, and other styles
8

Lewy, Peter, and Kasper Kristensen. "Modelling the distribution of fish accounting for spatial correlation and overdispersion." Canadian Journal of Fisheries and Aquatic Sciences 66, no. 10 (October 2009): 1809–20. http://dx.doi.org/10.1139/f09-114.

Full text
Abstract:
The spatial distribution of cod ( Gadus morhua ) in the North Sea and the Skagerrak was analysed over a 24-year period using the Log Gaussian Cox Process (LGCP). In contrast to other spatial models of the distribution of fish, LGCP avoids problems with zero observations and includes the spatial correlation between observations. It is therefore possible to predict and interpolate unobserved densities at any location in the area. This is important for obtaining unbiased estimates of stock concentration and other measures depending on the distribution in the entire area. Results show that the spatial correlation and dispersion of cod catches remained unchanged during winter throughout the period, in spite of a drastic decline in stock abundance and a movement of the centre of gravity of the distribution towards the northeast in the same period. For the age groups considered, the concentration of the stock was found to be constant or declining in the period. This means that cod does not follow the theory of density-dependent habitat selection, as the concentration of the stock does not increase when stock abundance decreases.
APA, Harvard, Vancouver, ISO, and other styles
9

Bäuerle, Heidi, and Arne Nothdurft. "Spatial modeling of habitat trees based on line transect sampling and point pattern reconstruction." Canadian Journal of Forest Research 41, no. 4 (April 2011): 715–27. http://dx.doi.org/10.1139/x11-004.

Full text
Abstract:
An approach is presented for the spatial modeling of rare habitat trees surveyed by line transect sampling (LTS) in a protected area of the European Natura 2000 network. The observed tree pattern is defined as a realization of a thinned point process where the thinning can be modeled by a parametric detection function. A complete pattern is reconstructed using an optimization algorithm. The start configuration contains detected tree locations and randomly generated tree positions. Empirical cumulative distribution functions (ECDFs) for intertree and location-to-tree distances estimated from the original LTS are set as target characteristics. The same ECDFs are estimated by means of virtual LTS in the reconstruction. Tree positions are relocated during the optimization. The sum of squared deviations between the ECDFs from the original LTS and the virtual LTS in the reconstruction is considered as a contrast measure. A new configuration is accepted if the contrast is lowered compared with the previous state. The nonparametrically reconstructed habitat tree patterns are described by a log Gaussian Cox process model. Evaluations by means of line transect resamplings in a complete habitat pattern show small deviations between the second-order functional characteristics obtained from the true pattern and their analogs derived from the reconstructions.
APA, Harvard, Vancouver, ISO, and other styles
10

Mørkrid, Lars, Alexander D. Rowe, Katja B. P. Elgstoen, Jess H. Olesen, George Ruijter, Patricia L. Hall, Silvia Tortorelli, et al. "Continuous Age- and Sex-Adjusted Reference Intervals of Urinary Markers for Cerebral Creatine Deficiency Syndromes: A Novel Approach to the Definition of Reference Intervals." Clinical Chemistry 61, no. 5 (May 1, 2015): 760–68. http://dx.doi.org/10.1373/clinchem.2014.235564.

Full text
Abstract:
Abstract BACKGROUND Urinary concentrations of creatine and guanidinoacetic acid divided by creatinine are informative markers for cerebral creatine deficiency syndromes (CDSs). The renal excretion of these substances varies substantially with age and sex, challenging the sensitivity and specificity of postanalytical interpretation. METHODS Results from 155 patients with CDS and 12 507 reference individuals were contributed by 5 diagnostic laboratories. They were binned into 104 adjacent age intervals and renormalized with Box–Cox transforms (Ξ). Estimates for central tendency (μ) and dispersion (σ) of Ξ were obtained for each bin. Polynomial regression analysis was used to establish the age dependence of both μ[log(age)] and σ[log(age)]. The regression residuals were then calculated as z-scores = {Ξ − μ[log(age)]}/σ[log(age)]. The process was iterated until all z-scores outside Tukey fences ±3.372 were identified and removed. Continuous percentile charts were then calculated and plotted by retransformation. RESULTS Statistically significant and biologically relevant subgroups of z-scores were identified. Significantly higher marker values were seen in females than males, necessitating separate reference intervals in both adolescents and adults. Comparison between our reconstructed reference percentiles and current standard age-matched reference intervals highlights an underlying risk of false-positive and false-negative events at certain ages. CONCLUSIONS Disease markers depending strongly on covariates such as age and sex require large numbers of reference individuals to establish peripheral percentiles with sufficient precision. This is feasible only through collaborative data sharing and the use of appropriate statistical methods. Broad application of this approach can be implemented through freely available Web-based software.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Marked log-Gaussian Cox process"

1

Liu, Jia. "Heterogeneous Sensor Data based Online Quality Assurance for Advanced Manufacturing using Spatiotemporal Modeling." Diss., Virginia Tech, 2017. http://hdl.handle.net/10919/78722.

Full text
Abstract:
Online quality assurance is crucial for elevating product quality and boosting process productivity in advanced manufacturing. However, the inherent complexity of advanced manufacturing, including nonlinear process dynamics, multiple process attributes, and low signal/noise ratio, poses severe challenges for both maintaining stable process operations and establishing efficacious online quality assurance schemes. To address these challenges, four different advanced manufacturing processes, namely, fused filament fabrication (FFF), binder jetting, chemical mechanical planarization (CMP), and the slicing process in wafer production, are investigated in this dissertation for applications of online quality assurance, with utilization of various sensors, such as thermocouples, infrared temperature sensors, accelerometers, etc. The overarching goal of this dissertation is to develop innovative integrated methodologies tailored for these individual manufacturing processes but addressing their common challenges to achieve satisfying performance in online quality assurance based on heterogeneous sensor data. Specifically, three new methodologies are created and validated using actual sensor data, namely, (1) Real-time process monitoring methods using Dirichlet process (DP) mixture model for timely detection of process changes and identification of different process states for FFF and CMP. The proposed methodology is capable of tackling non-Gaussian data from heterogeneous sensors in these advanced manufacturing processes for successful online quality assurance. (2) Spatial Dirichlet process (SDP) for modeling complex multimodal wafer thickness profiles and exploring their clustering effects. The SDP-based statistical control scheme can effectively detect out-of-control wafers and achieve wafer thickness quality assurance for the slicing process with high accuracy. (3) Augmented spatiotemporal log Gaussian Cox process (AST-LGCP) quantifying the spatiotemporal evolution of porosity in binder jetting parts, capable of predicting high-risk areas on consecutive layers. This work fills the long-standing research gap of lacking rigorous layer-wise porosity quantification for parts made by additive manufacturing (AM), and provides the basis for facilitating corrective actions for product quality improvements in a prognostic way. These developed methodologies surmount some common challenges of advanced manufacturing which paralyze traditional methods in online quality assurance, and embody key components for implementing effective online quality assurance with various sensor data. There is a promising potential to extend them to other manufacturing processes in the future.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
2

Tang, Man. "Statistical methods for variant discovery and functional genomic analysis using next-generation sequencing data." Diss., Virginia Tech, 2020. http://hdl.handle.net/10919/104039.

Full text
Abstract:
The development of high-throughput next-generation sequencing (NGS) techniques produces massive amount of data, allowing the identification of biomarkers in early disease diagnosis and driving the transformation of most disciplines in biology and medicine. A greater concentration is needed in developing novel, powerful, and efficient tools for NGS data analysis. This dissertation focuses on modeling ``omics'' data in various NGS applications with a primary goal of developing novel statistical methods to identify sequence variants, find transcription factor (TF) binding patterns, and decode the relationship between TF and gene expression levels. Accurate and reliable identification of sequence variants, including single nucleotide polymorphisms (SNPs) and insertion-deletion polymorphisms (INDELs), plays a fundamental role in NGS applications. Existing methods for calling these variants often make simplified assumption of positional independence and fail to leverage the dependence of genotypes at nearby loci induced by linkage disequilibrium. We propose vi-HMM, a hidden Markov model (HMM)-based method for calling SNPs and INDELs in mapped short read data. Simulation experiments show that, under various sequencing depths, vi-HMM outperforms existing methods in terms of sensitivity and F1 score. When applied to the human whole genome sequencing data, vi-HMM demonstrates higher accuracy in calling SNPs and INDELs. One important NGS application is chromatin immunoprecipitation followed by sequencing (ChIP-seq), which characterizes protein-DNA relations through genome-wide mapping of TF binding sites. Multiple TFs, binding to DNA sequences, often show complex binding patterns, which indicate how TFs with similar functionalities work together to regulate the expression of target genes. To help uncover the transcriptional regulation mechanism, we propose a novel nonparametric Bayesian method to detect the clustering pattern of multiple-TF bindings from ChIP-seq datasets. Simulation study demonstrates that our method performs best with regard to precision, recall, and F1 score, in comparison to traditional methods. We also apply the method on real data and observe several TF clusters that have been recognized previously in mouse embryonic stem cells. Recent advances in ChIP-seq and RNA sequencing (RNA-Seq) technologies provides more reliable and accurate characterization of TF binding sites and gene expression measurements, which serves as a basis to study the regulatory functions of TFs on gene expression. We propose a log Gaussian cox process with wavelet-based functional model to quantify the relationship between TF binding site locations and gene expression levels. Through the simulation study, we demonstrate that our method performs well, especially with large sample size and small variance. It also shows a remarkable ability to distinguish real local feature in the function estimates.
Doctor of Philosophy
The development of high-throughput next-generation sequencing (NGS) techniques produces massive amount of data and bring out innovations in biology and medicine. A greater concentration is needed in developing novel, powerful, and efficient tools for NGS data analysis. In this dissertation, we mainly focus on three problems closely related to NGS and its applications: (1) how to improve variant calling accuracy, (2) how to model transcription factor (TF) binding patterns, and (3) how to quantify of the contribution of TF binding on gene expression. We develop novel statistical methods to identify sequence variants, find TF binding patterns, and explore the relationship between TF binding and gene expressions. We expect our findings will be helpful in promoting a better understanding of disease causality and facilitating the design of personalized treatments.
APA, Harvard, Vancouver, ISO, and other styles
3

Héda, Ivan. "Modely kótovaných bodových procesů." Master's thesis, 2016. http://www.nusl.cz/ntk/nusl-346977.

Full text
Abstract:
Title: Models of Marked Point Processes Author: Ivan Héda Department: Department of Probability and Mathematical Statistics Supervisor: doc. RNDr. Zbyněk Pawlas, Ph.D. Abstract: In the first part of the thesis, we present necessary theoretical basics as well as the definition of functional characteristics used for examination of marked point patterns. Second part is dedicated to review some known marking strategies. The core of the thesis lays in the study of intensity-marked point processes. General formula for the characteristics is proven for this marking strategy and general class of the models with analytically computable characteristics is introduced. This class generalizes some known models. Theoretical results are used for real data analysis in the last part of the thesis. Keywords: marked point process, marked log-Gaussian Cox process, intensity-marked point process 1
APA, Harvard, Vancouver, ISO, and other styles
4

Leininger, Thomas Jeffrey. "Bayesian Analysis of Spatial Point Patterns." Diss., 2014. http://hdl.handle.net/10161/8730.

Full text
Abstract:

We explore the posterior inference available for Bayesian spatial point process models. In the literature, discussion of such models is usually focused on model fitting and rejecting complete spatial randomness, with model diagnostics and posterior inference often left as an afterthought. Posterior predictive point patterns are shown to be useful in performing model diagnostics and model selection, as well as providing a wide array of posterior model summaries. We prescribe Bayesian residuals and methods for cross-validation and model selection for Poisson processes, log-Gaussian Cox processes, Gibbs processes, and cluster processes. These novel approaches are demonstrated using existing datasets and simulation studies.


Dissertation
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Marked log-Gaussian Cox process"

1

Fu, Huiqiao, Kaiqiang Tang, Peng Li, Wenqi Zhang, Xinpeng Wang, Guizhou Deng, Tao Wang, and Chunlin Chen. "Deep Reinforcement Learning for Multi-contact Motion Planning of Hexapod Robots." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/328.

Full text
Abstract:
Legged locomotion in a complex environment requires careful planning of the footholds of legged robots. In this paper, a novel Deep Reinforcement Learning (DRL) method is proposed to implement multi-contact motion planning for hexapod robots moving on uneven plum-blossom piles. First, the motion of hexapod robots is formulated as a Markov Decision Process (MDP) with a specified reward function. Second, a transition feasibility model is proposed for hexapod robots, which describes the feasibility of the state transition under the condition of satisfying kinematics and dynamics, and in turn determines the rewards. Third, the footholds and Center-of-Mass (CoM) sequences are sampled from a diagonal Gaussian distribution and the sequences are optimized through learning the optimal policies using the designed DRL algorithm. Both of the simulation and experimental results on physical systems demonstrate the feasibility and efficiency of the proposed method. Videos are shown at https://videoviewpage.wixsite.com/mcrl.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography