To see the other types of publications on this topic, follow the link: Marked log-Gaussian Cox process.

Journal articles on the topic 'Marked log-Gaussian Cox process'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 19 journal articles for your research on the topic 'Marked log-Gaussian Cox process.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Medialdea, Adriana, José Miguel Angulo, and Jorge Mateu. "Structural Complexity and Informational Transfer in Spatial Log-Gaussian Cox Processes." Entropy 23, no. 9 (August 31, 2021): 1135. http://dx.doi.org/10.3390/e23091135.

Full text
Abstract:
The doubly stochastic mechanism generating the realizations of spatial log-Gaussian Cox processes is empirically assessed in terms of generalized entropy, divergence and complexity measures. The aim is to characterize the contribution to stochasticity from the two phases involved, in relation to the transfer of information from the intensity field to the resulting point pattern, as well as regarding their marginal random structure. A number of scenarios are explored regarding the Matérn model for the covariance of the underlying log-intensity random field. Sensitivity with respect to varying values of the model parameters, as well as of the deformation parameters involved in the generalized informational measures, is analyzed on the basis of regular lattice partitionings. Both a marginal global assessment based on entropy and complexity measures, and a joint local assessment based on divergence and relative complexity measures, are addressed. A Poisson process and a log-Gaussian Cox process with white noise intensity, the first providing an upper bound for entropy, are considered as reference cases. Differences regarding the transfer of structural information from the intensity field to the subsequently generated point patterns, reflected by entropy, divergence and complexity estimates, are discussed according to the specifications considered. In particular, the magnitude of the decrease in marginal entropy estimates between the intensity random fields and the corresponding point patterns quantitatively discriminates the global effect of the additional source of variability involved in the second phase of the double stochasticity.
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Jia, and Jarno Vanhatalo. "Bayesian model based spatiotemporal survey designs and partially observed log Gaussian Cox process." Spatial Statistics 35 (March 2020): 100392. http://dx.doi.org/10.1016/j.spasta.2019.100392.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Beneš, Viktor, Karel Bodlák, Jesper Møller, and Rasmus Waagepetersen. "A CASE STUDY ON POINT PROCESS MODELLING IN DISEASE MAPPING." Image Analysis & Stereology 23, no. 3 (May 3, 2011): 159. http://dx.doi.org/10.5566/ias.v24.p159-168.

Full text
Abstract:
We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis (TBE), and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence of the risk on the covariates. Instead of using the common area level approaches we base the analysis on a Bayesian approach for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using Markov chain Monte Carlo methods. A particular problem which is thoroughly discussed is to determine a model for the background population density. The risk map shows a clear dependency with the population intensity models and the basic model which is adopted for the population intensity determines what covariates influence the risk of TBE. Model validation is based on the posterior predictive distribution of various summary statistics.
APA, Harvard, Vancouver, ISO, and other styles
4

Samartsidis, Pantelis, Claudia R. Eickhoff, Simon B. Eickhoff, Tor D. Wager, Lisa Feldman Barrett, Shir Atzil, Timothy D. Johnson, and Thomas E. Nichols. "Bayesian log‐Gaussian Cox process regression: applications to meta‐analysis of neuroimaging working memory studies." Journal of the Royal Statistical Society: Series C (Applied Statistics) 68, no. 1 (June 29, 2018): 217–34. http://dx.doi.org/10.1111/rssc.12295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rostami, Mehran, Younes Mohammadi, Abdollah Jalilian, and Bashir Nazparvar. "Modeling spatio-temporal variations of substance abuse mortality in Iran using a log-Gaussian Cox point process." Spatial and Spatio-temporal Epidemiology 22 (August 2017): 15–25. http://dx.doi.org/10.1016/j.sste.2017.05.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Valente, Fernanda, and Márcio Laurini. "Tornado Occurrences in the United States: A Spatio-Temporal Point Process Approach." Econometrics 8, no. 2 (June 11, 2020): 25. http://dx.doi.org/10.3390/econometrics8020025.

Full text
Abstract:
In this paper, we analyze the tornado occurrences in the Unites States. To perform inference procedures for the spatio-temporal point process we adopt a dynamic representation of Log-Gaussian Cox Process. This representation is based on the decomposition of intensity function in components of trend, cycles, and spatial effects. In this model, spatial effects are also represented by a dynamic functional structure, which allows analyzing the possible changes in the spatio-temporal distribution of the occurrence of tornadoes due to possible changes in climate patterns. The model was estimated using Bayesian inference through the Integrated Nested Laplace Approximations. We use data from the Storm Prediction Center’s Severe Weather Database between 1954 and 2018, and the results provided evidence, from new perspectives, that trends in annual tornado occurrences in the United States have remained relatively constant, supporting previously reported findings.
APA, Harvard, Vancouver, ISO, and other styles
7

Pulido, Eliana Soriano, Carlos Valencia Arboleda, and Juan Pablo Rodríguez Sánchez. "Study of the spatiotemporal correlation between sediment-related blockage events in the sewer system in Bogotá (Colombia)." Water Science and Technology 79, no. 9 (May 1, 2019): 1727–38. http://dx.doi.org/10.2166/wst.2019.172.

Full text
Abstract:
Abstract The planning and scheduling of maintenance operations of large conventional sewer systems generate a complex decision-making environment due to the difficulty in the collection and analysis of the spatiotemporal information about the operational and structural condition of their components (e.g. pipes, gully pots and manholes). As such, water utilities generally carry out these operations following a corrective approach. This paper studies the impact of the spatiotemporal correlation between these failure events using Log-Gaussian Cox Process (LGCP) models. In addition, the association of failure events to physical and environmental covariates was assessed. The proposed methods were applied to analyze sediment-related blockages in the sewer system of an operative zone in Bogotá (Colombia). The results of this research allowed the identification of significant covariates that were further used to model spatiotemporal clusters with high sediment-related failure risk in sewer systems. The LGCP model proved to be more accurate in comparison to those models that build upon a fundamental assumption that a failure is equally likely to occur at any time regardless of the state of the system and the system's history of failures (i.e. a homogeneous Poisson process model).
APA, Harvard, Vancouver, ISO, and other styles
8

Lewy, Peter, and Kasper Kristensen. "Modelling the distribution of fish accounting for spatial correlation and overdispersion." Canadian Journal of Fisheries and Aquatic Sciences 66, no. 10 (October 2009): 1809–20. http://dx.doi.org/10.1139/f09-114.

Full text
Abstract:
The spatial distribution of cod ( Gadus morhua ) in the North Sea and the Skagerrak was analysed over a 24-year period using the Log Gaussian Cox Process (LGCP). In contrast to other spatial models of the distribution of fish, LGCP avoids problems with zero observations and includes the spatial correlation between observations. It is therefore possible to predict and interpolate unobserved densities at any location in the area. This is important for obtaining unbiased estimates of stock concentration and other measures depending on the distribution in the entire area. Results show that the spatial correlation and dispersion of cod catches remained unchanged during winter throughout the period, in spite of a drastic decline in stock abundance and a movement of the centre of gravity of the distribution towards the northeast in the same period. For the age groups considered, the concentration of the stock was found to be constant or declining in the period. This means that cod does not follow the theory of density-dependent habitat selection, as the concentration of the stock does not increase when stock abundance decreases.
APA, Harvard, Vancouver, ISO, and other styles
9

Bäuerle, Heidi, and Arne Nothdurft. "Spatial modeling of habitat trees based on line transect sampling and point pattern reconstruction." Canadian Journal of Forest Research 41, no. 4 (April 2011): 715–27. http://dx.doi.org/10.1139/x11-004.

Full text
Abstract:
An approach is presented for the spatial modeling of rare habitat trees surveyed by line transect sampling (LTS) in a protected area of the European Natura 2000 network. The observed tree pattern is defined as a realization of a thinned point process where the thinning can be modeled by a parametric detection function. A complete pattern is reconstructed using an optimization algorithm. The start configuration contains detected tree locations and randomly generated tree positions. Empirical cumulative distribution functions (ECDFs) for intertree and location-to-tree distances estimated from the original LTS are set as target characteristics. The same ECDFs are estimated by means of virtual LTS in the reconstruction. Tree positions are relocated during the optimization. The sum of squared deviations between the ECDFs from the original LTS and the virtual LTS in the reconstruction is considered as a contrast measure. A new configuration is accepted if the contrast is lowered compared with the previous state. The nonparametrically reconstructed habitat tree patterns are described by a log Gaussian Cox process model. Evaluations by means of line transect resamplings in a complete habitat pattern show small deviations between the second-order functional characteristics obtained from the true pattern and their analogs derived from the reconstructions.
APA, Harvard, Vancouver, ISO, and other styles
10

Mørkrid, Lars, Alexander D. Rowe, Katja B. P. Elgstoen, Jess H. Olesen, George Ruijter, Patricia L. Hall, Silvia Tortorelli, et al. "Continuous Age- and Sex-Adjusted Reference Intervals of Urinary Markers for Cerebral Creatine Deficiency Syndromes: A Novel Approach to the Definition of Reference Intervals." Clinical Chemistry 61, no. 5 (May 1, 2015): 760–68. http://dx.doi.org/10.1373/clinchem.2014.235564.

Full text
Abstract:
Abstract BACKGROUND Urinary concentrations of creatine and guanidinoacetic acid divided by creatinine are informative markers for cerebral creatine deficiency syndromes (CDSs). The renal excretion of these substances varies substantially with age and sex, challenging the sensitivity and specificity of postanalytical interpretation. METHODS Results from 155 patients with CDS and 12 507 reference individuals were contributed by 5 diagnostic laboratories. They were binned into 104 adjacent age intervals and renormalized with Box–Cox transforms (Ξ). Estimates for central tendency (μ) and dispersion (σ) of Ξ were obtained for each bin. Polynomial regression analysis was used to establish the age dependence of both μ[log(age)] and σ[log(age)]. The regression residuals were then calculated as z-scores = {Ξ − μ[log(age)]}/σ[log(age)]. The process was iterated until all z-scores outside Tukey fences ±3.372 were identified and removed. Continuous percentile charts were then calculated and plotted by retransformation. RESULTS Statistically significant and biologically relevant subgroups of z-scores were identified. Significantly higher marker values were seen in females than males, necessitating separate reference intervals in both adolescents and adults. Comparison between our reconstructed reference percentiles and current standard age-matched reference intervals highlights an underlying risk of false-positive and false-negative events at certain ages. CONCLUSIONS Disease markers depending strongly on covariates such as age and sex require large numbers of reference individuals to establish peripheral percentiles with sufficient precision. This is feasible only through collaborative data sharing and the use of appropriate statistical methods. Broad application of this approach can be implemented through freely available Web-based software.
APA, Harvard, Vancouver, ISO, and other styles
11

Chalimatusadiah, Chalimatusadiah, Donny Citra Lesmana, and Retno Budiarti. "Penentuan Harga Opsi Dengan Volatilitas Stokastik Menggunakan Metode Monte Carlo." Jambura Journal of Mathematics 3, no. 1 (April 28, 2021): 80–92. http://dx.doi.org/10.34312/jjom.v3i1.10137.

Full text
Abstract:
ABSTRAKHal yang utama dalam perdagangan opsi adalah penentuan harga jual opsi yang optimal. Namun pada kenyataan sebenarnya fluktuasi harga aset yang terjadi di pasar menandakan bahwa volatilitas dari harga aset tidaklah konstan, hal ini menyebabkan investor mengalami kesulitan dalam menentukan harga opsi yang optimal. Artikel ini membahas tentang penentuan harga opsi tipe Eropa yang optimal dengan volatilitas stokastik menggunakan metode Monte Carlo dan pengaruh harga saham awal, harga strike, dan waktu jatuh tempo terhadap harga opsi Eropa. Adapun model volatilitas stokastik yang digunakan dalam penelitian ini adalah model Heston, yang mengasumsikan bahwa proses harga saham (St) mengikuti distribusi log-normal, dan proses volatilitas saham (Vt) mengikuti Proses Cox-Ingersoll-Ross. Hal pertama yang dilakukan dalam penelitian ini adalah mengestimasi parameter model Heston untuk mendapatkan harga saham dengan menggunakan metode ordinary least square dan metode numerik Euler-Maruyama. Langkah kedua adalah melakukan estimasi harga saham untuk mendapatkan harga opsi tipe Eropa menggunakan metode Monte Carlo. Hasil dari penelitian ini menunjukkan bahwa penggunaan metode Monte Carlo dalam penentuan harga opsi tipe Eropa dengan volatilitas stokastik model Heston menghasilkan solusi yang cukup baik karena memiliki nilai error yang kecil dan akan konvergen ke solusi eksaknya dengan semakin banyak simulasi. Selain itu, simulasi Monte Carlo memberikan kesimpulan bahwa parameter harga strike, harga saham awal dan waktu jatuh tempo memiliki pengaruh terhadap harga opsi yang konsisten dengan teori harga opsi. ABSTRACTWhat is important in options trading is determining the optimal selling price. However, in real market conditions, fluctuations in asset prices that occur in the market indicate that the volatility of asset prices is not constant, this causes investors to experience difficulty in determining the optimal option price. This article discusses the optimal determination of the European type option price with stochastic volatility using the Monte Carlo method and the effect of the initial stock price, strike price, and expiration date on European option prices. The stochastic volatility model used in this study is the Heston model, which assumes that the stock price process (S) follows the normal log distribution, and the stock volatility process (V) follows the Ingersoll-Ross Cox Process. The first thing to do in this study is to estimate the parameters of the Heston model to get stock prices using the ordinary least square method and the Euler-Maruyama numerical method. The second step is to estimate the share price to get the European type option price using a Monte Carlo Simulation. This study indicates that using the Monte Carlo method in determining the price of European type options with the Heston model of stochastic volatility produces a fairly good solution because it has a small error value and will converge to the exact solution with more simulations. Also, the Monte Carlo simulation concludes that the parameters of the strike price, initial stock price, and maturity date influence the option price, which is consistent with the option price theory.
APA, Harvard, Vancouver, ISO, and other styles
12

Jansen, Teunis, Kasper Kristensen, Jeroen van der Kooij, Søren Post, Andrew Campbell, Kjell Rong Utne, Pablo Carrera, et al. "Nursery areas and recruitment variation of Northeast Atlantic mackerel (Scomber scombrus)." ICES Journal of Marine Science 72, no. 6 (October 29, 2014): 1779–89. http://dx.doi.org/10.1093/icesjms/fsu186.

Full text
Abstract:
Abstract There are currently no dedicated recruitment survey data available in support of the assessment of the abundance and distribution of Northeast Atlantic (NEA) mackerel (Scomber scombrus), one of the most widespread and commercially important fish stocks in the North Atlantic. This is despite the fact that an estimate of recruitment is an important requirement for the provision of advice to fishery managers. The work here addresses this by compiling catch rates of juvenile mackerel from bottom-trawl surveys conducted between October and March during 1998–2012 and applying a log Gaussian Cox (LGC) process geostatistical model incorporating spatio-temporal correlations. A statistically significant correlation between the modelled catch rates in adjacent quarters 4 and 1 (Q4 and Q1) demonstrates that bottom-trawl surveys in winter are an appropriate platform for sampling juvenile mackerel, and that the LCG model is successful in extracting a population abundance signal from the data. In this regard, the model performed appreciably better than a more commonly used raising algorithm based on survey swept-area estimates. Therefore, the LCG model was expanded to include data from the entire survey time-series, and a recruitment index was developed for use in the annual ICES stock assessment. We hypothesize that catchability is positively density-dependant and provides supporting evidence from acoustic observations. Various density-dependant transformations of the modelled catch rates were furthermore found to improve the correlation between the derived annual recruitment index and recruitment estimated by backcalculation of adult mackerel data. Square root transformation led to the strongest correlation, so this is recommended for further analysis of mackerel abundance. Finally, we provide maps of spatial distributions, showing that the most important nursery areas are around Ireland, north and west of Scotland, in the northern North Sea north of 59°N and, to some extent, also in the Bay of Biscay.
APA, Harvard, Vancouver, ISO, and other styles
13

Kim, Byung Soo, Chul Won Choi, and Seok Jin Kim. "Serum VEGF Per Platelet Count as a Prognosis Predicting Factor in Advanced Gastric Cancer Patients." Blood 112, no. 11 (November 16, 2008): 5467. http://dx.doi.org/10.1182/blood.v112.11.5467.5467.

Full text
Abstract:
Abstract Background: New blood vessel formation is a crucial step in the process of tumor growth and systemic metastasis. Recent studies have shown that VEGF expression not in tissues but in serum sample is correlated with tumor vascularity, and the high serum VEGF levels could predict poor prognosis in cancer patients. However there has been no data regarding the clinical and prognostic significance of serum VEGF levels per platelet count in advanced gastric cancer. In this study, we conducted a study to evaluate the prognostic implication of serum VEGF per platelet count in the patients with advanced gastric cancer. Methods: 111 patients with histologically confirmed gastric cancer, 35 patients with early gastric cancer were included and control serum samples were acquired from 25 healthy volunteers. The levels of VEGF were measured using human VEGF quantitative enzymelinked immunosorbent assay (ELISA). Survival curves were calculated using the Kaplan-Meier method and survival comparisons were made by the log rank test in metastatic gastric cancer. The Cox proportional hazards regression model was utilized for multivariate analyses after univariate analysis defined relevant prognostic variables. Results: The mean serum VEGF level was higher in the patients of AGC compared to those with EGC and controls (AGC 465 ± 315.8pg/ml; EGC 306 ± 97.8 pg/ml controls 230.8 ± 53.2 pg/ml, P< 0.033). A trend toward a significant positive correlation between serum VEGF and platelet counts was observed in patients of AGC (r = 0.477, P = 0.000, Fig 2) and there was a significant correlation between serum VEGF levels and differentiation of tumor (p = 0.014), stage (p = 0.036). The overall survival (log rank, p =0.0432) and the progression free survival (median 4.5 vs. 8.9 months; log rank, p =0.0116) were significantly shorter in patients with high VEGF per platelet count (≥1.626 pg/106). In the multivarivate analysis, performance status (P=0.025), the presence of peritoneal carcinomatosis (P=0.006), serum VEGF per platelet (P=0.005) were found to be significantly associated with the short progression free survival Conclusions: This study demonstrated that serum VEGF per platelet count is correlated with short overall survival and progression free survival in advanced gastric cancer patients. Therefore, serum VEGF per platelet may be a useful marker for predicting the prognosis of advanced gastric cancer patients.
APA, Harvard, Vancouver, ISO, and other styles
14

Chang, Jinjia, Midie Xu, Hui Sun, Wenhua Li, Min Ye, Weiwei Weng, and Xiaodong Zhu. "Prognostic value of DNA repair gene based on stratification of gastric cancer." Journal of Clinical Oncology 37, no. 4_suppl (February 1, 2019): 12. http://dx.doi.org/10.1200/jco.2019.37.4_suppl.12.

Full text
Abstract:
12 Background: DNA repair genes can be used as prognostic biomarkers in many types of cancer. We aimed to identify prognostic DNA repair genes in patients with gastric cancer (GC) by systematically bioinformatic approaches using web-based database. Methods: Global gene expression profiles from altogether 1,325 GC patients’ samples from six independent datasets were included in the study. Clustering analysis was performed to screen potentially abnormal DNA repair genes related to the prognosis of GC, followed by unsupervised clustering analysis to identify molecular subtypes of GC. Characteristics and prognosis differences were analyzed among these molecular subtypes, and modular key genes in molecular subtypes were identified based on changes in expression correlation. Multivariate Cox proportional hazard analysis was used to find the independent prognostic gene. Kaplan-Meier method and log-rank test was used to estimate correlations of key DNA repair genes with GC patients’overall survival. Results: There were 57 key genes significantly associated to GC patients’ prognosis, and patients were stratified into three molecular clusters based on their expression profiles, in which patients in Cluster 3 showed the best survival (P < 0.05). After a three-phase training, test and validation process, the expression profile of 13 independent key DNA repair genes were identified can classify the prognostic risk of patients. Compared with patients with low-risk score, patients with high risk score in the training set had shorter overall survival (P < 0.0001). Furthermore, we verified equivalent findings by these key DNA repair genes in the test set (P < 0.0001) and the independent validation set (P = 0.0024). Conclusions: Our results suggest a great potential for the use of DNA repair gene profiling as a powerful marker in prognostication and inform treatment decisions for GC patients.
APA, Harvard, Vancouver, ISO, and other styles
15

Kuronen, Mikko, Aila Särkkä, Matti Vihola, and Mari Myllymäki. "Hierarchical log Gaussian Cox process for regeneration in uneven-aged forests." Environmental and Ecological Statistics, August 20, 2021. http://dx.doi.org/10.1007/s10651-021-00514-3.

Full text
Abstract:
AbstractWe propose a hierarchical log Gaussian Cox process (LGCP) for point patterns, where a set of points $$\varvec{x}$$ x affects another set of points $$\varvec{y}$$ y but not vice versa. We use the model to investigate the effect of large trees on the locations of seedlings. In the model, every point in $$\varvec{x}$$ x has a parametric influence kernel or signal, which together form an influence field. Conditionally on the parameters, the influence field acts as a spatial covariate in the intensity of the model, and the intensity itself is a non-linear function of the parameters. Points outside the observation window may affect the influence field inside the window. We propose an edge correction to account for this missing data. The parameters of the model are estimated in a Bayesian framework using Markov chain Monte Carlo where a Laplace approximation is used for the Gaussian field of the LGCP model. The proposed model is used to analyze the effect of large trees on the success of regeneration in uneven-aged forest stands in Finland.
APA, Harvard, Vancouver, ISO, and other styles
16

Jullum, Martin. "Investigating mesh‐based approximation methods for the normalization constant in the log Gaussian Cox process likelihood." Stat 9, no. 1 (July 21, 2020). http://dx.doi.org/10.1002/sta4.285.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Neyens, Thomas, Peter J. Diggle, Christel Faes, Natalie Beenaerts, Tom Artois, and Emanuele Giorgi. "Mapping species richness using opportunistic samples: a case study on ground-floor bryophyte species richness in the Belgian province of Limburg." Scientific Reports 9, no. 1 (December 2019). http://dx.doi.org/10.1038/s41598-019-55593-x.

Full text
Abstract:
AbstractIn species richness studies, citizen-science surveys where participants make individual decisions regarding sampling strategies provide a cost-effective approach to collect a large amount of data. However, it is unclear to what extent the bias inherent to opportunistically collected samples may invalidate our inferences. Here, we compare spatial predictions of forest ground-floor bryophyte species richness in Limburg (Belgium), based on crowd- and expert-sourced data, where the latter are collected by adhering to a rigorous geographical randomisation and data collection protocol. We develop a log-Gaussian Cox process model to analyse the opportunistic sampling process of the crowd-sourced data and assess its sampling bias. We then fit two geostatistical Poisson models to both data-sets and compare the parameter estimates and species richness predictions. We find that the citizens had a higher propensity for locations that were close to their homes and environmentally more valuable. The estimated effects of ecological predictors and spatial species richness predictions differ strongly between the two geostatistical models. Unknown inconsistencies in the sampling process, such as unreported observer’s effort, and the lack of a hypothesis-driven study protocol can lead to the occurrence of multiple sources of sampling bias, making it difficult, if not impossible, to provide reliable inferences.
APA, Harvard, Vancouver, ISO, and other styles
18

Becker, Devan G., Douglas G. Woolford, and Charmaine B. Dean. "Algorithmically deconstructing shot locations as a method for shot quality in hockey." Journal of Quantitative Analysis in Sports, October 10, 2020. http://dx.doi.org/10.1515/jqas-2020-0012.

Full text
Abstract:
AbstractSpatial point processes have been successfully used to model the relative efficiency of shot locations for each player in professional basketball games. Those analyses were possible because each player makes enough baskets to reliably fit a point process model. Goals in hockey are rare enough that a point process cannot be fit to each player’s goal locations, so novel techniques are needed to obtain measures of shot efficiency for each player. A Log-Gaussian Cox Process (LGCP) is used to model all shot locations, including goals, of each NHL player who took at least 500 shots during the 2011–2018 seasons. Each player’s LGCP surface is treated as an image and these images are then used in an unsupervised statistical learning algorithm that decomposes the pictures into a linear combination of spatial basis functions. The coefficients of these basis functions are shown to be a very useful tool to compare players. To incorporate goals, the locations of all shots that resulted in a goal are treated as a “perfect player” and used in the same algorithm (goals are further split into perfect forwards, perfect centres and perfect defence). These perfect players are compared to other players as a measure of shot efficiency. This analysis provides a map of common shooting locations, identifies regions with the most goals relative to the number of shots and demonstrates how each player’s shot location differs from scoring locations.
APA, Harvard, Vancouver, ISO, and other styles
19

Lacquaniti, Antonio, Susanna Campo, Teresa Casuscelli Di Tocco, and Paolo Monardo. "MO902SERUM FREE LIGHT CHAINS IN HEMODIALYSIS PATIENTS: A BRIDGE BETWEEN INFLAMMATION, IMMUNE SYSTEM DYSFUNCTION AND MORTALITY RISK." Nephrology Dialysis Transplantation 36, Supplement_1 (May 1, 2021). http://dx.doi.org/10.1093/ndt/gfab102.003.

Full text
Abstract:
Abstract Background and Aims Uremic toxins, poor removed by conventional hemodialysis (HD), represent independent risk factors for mortality in end-stage renal disease (ESRD). Middle uremic toxin molecules were associated to pathological features of uremia, such as immune dysfunction and inflammation. These two entities are not mutually exclusive, but they could represent two sides of the same coin. ESRD-associated inflammation is closely related to the activation of innate immune system. Free light chain (FLC) may be a specific assessment of inflammation, representing a direct function of adaptive immunity through B-cell lineage production rather than a general marker of inflammation. While several studies have assessed the relation between FLCs and mortality risk in chronic kidney disease (CKD), FLCs, as uremic toxins in non-multiple myeloma dialyzed patients, were marginally analyzed. The aim of this prospective study was to evaluate the clinical impact of FLCs levels in HD patients, during a 2-years follow-up analysing the relations with biomarkers of inflammation, such as C-reactive protein (CRP) and procalcitonin (PCT), main lymphocytes subsets, such as CD4+ and CD8+ T cell count and high mobility group box (HMGB) -1 levels, as expression of the innate immune system. The potential link between FLCs levels and mortality risk was assessed. Method 190 patients on chronic hemodialysis at the Nephrology and Dialysis Unit of Papardo Hospital in Messina, Italy, were enrolled and followed for 2 years. Inclusion criteria were: age &gt;18 years, absence or &lt;200 ml/die residual diuresis, κ/λ ratio within the renal reference range (0.37–3.1). Receiver operating characteristics (ROC) analysis was performed to estimate the cut-off points of HMGB-1 and cFLC. Kaplan-Meier survival analysis and Cox proportional multivariate hazards model were used for clinical outcome. Results HD patients were characterized by high FLC levels. κFLC values were 182.3 (IQR: 140.2 – 216.1) mg/L, whereas λFLC levels were 108.2 (IQR: 72.7 – 143.2) mg/L. The median combined (c) FLC concentration was 182.9 mg/L (IQR = 207.8 – 330.2), which was extremely greater than the median reported in the general population (normal range = 9.3 – 43.3 mg/L) and in CKD patients [68.9 mg/L (IQR = 49.4 – 100.9)]. No differences in cFLC levels were revealed according to dialysis techniques. HD patients showed significant reduction of CD4+ and CD4+/CD8+ ratio. High HMGB1 levels were detected in HD patients (161.3 ± 39.7 ng/ml) and positively related to PCT and cFLC (r = - 0.38; p &lt; 0.001), with an inverse relation to CD4+/CD8+ ratio. cFLC positively correlated with β2 microglobulin, hemoglobin, and HMGB1. Conversely, an inverse correlation was revealed with surrogate markers of inflammation, such as CRP, procalcitonin, neutrophil counts. There were 49 deaths during the follow-up. The majority (23/49) of deaths were attributed to cardiovascular disease, the remainder to infection and malignancy. cFLCs and sHMGB-1 levels in this group were significantly elevated. By ROC analysis, HMGB-1 levels &gt; 100.9 ng/mL and cFLC &gt; 223.4 mg/l were associated with a significantly lower survival rate (p &lt; 0.02 by log-rank test) than for patients with lower levels when using Kaplan-Meier analysis. After adjusting for confounding factors, by Cox proportional hazards method, the difference remained statistically significant (p = 0.02) Conclusion Our study demonstrated an independent relation between high cFLC levels and mortality in HD patients. cFLCs represent a potential biomarker of “inflammunity”, a physiopathological process playing a pivotal role in ESRD, based on a vicious circle between inflammation and immune dysfunction. Further in-depth examinations should be verify our findings, determining whether therapeutic measures targeting cFLC balance, such as hemodiafiltration and expanded dialysis, would be helpful to reduce the “inflammunity” process, characterizing dialyzed patients.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography