Academic literature on the topic 'Universal background models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Universal background models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Universal background models"

1

St-Charles, Pierre-Luc, Guillaume-Alexandre Bilodeau, and Robert Bergevin. "Universal Background Subtraction Using Word Consensus Models." IEEE Transactions on Image Processing 25, no. 10 (2016): 4768–81. http://dx.doi.org/10.1109/tip.2016.2598691.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Billeb, Stefan, Christian Rathgeb, Herbert Reininger, Klaus Kasper, and Christoph Busch. "Biometric template protection for speaker recognition based on universal background models." IET Biometrics 4, no. 2 (2015): 116–26. http://dx.doi.org/10.1049/iet-bmt.2014.0031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Rua, Enrique Argones, Emanuele Maiorana, Jose Luis Alba Castro, and Patrizio Campisi. "Biometric Template Protection Using Universal Background Models: An Application to Online Signature." IEEE Transactions on Information Forensics and Security 7, no. 1 (2012): 269–82. http://dx.doi.org/10.1109/tifs.2011.2168213.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Eichhorn, Astrid, Tim Koslowski, and Antonio Pereira. "Status of Background-Independent Coarse Graining in Tensor Models for Quantum Gravity." Universe 5, no. 2 (2019): 53. http://dx.doi.org/10.3390/universe5020053.

Full text
Abstract:
A background-independent route towards a universal continuum limit in discrete models of quantum gravity proceeds through a background-independent form of coarse graining. This review provides a pedagogical introduction to the conceptual ideas underlying the use of the number of degrees of freedom as a scale for a Renormalization Group flow. We focus on tensor models, for which we explain how the tensor size serves as the scale for a background-independent coarse-graining flow. This flow provides a new probe of a universal continuum limit in tensor models. We review the development and setup of this tool and summarize results in the two- and three-dimensional case. Moreover, we provide a step-by-step guide to the practical implementation of these ideas and tools by deriving the flow of couplings in a rank-4-tensor model. We discuss the phenomenon of dimensional reduction in these models and find tentative first hints for an interacting fixed point with potential relevance for the continuum limit in four-dimensional quantum gravity.
APA, Harvard, Vancouver, ISO, and other styles
5

Ayoub, Bouziane, Kharroubi Jamal, and Zarghili Arsalane. "Towards an Optimal Speaker Modeling in Speaker Verification Systems using Personalized Background Models." International Journal of Electrical and Computer Engineering (IJECE) 7, no. 6 (2017): 3655–63. https://doi.org/10.11591/ijece.v7i6.pp3655-3663.

Full text
Abstract:
This paper presents a novel speaker modeling approachfor speaker recognition systems. The basic idea of this approach consists of deriving the target speaker model from a personalized background model, composed only of the UBM Gaussian components which are really present in the speech of the target speaker. The motivation behind the derivation of speakers’ models from personalized background models is to exploit the observeddifference insome acoustic-classes between speakers, in order to improve the performance of speaker recognition systems. The proposed approach was evaluatedfor speaker verification task using various amounts of training and testing speech data. The experimental results showed that the proposed approach is efficientin termsof both verification performance and computational cost during the testing phase of the system, compared to the traditional UBM based speaker recognition systems.
APA, Harvard, Vancouver, ISO, and other styles
6

Fatemi, Bahare, Siamak Ravanbakhsh, and David Poole. "Improved Knowledge Graph Embedding Using Background Taxonomic Information." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 3526–33. http://dx.doi.org/10.1609/aaai.v33i01.33013526.

Full text
Abstract:
Knowledge graphs are used to represent relational information in terms of triples. To enable learning about domains, embedding models, such as tensor factorization models, can be used to make predictions of new triples. Often there is background taxonomic information (in terms of subclasses and subproperties) that should also be taken into account. We show that existing fully expressive (a.k.a. universal) models cannot provably respect subclass and subproperty information. We show that minimal modifications to an existing knowledge graph completion method enables injection of taxonomic information. Moreover, we prove that our model is fully expressive, assuming a lower-bound on the size of the embeddings. Experimental results on public knowledge graphs show that despite its simplicity our approach is surprisingly effective.
APA, Harvard, Vancouver, ISO, and other styles
7

Dişken, Gökay, Zekeriya Tüfekci, and Ulus Çevik. "Speaker Model Clustering to Construct Background Models for Speaker Verification." Archives of Acoustics 42, no. 1 (2017): 127–35. http://dx.doi.org/10.1515/aoa-2017-0014.

Full text
Abstract:
Abstract Conventional speaker recognition systems use the Universal Background Model (UBM) as an imposter for all speakers. In this paper, speaker models are clustered to obtain better imposter model representations for speaker verification purpose. First, a UBM is trained, and speaker models are adapted from the UBM. Then, the k-means algorithm with the Euclidean distance measure is applied to the speaker models. The speakers are divided into two, three, four, and five clusters. The resulting cluster centers are used as background models of their respective speakers. Experiments showed that the proposed method consistently produced lower Equal Error Rates (EER) than the conventional UBM approach for 3, 10, and 30 seconds long test utterances, and also for channel mismatch conditions. The proposed method is also compared with the i-vector approach. The three-cluster model achieved the best performance with a 12.4% relative EER reduction in average, compared to the i-vector method. Statistical significance of the results are also given.
APA, Harvard, Vancouver, ISO, and other styles
8

You, Zuyao, Lingyu Kong, Lingchen Meng, and Zuxuan Wu. "FOCUS: Towards Universal Foreground Segmentation." Proceedings of the AAAI Conference on Artificial Intelligence 39, no. 9 (2025): 9580–88. https://doi.org/10.1609/aaai.v39i9.33038.

Full text
Abstract:
Foreground segmentation is a fundamental task in computer vision, encompassing various subdivision tasks. Previous research has typically designed task-specific architectures for each task, leading to a lack of unification. Moreover, they primarily focus on recognizing foreground objects without effectively distinguishing them from the background. In this paper, we emphasize the importance of the background and its relationship with the foreground. We introduce FOCUS, the Foreground ObjeCts Universal Segmentation framework that can handle multiple foreground tasks. We develop a multi-scale semantic network using the edge information of objects to enhance image features. To achieve boundary-aware segmentation, we propose a novel distillation method, integrating the contrastive learning strategy to refine the prediction mask in multi-modal feature space. We conduct extensive experiments on a total of 13 datasets across 5 tasks, and the results demonstrate that FOCUS consistently outperforms the state-of-the-art task-specific models on most metrics.
APA, Harvard, Vancouver, ISO, and other styles
9

Evstafiev, Dmitry G., and Lubov A. Tsyganova. "After Post-Modernity: Discussion Points Against the Background of Global Transformations." RUDN Journal of Political Science 25, no. 2 (2023): 293–307. http://dx.doi.org/10.22363/2313-1438-2023-25-2-293-307.

Full text
Abstract:
The transition of the basically local military-political crisis in the Western Eurasia into the real focus of global geopolitical transformations and civilizational confrontation. That has brought to the agenda the issue of degradation of the principles of universalities that were the basis for globalization in both: socio-political and socio-economic spheres thus demonstrating deepening interaction between them. The world is facing the perspective of competition of different models of development and their political and social localization that reflect the specifics of social-economic environment. The fact that global transformations became the result of interaction of the objective and subjective, contextual factors as well as sometimes were brought to reality through the interaction of political leaders brings us to the conclusion that the world is nowadays within the transitional era that contains several points of bifurcation of political nature that in turn drive for different models of socioeconomic development. The sharp nature of ongoing transformations reflects the situation when most of the paradigms and instrumental models that were regarded as axiomatically universal like institutional governance and representative democracy started to lose their relevance as political and socio-political management tools. The same is true about a bulk of global economic tools such as the universal protected nature of economic interdependence and international trade. But all that was the basis for the globalization. The system of global political and economic relations that has resided quite recently in the environment of nearly total universality started to lose synergy and integrity while forming complex localized formats in which political and socio-cultural factors play the leading role ahead of economic basis and socio-economic relationships. The research drives to the conclusion on the possibility of emergence of the two competing models claiming a global status the specifics of interaction between them and the key differences between them.
APA, Harvard, Vancouver, ISO, and other styles
10

Mosser, Kurt. "The Grammatical Background of Kant's General Logic." Kantian Review 13, no. 1 (2008): 116–40. http://dx.doi.org/10.1017/s1369415400001114.

Full text
Abstract:
In theCritique of Pure Reason, Kant conceives of general logic as a set of universal and necessary rules for the possibility of thought, or as a set of minimal necessary conditions for ascribing rationality to an agent (exemplified by the principle of non-contradiction). Such a conception, of course, contrasts with contemporary notions of formal, mathematical or symbolic logic. Yet, in so far as Kant seeks to identify those conditions that must hold for the possibility of thought in general, such conditions must holda fortiorifor any specific model of thought, including axiomatic treatments of logic and standard natural deduction models of first-order predicate logic. Kant's general logic seeks to isolate those conditions by thinking through – or better, reflecting on – those conditions that themselves make thought possible.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Universal background models"

1

Giannantonio, Tommaso. "Constraining cosmological models with cosmic microwave background fluctuations from the late universe." Thesis, University of Portsmouth, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.516237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bekli, Zeid, and William Ouda. "A performance measurement of a Speaker Verification system based on a variance in data collection for Gaussian Mixture Model and Universal Background Model." Thesis, Malmö universitet, Fakulteten för teknik och samhälle (TS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-20122.

Full text
Abstract:
Voice recognition has become a more focused and researched field in the last century,and new techniques to identify speech has been introduced. A part of voice recognition isspeaker verification which is divided into Front-end and Back-end. The first componentis the front-end or feature extraction where techniques such as Mel-Frequency CepstrumCoefficients (MFCC) is used to extract the speaker specific features of a speech signal,MFCC is mostly used because it is based on the known variations of the humans ear’scritical frequency bandwidth. The second component is the back-end and handles thespeaker modeling. The back-end is based on the Gaussian Mixture Model (GMM) andGaussian Mixture Model-Universal Background Model (GMM-UBM) methods forenrollment and verification of the specific speaker. In addition, normalization techniquessuch as Cepstral Means Subtraction (CMS) and feature warping is also used forrobustness against noise and distortion. In this paper, we are going to build a speakerverification system and experiment with a variance in the amount of training data for thetrue speaker model, and to evaluate the system performance. And further investigate thearea of security in a speaker verification system then two methods are compared (GMMand GMM-UBM) to experiment on which is more secure depending on the amount oftraining data available.This research will therefore give a contribution to how much data is really necessary fora secure system where the False Positive is as close to zero as possible, how will theamount of training data affect the False Negative (FN), and how does this differ betweenGMM and GMM-UBM.The result shows that an increase in speaker specific training data will increase theperformance of the system. However, too much training data has been proven to beunnecessary because the performance of the system will eventually reach its highest point and in this case it was around 48 min of data, and the results also show that the GMMUBM model containing 48- to 60 minutes outperformed the GMM models.
APA, Harvard, Vancouver, ISO, and other styles
3

Errard, Josquin. "A hunt for cosmic microwave background B-modes in the sytematic contaminants jungle." Paris 7, 2012. http://www.theses.fr/2012PA077260.

Full text
Abstract:
Cette thèse présente une étude de certains effets systématiques instrumentaux et astrophysiques, pouvant affecter les performances des nouvelles et futures générations d'observations de la polarisation du fond diffus cosmologique (CMB). Nous étudions l'impact de ces effets sur les objectifs scientifiques de ces observations, ainsi que les techniques pour leur élimination. Ce travail se concentre sur les problèmes généraux que rencontrent les expériences de manière générale, mais se penche également sur les questions plus spécifiques soulevées dans le cadre de l'expérience d'observation des modes-B du CMB, POLARBEAR. L'objectif principal de l'effort actuel pour l'étude de la polarisation du CMB est une détection des anisotropies primordiales appelées modes-B — une signature des théories inflationnaires non détectée à ce jour. Cela aurait un grand impact sur notre compréhension de l'univers, mais aussi des lois fondamentales de la physique. Comprendre, modéliser, et, finalement, éliminer ces effets systématiques sont des éléments indispensables pour tout pipeline d'analyse moderne du CMB. Sa réussite, de concert avec une haute sensibilité instrumentale, décidera du succès final des efforts entrepris. Dans cette thèse je décris tout d'abord l'optique des expériences typiques d'observation du CMB et propose un paramétrage des polarisations instrumentale et croisée. Deuxièmement, je présente un modèle décrivant la contamination atmosphérique et utilise celui-ci afin de donner quelques aperçus sur le rôle et l'impact de l'atmosphère sur les performances des expériences au sol. J'indique également comment ces résultats peuvent être utilisés pour améliorer le contrôle des effets atmosphériques dans l'analyse des données CMB. Ensuite, je discute d'une autre source d'effets systématiques venant du ciel — les avants-plans astrophysiques polarisés. Dans ce contexte, je présente d'une part une nouvelle approche pour prédire les performances des futures expériences prenant en compte la présence des avant-plans, et d'autre part je propose un cadre pour l'optimisation des expériences afin qu'elles puissent atteindre de meilleures performances. Cette partie de la thèse est issue d'un travail commun avec F. Stivoli et R. Stompor. Je présente enfin une expérience phare pour l'observation de la polarisation du CMB, POLARBEAR, dans laquelle j'ai été impliqué au cours de mes études doctorales. Je décris le statut actuel et les performances de l'instrument ainsi que quelques étapes de son pipeline d'analyse des données. En particulier, je montre des méthodes d'estimation de certains des paramètres introduits pour la modélisation d'effets systématiques, à partir de données simulées. Ce travail a été réalisé en collaboration avec les membres de l'équipe POLARBEAR<br>This thesis presents a study of selected instrumental and astrophysical systematics, which may affect the performance of new generation of future observations of the Cosmic Microwave Background (CMB) polarization. It elaborates on their impact on the science goals of those observations and discusses techniques and approaches for their removal. Its focus is on general issues typical of entire classes of experiments, but also on specific problems as encountered in the context of a CMB B-mode experiment, POLARBEAR. The main target of the CMB polarization effort undergoing currently in the field is a detection of the primordial B-modes anisotropies — a so far undetected signature of the inflationary theories. This would have far-reaching impact on our understanding of the universe but also fundamental laws of physics. Understanding, modelling, and ultimately removal of the systematics are essential steps in any modern CMB analyste pipeline and their successful accomplishment, together with a high instrumental sensitivity, will decide of a final success of the entire effort. In this thesis I first describe optics of typical CMB experiments and introduce a parametrization of instrumental and cross-polarisation affects particularly convenient for the analysis of their impact. Second, I present a model describing the atmospheric contamination and use it to provide some insights about the atmosphere's role and its impact on performance of ground-based experiments. I also outline how it could be used further to improve control of atmospheric effects in the CMB data analysis. Then, I discuss another source of sky systematics — the polarized astrophysical foregrounds. In this context I present on the one hand a new approach to forecasting performance of the future experiments, which accounts for the presence of the foregrounds, white on the other I propose a framework for optimizing hardware of such experiments to let them achieve better performance. This part of thesis stems from a commun work with dm. F. Stivoli and R. Stompor. I finally present one of the leading CMB polarization experiment POLARBEAR, in which I have been involved in over the course of my PhD studies. I describe its current status and performance as well as selected steps of its data analysis pipeline. In particular, I show methods to estimate some of the parameters introduced for the systematics modeling from simulated data. This work has been performed in collaboration with mernbers of the POLARBEAR team
APA, Harvard, Vancouver, ISO, and other styles
4

Pieroni, Mauro. "Classification of inflationary models and constraints on fundamental physics." Thesis, Sorbonne Paris Cité, 2016. http://www.theses.fr/2016USPCC258/document.

Full text
Abstract:
Ce travail est concentré sur l'étude de la cosmologie primordiale et en particulier sur l'étude de l'inflation. Après une introduction sur la théorie standard du Big Bang, nous discutons de la physique du CMB et nous expliquons comment ses observations peuvent être utilisées pour définir des contraintes sur les modèles cosmologiques. Nous introduisons l'inflation et nous expliquons sa réalisation la plus simple. Nous présentons les observables et les contraintes expérimentales qui peuvent être utilisées pour mettre des contraintes sur les modèles d'inflation. La possibilité d'observer des ondes gravitationnelles primordiales (GW) produites au cours de l'inflation est examinée. Nous présentons les raisons pour définir une classification des modèles d'inflation et pour introduire le formalisme de la fonction 13 pour décrire l'inflation. En particulier nous expliquons pourquoi, dans ce cadre, nous pouvons naturellement définir un ensemble de classes d'universalité pour les modèles d'inflation. Les motivations théoriques pour soutenir la formulation de l'inflation en termes de ce formalisme sont présentées. Certains modèles généralisés d'inflation sont introduits et l'extension du formalisme de la fonction (3-formalisme à ces modèles est discutée. Enfin, nous nous concentrons sur l'étude des modèles où l’inflation (qui es assumé être pseudo-scalaire) est couplé non-minimalement à des champs de jauge abéliens qui peuvent être présents lors de l'inflation. L'analyse du problème est effectuée en utilisant une caractérisation de modèles d'inflation sur la base de leur comportement asymptotique. Un large éventail d'aspects théoriques et des conséquences d'observation est discuté<br>This work is focused on the study of early time cosmology and in particular on the study of inflation. After an introduction on the standard Big Bang theory, we discuss the physics of CMB and we explain how its observations can be used to set constraints on cosmological models. We introduce inflation and we carry out its simplest realization by presenting the observables and the experimental constraints that can be set on inflationary models. The possibility of observing primordial gravitational wave (GW) produced during inflation is discussed. We present the reasons to deftne a classification of inflationary models and introduce the [3-function formalism for inflation by explaining why in this framework we can naturally define a set of universality classes for inflationary models. Theoretical motivations to support the formulation of inflation in terms of this formalism are presented. Some generalized models of inflation are introduced and the extension of the (3-function formalism for inflation to these models is discussed. Finally we focus on the study of models where the (pseudo-scalar) inflaton is non-minimally coupled to some Abelian gauge fields that can be present during inflation. The analysis of the problem is carried out by using a characterization of inflationary models in terms of their asymptotic behavior. A wide set of theoretical aspects and of observational consequences is discussed
APA, Harvard, Vancouver, ISO, and other styles
5

Verdet, Florian. "Exploring variabilities through factor analysis in automatic acoustic language recognition." Phd thesis, Université d'Avignon, 2011. http://tel.archives-ouvertes.fr/tel-00954255.

Full text
Abstract:
Language Recognition is the problem of discovering the language of a spoken definitionutterance. This thesis achieves this goal by using short term acoustic information within a GMM-UBM approach.The main problem of many pattern recognition applications is the variability of problemthe observed data. In the context of Language Recognition (LR), this troublesomevariability is due to the speaker characteristics, speech evolution, acquisition and transmission channels.In the context of Speaker Recognition, the variability problem is solved by solutionthe Joint Factor Analysis (JFA) technique. Here, we introduce this paradigm toLanguage Recognition. The success of JFA relies on several assumptions: The globalJFA assumption is that the observed information can be decomposed into a universalglobal part, a language-dependent part and the language-independent variabilitypart. The second, more technical assumption consists in the unwanted variability part to be thought to live in a low-dimensional, globally defined subspace. In this work, we analyze how JFA behaves in the context of a GMM-UBM LR framework. We also introduce and analyze its combination with Support Vector Machines(SVMs).The first JFA publications put all unwanted information (hence the variability) improvemen tinto one and the same component, which is thought to follow a Gaussian distribution.This handles diverse kinds of variability in a unique manner. But in practice,we observe that this hypothesis is not always verified. We have for example thecase, where the data can be divided into two clearly separate subsets, namely datafrom telephony and from broadcast sources. In this case, our detailed investigations show that there is some benefit of handling the two kinds of data with two separatesystems and then to elect the output score of the system, which corresponds to the source of the testing utterance.For selecting the score of one or the other system, we need a channel source related analyses detector. We propose here different novel designs for such automatic detectors.In this framework, we show that JFA's variability factors (of the subspace) can beused with success for detecting the source. This opens the interesting perspectiveof partitioning the data into automatically determined channel source categories,avoiding the need of source-labeled training data, which is not always available.The JFA approach results in up to 72% relative cost reduction, compared to the overall resultsGMM-UBM baseline system. Using source specific systems followed by a scoreselector, we achieve 81% relative improvement.
APA, Harvard, Vancouver, ISO, and other styles
6

Alizadeh, Hassan. "Intrusion detection and traffic classification using application-aware traffic profiles." Doctoral thesis, Universidade de Aveiro, 2018. http://hdl.handle.net/10773/23545.

Full text
Abstract:
Doutoramento em Engenharia Eletrotécnica no âmbito do programa doutoral MAP-tele<br>Along with the ever-growing number of applications and end-users, online network attacks and advanced generations of malware have continuously proliferated. Many studies have addressed the issue of intrusion detection by inspecting aggregated network traffic with no knowledge of the responsible applications/services. Such systems may detect abnormal tra c, but fail to detect intrusions in applications whenever their abnormal traffic ts into the network normality profiles. Moreover, they cannot identify intrusion-infected applications responsible for the abnormal traffic. This work addresses the detection of intrusions in applications when their traffic exhibits anomalies. To do so, we need to: (1) bind traffic to applications; (2) have per-application traffic profiles; and (3) detect deviations from profiles given a set of traffic samples. The first requirement has been addressed in our previous works. Assuming that such binding is available, this thesis' work addresses the last two topics in the detection of abnormal traffic and thereby identify its source (possibly malware-infected) application. Applications' traffic profiles are not a new concept, since researchers in the field of Traffic Identification and Classification (TIC) make use of them as a baseline of their systems to identify and categorize traffic samples by application (types-of-interest). But they do not seem to have received much attention in the scope of intrusion detection systems (IDS). We first provide a survey on TIC strategies, within a taxonomy framework, focusing on how the referred TIC techniques could help us for building application's traffic profiles. As a result of this study, we found that most TIC methodologies are based on some statistical (well-known) assumptions extracted from different traffic sources and make the use of machine learning techniques in order to build models (profiles) for recognition of either application types-of-interest or application-layer protocols. Moreover, the literature of traffic classification observed some traffic sources (e.g. first few packets of ows and multiple sub- ows) that do not seem to have received much attention in the scope of IDS research. An IDS can take advantage of such traffic sources in order to provide timely detection of intrusions before they propagate their infected traffic. First, we utilize conventional Gaussian Mixture Models (GMMs) to build per-application profiles. No prior information on data distribution of each application is available. Despite the improvement in performance, stability in high-dimensional data and calibrating a proper threshold for intrusion detection are still main concern. Therefore, we improve the framework restoring universal background model (UBM) to robustly learn application specific models. The proposed anomaly detection systems are based on class-specific and global thresholding mechanisms, where a threshold is set at Equal Error Rate (EER) operating point to determine whether a ow claimed by an application is genuine. Our proposed modelling approaches can also be used in a traffic classification scenario, where the aim is to assign each specific ow to an application (type-of-interest). We also investigate the suitability of the proposed approaches with just a few, initial packets from a traffic ow, in order to provide a more eficient and timely detection system. Several tests are conducted on multiple public datasets collected from real networks. In the numerous experiments that are reported, the evidence of the efectiveness of the proposed approaches are provided.<br>Em paralelo com o número crescente de aplicações e usuários finais, os ataques em linha na Internet e as gerações avançadas de malware têm proliferado continuadamente. Muitos estudos abordaram a questão da detecção de intrusões através da inspecção do tráfego de rede agregado, sem o conhecimento das aplicações / serviços responsáveis. Esses sistemas podem detectar tráfego anormal, mas não conseguem detectar intrusões em aplicações sempre que seu tráfego anormal encaixa nos perfis de normalidade da rede. Além disso, eles não conseguem identificar as aplicações infectadas por intrusões que são responsáveis pelo tráfego anormal. Este trabalho aborda a detecção de intrusões em aplicações quando seu tráfego exibe anomalias. Para isso, precisamos: (1) vincular o tráfego a aplicações; (2) possuir perfis de tráfego por aplicação; e (3) detectar desvios dos perfis dado um conjunto de amostras de tráfego. O primeiro requisito foi abordado em trabalhos nossos anteriores. Assumindo que essa ligação esteja disponível, o trabalho desta tese aborda os dois últimos tópicos na detecção de tráfego anormal e, assim, identificar a sua aplicação fonte (possivelmente infectada por um malware). Os perfis de tráfego das aplicações não são um conceito novo, uma vez que os investigadores na área da Identificação e Classificação de Tráfego (TIC) utilizam-nos nos seus sistemas para identificar e categorizar amostras de tráfego por tipos de aplicações (ou tipos de interesse). Mas eles não parecem ter recebido muita atenção no âmbito dos sistemas de detecção de intrusões (IDS). Assim, primeiramente fornecemos um levantamento de estratégias de TIC, dentro de uma estrutura taxonómica, tendo como foco a forma como as técnicas de TIC existentes nos poderiam ajudar a lidar com perfis de tráfego de aplicações. Como resultado deste estudo, verificou-se que a maioria das metodologias TIC baseia-se nalguns pressupostos estatísticos (bem conhecidos) extraídos de diferentes fontes de tráfego e usam técnicas de aprendizagem automática para construir os modelos (perfis) para o reconhecimento de quaisquer tipos de interesse ou protocolos aplicacionais. Além disso, a literatura de classificação de tráfego analisou algumas fontes de tráfego (por exemplo, primeiros pacotes de fluxos e subfluxos múltiplos) que não parecem ter recebido muita atenção no âmbito da IDS. Um IDS pode aproveitar essas fontes de tráfego para fornecer detecção atempada de intrusões antes de propagarem o seu tráfego infectado. Primeiro, utilizamos modelos convencionais de mistura gaussiana (GMMs) para construir perfis por aplicação. Nenhuma informação prévia sobre a distribuição de dados de cada aplicação estava disponível. Apesar da melhoria no desempenho, a estabilidade com dados de alta dimensionalidade e a calibração de um limiar adequado para a detecção de intrusões continuaram a ser um problema. Consequentemente, melhoramos a infraestrutura de detecção através da introdução do modelo basal universal (UBM) para robustecer a aprendizagem do modelo especifico de cada aplicação. As abordagens de modelação que propomos também podem ser usadas cenários de classificação de trafego, onde o objectivo e atribuir cada fluxo especifico a uma aplicação (tipo de interesse). Os sistemas de detecção de anomalias propostos baseiam-se em mecanismos de limiar específicos de classes e globais, nos quais um limiar e definido no ponto de operação da Taxa de Erros Igual (EER) para determinar se um fluxo reivindicado por uma aplicação é genuíno. Também investigamos a adequação das abordagens propostas com apenas alguns pacotes iniciais de um fluxo de trafego, a fim de proporcionar um sistema de detecção mais eficiente e oportuno. Para avaliar a eficácia das aproximações tomadas realizamos vários testes com múltiplos conjuntos de dados públicos, colectados em redes reais. Nas numerosas experiências que são relatadas, são fornecidas evidências da eficácia das abordagens propostas.
APA, Harvard, Vancouver, ISO, and other styles
7

Perbost, Camille. "Matrices de bolomètres supraconducteurs pour la mesure de la polarisation du fond diffus cosmologique : application à l’expérience QUBIC." Thesis, Sorbonne Paris Cité, 2016. http://www.theses.fr/2016USPCC283/document.

Full text
Abstract:
Le fond diffus cosmologique (CMB) est la première lumière libérée par l’Univers.À ce titre, elle constitue la photographie la plus ancienne à laquelle nous ayons accès.Ces photons recèlent des trésors d’informations capables de nous renseigner tant sur le contenu énergétique de l’Univers que sur son histoire. En retraçant son évolution jusqu’aujourd’hui, on est capable d’établir des scénarios quant à la période qui a précédé l’émission du CMB, inaccessible aux observations. Plus particulièrement,la plupart des modèles s’accordent à dire que l’Univers aurait connu juste après le Big Bang une période d’expansion exponentielle qualifiée d’inflation. L’un des défis majeurs de la cosmologie consiste à confirmer et contraindre ces modèles en cherchant sur le CMB les empreintes théoriques laissées par l’inflation : un motif de polarisation qualifié de mode B. Cependant, ce signal est attendu à un niveau très faible, sa détection requiert donc la mise en place d’instruments extrêmement sensibles. Cette thèse s’inscrit dans l’effort technologique mené au sein du projet QUBIC pour cette quête. Dans cette optique on s’est intéressé aux détecteurs, des matrices de plusieurs centaines de bolomètres supraconducteurs. Dans un premier temps, on a défini une méthode permettant de dimensionner les détecteurs et la matrice pour répondre au mieux à nos attentes à travers l’ajustement de paramètres pertinents. Puis on a mené pour la première fois dans la collaboration toute la réalisation d’une matrice de 256détecteurs sur laquelle on a par la suite effectué et exposé des tests préliminaires prometteurs pour la future implémentation du plan focal de QUBIC<br>The cosmic microwave background (CMB) is the very first light of the Uni- verse and thus constitutes the oldest picture of its initial state. These photons carry valuable information constraining both the energy content and the history of the Universe. CMB observations allow us to reconstruct what occurred before the CMB anisotropies were imprinted. The most promising theoretical models all postulate an epoch of exponential expansion known as inflation just after the Big Bang. One of the major challenges of observational cosmology is hence to confirm or falsify inflation as well as to discover how inflation was realized in a particular model by searching for its imprint on the CMB polarization B-mode. This signal is however expected to be extremely weak and its detection requires a very sensitive experiment. This thesis reports on contributions to the technology development for the innovative QUBIC instrument, focusing on the perfection of an array of several hundreds of supercon- ducting bolometric detectors. A method was defined to design the detector array through tuning the relevant parameters to best meet our requirements. Then a 256- detector prototype array was fully manufactured and characterized. The preliminary characterization gave promising results for the forthcoming implementation of the QUBIC focal plane
APA, Harvard, Vancouver, ISO, and other styles
8

PIETROBON, DAVIDE. "Making the best of cosmological perturbations: theory and data analysis." Doctoral thesis, Università degli Studi di Roma "Tor Vergata", 2010. http://hdl.handle.net/2108/1197.

Full text
Abstract:
Grazie a esperimenti estremamente raffinati, la cosmologia moderna si trova oggi in quella che puo' essere defi nita l' epoca della precisione. I cosmologi dispongono di una grande quatita' di strumenti per testare il cosiddetto modello cosmologico di concordanza e vincolarne i principali parametri. In particolare, la radiazione cosmica di fondo (CMB) ha svolto, e svolge tutt'oggi, un ruolo chiave in questo ambito. Numerose domande rimangono tuttavia ancora in attesa di una risposta, in particolare quelle che riguardano la fisica dell'inflazione, che ha governato l'Universo nelle prime fasi di evoluzione, e la natura dell' accelerazione dell' espansione dell'Universo, che e' stata osservata negli ultimi anni. La mia attivita' di ricerca ha contribuito ad approfondire lo studio e la conoscenza su entrambe le tematiche, che sono state accomunate dallo sviluppo e dall'utilizzo delle needlets - una nuova "frame" sulla sfera - per analizzare la CMB. Con questo strumento, abbiamo misurato l'effetto Sachs-Wolfe integrato, correlando i dati di WMAP e NVSS, e caratterizzato le proprieta' della dark energy, seguendo un approccio fenomenologico che si basa sull'approssimazione di fl uido perfetto. Stimolati dai risultati ottenuti, abbiamo studiato in dettaglio un modello unifi cato per le componenti di dark energy e dark matter, che fa uso di un'equazione di stato affi ne, e investigato i vincoli sui parametri di questo modello provenienti da WMAP e SDSS. Abbiamo quindi applicato le needlets ai dati di WMAP 5-anni allo scopo di studiare la gaussianita' della distribuzione delle perturbazioni della CMB. Ci siamo dapprima concentrati sulle mappe, rilevando la presenza di regioni anomale, localizzate nell' emisfero meridionale, e studiando l'effetto che queste regioni hanno sullo spettro di potenza angolare. Successivamente, abbiamo misurato la funzione di correlazione a tre punti (bispettro) delle needlets caratterizzandola in termini della sua ampiezza complessiva, descritta dal parametro fNL, e secondo la geometria delle configurazioni triangolari che contribuiscono al segnale totale. Abbiamo misurato una significativa anomalia nelle confi gurazioni isosceli, nuovamente presente nell'emisfero meridionale. Infine, ci siamo concentrati sulla costruzione di un estimatore per il bispettro delle needlets, includendo l'effetto spurio che puo' essere introdotto dalla presenza di eventuale segnale residuo, proveniente da sorgenti di natura prevalentemente Galattica.<br>Cosmology has entered the precision epoch thanks to several very accurate experiments. Cosmologists now have access to an array of tools to test the cosmological concordance model and constrain its parameters; the Cosmic Microwave Background radiation (CMB), in particular, has been playing a crucial role in this ambition. Many questions remain nonetheless unanswered, especially concerning the physics of the early Universe, the infl ationary mechanism which set the initial conditions for the Universe expansion, and the nature of the late time acceleration of the Universe expansion. My research contributes to both of these sub jects, the common ground being the development of a statistical tool - needlets, a new "frame" on the sphere - to analyse the CMB. By means of needlets, we measure the Integrated Sachs Wolfe effect by cross-correlating WMAP and NVSS datasets and characterise dark energy properties using a phenomenological fluid model. Motivated by our findings, we study in detail a parameterisation of the dark components, dark matter and dark energy, which makes use of an affine equation of state, constraining the parameters of the model by combining WMAP and SDSS datasets. We apply needlets to the WMAP 5-year data release testing the Gaussianity of the CMB perturbations. Our approach is twofold: we first focus on the maps, detecting anomalous spots located in the southern hemisphere and check their effect on the angular power spectrum. We next measure the needlet three-point correlation function (bispectrum) and characterise it in terms of its overall amplitude, constraining the primordial fNL parameter, and considering its properties according to the geometry of the triangle configurations which contribute to the total power. We find a significant anomaly in the isosceles confi gurations, again in the southern hemisphere. Finally we focus on the construction of an optimal estimator for the (needlets) bispectrum, taking into account foreground residuals due mainly to Galactic emission.
APA, Harvard, Vancouver, ISO, and other styles
9

Wong, Kim-Yung Eddie. "Automatic spoken language identification utilizing acoustic and phonetic speech information." Thesis, Queensland University of Technology, 2004. https://eprints.qut.edu.au/37259/1/Kim-Yung_Wong_Thesis.pdf.

Full text
Abstract:
Automatic spoken Language Identi¯cation (LID) is the process of identifying the language spoken within an utterance. The challenge that this task presents is that no prior information is available indicating the content of the utterance or the identity of the speaker. The trend of globalization and the pervasive popularity of the Internet will amplify the need for the capabilities spoken language identi¯ca- tion systems provide. A prominent application arises in call centers dealing with speakers speaking di®erent languages. Another important application is to index or search huge speech data archives and corpora that contain multiple languages. The aim of this research is to develop techniques targeted at producing a fast and more accurate automatic spoken LID system compared to the previous National Institute of Standards and Technology (NIST) Language Recognition Evaluation. Acoustic and phonetic speech information are targeted as the most suitable fea- tures for representing the characteristics of a language. To model the acoustic speech features a Gaussian Mixture Model based approach is employed. Pho- netic speech information is extracted using existing speech recognition technol- ogy. Various techniques to improve LID accuracy are also studied. One approach examined is the employment of Vocal Tract Length Normalization to reduce the speech variation caused by di®erent speakers. A linear data fusion technique is adopted to combine the various aspects of information extracted from speech. As a result of this research, a LID system was implemented and presented for evaluation in the 2003 Language Recognition Evaluation conducted by the NIST.
APA, Harvard, Vancouver, ISO, and other styles
10

Lacasa, Fabien. "Non-Gaussianity and extragalactic foregrounds to the Cosmic Microwave Background." Phd thesis, Université Paris Sud - Paris XI, 2013. http://tel.archives-ouvertes.fr/tel-00955975.

Full text
Abstract:
This PhD thesis, written in english, studies the non-Gaussianity (NG) of extragalactic foregrounds to the Cosmic Microwave Background (CMB), the latter being one of the golden observables of today's cosmology. In the last decade has emerged research for deviations of the CMB to the Gaussian law, as they would discriminate the models for the generation of primordial perturbations. However the CMB measurements, e.g. by the Planck satellite, are contaminated by several foregrounds. I studied in particular the extragalactic foregrounds which trace the large scale structure of the universe : radio and infrared point-sources and the thermal Sunyaev-Zel'dovich effect (tSZ). I hence describe the statistical tools to characterise a random field : the correlation functions, and their harmonic counterpart : the polyspectra. In particular the bispectrum is the lowest order indicator of NG, with the highest potential signal to noise ratio (SNR). I describe how it can be estimated on data, accounting for a potential mask (e.g. galactic), and propose a method to visualise the bispectrum, which is more adapted than the already existing ones. I then describe the covariance of a polyspectrum measurement, a method to generate non-Gaussian simulations, and how the statistic of a 3D field projects onto the sphere when integrating along the line-of-sight. I then describe the generation of density perturbations by the standard inflation model and their possible NG, how they yield the CMB anisotropies and grow to form the large scale structure of today's universe. To describe this large scale structure, I present the halo model and propose a diagrammatic method to compute the polyspectra of the galaxy density field and to have a simple and powerful representation of the involved terms. I then describe the foregrounds to the CMB, galactic as well as extragalactic. I briefly describe the physics of the thermal Sunyaev-Zel'dovich effect and how to describe its spatial distribution with the halo model. I then describe the extragalactic point-sources and present a prescription for the NG of clustered sources. For the Cosmic Infrared Background (CIB) I introduce a physical modeling with the halo model and the diagrammatic method. I compute numerically the 3D galaxy bispectrum and produce the first theoretical prediction of the CIB angular bispectrum. I show the contributions of the different terms and the temporal evolution of the galaxy bispectrum. For the CIB angular bispectrum, I show its different terms, its scale and configuration dependence, and how it varies with model parameters. By Fisher analysis, I show it allows very good constraints on these parameters, complementary to or better than those coming from the power spectrum. Finally, I describe my work on measuring NG. I first introduce an estimator for the amplitude of the CIB bispectrum, and show how to combine it with similar ones for radio sources and the CMB, for a joint constraint of the different sources of NG. I quantify the contamination of extragalactic point-sources to the estimation of primordial NG ; for Planck it is negligible for the central CMB frequencies. I then describe my measurement of the CIB bispectrum on Planck data ; it is very significantly detected at 217, 353 and 545 GHz with SNR ranging from 5.8 to 28.7. Its shape is consistent between frequencies, as well as the intrinsic amplitude of NG. Ultimately, I describe my measurement of the tSZ bispectrum, on simulations and on Compton parameter maps estimated by Planck, validating the robustness of the estimation thanks to realist foreground simulations. The tSZ bispectrum is very significantly detected with SNR~200. Its amplitude and its scale and configuration dependence are consistent with the projected map of detected clusters and tSZ simulations. Finally, this measurement allows to put a constraint on the cosmological parameters : sigma_8*(Omega_b/0.049)^0.35 = 0.74+/-0.04 in agreement with other tSZ statistics.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Universal background models"

1

Saha, Prasenjit, and Paul A. Taylor. The Cosmic Microwave Background. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198816461.003.0009.

Full text
Abstract:
The measurement of the acoustic modes of the cosmic microwave background have been perhaps the most exciting new astrophysical territory explored in the 21st century. This cosmological area includes the study of some of the earliest moments of the Universe, such as the significant change in state of going from a gas of ionized particles to one of atoms (called recombination), greatly reducing the opacity for photons, leading to the observable phenomenon of the microwave background today. This chapter builds up to a calculation of the sound horizon and hence the location of the first observed peak in the fluctuation spectrum.
APA, Harvard, Vancouver, ISO, and other styles
2

Omorogbe, Yinka. Universal Access to Modern Energy Services. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198819837.003.0002.

Full text
Abstract:
This chapter examines the role that law plays in the enablement and empowerment of the world’s energy poor, with a particular focus on Africa, and in doing so, provides the rationale for the research. Against the background of contemporary measures to promote universal access to modern energy services, it critically analyses key concepts such as energy poverty, sustainable development and access to energy. The role of the law as a critical component for achieving this goal and the need for its centrality to be recognized as a necessary ingredient for success is ultimately reinforced. Further, the chapter discusses key concepts such as energy poverty, sustainable development, and access to energy, which underpin most of the contributions, and then highlights the indispensability of modern energy as an essential component of sustainable development. It highlights the need for complementary pro-energy-poor policies and critical success factors of energy planning and finance.
APA, Harvard, Vancouver, ISO, and other styles
3

Deruelle, Nathalie, and Jean-Philippe Uzan. The Lambda-CDM model of the hot Big Bang. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198786399.003.0059.

Full text
Abstract:
This chapter introduces the Lambda-CDM (cold dark matter) model. In 1948, under the impetus of George Gamow, Robert Hermann, Ralph Alpher, and Hans Bethe in particular, relativistic cosmology entered the second phase of its history. In this phase, physical processes, in particular, nuclear and atomic processes, are taken into account. This provides two observational tests of the model: primordial nucleosynthesis, which explains the origin of light nuclei, and the existence of the cosmic microwave background, and it establishes the fact that the universe has a thermal history. Study of the large-scale structure of the universe then indicates the existence of dark matter and a nonzero cosmological constant. This model, known as the Λ‎CDM model, is the standard model of contemporary cosmology.
APA, Harvard, Vancouver, ISO, and other styles
4

Kachelriess, Michael. Inflation. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198802877.003.0024.

Full text
Abstract:
This chapter introduces inflation as a phase of nearly exponential expansion in the early universe. The slow-roll conditions are deribed and possible inflationary models are discussed. Reheating connects the end of inflation with the standard hot big-bang model. The spectrum of fluctuations generated by inflation is calculated and it is shown that it is nearly scale-invariant and Gaussian. The fluctuations have fixed phase relations on superhorizon scales that cause characteristic oscillations of the temperature fluctuations of the cosmic microwave background.
APA, Harvard, Vancouver, ISO, and other styles
5

Kragh, Helge, and Malcolm S. Longair, eds. The Oxford Handbook of the History of Modern Cosmology. Oxford University Press, 2019. http://dx.doi.org/10.1093/oxfordhb/9780198817666.001.0001.

Full text
Abstract:
Although some of the observational and conceptual roots of modern cosmology can be traced back to the nineteenth century, it was only in the twentieth century that the study of the universe as a whole emerged as a genuine physical science. The development through the twentieth and now well into the twenty-first century has been far from smooth, but in spite of a number of false trails it has been tremendously fruitful and surprisingly successful scientifically. The volume presents a comprehensive overview of the development of cosmology from about 1860 to the most recent discoveries. It describes and explains the historical background to what we know about the universe today and what people in the past thought they knew about the universe, starting with the first observations of spiral nebulae and ending with the discovery of gravitational waves. The book is organized into thirteen roughly chronologically ordered chapters, some focusing on theory and others more on observations and technological advances. A few of the chapters are of a more general nature, relating to larger contexts such as politics, philosophy and religious world views. The chapters are written by eight different authors, some of whom are astrophysicists or cosmologists while others have backgrounds in the history and philosophy of science. Each chapter can be read separately but also has a symbiotic relation with the other chapters. As a result, the book describes the history of modern cosmology coherently, comprehensively and with ample references to the relevant sources.
APA, Harvard, Vancouver, ISO, and other styles
6

Vigdor, Steven E. Expansion Everlasting. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198814825.003.0005.

Full text
Abstract:
Chapter 5 presents experiments illuminating the cosmological evolution of the universe and its energy budget, accounting for its longevity. The observations establishing the Hubble’s Law linear relationship between intergalactic distances and recession speeds, and their interpretation in terms of the expansion of cosmic space, are reviewed. The evidence for big bang cosmology from nucleosynthesis and the cosmic microwave background (CMB) is presented. The measurements that establish the ongoing acceleration of the cosmic expansion are reviewed: distant supernova recession speeds, tiny CMB anisotropies, baryon acoustic oscillations, and gravitational lensing. Excellent model fits to these data, assuming general relativity, cold dark matter, and a cosmological constant, lead to precise determinations of both the age of the universe and the energy budget of the universe. The cosmic history of the expansion rate and the energy budget are inferred, along with the remarkable flatness of cosmic space within the observable portion of the universe.
APA, Harvard, Vancouver, ISO, and other styles
7

Iliopoulos, John. A Brief History of Cosmology. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198805175.003.0002.

Full text
Abstract:
We present the evolution of our ideas concerning the history of the Cosmos. They are based on Einstein’s theory of General Relativity in which E.P. Hubble and G. Lemaître brought two fundamental new concepts: the expansion of the Universe and the model of the Big Bang. They form the basic elements of the modern theory of Cosmology. We present very briefly the observational evidence which corroborates this picture based on a vast amount of data, among which the most recent ones come from the Planck mission with a detailed measurement of the cosmic microwave background (CMB) radiation. We show that during its evolution the Universe went through several phase transitions giving rise to the formation of particles, atoms, nuclei, etc. A particular phase transition, which occurred very early in the cosmic history, around 10–12 seconds after the Big Bang, is the Brout–Englert–Higgs (BEH) transition during which a fraction of the energy was transformed into mass, thus making it possible for most elementary particles to become massive.
APA, Harvard, Vancouver, ISO, and other styles
8

Higdon, Dave, Katrin Heitmann, Charles Nakhleh, and Salman Habib. Combining simulations and physical observations to estimate cosmological parameters. Edited by Anthony O'Hagan and Mike West. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780198703174.013.26.

Full text
Abstract:
This article focuses on the use of a Bayesian approach that combines simulations and physical observations to estimate cosmological parameters. It begins with an overview of the Λ-cold dark matter (CDM) model, the simplest cosmological model in agreement with the cosmic microwave background (CMB) and largescale structure analysis. The CDM model is determined by a small number of parameters which control the composition, expansion and fluctuations of the universe. The present study aims to learn about the values of these parameters using measurements from the Sloan Digital Sky Survey (SDSS). Computationally intensive simulation results are combined with measurements from the SDSS to infer about a subset of the parameters that control the CDM model. The article also describes a statistical framework used to determine a posterior distribution for these cosmological parameters and concludes by showing how it can be extended to include data from diverse data sources.
APA, Harvard, Vancouver, ISO, and other styles
9

Avilez, GerShun. Introduction. University of Illinois Press, 2017. http://dx.doi.org/10.5406/illinois/9780252040122.003.0001.

Full text
Abstract:
This introductory chapter provides a background of Black Nationalism. Black Nationalism is a political philosophy that has played an integral part in African American social thought from the nineteenth century forward. There are two main threads of this philosophical tradition: classical and modern. Classical Black Nationalism is a political framework guided primarily by concerns with the creation of a sovereign Black state and uplifting and “civilizing” the race. With regards to Black Nationalist thought in the twentieth century, two moments loom large: Marcus Garvey's Universal Negro Improvement Association (UNIA) in the 1910s/1920s and the Black Power Movement in the 1960s/1970s. Modern Black Nationalism is characterized by two specific shifts away from the foundational ideas that governed the classical form. It departs from its predecessor in the general lack of an explicit emphasis on an independent Black nation-state. It also shifts attention to mass culture and Black working-class life.
APA, Harvard, Vancouver, ISO, and other styles
10

Kragh, Helge. Physics and Cosmology. Edited by Jed Z. Buchwald and Robert Fox. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199696253.013.30.

Full text
Abstract:
This article considers the role of physics in transforming cosmology into a research field which relies heavily on fundamental physical knowledge. It begins with an overview of astrophysics and the state of physical cosmology prior to the introduction of relativity, followed by a discussion of Albert Einstein’s application of his new theory of gravitation to cosmology. It then examines the development of a theory about the possibility of an expanding universe, citing the work of such scientists as Edwin Hubble, Alexander Friedmann, Georges Lemaître, and George Gamow; the emergence of the field of nuclear archaeology to account for the origins of the early universe; and the controversy sparked by the steady-state theory. It also describes the discovery of a cosmic microwave background of the kind that Alpher and Herman had predicted in 1948 before concluding with a review of modern cosmological hypotheses such as the idea of ‘multiverse’.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Universal background models"

1

Reynolds, Douglas. "Universal Background Models." In Encyclopedia of Biometrics. Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-73003-5_197.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Reynolds, Douglas. "Universal Background Models." In Encyclopedia of Biometrics. Springer US, 2015. http://dx.doi.org/10.1007/978-1-4899-7488-4_197.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Monteiro, João C., and Jaime S. Cardoso. "A Novel Application of Universal Background Models for Periocular Recognition." In Biomedical Engineering Systems and Technologies. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27707-3_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Driessen, P. "A Fractal Model Of The Universe." In Examining the Big Bang and Diffuse Background Radiations. Springer Netherlands, 1996. http://dx.doi.org/10.1007/978-94-009-0145-2_50.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Frieman, J. A. "Inflation, Microwave Background Anisotropy, and Open Universe Models." In Examining the Big Bang and Diffuse Background Radiations. Springer Netherlands, 1996. http://dx.doi.org/10.1007/978-94-009-0145-2_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tran, Huyen, Dat Tran, Wanli Ma, and Phuoc Nguyen. "EEG-Based Person Authentication with Variational Universal Background Model." In Network and System Security. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36938-5_25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Panek, Mirosław. "Large-Scale Microwave Background Anisotropies in Cosmological Models with Exotic Components." In The Post-Recombination Universe. Springer Netherlands, 1988. http://dx.doi.org/10.1007/978-94-009-3035-3_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nour-Eddine, Lachachi, and Adla Abdelkader. "Reduced Universal Background Model for Speech Recognition and Identification System." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-31149-9_31.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sugiyama, Naoshi, Naoteru Gouda, and Misao Sasaki. "Constraints on Universe Models With Cosmological Constant from Cosmic Microwave Background Anisotropy." In Primordial Nucleosynthesis and Evolution of Early Universe. Springer Netherlands, 1991. http://dx.doi.org/10.1007/978-94-011-3410-1_79.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tran, Huyen, Dat Tran, Wanli Ma, and Phuoc Nguyen. "$$\mathsf {vUBM}$$: A Variational Universal Background Model for EEG-Based Person Authentication." In Communications in Computer and Information Science. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36808-1_52.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Universal background models"

1

Martinez-Diaz, M., J. Fierrez, and J. Ortega-Garcia. "Universal Background Models for Dynamic Signature Verification." In 2007 First IEEE International Conference on Biometrics: Theory, Applications, and Systems. IEEE, 2007. http://dx.doi.org/10.1109/btas.2007.4401942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

C. Monteiro, João, and Jaime S. Cardoso. "Periocular Recognition under Unconstrained Settings with Universal Background Models." In International Conference on Bio-inspired Systems and Signal Processing. SCITEPRESS - Science and and Technology Publications, 2015. http://dx.doi.org/10.5220/0005195900380048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Snyder, David, Daniel Garcia-Romero, and Daniel Povey. "Time delay deep neural network-based universal background models for speaker recognition." In 2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU). IEEE, 2015. http://dx.doi.org/10.1109/asru.2015.7404779.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Qwam Alden, Arz Y., Andrew G. Geeslin, Jeffrey C. King, and Peter A. Gustafson. "A Finite Element Model of a Surgical Knot." In ASME 2017 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/imece2017-72201.

Full text
Abstract:
Background Surgical knots are one of several structures which can fail during surgical repair. However, there is no universal agreement on the superiority (best/safest) of one particular surgical knot technique. Tensile testing of repaired soft tissue has been used to assess the efficacy of surgical knot tying techniques, however, few computational models exist. The purpose of this study was to create a validated biomechanical model to evaluate the effect of knot configuration on the mechanical performance of surgical sutures. Methods Two sutures were tested experimentally to find the mechanical properties and strength. Single throw knots were also tested for strength. Finite element models were constructed of each configuration and correlation was established. Results The finite element results are quantitatively and qualitatively consistent with experimental findings. The FE model stress concentrations are also consistent with published strength reductions. Model and experimental results are presented using as-manufactured No. 2 FiberWire as well as its core and jacket constituents separately. Clinical Relevance This paper describes a model which can evaluate the effect of knot topology on the mechanics of surgical suture. In the future, the model may be used to evaluate the mechanical differences between surgical techniques and suture materials. The findings may impact choices for suture and knot types selected for soft tissue repairs.
APA, Harvard, Vancouver, ISO, and other styles
5

Povey, Daniel, Stephen M. Chu, and Balakrishnan Varadarajan. "Universal background model based speech recognition." In ICASSP 2008 - 2008 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2008. http://dx.doi.org/10.1109/icassp.2008.4518671.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ngo, Melissa, Philippe Rauffet, and Siobhan Banks. "Exploring Multitasking Performance and Fatigue with the MAT-B II: A Narrative Review." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1001564.

Full text
Abstract:
Multitasking and switching between tasks is a universal function in many occupations as juggling tasks simultaneously can increase task productivity especially with, factors such as workload that can lead to decrements and impair human performance. Fatigue can refer to the effects or after-effects of exerting mental and or physical effort on a task. Fatigue inducing factors such as high workload and time-on-task can impact task management, optimization and prioritization which can lead to performance decrements. Despite the universality of multitasking, from aviation to driving a car whilst talking simultaneously, it is unclear as to what underlying cognitive processes are affected by induced fatigue. This brief narrative review explores the dynamics of cognitive processes with induced fatigue on individual operator and task contexts. With an interest in cognitive-behavioral models and the Multi-Attribute Task Battery II (MAT-B II), this review aims to provide a conceptual background of the MAT-B II and its diverse use in modelling multitasking environments. By describing and investigating fatigue with multidisciplinary expertise, the development and implementation of countermeasures can enhance performance to mitigate the deleterious effects of workload and time-on-task.
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Zhifeng, Qingtang Liu, Jia Chen, and Huang Yao. "Recording Source Identification Using Device Universal Background Model." In 2015 International Conference of Educational Innovation through Technology (EITT). IEEE, 2015. http://dx.doi.org/10.1109/eitt.2015.11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Alizadeh, Hassan, Samaeh Khoshrou, and André Zúquete. "Application-Specific Traffic Anomaly Detection Using Universal Background Model." In CODASPY'15: Fifth ACM Conference on Data and Application Security and Privacy. ACM, 2015. http://dx.doi.org/10.1145/2713579.2713586.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

"An Analytical Investigation of the Characteristics of the Dropout Students in Higher Education." In InSITE 2018: Informing Science + IT Education Conferences: La Verne California. Informing Science Institute, 2018. http://dx.doi.org/10.28945/3979.

Full text
Abstract:
Aim/Purpose: [This Proceedings paper was revised and published in the 2018 issue of the journal Issues in Informing Science and Information Technology, Volume 15] Student dropout in higher education institutions is a universal problem. This study identifies the characteristics of dropout. In addition, it develops a mathematical model to predict students who may dropout. Background: This study compared dropout rates of one and a half year of enrollment among Traditional Undergraduate Students. The sample includes 555 freshmen in a non-profit private university. Methodology: The study uses both descriptive statistics such as cross tabulation and a binary regression model to predict student dropout. Contribution: There are two major contributions for the paper, one it raises questions regarding causes of dropout thus, hopefully, it can result in better allocation of resources at higher education institutions. It also develops a predictive model that may be used in order to predict the probability of a student dropping out and take preventive actions. Findings: Two major findings are that some of the resources designed to assist student are misallocated, and that the proposed model predicted with 66.6% accuracy students who will dropout. Recommendations for Practitioners: The study recommends that institutions must create initiatives to assist freshmen students and have annual assessment to measure the success of the initiatives. Recommendation for Researchers: Secondly that analytical models be used to predicts dropout with fair accuracy. Impact on Society: The study should result in better allocation of resources in higher education institutions Future Research: The research will continue developing and testing the model using a wider sample and other institutions.
APA, Harvard, Vancouver, ISO, and other styles
10

Popov, A., D. Barsukov, A. Ivanchik, and S. Bobashev. "Spectrum of positrons produced due to interaction of gamma-ray background photons with soft background photons." In Modern astronomy: from the Early Universe to exoplanets and black holes. Special Astrophysical Observatory of the Russian Academy of Sciences, 2024. https://doi.org/10.26119/vak2024.015.

Full text
Abstract:
The interaction of the cosmological gamma-ray background photons with soft cosmological background with producing electron-positron pairs is considered. It is shown that the majority of positrons are produced with energies of 10 Gev - 1 TeV. However, the interaction of ``X-ray" cosmological background photons may produce the positrons with energies of 10-100 keV.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Universal background models"

1

Batliwala, Srilatha. Transformative Feminist Leadership: What It Is and Why It Matters. United Nations University International Institute of Global Health, 2022. http://dx.doi.org/10.37941/rr/2022/2.

Full text
Abstract:
The words of ancient Chinese philosopher Lao Tsu make the simplest, yet most profound, case for transformation – a change of direction, a fundamental shift in the nature or character of something, recasting the existing order and ways of doing things. This is what the world needs now, as institutions and systems of the past century prove unable to address the challenges of impending planetary disaster, persistent poverty, pandemics, rising fundamentalism and authoritarianism, wars, and everyday violence. Against a background of a worldwide backlash against women’s rights, gender parity in leadership positions – in legislatures, corporations, or civil society – has proved inadequate, as women in these roles often reproduce dominant patriarchal leadership models or propagate ideologies and policies that do not actually advance equality or universal human rights. What is required is truly transformative, visionary leadership, whereby new paradigms, relationships and structures are constructed on the basis of peace, planetary health, and social and economic justice.
APA, Harvard, Vancouver, ISO, and other styles
2

Irudayaraj, Joseph, Ze'ev Schmilovitch, Amos Mizrach, Giora Kritzman, and Chitrita DebRoy. Rapid detection of food borne pathogens and non-pathogens in fresh produce using FT-IRS and raman spectroscopy. United States Department of Agriculture, 2004. http://dx.doi.org/10.32747/2004.7587221.bard.

Full text
Abstract:
Rapid detection of pathogens and hazardous elements in fresh fruits and vegetables after harvest requires the use of advanced sensor technology at each step in the farm-to-consumer or farm-to-processing sequence. Fourier-transform infrared (FTIR) spectroscopy and the complementary Raman spectroscopy, an advanced optical technique based on light scattering will be investigated for rapid and on-site assessment of produce safety. Paving the way toward the development of this innovative methodology, specific original objectives were to (1) identify and distinguish different serotypes of Escherichia coli, Listeria monocytogenes, Salmonella typhimurium, and Bacillus cereus by FTIR and Raman spectroscopy, (2) develop spectroscopic fingerprint patterns and detection methodology for fungi such as Aspergillus, Rhizopus, Fusarium, and Penicillium (3) to validate a universal spectroscopic procedure to detect foodborne pathogens and non-pathogens in food systems. The original objectives proposed were very ambitious hence modifications were necessary to fit with the funding. Elaborate experiments were conducted for sensitivity, additionally, testing a wide range of pathogens (more than selected list proposed) was also necessary to demonstrate the robustness of the instruments, most crucially, algorithms for differentiating a specific organism of interest in mixed cultures was conceptualized and validated, and finally neural network and chemometric models were tested on a variety of applications. Food systems tested were apple juice and buffer systems. Pathogens tested include Enterococcus faecium, Salmonella enteritidis, Salmonella typhimurium, Bacillus cereus, Yersinia enterocolitis, Shigella boydii, Staphylococus aureus, Serratiamarcescens, Pseudomonas vulgaris, Vibrio cholerae, Hafniaalvei, Enterobacter cloacae, Enterobacter aerogenes, E. coli (O103, O55, O121, O30 and O26), Aspergillus niger (NRRL 326) and Fusarium verticilliodes (NRRL 13586), Saccharomyces cerevisiae (ATCC 24859), Lactobacillus casei (ATCC 11443), Erwinia carotovora pv. carotovora and Clavibacter michiganense. Sensitivity of the FTIR detection was 103CFU/ml and a clear differentiation was obtained between the different organisms both at the species as well as at the strain level for the tested pathogens. A very crucial step in the direction of analyzing mixed cultures was taken. The vector based algorithm was able to identify a target pathogen of interest in a mixture of up to three organisms. Efforts will be made to extend this to 10-12 key pathogens. The experience gained was very helpful in laying the foundations for extracting the true fingerprint of a specific pathogen irrespective of the background substrate. This is very crucial especially when experimenting with solid samples as well as complex food matrices. Spectroscopic techniques, especially FTIR and Raman methods are being pursued by agencies such as DARPA and Department of Defense to combat homeland security. Through the BARD US-3296-02 feasibility grant, the foundations for detection, sample handling, and the needed algorithms and models were developed. Successive efforts will be made in transferring the methodology to fruit surfaces and to other complex food matrices which can be accomplished with creative sampling methods and experimentation. Even a marginal success in this direction will result in a very significant breakthrough because FTIR and Raman methods, in spite of their limitations are still one of most rapid and nondestructive methods available. Continued interest and efforts in improving the components as well as the refinement of the procedures is bound to result in a significant breakthrough in sensor technology for food safety and biosecurity.
APA, Harvard, Vancouver, ISO, and other styles
3

Pritchett, Lant, Kirsty Newman, and Jason Silberstein. Focus to Flourish: Five Actions to Accelerate Progress in Learning. Research on Improving Systems of Education (RISE), 2022. http://dx.doi.org/10.35489/bsg-rise-misc_2022/07.

Full text
Abstract:
There is a severe global learning crisis. While nearly all children start school, far too many do not learn even the most foundational skills of reading, writing, and basic mathematics during the years they spend there. The urgent need to address this crisis requires no elaborate reasoning. If one starts with love for a child, a human universal, it is easy to see that in the modern world a child’s dignity, self-worth, and freedom to define their own destiny require an adequate education. An adequate education is what will then enable that child to lead a full adult life as a parent, community member, citizen, and worker in the 21st century. To enable every child to leave school with the foundational skills they need will require fundamental changes to education systems. Since 2015, the Research on Improving Systems of Education (RISE) Programme, with which we are affiliated, has been conducting research exploring how to make these changes through country research teams in seven countries (Ethiopia, India, Indonesia, Nigeria, Pakistan, Tanzania, and Vietnam) and crosscutting teams on the political economy of education reform. Drawing on the cumulative body of research on learning outcomes and systems of education in the developing world, both from the RISE Programme and other sources, we advocate for five key actions to drive system transformation. (See next page.) A message cutting across all five actions is “focus to flourish”. Education systems have been tremendously successful at achieving specific educational goals, such as expanding schooling, because that is what they committed to, that is what they measured, that is what they were aligned for, and that is what they supported. In order to achieve system transformation for learning, systems must focus on learning and then act accordingly. Only after a system prioritises learning from among myriad competing educational goals can it dedicate the tremendous energies necessary to succeed at improving learning. The research points to these five actions as a means to chart a path out of the learning crisis and toward a future that offers foundational skills to all children. The first section that follows provides background on the depth and nature of the learning crisis. The remainder of the document explains each of the five actions in turn, synthesising the research that informs each action, contrasting that action with the prevailing status quo, and describing what the action would entail in practice.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography