Дисертації з теми "Event measurement"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Event measurement.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 дисертацій для дослідження на тему "Event measurement".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Schulz, Holger. "Measurement of the Underlying Event using track-based event shapes in Z -> ℓ+ℓ− events with ATLAS". Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät, 2015. http://dx.doi.org/10.18452/17129.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Diese Dissertation beschreibt eine Messung von hadronischen Ereignisformvariablen (event shapes) in Protonkollisionen mit einer Schwerpunktsenergie von 7 TeV am Large Hadron Collider (LHC) am CERN (Conseil Europeenne pour la Recherche Nucleaire) bei Genf (Schweiz). Die analysierten Daten mit einer integrierten Luminosität von 1.1 inversen fb wurden im Jahr 2011 mit dem ATLAS Experiment aufgenommen. Für die Analyse wurden solche Ereignisse ausgewählt, in deren harten Streuprozessen ein Z-Boson produziert wurde, welches entweder in ein Elektron-Positron-Paar oder ein Muon-Antimuon-Paar zerfällt. Die Observablen wurden mit sämtlichen rekonstruierten Spuren innerhalb der Akzeptanz des inneren Spurdetektors (Inner Detector) von ATLAS außer denen der Leptonen des Zerfalls des Z-Bosons berechnet. Somit handelt es sich hierbei um die erste Messung dieser Art. Anschließend wurden die Observablen auf Untergrundprozesse mit auf Daten basierenden Methoden korrigiert wobei ein neues Verfahren für die Korrektur des sogenannten Pile-up (Überlagerung mehrerer Proton-Proton Wechselwirkungen) entwickelt und erfolgreich zur Anwedung gebracht wurde. Schließlich wurden die gemessenen Verteilungen entfaltet. Die so erhaltenen Daten sind insbesondere sensitiv auf das sogenannte Underlying Event und können direkt mit Monte-Carlo-Ereignisgeneratoren ohne aufwändige Simulation des ATLAS-Detektors verglichen werden. Abschließend wurde versucht die Modellparameter in den Simulationsprogrammen Pythia8 und Sherpa mithilfe der gewonnenen Daten durch eine bessere Abstimmung (Tuning) zu verbessern. Hierbei zeigte sich, dass das zugrunde liegende Sjostrand-Zijl Modell nicht ausreicht, um eine adäquate Beschreibung der gemessenen Verteilungen zu erreichen.
This thesis describes a measurement of hadron-collider event shapes in proton-proton collisions at a centre of momentum energy of 7 TeV at the Large Hadron Collider (LHC) at CERN (Conseil Europeenne pour la Recherche Nucleaire) located near Geneva (Switzerland). The analysed data (integrated luminosity: 1.1 inverse fb) was recorded in 2011 with the ATLAS-experiment. Events where a Z-boson was produced in the hard sub-process which subsequently decays into an electron-positron or muon-antimuon pair were selected for this analysis. The observables are calculated using all reconstructed tracks of charged particles within the acceptance of the inner detector of ATLAS except those of the leptons of the Z-decay. Thus, this is the first measurement of its kind. The observables were corrected for background processes using data-driven methods. For the correction of so-called pile-up (multiple overlapping proton-proton collisions) a novel technique was developed and successfully applied. The data was further unfolded to correct for remaining detector effects. The obtained distributions are especially sensitive to the so-called Underlying Event and can be compared with predictions of Monte-Carlo event-generators directly, i.e. without the necessity of running time-consuming simulations of the ATLAS-detector. Finally, it was tried to improve the predictions of the event generators Pythia8 and Sherpa by finding an optimised setting of relevant model parameters in a technique called Tuning. It became apparent, however, that the underlying Sjostrand-Zijl model is unable to give a good description of the measured event-shape distributions.
2

Desmarais, Bruce A. Carsey Thomas M. "Discrete measurement, continuous time and event history modeling." Chapel Hill, N.C. : University of North Carolina at Chapel Hill, 2008. http://dc.lib.unc.edu/u?/etd,1900.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Thesis (M.A.)--University of North Carolina at Chapel Hill, 2008.
Title from electronic title page (viewed Dec. 11, 2008). "... in partial fulfillment of the requirements for the degree of Master of Political Science in the Department of Political Science." Discipline: Political Science; Department/School: Political Science.
3

Waugh, Robert George. "Measurement of event shape variables in deep inelastic scattering." Thesis, University of Glasgow, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.301504.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Hanlon, Steven James Henry. "Measurement of event shapes in deep inelastic scattering at HERA." Thesis, University of Glasgow, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.400807.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

South, Andrew. "Design and development of an event related potential measurement system." Thesis, Sheffield Hallam University, 1999. http://shura.shu.ac.uk/20387/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Event-related potentials have been found to be a useful indicator of brain states and brain abnormality. The contingent negative variation, P300 and bereitschafts potential are well researched event-related potentials of particular interest. Many factors have to be considered in the design of measurement systems to record multiple channels of these signals accurately. The correlation between channels must be high and channel noise and distortion must be minimal, whilst the system as a whole must meet the requirements of the medical safety standards. For further research there was found to be a requirement for a dedicated thirty-two channel ERP measurement system that met these criteria. This has been achieved in a PC based system that utilises simultaneous sampling of all channels, and filters that extend to very low frequencies. Software control of the system enables user adjustment of recording parameters and paradigm implementation. Data processing using high level software enables digital signal processing techniques to be applied for further noise removal and signal analysis. The system has been tested using synthetically generated signals and by limited recording of the three ERPs. The results prove that the system is a suitable tool for high accuracy, multi-channel recording of ERPs.
6

Liu, Jian. "Ambulatory Fall Event Detection with Integrative Ambulatory Measurement (IAM) Framework." Diss., Virginia Tech, 2008. http://hdl.handle.net/10919/77184.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Injuries associated with fall accidents pose a significant health problem to society, both in terms of human suffering and economic losses. Existing fall intervention approaches are facing various limitations. This dissertation presented an effort to advance indirect type of injury prevention approach. The overall objective was to develop a new fall event detection algorithm and a new integrative ambulatory measurement (IAM) framework which could further improve the fall detection algorithm's performance in detecting slip-induced backward falls. This type of fall was chosen because slipping contributes to a major portion of fall-related injuries. The new fall detection algorithm was designed to utilize trunk angular kinematics information as measured by Inertial Measurement Units (IMU). Two empirical studies were conducted to demonstrate the utility of the new detection algorithm and the IAM framework in fall event detection. The first study involved a biomechanical analysis of trunk motion features during common Activities of Daily Living (ADLs) and slip-induced falls using an optical motion analysis system. The second study involved collecting laboratory data of common ADLs and slip-induced falls using ambulatory sensors, and evaluating the performance of the new algorithm in fall event detection. Results from the current study indicated that the backward falls were characterized by the unique, simultaneous occurrence of an extremely high trunk extension angular velocity and a slight trunk extension angle. The quadratic form of the two-dimensional discrimination function showed a close-to-ideal overall detection performance (AUC of ROCa = 0.9952). The sensitivity, specificity, and the average response time associated with the specific configuration of the new algorithm were found to be 100%, 95.65%, and 255ms, respectively. The individual calibration significantly improved the response time by 2.4% (6ms). Therefore, it was concluded that slip-induced backward fall was clearly distinguishable from ADLs in the trunk angular phase plot. The new algorithm utilizing a gyroscope and orientation sensor was able to detect backward falls prior to the impact, with a high level of sensitivity and specificity. In addition, individual calibration provided by the IAM framework was able to further enhance the fall detection performance.
Ph. D.
7

Otis, Craig H., and Steve M. Lewis. "AN EVENT TIMING SYSTEM USING FIBER OPTIC SENSORS." International Foundation for Telemetering, 1992. http://hdl.handle.net/10150/608886.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
International Telemetering Conference Proceedings / October 26-29, 1992 / Town and Country Hotel and Convention Center, San Diego, California
A fiber optic event timing system was developed for the High Speed Test Track at Holloman Air Force Base, Alamogordo, NM. The system uses fiber optic sensors to detect the passage of rocket sleds by different stations along the track. The sensors are connected by fiber optic cables to an electronics package that records the event time to a resolution of 100 nanoseconds. By use of a GPS receiver as the timebase, the event time is stored to an absolute accuracy of 300 nanoseconds. Custom VMEbus boards were developed for the event timing function, and these boards are controlled by a programmable high speed sequencer, which allows for complicated control functions. Each board has 4 electro-optic channels, and multiple boards can be used in a VMEbus card cage controlled by a single board computer. The system has been tested in a series of missions at the Test Track.
8

Schulz, Holger [Verfasser], Heiko [Akademischer Betreuer] Lacker, Arno [Akademischer Betreuer] Straessner та Klaus [Akademischer Betreuer] Moenig. "Measurement of the Underlying Event using track-based event shapes in Z -> ℓ+ℓ− events with ATLAS / Holger Schulz. Gutachter: Heiko Lacker ; Arno Straessner ; Klaus Moenig". Berlin : Mathematisch-Naturwissenschaftliche Fakultät, 2015. http://d-nb.info/1067297219/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Bjarnevik, Therese, and Elin Borgström. "Mätning och utvärdering av Eventmarknadsföring." Thesis, Högskolan i Borås, Akademin för textil, teknik och ekonomi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:hb:diva-966.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Denna kandidatuppsats inom företagsekonomi behandlar ämnet eventmarknadsföring och merinriktat hur mätning och utvärdering går till hos företag och organisationer samt vilkakomplikationer som finns kring detta. Studien är gjord med kvalitativ metod och har baseratspå teorier samt praktiska exempel som samlats in via en intervjumetod hos företag ochorganisationer.Uppsatsens huvudfokus är ämnet eventmarknadsföring, och den inriktas främst på hurmätning och utvärdering av marknadsföringsmetoden går till. Just mätning och utvärdering aveventmarknadsföring är sparsamt beforskat och det finns en avsaknad av kunskap om hur manpraktiskt tillämpar detta i organisationer och företag. Dock kan man se att den traditionellamarknadsföringens plats håller på att minska och eventmarknadsföring går stadigt uppåt.Exakt vad eventmarknadsföring innebär är aningen diffust då alla verkar ha sin egendefinition på ämnet men man kan kortfattat förklara det som att det handlar om att användasig av event av olika slag för att kommunicera ut en produkt eller ett budskap. Trots att mångaanvänder sig av denna metod är det ofta som man inte genomför någon utvärdering avresultatet och tidigare var detta inget problem men nu efterfrågas det mer och mer av företagoch i organisationer. På grund av detta är det av mångas intresse att lära sig mer kringmätning och utvärdering av eventmarknadsföring.Under våra intervjuer använde vi oss av en intervjuguide för att lämpliga frågor skulle ställastill respondenterna samt för att säkerställa att vi fick in samma information från de olikaintervjuerna. Intervjuguiden utformades utifrån den teori vi hittade och användes till de femintervjuer som genomfördes på fem olika företag. Respondenterna verkar inom flera olikabranscher och på så vis har vi fått inblick från olika branscher till uppsatsen.Teori och empiri indikerar att det är ett ytterst komplext ämne vi har undersökt och därför hardet varit svårt att dra några klara slutsatser. Det vi har kunnat konstatera är att det inte finnsnågon samordnad sanning kring hur mätning och utvärdering går till eftersom det är olikaberoende på vilken bransch du tillhör och vad du har för mål med eventet. Det finns hellerinte något standardiserat utvärderingsverktyg, men det efterfrågas delvis av både forskare ochvåra respondenter. Dessutom har vi konstaterat att man måste sätta mål redan under planeringför att man ska kunna mäta och utvärdera i efterhand. Undersökningen visar även atteventmarknadsföring är en lämplig metod att använda sig av då man lätt kan få en dialogmellan företag och konsumenten där man kan förändra attityd- och köpbeteendet.
This business bachelor thesis deals with the subject event marketing and more focused onhow to measure and evaluate this subject within companies and organisations and whatcomplications that surround this. The study is made with a qualitative method and is based ontheories and practical examples, which are gathered through interviews at companies andorganisations.Measurement and evaluation of event marketing is sparingly researched and there is a lack ofknowledge on how to practically use this in companies and organisations. However, one cansee that the traditional marketing space is shrinking and event marketing is rising. What eventmarketing means exactly is somewhat diffuse since everyone seems to have their owndefinition of the topic - but one can briefly explain that it is about making use of events ofvarious kinds to communicate a product or a message. Even though this is a method that isoften used the result is rarely evaluated. This was not a problem a few years back but nowcompanies and organisations demand this more and more. Because of this, there is a lot ofinterest in learning more about the measurement and evaluation of event marketing.During our interviews we used an interview guide to make sure we used appropriate questionsto our respondents and to make sure we got the same information from the differentinterviews. The interview guide was created from the theory we found and was used in thefive interviews conducted at five different companies. Our respondents operate in variousindustries and thus we have gained insight from various sectors of the paper.Theory and empirical evidence indicates that there is an extremely complex topic, we haveinvestigated and therefore it has been difficult to draw any clear conclusions. What we haveobserved is that there is no coordinated truths about how measurement and evaluation is donebecause it is different depending on what industry you belong to and what your goals are withthe event. Nor are there any standardized evaluation tools, but are partly requested by bothscientists and our respondents. Moreover, we have noted that you have to set goals alreadyduring planning in order to be able to measure and evaluate afterwards. The survey alsoshows that event marketing is an appropriate method to use because you can easily get adialogue between companies and consumers where they can change attitudes and purchasingbehaviour.
10

Kluge, Thomas. "Measurement and QCD analysis of event shape variables in deep inelastic electron proton collisions at HERA." Hamburg : DESY, 2004. http://deposit.d-nb.de/cgi-bin/dokserv?idn=971990387.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Rieck, Patrick. "Measurement of s-channel single top-quark production with the ATLAS detector using total event likelihoods." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät, 2016. http://dx.doi.org/10.18452/17620.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Es wird eine Messung der s-Kanal Einzel Top-Quark Produktion in Proton-Proton Kollisionen bei einer Schwerpunktsenergie von 8 TeV vorgestellt. Der verwendete Datensatz wurde mit dem ATLAS Detektor am LHC aufgenommen und entspricht einer integrierten Luminosität von 20.3 inversen Femtobarn. Kollisionsereignisse werden selektiert, sodass der resultierende Anteil der Signalereignisse relativ hoch ist. Selektierte Ereignisse enthalten ein isoliertes Elektron oder Myon, fehlenden Transversalimpuls und zwei Jets, die durch b-Quarks induziert wurden. Alle Objekte haben hohe transversalimpulse. Auch nach dieser Selektion überwiegen Untergrundprozesse, insbesondere die Paarproduktion von Top-Quarks und die Produktion von W-Bosonen begleitet von Jets. Um den Signalprozess weiter von den Untergründen zu trennen, werden mehrere Wahrscheinlichkeitsdichten näherungsweise berechnet. Sie unterscheiden sich hinsichtlich der Annahme des zugrundeliegenden Streuprozesses. Zusammen ergeben sie eine Funktion der gemessenen Impulse, mit deren Hilfe das Signal weiter von den Untergründen getrennt werden kann. Ein statistisches Modell der entsprechenden Verteilung wird an die Messdaten angepasst. Diese Messung ergibt eine Signifikanz des Signalprozesses von 3.4 Standardabweichungen und einen totalen Wirkungsquerschnitt von 5.3^+1.8_-1.6 Pikobarn. Dies ist die erste signifikante Messung der s-Kanal Einzel Top-Quark Produktion in Proton-Proton Kollisionen. Die Ergebnisse stimmen mit der Vorhersage des Standardmodells überein.
A measurement of s-channel single top-quark production in proton-proton collisions at a centre-of-mass energy of 8 TeV is presented. The data set has been recorded with the ATLAS detector at the LHC and corresponds to an integrated luminosity of 20.3 inverse femtobarn. Collision events are selected so that a subset of the data is obtained where the signal fraction is relatively high. Selected events contain one isolated electron or muon, missing transverse momentum and 2 jets, both of which are induced by b-quarks. All of these objects have large transverse momenta. The resulting set of events is still dominated by background processes, most notably top-quark pair production and the production of W bosons in association with jets. In order to further separate the signal from the backgrounds, several approximate event likelihoods are computed. They are based on different hypotheses regarding the scattering process at hand. Together they result in a function of the measured momenta which allows for the desired separation of the signal process. A statistical model of the corresponding distribution is used in a fit to the measured data. The fit results in a signal significance of 3.4 standard deviations and a total cross section of 5.3^+1.8_-1.6 picobarn. This is the first evidence for s-channel single top-quark production in proton-proton collisions. The results agree with the standard model prediction.
12

Scarboro, Sarah Brashear. "The use of a thyroid uptake system for assaying internal contamination following a radioactive dispersal event." Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/22639.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Thesis (M. S.)--Mechanical Engineering, Georgia Institute of Technology, 2008.
Committee Chair: Nolan Hertel; Committee Member: Armin Ansari; Committee Member: Chris Wang; Committee Member: Rebecca Howell.
13

Wynne, Benjamin Michael. "Measurement of the underlying event in pp collisions using the ATLAS detector and development of a software suite for Bayesian unfolding." Thesis, University of Edinburgh, 2013. http://hdl.handle.net/1842/7845.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
First measurements are made of the underlying event in calorimeter jet events at the LHC, using 37 pb-1 of pp collisions at √s = 7TeV, recorded during 2010 by the ATLAS detector. Results are compared for an assumed di-jet topology based on a single identified jet, and an exclusive di-jet requirement. The number of charged particles in the azimuthal region transverse to the jet axis is recorded, as well as their total and average transverse momentum. The total energy carried by all particles - charged and neutral - is also calculated, using the full calorimeter acceptance |η| < 4:8. Distributions are constructed to show the variation of these quantities versus the transverse momentum of the selected jet, over the range 20 - 800 GeV. Additional jets in the transverse region are shown to dramatically influence the measured activity. Software is developed to perform Bayesian iterative unfolding, testing closure of the process and stability with respect to the number of iterations performed. Pseudo-experiments are used to propagate systematic errors, and the intrinsic error due to unfolding is estimated. Although the correction relies on a prior probablitity distribution, model-dependence is reduced to an uncertainty comparable to or smaller than experimental systematic errors. The software is used to correct underlying event measurements for effects introduced by the ATLAS detector. Unfolded results are compared to predictions from different Monte Carlo event generators used in LHC analyses, showing general agreement in the range |η| < 2:5, but discrepancies in the forward region. Comparison with other ATLAS results shows compatible behaviour in events defined by any high-momentum charged particle, or by leptonic Z-boson decays.
14

Heidari, Haratmeh Bardia. "New Framework for Real-time Measurement, Monitoring, and Benchmarking of Construction Equipment Emissions." Thesis, Virginia Tech, 2014. http://hdl.handle.net/10919/64345.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The construction industry is one of the largest emitters of greenhouse gases and health-related pollutants. Monitoring and benchmarking emissions will provide practitioners with information to assess environmental impacts and improve the sustainability of construction. This research focuses on real-time measurement of emissions from non-road construction equipment and development of a monitoring-benchmarking tool for comparison of expected vs. actual emissions. First, exhaust emissions were measured using a Portable Emission Measurement System (PEMS) during the operation of 18 pieces of construction equipment at actual job sites. Second-by-second emission rates and emission factors for carbon dioxide, carbon monoxide, nitrogen oxides, and hydrocarbons were calculated for all equipment. Results were compared to those of other commonly used emission estimation models. Significant differences in emission factors associated with different activities were not observed, except for idling and hauling. Moreover, emission rates were up to 200 times lower than the values estimated using EPA and California Air Resources Board (CARB) guidelines. Second, the resulting database of emissions was used in an automated, real-time environmental assessment system. Based on videos of actual construction activities, this system enabled real-time action recognition of construction operations. From the resulting time-series of activities, emissions were estimated for each piece of equipment and differed by only 2% from those estimated by manual action recognition. Third, the actual emissions were compared to estimated ones using discrete event simulation, a computational model of construction activities. Actual emissions were 28% to 144% of those estimated by manual action recognition. Results of this research will aid practitioners in implementing strategies to measure, monitor, benchmark, and possibly reduce air pollutant emissions stemming from construction.
Master of Science
15

Wagner, Adrian [Verfasser], and S. [Akademischer Betreuer] Fuchs. "Event-Based Measurement and Mean Annual Flux Assessment of Suspended Sediment in Meso Scale Catchments / Adrian Wagner ; Betreuer: S. Fuchs." Karlsruhe : KIT-Bibliothek, 2020. http://d-nb.info/1203212003/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Baliki, Ghassan. "Empirical Advances in the Measurement and Analysis of Violent Conflict." Doctoral thesis, Humboldt-Universität zu Berlin, 2017. http://dx.doi.org/10.18452/18363.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Gewaltsamer Konflikt ist eine der hartnäckigsten Bedrohungen des Lebensunterhalts und der Nahrungssicherheit von Individuen weltweit. Trotz einer wachsenden Literatur, die die Ursachen und Folgen von Konflikten untersucht, bestehen nach wie vor erhebliche Verständnislücken, die zum Teil auf einen Mangel an qualitativ hochwertigen Konfliktereignisdaten zurückgehen. Mit Hilfe moderner ökonometrischer und statistischer Methoden trägt diese Monographie empirisch zur Literatur bei, indem sie sich mit drei miteinander verknüpften Themen befasst: (i) die Auswirkungen von Gewalterfahrungen auf Radikalisierung; (ii) das Ausmaß von Verzerrungen ("bias") in medienbasierten Konfliktereignisdaten; sowie (iii) die Rolle von Gewalt in benachbarten Gebieten für die Vorhersage von Ausbruch und Eskalation von Konflikten. Erstens zeigt eine Analyse des Gaza-Krieges von 2009, dass Menschen, die Gewalt direkt ausgesetzt sind, radikale Gruppen im Durchschnitt weniger unterstützen. Wenn frühere Wahlpräferenzen statistisch einbezogen werden, besitzt Gewalt jedoch eine polarisierende Wirkung im Wahlverhalten. Zweitens schätzt eine Auswertung syrischer Konfliktereignisdaten basierend auf internationalen und nationalen Quellen, dass Medien über nur knapp zehn Prozent der auftretenden Ereignisse berichten. Zudem ist die Berichterstattung stark räumlich und nach Konflikt-Akteuren verzerrt. Drittens stellt sich anhand von Paneldaten kleiner geographischer Zellen heraus, dass die räumliche und zeitliche Dynamik von Gewalt starken Einfluss auf sowohl den Ausbruch als auch die Eskalation von Konflikten an einem bestimmten Ort hat. In hochaufgelösten Analysen erhöht Gewalt in benachbarten Raumzellen jedoch nicht die Vorhersagekraft des Modells. Auf Grundlage der empirischen Befunde entwickelt diese Arbeit eine neue Methode zur Erhebung von Konfliktdaten, die auf direkte Informationsquellen vor Ort zurückgreift ("crowdseeding"), um Politik und Forschung verlässlichere Daten zu bieten.
Violent conflict is one of the most persistent challenges affecting the economic livelihoods and food security of individuals worldwide. Despite the surge in literature studying the impacts and drivers of armed conflict, there remains notable knowledge and methodological gaps, particularly regarding the quality of conflict event data. Using various advanced econometric and statistical techniques, this monograph contributes empirically to this literature by studying three interrelated issues. (i) The impact of violence exposure on radicalization; (ii) the magnitude of selection and veracity biases in media-based conflict event data; and (iii) the significance of incorporating violence in nearby locations in predicting armed conflict onset and escalation. First, evidence from the 2009 war on Gaza shows that individuals who experienced violence directly are less likely, on average, to support radical groups. However, when controlling for past electoral preferences, the results reveal a polarization effect among voters exposed directly to violence. Second, by matching conflict event data from several international and national media sources on the Syrian war, media reports are found to capture less than 10\% of the estimated total number of events in the study period. Moreover, reported events across the sources exhibit a systematic spatial clustering and actor-specific biases. Third, using a grid-level panel dataset, the temporal and spatial dynamics of violence, among other geographic factors, are found to significantly drive both conflict onset and escalation. However, violence in neighbouring grids does not enhance the prediction of armed conflict when using high precision units of analysis. In addition to these main findings, I propose and discuss a novel methodology, namely crowdseeding, for collecting conflict event data which works directly with primary sources on the ground to provide reliable information for researchers and policy-makers alike.
17

Hu, Liang. "Dynamic state estimation for power grids with unconventional measurements." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/12692.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
State estimation problem for power systems has long been a fundamental issue that demands a variety of methodologies dependent on the system settings. With recent introduction of advanced devices of phasor measurement units (PMUs) and dedicated communication networks, the infrastructure of power grids has been greatly improved. Coupled with the infrastructure improvements are three emerging issues for the state estimation problems, namely, the coexistence of both traditional and PMU measurements, the incomplete information resulting from delayed, missing and quantized measurements due to communication constraints, and the cyber-attacks on the communication channels. Three challenging problems are faced when dealing with the three issues in the state estimation program of power grids: 1) how to include the PMU measurements in the state estimator design, 2) how to account for the phenomena of incomplete information occurring in the measurements and design effective state estimators resilient to such phenomena, and 3) how to identify the system vulnerability in state estimation scheme and protect the estimation system against cyber-attacks. In this thesis, with the aim to solve the above problems, we develop several state estimation algorithms which tackle the issues of mixed measurements and incomplete information, and examine the cyber-security of the dynamic state estimation scheme. • To improve the estimation performance of power grids including PMU measurements, a hybrid extended Kalman filter and particle swarm optimization algorithm is developed, which has the advantages of being scalable to the numbers of the installed PMUs and being compatible with existing dynamic state estimation software as well. • Two kinds of network-induced phenomena, which leads to incomplete information of measurements, are considered. Specifically, the phenomenon of missing measurements is assumed to occur randomly and the missing probability is governed by a random variable, and the quantized nonlinear measurement model of power systems is presented where the quantization is assumed to be of logarithmic type. Then, the impact of the incomplete information on the overall estimation performance is taken into account when designing the estimator. Specifically, a modified extended Kalman filter is developed which is insensitive to the missing measurements in terms of acceptable probability, and a recursive filter is designed for the system with quantized measurements such that an upper bound of the estimation error is guaranteed and also minimized by appropriately designing the filter gain. • With the aim to reduce or eliminate the occurrence of the above-mentioned network-induced phenomena, we propose an event-based state estimation scheme with which communication transmission from the meters to the control centre can be greatly reduced. To ensure the estimation performance, we design the estimator gains by solving constrained optimization problems such that the estimation error covariances are guaranteed to be always less than a finite upper bound. • We examine the cyber-security of the dynamic state estimation system in power grids where the adversary is able to inject false data into the communication channels between PMUs and the control centre. The condition under which the attacks cause unbounded estimation errors is found. Furthermore, for system that is vulnerable to cyber-attacks, we propose a system protection scheme through which only a few (rather than all) communication channels require protection against false data injection attacks.
18

Dosé, Tiffany, and Alexander Åström. "The Taco Theory : - A repeated measurement study of the effects of experiential event marketing on brand relationship quality in the FMCG industry." Thesis, Uppsala universitet, Företagsekonomiska institutionen, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-324789.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Consumer marketing scholars keenly emphasize a proposed paradigm shift toward interactive relationships and lived brand experiences. Yet, little has been done to investigate the link between the two. Until now. This study is an attempt to measure the effects of lived brand experiences on consumers’ perceived relationship with a brand, through testing an academically established brand relationship quality model onto the concept of experiential event marketing. Susan Fournier’s (2000) brand relationship quality scale was chosen as the construct to be tested in the experiential event marketing context. It was through a theoretical argumentation hypothesized that the experiential event intervention would produce positive direct effects within the scale, but that these would decline with time. This was consequently tested through a repeated measurement study, set at an experiential food truck event hosted by the Swedish FMCG brand Santa Maria. Respondents were to rank their perceived brand relationship quality with the brand on three different occasions; directly before, directly after, and two weeks after being exposed to the experiential event. This way, not only the immediate effect, but also the effect over time, could be measured. It could be concluded that all but one constructs produced positive direct effects, but only half of them were significant. In all cases but one this effect declined significantly when being measured two weeks afterwards, and went in several cases back at approximately the same level as in the initial measurement. These findings have important implications for both academics and practitioners. Most notably, we argue that the link between lived brand experiences in form of typical FMCG experiential events and strengthened longer-term brand relationship quality can be invalidated.
19

Kampmann, Philipp René Verfasser], Livia [Akademischer Betreuer] Ludhová, and Christopher Henrik V. [Akademischer Betreuer] [Wiebusch. "Energy scale non-linearity and event reconstruction for the neutrino mass ordering measurement of the JUNO experiment / Philipp René Kampmann ; Livia Ludhová, Christopher Wiebusch." Aachen : Universitätsbibliothek der RWTH Aachen, 2020. http://d-nb.info/1227447124/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Rieck, Patrick [Verfasser], Thomas [Gutachter] Lohse, Heiko [Gutachter] Lacker, and Ulrich [Gutachter] Husemann. "Measurement of s-channel single top-quark production with the ATLAS detector using total event likelihoods / Patrick Rieck ; Gutachter: Thomas Lohse, Heiko Lacker, Ulrich Husemann." Berlin : Mathematisch-Naturwissenschaftliche Fakultät, 2016. http://d-nb.info/1117081303/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Yamac, Pinar Isil. "Improvement Proposal For A Software Requirements Management Process." Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/12607268/index.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This thesis focuses on measurement based software process improvement, especially improvement of requirements change management process. The literature on software measurement is investigated, software process improvement methodologies are studied and requirements change management metrics are examined. Requirements change management process at a private company working in the defense industry is observed and metrics obtained from various tools have been aggregated. Moreover, an improvement proposal, which also simplifies collecting metrics, is presented for the requirements change management process. A tool is developed for evaluating the performance of the improvement proposal using event driven simulation method.
22

Junqueira, Cinthia Amorim de Oliveira. "Investigação da estabilidade inter e intra-examinador na identificação do P300 auditivo: análise de erros." Universidade de São Paulo, 2001. http://www.teses.usp.br/teses/disponiveis/59/59134/tde-30082002-112247/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
O P300 auditivo é um potencial evocado que reflete a atividade neurofisiológica das habilidades cognitivas auditivas de atenção, memória, discriminação e tomada de decisão. A possibilidade de correlacionar aspectos do comportamento auditivo a fenômenos fisiológicos observáveis tem despertado o interesse de profissionais de diversas áreas interessados no estudo das disfunções auditivas. Por ser um procedimento novo, os métodos de análise e interpretação dos resultados ainda não estão padronizados e, portanto, devem ser explorados e discutidos visando maior segurança para aplicação clínica e científica. Este estudo investigou a estabilidade na análise e interpretação do P300 auditivo seguindo um conjunto de regras (critério) pré-determinadas. Para isso, quatro profissionais da área audiológica analisaram, em 2 momentos diferentes, 70 traçados de P300 de crianças e adolescentes saudáveis entre 8 e 18 anos de idade, seguindo as mesmas regras para a identificação das ondas (N1, P2, N2 e P3) e marcação de suas medidas de latência. As medidas de latência da onda P300 foram submetidas a análises qualitativa e quantitativa. A análise qualitativa investigou os tipos de erros cometidos pelo examinador no uso do critério de determinação do P300 (5,9% do total de 560 medidas obtidas). Os erros mais freqüentes no uso do critério foram: não identificar o P300 como a maior onda logo após o complexo N1-P2-N2 e identificar uma “falsa” onda P300. A análise quantitativa investigou a variabilidade da medida da latência do P300 atribuível ao examinador. Os resultados mostraram que não houve diferença significante entre as análises inter e intra-examinador, tendo sido encontradas correlações significantes entre as medidas de latência, indicando boa fidedignidade no teste-reteste e alta concordância entre os examinadores no modo como analisaram os traçados das ondas. O critério usado neste estudo demonstrou ser útil na determinação do P300, podendo ser sugerido com segurança para uso clínico e científico.
The P300 auditory is an evoked potential which reflects the neurophysiological activity of auditory cognitive abilities: attention, memory, discrimination and making decision as well. The possibility of correlation between aspects of auditory behavior and observable physiological phenomena has increased the interest in the study of auditory dysfunctions among professionals of various fields. Due the fact that the P300 is a recent procedure, the methods of its analysis and interpretation have not been standardized yet. Therefore, they must be explored and debated aiming more security for clinical and scientific application. In this study it was investigated the stability in the analysis and interpretation of P300 auditory, according to a pre-determined set of rules. Four audiologists analyzed twice 70 records of P300 of healthy children and adolescents between 8 and 18 years of age, identifying the waves (N1, P2, N2, P3) and their latencies according to the pre-determined set of rule. The P300 latency measurements were submitted to quantitative and qualitative analysis. The qualitative analysis looked into types of errors made by the examiner during the P300 identification (5.9% in a total of 560 measurements). The no-identification of the P300 as the highest wave following the complex N1-P2-N2, likewise the “wrong” identification of P300 wave were the most frequent mistakes. In the quantitative analysis we investigated the variability of the P300 latency measurements attributable to the examiner. The results showed that there were no significant differences between the inter- and intra-examiner analyses. Significant correlations were found between the measurements, showing a good test-retest reliability and high concordance among the examiners in the way they analyzed the wave records. We conclude that the rules used in this study are useful to the identification of the P300 in both clinical and scientific situations.
23

Kook, Kyung Soo Soo. "Dynamic Model Based Novel Findings in Power Systems Analysis and Frequency Measurement Verification." Diss., Virginia Tech, 2007. http://hdl.handle.net/10919/27761.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This study selects several new advanced topics in power systems, and verifies their usefulness using the simulation. In the study on ratio of the equivalent reactance and resistance of the bulk power systems, the simulation results give us the more correct value of X/R of the bulk power system, which can explain why the active power compensation is also important in voltage flicker mitigation. In the application study of the Energy Storage System(ESS) to the wind power, the new model implementation of the ESS connected to the wind power is proposed, and the control effect of ESS to the intermittency of the wind power is verified. Also this study conducts the intensive simulations for clarifying the behavior of the wide-area power system frequency as well as the possibility of the on-line instability detection. In our POWER IT Laboratory, since 2003, the U.S. national frequency monitoring network (FNET) has been being continuously operated to monitor the wide-area power system frequency in the U.S. Using the measured frequency data, the event of the power system is triggered, and its location and scale are estimated. This study also looks for the possibility of using the simulation technologies to contribute the applications of FNET, finds similarity of the event detection orders between the frequency measurements and the simulations in the U.S. Eastern power grid, and develops the new methodology for estimating the event location based on the simulated N-1 contingencies using the frequency measurement. It has been pointed out that the simulation results can not represent the actual response of the power systems due to the inevitable limit of modeling power systems and different operating conditions of the systems at every second. However, in the circumstances that we need to test such an important infrastructure supplying the electric energy without taking any risk of it, the software based simulation will be the best solution to verify the new technologies in power system engineering and, for doing this, new models and better application of the simulation should be proposed. Conducting extensive simulation studies, this dissertation verified that the actual X/R ratio of the bulk power systems is much lower than what has been known as its typical value, showed the effectiveness of the ESS control to mitigate the intermittence of the wind power from the perspective of the power grid using the newly proposed simulation model of ESS connected to the wind power, and found many characteristics of the wide-area frequency wave propagation. Also the possibility of using the simulated responses of the power system for replacing the measured data could be confirmed and this is very promising to the future application of the simulation to the on-line analysis of the power systems based on the FNET measurements.
Ph. D.
24

Adam, Lennart [Verfasser]. "Precision Matters: Measurement of the W boson mass and width with the ATLAS detector at a centre-of-mass energy of 7 TeV and the activity of the Underlying Event in Z boson events at a centre-of-mass energy of 13 TeV : Precision Matters: Measurement of the W boson mass and width with the ATLAS detector at √s = 7 TeV and the activity of the Underlying Event in Z boson events at √s = 13 TeV / Lennart Adam." Mainz : Universitätsbibliothek der Johannes Gutenberg-Universität Mainz, 2021. http://d-nb.info/122807190X/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Brickwedde, Bernard [Verfasser]. "Measurement of the differential Drell-Yan production cross-section and application of deep convolutional neural networks on event images in the context of pileup mitigation / Bernard Brickwedde." Mainz : Universitätsbibliothek der Johannes Gutenberg-Universität Mainz, 2020. http://d-nb.info/1224895738/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Kluge, Thomas [Verfasser]. "Measurement and QCD analysis of event shape variables in deep inelastic electron proton collisions at HERA / Deutsches Elektronen-Synchrotron in der Helmholtz-Gemeinschaft, DESY. Vorgelegt von Thomas Kluge." Hamburg : DESY, 2004. http://d-nb.info/971990387/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Stuart, Graeme. "Monitoring energy performance in local authority buildings." Thesis, De Montfort University, 2011. http://hdl.handle.net/2086/4964.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Energy management has been an important function of organisations since the oil crisis of the mid 1970’s led to hugely increased costs of energy. Although the financial costs of energy are still important, the growing recognition of the environmental costs of fossil-fuel energy is becoming more important. Legislation is also a key driver. The UK has set an ambitious greenhouse gas (GHG) reduction target of 80% of 1990 levels by 2050 in response to a strong international commitment to reduce GHG emissions globally. This work is concerned with the management of energy consumption in buildings through the analysis of energy consumption data. Buildings are a key source of emissions with a wide range of energy-consuming equipment, such as photocopiers or refrigerators, boilers, air-conditioning plant and lighting, delivering services to the building occupants. Energy wastage can be identified through an understanding of consumption patterns and in particular, of changes in these patterns over time. Changes in consumption patterns may have any number of causes; a fault in heating controls; a boiler or lighting replacement scheme; or a change in working practice entirely unrelated to energy management. Standard data analysis techniques such as degree-day modelling and CUSUM provide a means to measure and monitor consumption patterns. These techniques were designed for use with monthly billing data. Modern energy metering systems automatically generate data at half-hourly or better resolution. Standard techniques are not designed to capture the detailed information contained in this comparatively high-resolution data. The introduction of automated metering also introduces the need for automated analysis. This work assumes that consumption patterns are generally consistent in the short-term but will inevitably change. A novel statistical method is developed which builds automated event detection into a novel consumption modelling algorithm. Understanding these changes to consumption patterns is critical to energy management. Leicester City Council has provided half-hourly data from over 300 buildings covering up to seven years of consumption (a total of nearly 50 million meter readings). Automatic event detection pinpoints and quantifies over 5,000 statistically significant events in the Leicester dataset. It is shown that the total impact of these events is a decrease in overall consumption. Viewing consumption patterns in this way allows for a new, event-oriented approach to energy management where large datasets are automatically and rapidly analysed to produce summary meta-data describing their salient features. These event-oriented meta-data can be used to navigate the raw data event by event and are highly complementary to strategic energy management.
28

Perkowski, Matthew Paul. "An analysis of the gypsy moth event monitor modified forest vegetation simulator and the stand damage model using empirical long-term measurement plot data from the Appalachian hardwood and the Atlantic Coastal Plain mixed pine-hardwood regions." Morgantown, W. Va. : [West Virginia University Libraries], 2008. https://eidr.wvu.edu/etd/documentdata.eTD?documentid=5856.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Thesis (M.S.)--West Virginia University, 2008.
Title from document title page. Document formatted into pages; contains ix, 103 p. : ill. (some col.), col. maps. Includes abstract. Includes bibliographical references (p. 77-79).
29

Smith, Austin. "Agreement Level of Running Temporal Measurements, Kinetics, and Force-Time Curves Calculated from Inertial Measurement Units." Digital Commons @ East Tennessee State University, 2021. https://dc.etsu.edu/etd/3861.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Inertial measurement units (IMUs) and wearable sensors have enabled athlete monitoring and research to become more ecologically valid due to their small size and low cost. IMUs and accelerometers that are placed on the body close to the point of impact and that record at sufficiently high frequencies have demonstrated the highest validity when measuring temporal gait event moments such as ground contact time (GCT) and flight time (FT) as well as peak forces (PF) during upright running. While the use of IMUs has increased in the sport performance and athlete monitoring realm, the potential of the technology’s ability to estimate running force-time curves utilizing the two-mass model (TMM) remains unexplored. The purpose of this study was two-fold. First, was to determine the validity of measuring temporal gait events and peak forces utilizing a commercially available shank-mounted inertial measurement unit. Second, was to determine the validity of force-time curves generated from the TMM utilizing data from shank-mounted inertial measurement units. Ten subjects voluntarily completed submaximal treadmill tests equipped with a force plate while wearing shank-mounted IMUs on each leg. Using the raw data from the IMUs, GCT, FT, total step time (ST), PF, and two-mass model-based force-time (F-t) curves were generated for 25 steps at 8 different speeds. Paired sample T-tests were performed on the gait events and peak force between the IMU and treadmill with both individual step comparison and averages per each speed. 95% confidence intervals were calculated for each timepoint of the force time curves. No statistically significant differences (p > 0.05) and nearly perfect relationships were observed for the step averages for each speed with FT, ST, and PF. Confidence intervals of the corrected mean difference suggest that F-t curves calculated from the TMM may not be valid when assessing the running population as a whole. When performing a sub-group analysis of skilled runners and recreational runners, F-t curves derived from shank-mounted IMUs may be more valid in skilled runners than recreational runners. In skilled runners, the 95% CI for the mean difference contained zero within the first 60% of the GCT duration, whereas the 95% CI recreational runners contained a zero-value in a smaller percentage of the GCT located only in the middle of the GCT at the curve peak height. The results of this study suggest that interchangeability between shank-mounted IMUs and force plates may be very limited when estimating temporal gait events and kinetics. While agreement was low between F-t curves after the peak in skilled runners, use of shank-mounted IMUs to estimate F-t curves may have several benefits still in skilled runners when assessing peak forces and force development from initial contact until peak force.
30

Bruce, Julie. "Measurement and monitoring of surgical adverse events." Thesis, University of Aberdeen, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.408939.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The aim of this thesis is to investigate the validity of the measurement and the monitoring of surgical adverse events. Objectives:  1.  To select surgical adverse events for detailed evaluation;  2.  To investigate events with different epidemiological characteristics and attributes to assess the validity of their measurement and monitoring.  Three events were selected:  surgical wound infection, anastomotic leak and chronic post-surgical pain. Methods:  Four systematic reviews and two epidemiological studies were conducted to investigate surgical wound infection, anastomotic leak and chronic post-surgical pain. A total of 41 definitions of wound infection were identified in the surgical literature, with little evidence of formal theoretical assessment of validity and reliability.  Modified versions of Centres for Disease Control (CDC) definitions are currently used by UK nosocomial surveillance systems, although the impact of these modifications has not been evaluated.  A total of 56 definitions for anastomotic leak were found.  Although a national surgical consensus group proposed a definition for anastomotic leak, no evidence of its use was found in the surgical literature.  The cohort study of 435 patients undergoing gastrointestinal surgery found that patients with anastomotic leak had poorer long-term survival at four years postoperatively, although patient numbers were small.  The systematic review of chronic pain after cardiac study identified six prevalence studies worldwide, none of which used ‘standard’ definitions proposed by the International Association of the Study of Pain (IASP).  A total of 1080 patients undergoing cardiac surgery at one regional cardiothoracic centre were assessed at two years postoperatively; the cumulative prevalence of chronic pain was 39% using a definition based on timing, location and pain characteristics.
31

Danardono. "Multiple Time Scales and Longitudinal Measurements in Event History Analysis." Doctoral thesis, Umeå : Dept. of Statistics, Umeå Univ, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-420.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Gedion, Michael. "Contamination des composants électroniques par des éléments radioactifs." Thesis, Montpellier 2, 2012. http://www.theses.fr/2012MON20267/document.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Cette thèse a pour objet l'étude des éléments radioactifs qui peuvent altérer le bon fonctionnement des composants électroniques au niveau terrestre. Ces éléments radioactifs sont appelés émetteurs alpha. Intrinsèques aux composants électroniques, ils se désintègrent et émettent des particules alpha qui ionisent la matière du dispositif électronique et déclenchent des SEU (Single Event Upset). Ces travaux visent à évaluer la fiabilité des circuits digitaux due à cette contrainte radiative interne aux composants électroniques. Dans ce but, tous les émetteurs alpha naturelles ou artificielles susceptibles de contaminer les matériaux des circuits digitaux ont été identifiés et classés en deux catégories : les impuretés naturelles et les radionucléides introduits. Les impuretés naturelles proviennent d'une contamination naturelle ou involontaire des matériaux utilisés. Afin d'évaluer leurs effets sur la fiabilité, le SER (Soft Error Rate) a été déterminé par simulations Monte-Carlo pour différents nœuds technologiques dans le cas de l'équilibre séculaire. Par ailleurs, avec la miniaturisation des circuits digitaux, de nouveaux éléments chimiques ont été suggérés ou employés dans la nanoélectronique. Les radionucléides introduits regroupent ce type d'élément naturellement constitué d'émetteurs alpha. Des études basées sur des simulations Monte-Carlo et des applications analytiques ont été effectués pour évaluer la fiabilité des dispositifs électroniques. Par la suite, des recommandations ont été proposées sur l'emploi de nouveaux éléments chimiques dans la nanotechnologie
This work studies radioactive elements that can affect the proper functioning of electronic components at ground level. These radioactive elements are called alpha emitters. Intrinsic to electronic components, they decay and emit alpha particles which ionize the material of the electronic device and trigger SEU (Single Event Upset).This thesis aims to assess the reliability of digital circuits due to this internal radiative constraint of electronic components. For that, all alpha-emitting natural or artificial isotopes that can contaminate digital circuits have been identified and classified into two categories: natural impurities and introduced radionuclides.Natural impurities result from a natural or accidental contamination of materials used in nanotechnology. To assess their effects on reliability, the SER (Soft Error Rate) was determined by Monte Carlo simulations for different technology nodes in the case of secular equilibrium. Besides, a new analytical approach was developed to determine the consequences of secular disequilibrium on the reliability of digital circuits.Moreover, with the miniaturization of digital circuits, new chemical elements have been suggested or used in nanoelectronics. The introduced radionuclides include this type of element consisting of natural alpha emitters. Studies based on Monte Carlo simulations and analytical approches have been conducted to evaluate the reliability of electronic devices. Subsequently, recommendations were proposed on the use of new chemical elements in nanotechnology
33

Colling, David John. "Measurement of heavy flavour semileptonic branching ratios at ALEPH." Thesis, Imperial College London, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.243991.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Cunha, de Sousa Ines Pereira Silva. "Analysis of Repeated Measurements and Time-to-Event Outcomes in Longitudinal Studies." Thesis, Lancaster University, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.504199.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Hofman, Jiří. "Testovací metody pro hodnocení radiačních efektů v přesných analogových a signálově smíšených obvodech pro aplikace v kosmické elektronice." Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2019. http://www.nusl.cz/ntk/nusl-401588.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The traditional radiation testing of space electronics has been used for more than fifty years to support the radiation hardness assurance. Its typical goal is to ensure reliable operation of the spacecraft in the harsh environment of space. This PhD research looks into the radiation testing from a different perspective; the goal is to develop radiation testing methods that are focused not only on the reliability of the components but also on a continuous radiation-induced degradation of their performance. Such data are crucial for the understanding of the impact of radiation on the measurement uncertainty of data acquisition systems onboard research space missions.
36

Hopkins, Mark Franklin 1963. "Infrared imaging spectrometer for measurement of temperature in high-speed events." Diss., The University of Arizona, 1998. http://hdl.handle.net/10150/282751.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Munition development has always been driven by the necessity of delivering enough explosives to a targeted object to destroy it. Targets that are protected by steel reinforced concrete housings have become increasingly more difficult to destroy. Improvements must be made in munitions engineering design to either deliver more payload to the target or to make the weapon more potent. In most cases, due to aircraft weight limitations, the delivery of more payload is not an option. Therefore, improving the destructive power of a weapon of a given payload requires the use of more powerful explosives. However, when the potency of an explosive is increased, its sensitivity to premature detonation also increases. The characteristics of the metal casing containing the explosive contribute significantly to the weapon's detonation sensitivity. Casing experience significant heating during weapon penetration. This heating can cause the weapon to detonate before it reaches its target location. In the past, computer codes used to model detonating weapons have not taken heating into account in their performance predictions. Consequently, the theoretical models and the actual field tests are not in agreement. New models, that include temperature information, are currently being developed which are based on work done in the area of computational fluid dynamics. In this research, a remotely located, high-speed, infrared (IR) camera is used to obtain detailed measurements of the passive radiation from an object in an energetic environment. This radiation information is used to determine both the emissivity and the temperature of the surface of an object. However, before the temperature or emissivity was determined, the functional form of the emissivity was calculated to be an Mth degree polynomial with respect to wavelength dependence. With the advent of large, high-speed, IR detector arrays, it has now become possible to realize IR imaging spectrometers that have very high spatial resolution. The IR spectrometer system developed in this research utilized a large detector array to allow multiple spectral images to be formed simultaneously on the image plane. In conjunction with the correct emissivity model, this imaging IR spectrometer can determine temperature to within ±5 degrees Celsius. These experimentally verified temperature maps were then integrated into the newly developed computer models. This additional information will result in more accurate computer codes for modeling the energetic environment. In turn, this will allow the weapon designer to accurately optimize weapon performance with respect to different materials, geometries and kinetics.
37

PRASAD, RAVI B. "HEAT TRANSFER STUDIES OF A PYROTECHNIC EVENT AND ITS EFFECT ON FUEL POOL IGNITION." University of Cincinnati / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1109262274.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Zuo, Jian. "The Frequency Monitor Network (FNET) Design and Situation Awareness Algorithm Development." Diss., Virginia Tech, 2008. http://hdl.handle.net/10919/26721.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Wide Area Measurements (WAMs) have been widely used in the energy management system (EMS) of power system for monitoring, operation and control. In recent years, the advent of synchronized Phasor Measurements Unit (PMU) has added another dimension to the field of wide-area measurement. However, the high cost of the PMU, which includes the manufacture and deployment fee, is a hurdle to the wide use of the PMU in power systems. Unlike traditional PMUs, the frequency monitoring network (FNET) developed by the Virginia Tech Power IT lab is an Internetâ based, GPSâ synchronized, wide-area frequency monitoring network deployed at the distribution level, providing a low-cost and easily deployable WAMs solution. In this dissertation, the research work can be categorized into two parts: FNET Design and Situation Awareness Algorithm Development.
Ph. D.
39

Zora, Leydi Tatiana. "Thesis PMU Applications Prioritization Based in Wide Area Disturbance Events." Thesis, Virginia Tech, 2015. http://hdl.handle.net/10919/71829.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Synchrophasor Measurement Units (PMUs) are devices that can not only measure but also time stamp voltage, current, frequency, among others. PMUs take these synchronized measurements as fast as 60 times per second; compared with the traditional 2-4 second SCADA measurements, PMUs bring a much clear and real-time picture of what is happening in the power system. PMUs have been increasingly deployed across transmission power grids worldwide. In the USA this is primarily done by utilities through projects sponsored mainly by SIGS and Smart Grid grants. There are different applications that synchrophasors can provide, including off-line and real-time applications. However, due to budget constraints, technology development and characteristics specific to each system, not all applications are equally suitable and essential for all electric power systems. This thesis provides a method for PMU applications prioritization based on the analysis and results of wide area disturbance events.
Master of Science
40

Bindi, Marcello <1981&gt. "Measurement of the charm production cross section in DIS events at HERA." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/853/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Lo, Ling Hsiang. "Evaluation of narrowband frequency domain measurements of electrostatic discharge, ESD, events." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape11/PQDD_0014/MQ52468.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Hay, Timothy Deane. "MAX-DOAS measurements of bromine explosion events in McMurdo Sound, Antarctica." Thesis, University of Canterbury. Physics and Astronomy, 2010. http://hdl.handle.net/10092/5394.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Reactive halogen species (RHS) are responsible for ozone depletion and oxidation of gaseous elemental mercury and dimethyl sulphide in the polar boundary layer, but the sources and mechanisms controlling their catalytic reaction cycles are still not completely understood. To further investigate these processes, ground– based Multi–Axis Differential Optical Absorption Spectroscopy (MAX-DOAS) observations of boundary layer BrO and IO were made from a portable instrument platform in McMurdo Sound during the Antarctic spring of 2006 and 2007. Measurements of surface ozone, temperature, pressure, humidity, and wind speed and direction were also made, along with fourteen tethersonde soundings and the collection of snow samples for mercury analysis. A spherical multiple scattering Monte Carlo radiative transfer model (RTM) was developed for the simulation of box-air-mass-factors (box-AMFs), which are used to determine the weighting functions and forward model differential slant column densities (DSCDs) required for optimal estimation. The RTM employed the backward adjoint simulation technique for the fast calculation of box-AMFs for specific solar zenith angles (SZA) and MAX-DOAS measurement geometries. Rayleigh and Henyey-Greenstein scattering, ground topography and reflection, refraction, and molecular absorption by multiple species were included. Radiance and box-AMF simulations for MAX-DOAS measurements were compared with nine other RTMs and showed good agreement. A maximum a posteriori (MAP) optimal estimation algorithm was developed to retrieve trace gas concentration profiles from the DSCDs derived from the DOAS analysis of the measured absorption spectra. The retrieval algorithm was validated by performing an inversion of artificial DSCDs, simulated from known NO2 profiles. Profiles with a maximum concentration near the ground were generally well reproduced, but the retrieval of elevated layers was less accurate. Retrieved partial vertical column densities (VCDs) were similar to the known values, and investigation of the averaging kernels indicated that these were the most reliable retrieval product. NO₂ profiles were also retrieved from measurements made at an NO₂ measurement and profiling intercomparison campaign in Cabauw, Netherlands in July 2009. Boundary layer BrO was observed on several days throughout both measurement periods in McMurdo Sound, with a maximum retrieved surface mixing ratio of 14.4±0.3 ppt. The median partial VCDs up to 3km were 9.7±0.07 x 10¹² molec cm ⁻ in 2007, with a maximum of 2.3±0.07 x 10¹³ molec cm⁻², and 7.4±0.06 x 10¹² molec cm⁻² in 2006, with a maximum of 1.05 ± 0.07 x 1013 molec cm⁻². The median mixing ratio of 7.5±0.5 ppt for 2007 was significantly higher than the median of 5.2±0.5 ppt observed in 2006, which may be related to the more extensive first year sea ice in 2007. These values are consistent with, though lower than estimated boundary layer BrO concentrations at other polar coastal sites. Four out of five observed partial ozone depletion events (ODEs) occurred during strong winds and blowing snow, while BrO was present in the boundary layer in both stormy and calm conditions, consistent with the activation of RHS in these two weather extremes. Air mass back trajectories, modelled using the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model, indicated that the events were locally produced rather than transported from other sea ice zones. Boundary layer IO mixing ratios of 0.5–2.5±0.2 ppt were observed on several days. These values are low compared to measurements at Halley and Neumayer Stations, as well as mid-latitudes. Significantly higher total mercury concentrations observed in 2007 may be related to the higher boundary layer BrO concentrations, but further measurements are required to verify this.
43

Foti, Maria Giovanna. "Measurement of the top quark mass in multijet events with the CMS experiment at LHC." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/7714/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
La massa del quark top è qui misurata per mezzo dei dati raccolti dall’esperimento CMS in collisioni protone-protone ad LHC, con energia nel centro di massa pari ad 8 TeV. Il campione di dati raccolto corrisponde ad una luminosità integrata pari a 18.2 /fb. La misura è effettuata su eventi con un numero di jet almeno pari a 6, di cui almeno due b-taggati (ovvero identificati come prodotto dell’adronizzazione di due quark bottom). Il valore di massa trovato è di (173.95 +- 0.43 (stat)) GeV/c2, in accordo con la media mondiale. The top quark mass is here measured by using the data that have been collected with the CMS experiment in proton-proton collisions at the LHC, at a center-of-mass energy of 8 TeV. The dataset which was used, corresponds to an integrated luminosiy of 18.2 /fb. The mass measurement is carried out by using events characterized by six or more jets, two of which identified as being originated by the hadronization of bottom quarks. The result of the measurement of the top quark mass performed here is: (173.95 +- 0.43 (stat)) GeV/c2, in accordance with the recently published world average.
44

Zorita, Paz Mendez-Bonito. "Family functioning, life events, and depression: Accounting for contamination of family functioning measures by depression variables, and error of measurement in life events measures." Case Western Reserve University School of Graduate Studies / OhioLINK, 1991. http://rave.ohiolink.edu/etdc/view?acc_num=case1059416374.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
45

MacEwen, Clare. "Can data fusion techniques predict adverse physiological events during haemodialysis?" Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:1ef92d5d-920d-4ff4-b368-5e892527e675.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Intra-dialytic haemodynamic instability is a common and disabling problem which may lead to morbidity and mortality though repeated organ ischaemia, but it has proven difficult to link any particular blood pressure threshold with hard patient outcomes. The relationship between blood pressure and downstream organ ischaemia during haemodialysis has not been well characterised. Previous attempts to predict and prevent intra-dialytic hypotension have had mixed results, partly due to patient and event heterogeneity. Using the brain as the indicator organ, we aimed to model the dynamic relationship between blood pressure, real-time symptoms, downstream organ ischaemia during haemodialysis, in order to identify the most physiologically grounded, prognostic definition of intra-dialytic decompensation. Following on from this, we aimed to predict the onset of intra-dialytic decompensation using personalised, probabilistic models of multivariate, continuous physiological data, ultimately working towards an early warning system for intra-dialytic adverse events. This was a prospective study of 60 prevalent haemodialysis patients who underwent extensive, continuous physiological monitoring of haemodynamic, cardiorespiratory, tissue oxygenation and dialysis machine parameters for 3-4 weeks. In addition, longitudinal cognitive function testing was performed at baseline and at 12 months. Despite their use in clinical practice, we found that blood pressure thresholds alone have a poor trade off between sensitivity and specificity for predicting downstream tissue ischaemia during haemodialysis. However, the performance of blood pressure thresholds could be improved by stratification for the presence or absence of cerebral autoregulation, and personalising thresholds according to the individual lower limit of autoregulation. For patients without autoregulation, the optimal blood pressure target was a mean arterial pressure (MAP) of 70mmHg. A key finding was that cumulative intra-dialytic exposure to cerebral ischaemia, but not to hypotension per se, corresponded to change in executive cognitive function over 12 months. Therefore we chose cerebral ischaemia as the definition of intra-dialytic decompensation for predictive modelling. We were able to demonstrate that the development of cerebral desaturation could be anticipated from earlier deviations of univariate physiological data from the expected trajectory for a given patient, but sensitivity was limited by the heterogeneity of events even within one individual. The most useful phys- iological data streams included peripheral saturation variance, cerebral saturation variance, heart rate and mean arterial pressure. Multivariate data fusion techniques using these variables created promising personalised models capable of giving an early warning of decompensation. Future work will involve the refinement and prospective testing of these models. In addition, we envisage a prospective study assessing the benefit of autoregulation-guided blood pressure targets on short term outcomes such as patient symptoms and wellbeing, as well as longer term outcomes such as cognitive function.
46

Xu, Xueyan. "Prediction of life-threatening events in infants using heart rate variability measurements." Morgantown, W. Va. : [West Virginia University Libraries], 2002. http://etd.wvu.edu/templates/showETD.cfm?recnum=2288.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Thesis (Ph. D.)--West Virginia University, 2002.
Title from document title page. Document formatted into pages; contains viii, 250 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 240-250).
47

Wilt, Brian A. "Charged multiplicity measurement for simulated pp events in the Compact Muon Solenoid (CMS) detector." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/40922.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Thesis (S.B.)--Massachusetts Institute of Technology, Dept. of Physics, 2007.
Includes bibliographical references (p. 61-64).
In this thesis, I studied the effectiveness of a method for measuring the charged multiplicity of proton-proton collisions in the Compact Muon Solenoid (CMS) experiment at LHC energies ... This technique involves counting reconstructed hits in the innermost layer of the pixel tracker. By using the relationship between pseudorapidity and deposited charge of the hits, we can distinguish between signal and background. We calculate a transformation function as the division of the average Monte Carlo track distribution by the average reconstructed hit distribution. By applying this transformation to the reconstructed hit distributions on an event-by-event basis, we can collect information about minimum bias events. This method gives us access to low PT particles which cannot be reconstructed in charged multiplicity methods using tracklets. A description of the method is given, followed by preliminary results: reconstructed Neh distributions for ... distribution.
by Brian A. Wilt.
S.B.
48

Wang, Joshua Kevin. "Identification, Analysis, and Control of Power System Events Using Wide-Area Frequency Measurements." Diss., Virginia Tech, 2009. http://hdl.handle.net/10919/26250.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The power system has long been operated in a shroud of introspection. Only recently have dynamic, wide-area time synchronized grid measurements brought to light the complex relationships between large machines thousands of miles apart. These measurements are invaluable to understanding the health of the system in real time, for disturbances to the balance between generation and load are manifest in the propagation of electromechanical waves throughout the grid. The global perspective of wide-area measurements provides a platform from which the destructive effects of these disturbances can be avoided. Virginia Tech's distributed network of low voltage frequency monitors, FNET, is able to track these waves as they travel throughout the North American interconnected grids. In contrast to other wide-area measurement systems, the ability to easily measure frequency throughout the grid provides a way to identify, locate, and analyze disturbances with high dynamic accuracy. The unique statistical properties of wide-area measurements require robust tools in order to accurately understand the nature of these events. Expert systems and data conditioning can then be used to quantify the magnitude and location of these disturbances without requiring any knowledge of the system state or topology. Adaptive application of these robust methods form the basis for real-time situational awareness and control. While automated control of the power system rarely utilize wide-area measurements, global insight into grid behavior can only improve disturbance rejection.
Ph. D.
49

Seo, Seon-Hee. "All-Sky Measurements of the Mesospheric "Frontal Events" From Bear Lake Observatory, Utah." DigitalCommons@USU, 1998. https://digitalcommons.usu.edu/etd/3838.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Studies of internal gravity waves in the earth's upper atmosphere are of considerable interest. These waves play a very important role in the dynamics of the mesosphere and lower thermosphere (ML T) region where they can transfer large amounts of energy and momentum from the lower atmosphere via wave saturation and dissipation. In particular, small-scale short-period (50ms"1) . Another unusual characteristic of "frontal events" is an apparent reversal in contrast of the wave structures as imaged in the hydroxyl (OH) emission (peak altitude- 87 km) when compared with the oxygen (OJ) "green line" (557.7 nm) emission (peak altitude -96 km) that can sometimes occur. In one isolated case, observed from Haleakala, Hawaii, the bright wave crests in the OH emission appeared to propagated through a dark structureless sky, whereas in the OI emission the same waves appeared to propagate into a bright sky, leaving an apparently depleted emission in its wake. Recent theoretical studies based on noble measurements have shown that frontal events may be due to a "bore-like" intrusion that raises the OJ (557. 7 nm) layer by a few km and at the same time depresses the OH layer by a similar amount. However, studies of fronts and bores in the ML T region are exceptionally rare. I have discovered and analyzed 16 frontal events from image data recorded at Bear Lake Observatory, Utah ( 41.6°N, 111.6°W), over the past four years. I have investigated some of their properties such as their horizontal wavelengths, horizontal phase speeds, observed periods, and their directions of motion. In addition, I have made comparative measurements of their relative intensities in the OH and OI emissions. These studies provide the first "extensive" data set on such events detailing their morphology and dynamics and should provide important information necessary for a deeper understanding of their occurrence frequency and properties.
50

Dorris, Simon James. "A measurement of the colour factors of quantum chromodynamics from four-jet events at LEP." Thesis, University of Glasgow, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.360179.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

До бібліографії