To see the other types of publications on this topic, follow the link: Source of objects.

Dissertations / Theses on the topic 'Source of objects'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Source of objects.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Litke, Katrina C., You-Hua Chu, Abigail Holmes, Robert Santucci, Terrence Blindauer, Robert A. Gruendl, Chuan-Jui Li, Kuo-Chuan Pan, Paul M. Ricker, and Daniel R. Weisz. "Nature of the Diffuse Source and Its Central Point-like Source in SNR 0509-67.5." IOP PUBLISHING LTD, 2017. http://hdl.handle.net/10150/624374.

Full text
Abstract:
We examine a diffuse emission region near the center of SNR 0509-67.5 to determine its nature. Within this diffuse region we observe a point-like source that is bright in the near-IR, but is not visible in the B and V bands. We consider an emission line observed at 6766 angstrom and the possibilities that it is Ly alpha, H alpha, and [O II] lambda 3727. We examine the spectral energy distribution (SED) of the source, comprised of Hubble Space Telescope B, V, I, J, and H bands in addition to Spitzer/IRAC 3.6, 4.5, 5.8, and 8 mu m bands. The peak of the SED is consistent with a background galaxy at z approximate to 0.8 +/- 0.2 and a possible Balmer jump places the galaxy at z approximate to 0.9 +/- 0.3. These SED considerations support the emission line's identification as [O II] lambda 3727. We conclude that the diffuse source in SNR 0509-67.5 is a background galaxy at z approximate to 0.82. Furthermore, we identify the point-like source superposed near the center of the galaxy as its central bulge. Finally, we find no evidence for a surviving companion star, indicating a double-degenerate origin for SNR 0509-67.5.
APA, Harvard, Vancouver, ISO, and other styles
2

Huerta, Escudero Eliu Antonio. "Source modelling of extreme and intermediate mass ratio inspirals." Thesis, University of Cambridge, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.609770.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Dilley, Jerome Alexander Martin. "A single-photon source for quantum networking." Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:380a4aaf-e809-4fff-84c7-5b6a0856a6cf.

Full text
Abstract:
Cavity quantum electrodynamics (cavity QED) with single atoms and single photons provides a promising route toward scalable quantum information processing (QIP) and computing. A strongly coupled atom-cavity system should act as a universal quantum interface, allowing the generation and storage of quantum information. This thesis describes the realisation of an atom-cavity system used for the production and manipulation of single photons. These photons are shown to exhibit strong sub-Poissonian statistics and indistinguishability, both prerequisites for their use in realistic quantum systems. Further, the ability to control the temporal shape and internal phase of the photons, as they are generated in the cavity, is demonstrated. This high degree of control presents a novel mechanism enabling the creation of arbitrary photonic quantum bits.
APA, Harvard, Vancouver, ISO, and other styles
4

Vardoulaki, Eleni. "Understanding the nature of the faint radio-source population." Thesis, University of Oxford, 2009. http://ora.ox.ac.uk/objects/uuid:b5750339-d1cd-4d2c-8125-a1bc645b8de8.

Full text
Abstract:
This DPhil dissertation presents two new and independent samples of faint radio sources. The first sample is the 37 SXDS radio sources with flux densities at 1.4-GHz above 2 mJy, a spectroscopic completeness of 65% and a median redshift z_med ≈ 1.1. The second sample is the 47 TOOT00 radio sources with flux densities at 151 MHz above 100 mJy, a spectroscopic completeness of 85% and z_med ∼ 1.25. Optical, near- and mid-IR photometry, optical spectroscopy, and radio observations are used in the analysis and comparison of the two samples. The quasar fraction in the TOOT00 radio sources is 0.13 < f_q < 0.25 above the FRI/FRII break in radio luminosity, while use of 24 μm data reveals objects with significant but sometimes obscured accretion and gives quasar-mode fraction of 0.5 → 0.9 above the FRI/FRII break. The FRI/FRII divide seen at z < ∼ 0.5 is also observed at z ∼ 1 for FRII objects in the TOOT00 and SXDS samples, but examples of FRI radio sources above the FRI/FRII break do exist. The total number of the TOOT00 objects and their distribution are consistent with simulations based on extrapolations from previous work, while for the SXDS objects the results are only broadly similar. Based on that comparison, the redshift spikes seen at z ∼ 1.3 in TOOT00 and at z ∼ 0.65 & 2.7 in SXDS appear to be significant, and might be due to Large-Scale Structure. A V/Vmax test suggests the cosmic evolution of the TOOT00 and SXDS samples, is different. The TOOT00 radio sources are 2-times more luminous in host-galaxy starlight than the SXDS radio sources. The almost proportionality between radio luminosity at 1.4 GHz and 24 μm luminosity suggests that L_rad traces accretion luminosity and L[OII] ∝ L_rad^0.7 may reflect imperfections in the L[OII]-accretion luminosity scaling. Mid-IR 24 μm observations in the SXDS sample suggest that 30% of the light from the nucleus is absorbed by the torus and re-emitted in the mid-IR, while ∼ 1% of the light is scattered above and below the torus.
APA, Harvard, Vancouver, ISO, and other styles
5

Oldham, James Martin. "Combination of a cold ion and cold molecular source." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:ef33adcb-609a-4329-b4d8-aca8a1c48661.

Full text
Abstract:
This thesis describes the combination of two sources of cold atomic or molecular species which can be used to study a wide range of ion-molecule reactions. The challenges in forming these species and in determining the fate of reactive events are explored throughout. Reactions occur in a volume within a radio-frequency ion trap, in which ions have previously been cooled to sub-Kelvin temperatures. Ions are laser-cooled, with migration of ions slowed sufficiently to form a quasi-crystalline spheroidal structure, deemed a Coulomb crystal. Fluorescence emitted as a consequence of laser-cooling is detected; the subsequent fluorescence profiles are used to determine the number of ions in the crystal and, in combination with complementary simulations, the temperature of these ions. Motion imparted by trapping fields can be substantial and simulations are required to accurately determine collision energies. A beam of decelerated molecules is aimed at this stationary ion target. An ammonia seeded molecular beam enters a Stark decelerator, based on the original design of Meijer and co-workers. The decelerator uses time-varying electric fields to remove kinetic energy from the molecules, which exit at speeds down to 35 m/s. A fast-opening shutter and focussing elements are subsequently used to maximise the decelerated flux in the reaction volume while minimising undecelerated molecule transmission. Substantial fluxes of decelerated ammonia are obtained with narrow velocity distributions to provide a suitable source of reactant molecules. Combination of these two techniques permits studies of reactions between atomic ions and decelerated molecules that can be entirely state-specific. Changes in the Coulomb crystal fluorescence profile denote changes in the ion identities, the rate of these changes can be used to obtain rate constants. Determination of rate constants is even possible despite the fact that neither reactant nor product ions are directly observed. This work has studied reactions between sympathetically cooled Xe+ ions and guided ND3 and has obtained data consistent with prior studies. Determination of reactive events is complicated if ion identities can change without affecting the fluorescence profile, or if multiple reaction channels are possible. A range of spectroscopic techniques are discussed and considered in regards to determining rate constants and product identities. Pulsed axial excitation of trapped ions can follow rapid changes in average ion weights and subtle changes for small crystals. Time-of-flight mass spectrometry is also demonstrated using the trapping electrodes and is suitable for discrimination of ions formed within the trap.
APA, Harvard, Vancouver, ISO, and other styles
6

Twardzik, Cedric. "Study of the earthquake source process and seismic hazards." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:c2553a3f-f6ce-46a0-9c47-d68f5957cdac.

Full text
Abstract:
To obtain the rupture history of the Parkfield, California, earthquake, we perform 12 kinematic inversions using elliptical sub-faults. The preferred model has a seismic moment of 1.21 x 10^18 Nm, distributed on two distinct ellipses. The average rupture speed is ~2.7 km/s. The good spatial agreement with previous large earthquakes and aftershocks in the region, suggests the presence of permanent asperities that break during large earthquakes. We investigate our inversion method with several tests. We demonstrate its capability to retrieve the rupture process. We show that the convergence of the inversion is controlled by the space-time location of the rupture front. Additional inversions show that our procedure is not highly influenced by high-frequency signal, while we observe high sensitivity to the waveforms duration. After considering kinematic inversion, we present a full dynamic inversion for the Parkfield earthquake using elliptical sub-faults. The best fitting model has a seismic moment of 1.18 x 10^18 Nm, distributed on one ellipse. The rupture speed is ~2.8 km/s. Inside the parameter-space, the models are distributed according the rupture speed and final seismic moment, defining a optimal region where models fit correctly the data. Furthermore, to make the preferred kinematic model both dynamically correct while fitting the data, we show it is necessary to connect the two ellipses. This is done by adopting a new approach that uses b-spline curves. Finally, we relocate earthquakes in the vicinity of the Darfield, New-Zealand earthquake. 40 years prior to the earthquake, where there is the possibility of earthquake migration towards its epicentral region. Once it triggers the 2010-2011 earthquake sequence, we observe earthquakes migrating inside regions of stress increase. We also observe a stress increase on a large seismic gap of the Alpine Fault, as well as on some portions of the Canterbury Plains that remain today seismically quiet.
APA, Harvard, Vancouver, ISO, and other styles
7

Fleury, Rob. "Evaluation of Thermal Radiation Models for Fire Spread Between Objects." Thesis, University of Canterbury. Civil and Natural Resources Engineering, 2010. http://hdl.handle.net/10092/4959.

Full text
Abstract:
Fire spread between objects within a compartment is primarily due to the impingement of thermal radiation from the fire source. In order to estimate if or when a remote object from the fire will ignite, one must be able to quantify the radiative heat flux being received by the target. There are a variety of methods presented in the literature that attempt to calculate the thermal radiation to a target; each one based on assumptions about the fire. The performance of six of these methods, of varying complexity, is investigated in this research. This includes the common point source model, three different cylindrical models, a basic correlation and a planar model. In order to determine the performance of each method, the predictions made by the models were compared with actual measurements of radiant heat flux. This involved taking heat flux readings at numerous locations surrounding a propane gas burner. Different fire scenarios were represented by varying the burner geometry and heat release rate. Video recordings of the experiments were used to determine the mean flame heights using video image analysis software. After comparing the measured data with predictions made by the theoretical radiation methods, the point source model was found to be the best performing method on average. This was unexpected given the relative simplicity of the model in comparison to some of its counterparts. Additionally, the point source model proved to be the most robust of the six methods investigated, being least affected by the experimental variables. The Dayan and Tien method, one of the cylindrical models, was the second most accurate over the range of conditions tested in this work. Based on these findings, recommendations are made as to the most appropriate method for use in a radiation sub-model within an existing zone model software. The accuracy shown by the point source model, coupled with its ease of implementation, means that it should be suitable for such a use.
APA, Harvard, Vancouver, ISO, and other styles
8

Lam, Jessica. "Creating a source for cold, magnetically-trapped bromine atoms." Thesis, University of Oxford, 2015. https://ora.ox.ac.uk/objects/uuid:7a66d81c-9613-47c5-84ad-b295face98e4.

Full text
Abstract:
This thesis demonstrates the feasibility of producing the first cold source of halogen atoms, using Br atoms as the focus. Br atoms are produced by photodissociation of Br2, and detected by (2+1) REMPI and time-of-flight measurements. Ground- and excited-state Br fragments are formed with a recoil velocity directed along the molecular beam axis. The excess energy from the dissociation laser provides the backscattered Br fragments with sufficient recoil velocity to match and cancel out the average velocity of the molecular beam. The Br fragments which undergo sufficient velocity cancellation remain in the laser detection volume for up to 5 ms. A magnetic trap, composed of two permanent bar magnets with their North poles facing each other, is placed around the detection volume and with an axis perpendicular to the molecular beam and laser axes. The centres of the trap and of the laser interaction volume are overlapped. The magnetic field is linear near the centre of the trap and forms a 3-dimensional well with a depth of 0.22 T, equivalent to a trap depth of U0/kB = 255 mK, with the upper and lower standard error [215,325] mK, for ground-state Br. With this configuration, Br atoms have been detected up to delays of 99 ms, to the next laser pulse, suggesting the possibility of accumulating density over successive molecular beam cycles. The decay of Br atoms from the trap is measured, and a Monte Carlo Markov Chain method is implemented to extract the intensity of the Br signal, which decreases to be on the order of the background noise at delays close to 99 ms. Monte-Carlo simulations demonstrate that the trap loss mechanisms are primarily due to collisions with the molecular beam and background gas, inhibiting the ability to accumulate trap density. These simulations also show that Majorana transitions to higher quantum states are minimal and can be ignored. Experimental measurements confirm that near the peak of the molecular beam, when the strongest signal of Br is observed at long delays, around 60% of initially trapped Br atoms are lost due to molecular beam collisions, and (34 ± 3)% due to collisions with the background gas. To minimise molecular beam collisions, a chopper construct is designed to reduce the beam pulse width and is placed between the molecular beam valve and detection volume. The chopper runs from 3,000 rpm to 80,000 rpm, during which the duration of the molecular beam at the detection volume is shortened from 130 μs (measured at the full width at half maximum) to between 80 μs and 13 μs, respectively. However, the chopper significantly reduces the initial Br2 density in the trapping region, and the influence of the chopper construct in reducing molecular beam collisions requires further experimental work. Further work is also necessary to improve the operational components, such as reducing the base pressure in the detection chamber to lower background gas collisions, or using a heavier carrier gas to increase density of initially trapped Br atoms. These improvements can lead us closer toward building density of cold Br atoms in the trap over successive molecular beam cycles, with which a source of cold, dense halogen atoms can be realised.
APA, Harvard, Vancouver, ISO, and other styles
9

Affonso, Cláudia Andressa Cruz. "Gestão de configuração e colaboração em plataformas de apoio às comunidades Open Source Design." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/18/18156/tde-07072017-095017/.

Full text
Abstract:
Open Source Design (OSD) é uma estratégia de desenvolvimento adotada por comunidades que desenvolvem produtos manufaturados colaborativamente, em um regime copyleft e rede peer to peer virtual. O resultado é um conjunto de informações capazes de permitir a fabricação remota em que o usuário pode construir o seu produto, ao invés de um sistema produtivo específico previamente definido, como no desenvolvimento tradicional. A tangibilidade e a distância física entre desenvolvedores impõem barreiras à colaboração, mas avanços como a tecnologia de impressão 3D fez surgir plataformas virtuais para o compartilhamento de arquivos que tornam esta realidade possível. O número deste tipo de plataformas é significativo e há problemas na delimitação teórica do fenômeno e questões em aberto. No desenvolvimento OSD faz-se necessário o uso de objetos de fronteira (Boundary Objects – BOs). Qual o papel destes objetos? Eles estão presentes nas plataformas citadas? As plataformas existentes possuem recursos para a gestão destes objetos? Quais as características que estas plataformas devem apresentar para viabilizar o desenvolvimento OSD? Este trabalho reúne um conjunto de investigações para elucidar as questões acima. Por meio de um estudo de caso inicial e revisões sistemáticas, os conceitos foram analisados e propostos teoricamente. Por meio de um levantamento de 686 projetos de seis comunidades OSD diferentes hospedadas na plataforma Thingiverse, auxiliado por uma análise de conteúdo sobre a relação entre plataforma e colaboração, em seis projetos de diferentes plataformas, foi possível discutir o papel das plataformas utilizadas por comunidades OSD. Discute-se a relação com o desempenho do projeto e com a colaboração alcançada, identificando-se problemas potenciais e soluções latentes. Os resultados indicam que as plataformas estudadas, entre as mais difundidas, apresentam limitações. Finalmente, por meio da análise conjunta dos resultados, à luz das práticas de gestão de configuração existentes no desenvolvimento tradicional, foi possível identificar uma lista de propriedades latentes, que poderiam ser incorporadas nas plataformas, como: estrutura do produto, versionamento, ferramentas de armazenamento e rastreabilidade, auditoria, controle da configuração e governança da interface. As propriedades identificadas são uma contribuição importante para a inserção em ferramentas de design colaborativo.
Open Source Design (OSD) is a strategy adopted by communities that develop collaboratively manufactured products in a copyleft conditions and on a peer to peer network. The result is a set of information that allow the remote manufacturing where the users can build their own products, despite of a specific productive system previously defined as the traditional development. The tangibility and the geographic distance among developers impose barriers on the collaboration, on the other hand technology breakthroughs in the 3D printer technology made arise virtual platforms to share files that made this reality achievable. The number of these types of platforms is increasing and there are difficulties to delimitating of the phenomenon and there are gaps that need investigation. Is required the use of BOs in OSD development? What is the role of these BOs? Are they used in the mentioned platforms? Do the current platforms have features to the management of BOs? Which features must these platforms have to make feasible the OSD development? This thesis gathers an amount of investigations to elucidate the questions above. Through an initial study of case and systematic revisions, the concepts were analyzed and theoretically proposed. There was collected information of 686 projects from 6 different OSD communities hosted on the Thingiverse platform; in complement a content analysis about the relation between the platform and collaboration in six projects of three different platforms, to discuss the role of the platforms used for OSD communities. It is discussed the relation between the performance project with the collaboration, identifying the potential problems and the latent solutions. The result indicates that the investigated platforms, even the most widespread, present limitations. Finally, using a combined analysis of the results, according to the practices of management configuration existent in the traditional development, it was possible to identify a list of features that could be incorporated in the platforms, such: product structure, versioning, storage tools and traceability, audits, control configuration, governance. The identified features are important contributions to the insertion in tools of collaborative design.
APA, Harvard, Vancouver, ISO, and other styles
10

Fox, Benjamin Daniel. "Seismic source parameter determination using regional intermediate-period surface waves." Thesis, University of Oxford, 2007. http://ora.ox.ac.uk/objects/uuid:6b89e41d-8dd0-4286-9bf0-d22c4a349bb7.

Full text
Abstract:
In general, the depths of shallow earthquakes are poorly resolved in current catalogues. Variations in depth of ±10 km can significantly alter the tectonic interpretation of such earthquakes. If the depth of a seismic event is in error then moment tensor estimates can also be significantly altered. In the context of nuclear-test-ban monitoring, a seismic event whose depth can be confidently shown to exceed say, 10km, is unlikely to be an explosion. Surface wave excitation is sensitive to source depth, especially at intermediate and short periods, owing to the approximate exponential decay of surface wave displacements with depth. The radiation pattern and amplitude of surface waves are controlled by the depth variations in the six components of the strain tensor associated with the surface wave eigenfunctions. The potential exists, therefore, for improvements to be made to depth and moment tensor estimates by analysing surface wave amplitudes and radiation patterns. A new method is developed to better constrain seismic source parameters by analysing 100-20s period amplitude spectra of fundamental-mode surface waves. Synthetic amplitude spectra are generated for all double-couple sources over a suitable depth range and compared with data in a grid-search algorithm. Best fitting source parameters are calculated and appropriate bounds are placed on these results. This approach is tested and validated using a representative set of globally-distributed events. Source parameters are determined for 14 moderately-sized earthquakes (5.4 ≤ Mw ≤ 6.5), occurring in a variety of tectonic regimes with depths calculated between 4-39km. For very shallow earthquakes the use of surface wave recordings as short as 15s is shown to improve estimates of source parameters, especially depth. Analysis of aftershocks (4.8 ≤ Mw ≤ 6.0) of the 2004 great Sumatra earthquake is performed to study the depth distribution of seismicity in the region. Three distinct tectonic regimes are identified and depth estimates calculated between 3-61km, including the identification of one CMT depth estimate to be in error by some 27km.
APA, Harvard, Vancouver, ISO, and other styles
11

Bowler, David Robert. "A theoretical investigation of gas source growth of the Si(001) surface." Thesis, University of Oxford, 1997. http://ora.ox.ac.uk/objects/uuid:a817986f-114d-4a8a-8001-767f795d0e55.

Full text
Abstract:
The growth of the Si(001) surface from gas sources such as disilane is technologically important, as well as scientifically interesting. The aspects of growth covered are: the clean surface, its defects and steps; the action of bismuth, a surfactant; the diffusion behaviour of hydrogen in different environments; and the entire pathway for formation of a new layer of silicon from adsorption of fragments of disilane to nucleation of dimer strings. The theoretical methods used, density functional theory and tight binding, are described. Four linear scaling tight binding methods are compared. The construction of the tight binding parameterisations used is also explained. The structure of the most common defect on the Si(001) surface is identified by comparison of the electronic structure with scanning tunneling microscopy (STM) images. The energy and structure of steps is calculated, and their kinking behaviour is modelled, achieving good agreement with experimental results. Two unusual features which form when bismuth is placed on the surface and annealed are investigated. The first has possible applications as a quantum wire, and its structure and growth are described. The second relates to a controversial area in the field; a structure is proposed which fits all available experimental evidence. The behaviour of hydrogen is vital to understanding growth, as large amounts are deposited during disilane growth. After validating the tight binding parameterisation against DFT and experiment for the system of a single hydrogen diffusing on the clean Si(001) surface, the barriers for diffusion on the saturated surface, down a step and away from a defect are found, and prove to be in good agreement with available experimental data. The pathway for the formation of a new layer of silicon from disilane is described step by step, giving barriers and structures for all events. The interaction with experiment is highlighted, and demonstrates that great benefit accrues from such close work, and that the atomistic modelling techniques used in the thesis produce results in close agreement with reality.
APA, Harvard, Vancouver, ISO, and other styles
12

Lawrie, Scott. "Understanding the plasma and improving extraction of the ISIS Penning H⁻ ions source." Thesis, University of Oxford, 2017. https://ora.ox.ac.uk/objects/uuid:1648761a-57b1-4d6f-8281-9d1c36ccd46a.

Full text
Abstract:
A Penning-type surface-plasma negative hydrogen (H-) ion source has been delivering beam at the ISIS pulsed spallation neutron and muon facility for over thirty years. It is one of the most powerful and well-renowned H- sources in the world. Although long-term experience has allowed the source to be operated reliably and set up in a repeatable way, it is treated as something of a 'black box': the detailed plasma physics of why it works has always been unclear. A vacuum Vessel for Extraction and Source Plasma Analyses (VESPA) has been developed to understand the ISIS ion source plasma and improve the beam extracted from it. The VESPA ion source is operated in a completely new regime whereby the analysing sector dipole magnet housed inside a refrigerated 'cold box', presently used on ISIS, is replaced by an on-axis extraction system. The new extraction system incorporates a novel einzel lens with an elliptical aperture. This is the first demonstration of an elliptical einzel being used to focus an asymmetric H- ion beam. With the dipole magnet removed, the ion source has been shown to produce 85 mA of H- beam current at normal settings; of which 80 mA is transported through the new einzel lens system, with a normalised RMS emittance of 0.2 π mm mrad. Optical emission spectroscopy measurements have shown a plasma density of 1019 m–3, an H2 dissociation rate of 70%, an almost constant electron temperature of 3.5 eV and an atomic temperature which linearly increases above the electron temperature. In support of these principal measurements, rigorous particle tracking, electrostatic and thermal simulations were performed. In addition, a suite of new equipment was manufactured by the author. This includes a fast pressure gauge, a temperature controller, a high voltage einzel lens circuit, a fast beam chopper and a caesium detection system.
APA, Harvard, Vancouver, ISO, and other styles
13

Bernabeu, Llinares Miguel Oscar. "An open source HPC-enabled model of cardiac defibrillation of the human heart." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:9ca44896-8873-4c91-9358-96744e28d187.

Full text
Abstract:
Sudden cardiac death following cardiac arrest is a major killer in the industrialised world. The leading cause of sudden cardiac death are disturbances in the normal electrical activation of cardiac tissue, known as cardiac arrhythmia, which severely compromise the ability of the heart to fulfill the body's demand of oxygen. Ventricular fibrillation (VF) is the most deadly form of cardiac arrhythmia. Furthermore, electrical defibrillation through the application of strong electric shocks to the heart is the only effective therapy against VF. Over the past decades, a large body of research has dealt with the study of the mechanisms underpinning the success or failure of defibrillation shocks. The main mechanism of shock failure involves shocks terminating VF but leaving the appropriate electrical substrate for new VF episodes to rapidly follow (i.e. shock-induced arrhythmogenesis). A large number of models have been developed for the in silico study of shock-induced arrhythmogenesis, ranging from single cell models to three-dimensional ventricular models of small mammalian species. However, no extrapolation of the results obtained in the aforementioned studies has been done in human models of ventricular electrophysiology. The main reason is the large computational requirements associated with the solution of the bidomain equations of cardiac electrophysiology over large anatomically-accurate geometrical models including representation of fibre orientation and transmembrane kinetics. In this Thesis we develop simulation technology for the study of cardiac defibrillation in the human heart in the framework of the open source simulation environment Chaste. The advances include the development of novel computational and numerical techniques for the solution of the bidomain equations in large-scale high performance computing resources. More specifically, we have considered the implementation of effective domain decomposition, the development of new numerical techniques for the reduction of communication in Chaste's finite element method (FEM) solver, and the development of mesh-independent preconditioners for the solution of the linear system arising from the FEM discretisation of the bidomain equations. The developments presented in this Thesis have brought Chaste to the level of performance and functionality required to perform bidomain simulations with large three-dimensional cardiac geometries made of tens of millions of nodes and including accurate representation of fibre orientation and membrane kinetics. This advances have enabled the in silico study of shock-induced arrhythmogenesis for the first time in the human heart, therefore bridging an important gap in the field of cardiac defibrillation research.
APA, Harvard, Vancouver, ISO, and other styles
14

Bednář, Jan. "Srovnání komerčních BI reportovacích nástrojů s nástroji Open Source." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-165134.

Full text
Abstract:
Diploma thesis deals with comparison of commercial and Open Source Business Intelligence (BI) reporting tools. The main aim of the thesis is to provide a list of BI reporting tools and their further comparison. Evaluation is based on a set of criteria. Every criterium has its assigned value that represents the importance of the criterium in a given group. The same procedure is applied on the groups. The final evaluation is based on defined values of these groups. The output of the thesis is a table structured into five parts according to defined groups of criteria. The second part of the thesis walks the reader through the practical showcase of implementation one of the selected tools that is SAP Business Objects Enterprise 4.0. At the beginning, there is a description of report proposal that contains graphical design and functionality requirements. Next part shows the whole process in detail.
APA, Harvard, Vancouver, ISO, and other styles
15

Pritzker, David Thomas. "Canopy of everlasting joy : an early source in Tibetan historiography and the history of West Tibet." Thesis, University of Oxford, 2017. http://ora.ox.ac.uk/objects/uuid:f231f283-1711-4428-820d-0c2ccfa11f5f.

Full text
Abstract:
A more descriptive title for the dissertation might be "Early historiography in Purang-Guge and its relationship between orality, kingship, and Tibetan identity: a close study of a recently uncovered 12th century historical manuscript from Tholing monastery in West Tibet." The present study is therefore a close textual analysis of all the outer and inner features of the Tholing Manuscript. When reading the text, there is the gradual realization that the archaic peculiarities in script, binding, spelling, vocabulary, prose, and narrative twists, all highlight the work as a wholly rare and different version from those early histories typically found in Central Tibet. The key difference lies primarily in the focal point of the narrative. Whereas most similar narratives from the time of the phyi dar (11th-13th centuries) onwards place at the core of their structure the history of Buddhism in Tibet, the Tholing text puts as its central focus kingship and the history of kings in Tibet. For this reason, while Buddhism plays an essential and integral part of the story as a whole, the text can be viewed as a more secular work then any comparable monastic history of the period. The narrative structure of the manuscript, with its heavy use of rhythmical prose, similes, archaic topoi and motifs, is hauntingly familiar to those parallel passages found among Old Tibetan Documents and is emblematic of the liminal period in which the text was written. At this time, histories were transitioning from disperse and possibly oral transmissions to predominantly formal organized written traditions. The poetic nature of the text, together with its unusual physical features, raises questions relating to its purpose and function, with the possibility of its use as a ritual manuscript for royal legitimization. Through a close study of the text, I offer some insights on the formative nature of early Tibetan historiography in establishing the sacred and political power of the kings of West Tibet.
APA, Harvard, Vancouver, ISO, and other styles
16

Dysart, Thomas. "Systems within systems : free and open source software licences under German and United States law." Thesis, University of Oxford, 2017. http://ora.ox.ac.uk/objects/uuid:4632118c-1ef6-47b9-ac89-2b3c7889f881.

Full text
Abstract:
Free and Open Source Software (FOSS) licences channel the exclusionary and individualising force of copyright to establish a qualitatively different, somewhat subversive, system for the exploitation of software. This thesis examines how it is that FOSS licences establish this 'system within a system' under both German and United States law. The inquiry begins with a detailed examination of FOSS licence templates as the instruments which transform code from its default position as the 'res' of proprietary relations to its status as 'open' or 'free'. The thesis then considers whether FOSS licence templates, as the legal basis for this subversive move, are valid and enforceable under domestic law. In addressing this question, the thesis undertakes a critical analysis of the leading case law in each jurisdiction. Going beyond the immediate case law, the thesis considers the broader systemic effects of FOSS licence enforcement. It highlights how building a system within a system foments certain tensions and contradictions within the law, in turn giving rise to unintended consequences and legal uncertainty. By highlighting these tensions, the thesis argues that the questions of FOSS licence enforcement in Germany and the United States may not be as settled as some may think.
APA, Harvard, Vancouver, ISO, and other styles
17

Dušan, Jovanović. "Модел објектно оријентисане класификације у идентификацији геопросторних објеката." Phd thesis, Univerzitet u Novom Sadu, Fakultet tehničkih nauka u Novom Sadu, 2015. http://www.cris.uns.ac.rs/record.jsf?recordId=95426&source=NDLTD&language=en.

Full text
Abstract:
У оквиру докторске дисертације извршен је преглед стања постојећих начинаидентификације геопросторних објеката на основу података насталих на принципимадаљинске детекције. Извршена је анализа постојећих проблема и корака које јенеопходно провести како би се добили што бољи резултати идентификацијегеопросторних објеката. Анализирани су поступци мапирања, начини сегментацијеслике, критеријуми за идентификацију, селекцију и класификацију геопросторнихобјеката као и методе класификације. На основу анализе креиран је моделидентификовања геопросторних објеката базираних на објектно оријентисаној анализислике. На основу предложеног модела извршена је верификација модела у поступкуидентификовања зграда, пољопривредних површина, шумских површина и воденихповршина које представљају студије случаја.
U okviru doktorske disertacije izvršen je pregled stanja postojećih načinaidentifikacije geoprostornih objekata na osnovu podataka nastalih na principimadaljinske detekcije. Izvršena je analiza postojećih problema i koraka koje jeneophodno provesti kako bi se dobili što bolji rezultati identifikacijegeoprostornih objekata. Analizirani su postupci mapiranja, načini segmentacijeslike, kriterijumi za identifikaciju, selekciju i klasifikaciju geoprostornihobjekata kao i metode klasifikacije. Na osnovu analize kreiran je modelidentifikovanja geoprostornih objekata baziranih na objektno orijentisanoj analizislike. Na osnovu predloženog modela izvršena je verifikacija modela u postupkuidentifikovanja zgrada, poljoprivrednih površina, šumskih površina i vodenihpovršina koje predstavljaju studije slučaja.
This PhD thesis includes an overview of the existing methods of identifying geospatialobjects from a remote sensing data, basically satellite or airplane images. The analysisof existing problems and necessary steps in identification of remotely sensed data isobtained in way to get the best results of identification of geospatial objects. Themapping procedures, methods of image segmentation, the criteria for identification,selection and classification of geospatial objects and methods of classification are alsoanalyzed. The result of analysis is a model of identifying geospatial objects based onobject-oriented image analysis. Based on the proposed model, verification of themodel was carried out. Afterwards case study of the proposed model is carried out inprocess of identifying buildings, farmland, forest and water areas.
APA, Harvard, Vancouver, ISO, and other styles
18

Sadler, Jeffrey Michael. "Hydrologic Data Sharing Using Open Source Software and Low-Cost Electronics." BYU ScholarsArchive, 2015. https://scholarsarchive.byu.edu/etd/4425.

Full text
Abstract:
While it is generally accepted that environmental data are critical to understanding environmental phenomena, there are yet improvements to be made in their consistent collection, curation, and sharing. This thesis describes two research efforts to improve two different aspects of hydrologic data collection and management. First described is a recipe for the design, development, and deployment of a low-cost environmental data logging and transmission system for environmental sensors and its connection to an open source data-sharing network. The hardware is built using several low-cost, open-source, mass-produced components. The system automatically ingests data into HydroServer, a standards-based server in the open source Hydrologic Information System (HIS) created by the Consortium of Universities for the Advancement of Hydrologic Sciences Inc (CUAHSI). A recipe for building the system is provided along with several test deployment results. Second, a connection between HydroServer and HydroShare is described. While the CUAHSI HIS system is intended to empower the hydrologic sciences community with better data storage and distribution, it lacks support for the kind of “Web 2.0” collaboration and social-networking capabilities that are increasing scientific discovery in other fields. The design, development, and testing of a software system that integrates CUAHSI HIS with the HydroShare social hydrology architecture is presented. The resulting system supports efficient archive, discovery, and retrieval of data, extensive creator and science metadata, assignment of a persistent digital identifier such as a Digital Object Identifier (DOI), scientific discussion and collaboration around the data and other basic social-networking features. In this system, HydroShare provides functionality for social interaction and collaboration while the existing HIS provides the distributed data management and web services framework. The system is expected to enable scientists, for the first time, to access and share both national- and research lab-scale hydrologic time series in a standards-based web services architecture combined with a social network developed specifically for the hydrologic sciences.These two research projects address and provide a solution for significant challenges in the automatic collection, curation, and feature-rich sharing of hydrologic data.
APA, Harvard, Vancouver, ISO, and other styles
19

Steer, Edward. "Development and characterisation of a cold molecule source and ion trap for studying cold ion-molecule chemistry." Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:13c3a622-ba78-4a53-902c-666ec461f708.

Full text
Abstract:
A novel apparatus, combining buffer-gas cooling, electrostatic velocity selection and ion trapping, has been constructed and characterised. This apparatus is designed to investigate cold ion-molecule chemistry in the laboratory, at a variable translational and internal (rotational) temperature. This improves on previous experiments with translationally cold but rotationally hot molecule sources. The ability to vary the rotational temperature of cold molecules will allow for the experimental investigation of post-Langevin capture theories.
APA, Harvard, Vancouver, ISO, and other styles
20

Hewett, David Peter. "Sound propagation in an urban environment." Thesis, University of Oxford, 2010. http://ora.ox.ac.uk/objects/uuid:e7a1d40b-2bf4-4f48-8a6b-ce6f575e955e.

Full text
Abstract:
This thesis concerns the modelling of sound propagation in an urban environment. For most of the thesis a point source of sound source is assumed, and both 2D and 3D geometries are considered. Buildings are modelled as rigid blocks, with the effects of surface inhomogeneities neglected. In the time-harmonic case, assuming that the wavelength is short compared to typical lengthscales of the domain (street widths and lengths), ray theory is used to derive estimates for the time-averaged acoustic power flows through a network of interconnecting streets in the form of integrals over ray angles. In the impulsive case, the propagation of wave-field singularities in the presence of obstacles is considered, and a general principle concerning the weakening of singularities when they are diffracted by edges and vertices is proposed. The problem of switching on a time-harmonic source is also studied, and an exact solution for the diffraction of a switched on plane wave by a rigid half-line is obtained and analysed. The pulse diffraction theory is then applied in a study of the inverse problem for an impulsive source, where the aim is to locate an unknown source using Time Differences Of Arrival (TDOA) at multiple receivers. By using reflected and diffracted pulse arrivals, the standard free-space TDOA method is extended to urban environments. In particular, approximate source localisation is found to be possible even when the exact building distribution is unknown.
APA, Harvard, Vancouver, ISO, and other styles
21

Solana, Javier. "All that glitters is not gold : the re-use of securities collateral as a source of systemic risk." Thesis, University of Oxford, 2017. https://ora.ox.ac.uk/objects/uuid:4f5df3ab-ca74-425f-9e35-9a25cd8336b6.

Full text
Abstract:
Since the 1980s, regulators in the U.S. and the U.K. have protected the collateral taker's right to re-use securities collateral in securities financing and OTC derivatives markets on the understanding that it would promote liquidity and credit growth, and reduce systemic risk. However, this rationale was incomplete: it failed to acknowledge the full implications of collateral re-use for systemic risk. In this dissertation, I aim to complete that understanding by illustrating how the re-use of securities collateral in those markets can aggravate systemic risk. In particular, I describe two effects. First, re-using securities collateral multiplies the number of market participants that will be exposed to changes in the price of the collateral asset and can thus amplify the role of asset prices as channels of contagion. Second, by conferring a right to re-use, the collateral provider will effectively waive its proprietary interests in the collateral assets and retain a mere contractual claim against the collateral taker for the return of equivalent securities. This transformation will accentuate the incentive of the collateral provider to run from an over-collateralised collateral taker if the latter were to experience financial difficulty. Information asymmetries and a lack of coordination among collateral providers could push the collateral taker over the brink of insolvency. These risks pose an obvious question for regulators: what should we do about collateral re-use? At a time when international bodies are drawing their attention to this widespread market practice, the question is an invitation to a very timely reflection. The final chapter of the dissertation offers an answer to this question and assesses the potential efficacy of the most recent regulatory initiatives in relation to collateral re-use.
APA, Harvard, Vancouver, ISO, and other styles
22

Close, Tom A. "Robust quantum phenomena for quantum information processing." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:95324cad-e44b-4bd8-b6e1-173753959993.

Full text
Abstract:
This thesis is concerned with finding technologically useful quantum phenomena that are robust against real world imperfections. We examine three different areas covering techniques for spin measurement, photon preparation and error correction. The first research chapter presents a robust spin-measurement procedure, using an amplification approach: the state of the spin is propagated over a two-dimensional array to a point where it can be measured using standard macroscopic state mea- surement techniques. Even in the presence of decoherence, our two-dimensional scheme allows a linear growth in the total spin polarisation - an important increase over the √t obtainable in one-dimension. The work is an example of how simple propagation rules can lead to predictable macroscopic behaviour and the techniques should be applicable in other state propagation schemes. The next chapter is concerned with strategies for obtaining a robust and reliable single photon source. Using a microscopic model of electron-phonon interactions and a quantum master equation, we examine phonon-induced decoherence and assess its impact on the rate of production, and indistinguishability, of single photons emitted from an optically driven quantum dot system. We find that, above a certain threshold of desired indistinguishability, it is possible to mitigate the deleterious effects of phonons by exploiting a three-level Raman process for photon production. We introduce a master equation technique for quantum jump situations that should have wide application in other situations. The final chapter focusses on toric error correcting codes. Toric codes form part of the class of surface codes that have attracted a lot of attention due to their ability to tolerate a high level of errors, using only local operations. We investigate the power of small scale toric codes and determine the minimum size of code necessary for a first experimental demonstration of toric coding power.
APA, Harvard, Vancouver, ISO, and other styles
23

Garmulewicz, Alysia. "3D printing in the commons : knowledge and the nature of digital and physical resources." Thesis, University of Oxford, 2015. http://ora.ox.ac.uk/objects/uuid:669993b7-edef-4905-a461-8b1054dad443.

Full text
Abstract:
3D printers are a type of digital fabrication tool being used by communities committed to shared software, hardware, and digital designs. This shared digital knowledge can be understood as an emerging common resource for the fabrication of physical goods and services. Yet the knowledge associated with physical resources used in 3D printing is less understood. This thesis explores what factors enable or prevent knowledge about physical materials entering the commons. 3D printing, with its particular configuration of digital and physical goods, offers a unique angle to advance the field of commons scholarship. This thesis elaborates the use of commons theory for traversing the boundary between knowledge associated with physical materials and digital content from the perspective of 3D printer users. Particular contributions are made to the branch of knowledge commons theory: notably, how design rules in technological systems can be used to theorise boundaries; how differentiating between the nature of underlying resources can help explain the inclusion of knowledge in the commons; and, how patterns of user engagement with types of knowledge in the commons can be studied over time. To develop these contributions I employ theory on the design rules of technological architecture, and use insights from the study of peer production in online communities. Empirical data comes from a qualitative study of users of Fab Labs, community workshops for digital fabrication, as well as from a quantitative study of the online user forum for the Ultimaker 3D printer.
APA, Harvard, Vancouver, ISO, and other styles
24

Gyöngy, Miklós. "Passive cavitation mapping for monitoring ultrasound therapy." Thesis, University of Oxford, 2010. http://ora.ox.ac.uk/objects/uuid:af6f3c5a-bec5-4378-a617-c89d2b16d95d.

Full text
Abstract:
Cavitation is a phenomenon present during many ultrasound therapies, including the thermal ablation of malignant tissue using high intensity focused ultrasound (HIFU). Inertial cavitation, in particular, has been previously shown to result in increased heat deposition and to be associated with broadband noise emissions that can be readily monitored using a passive receiver without interference from the main ultrasound signal. The present work demonstrates how an array of passive receivers can be used to generate maps of cavitation distribution during HIFU exposure, uncovering a new potential method of monitoring HIFU treatment. Using a commercially available ultrasound system (z.one, Zonare, USA), pulse transmission can be switched off and data from 64 elements of an array can be simultaneously acquired to generate passive maps of acoustic source power. For the present work, a 38 mm aperture 5-10 MHz linear array was used, with the 64 elements chosen to span the entire aperture. Theory and simulations were used to show the spatial resolution of the system, the latter showing that the broadband nature of inertial cavitation makes passive maps robust to interference between cavitating bubbles. Passive source mapping was first applied to wire scatterers, demonstrating the ability of the system to resolve broadband sources. With the array transversely placed to the HIFU axis, high-resolution passive maps are generated, and emissions from several cavitating bubbles are resolved. The sensitivity of passive mapping during HIFU exposure is compared with that of an active cavitation detector following exposure. The array was then placed within a rectangular opening in the centre of the HIFU transducer, providing a geometric setup that could be used clinically to monitor HIFU treatment. Cavitation was instigated in continuous and disjoint regions in agar tissue mimicking gel, with the expected regions of cavitation validating the passive maps obtained. Finally, passive maps were generated for samples of ox liver exposed to HIFU. The onset of inertial cavitation as detected by the passive mapping approach was found to provide a much more robust indicator of lesioning than post-exposure B-mode hyperecho, which is in current clinical use. Passive maps based on the broadband component of the received signal were able to localize the lesions both transversely and axially, however cavitation is generally indicated 5 mm prefocal to the lesions. Further work is needed to establish the source of this discrepancy. It is believed that with use of an appropriately designed cavitation detection array, passive mapping will represent a major advance in ultrasound-guided HIFU therapy. Not only can it be utilized in real-time during HIFU exposure, without the need to turn the therapeutic ultrasound field off, but it has also been shown in the context of the present work to provide a strong indicator of successful lesioning and high signal-to-noise compared to conventional B-mode ultrasound techniques.
APA, Harvard, Vancouver, ISO, and other styles
25

Spring, Justin Benjamin. "Single photon generation and quantum computing with integrated photonics." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:b08937c7-ec87-47f8-b5ac-902673f87ce2.

Full text
Abstract:
Photonics has consistently played an important role in the investigation of quantum-enhanced technologies and the corresponding study of fundamental quantum phenomena. The majority of these experiments have relied on the free space propagation of light between bulk optical components. This relatively simple and flexible approach often provides the fastest route to small proof-of-principle demonstrations. Unfortunately, such experiments occupy significant space, are not inherently phase stable, and can exhibit significant scattering loss which severely limits their use. Integrated photonics offers a scalable route to building larger quantum states of light by surmounting these barriers. In the first half of this thesis, we describe the operation of on-chip heralded sources of single photons. Loss plays a critical role in determining whether many quantum technologies have any hope of outperforming their classical analogues. Minimizing loss leads us to choose Spontaneous Four-Wave Mixing (SFWM) in a silica waveguide for our source design; silica exhibits extremely low scattering loss and emission can be efficiently coupled to the silica chips and fibers that are widely used in quantum optics experiments. We show there is a straightforward route to maximizing heralded photon purity by minimizing the spectral correlations between emitted photon pairs. Fabrication of identical sources on a large scale is demonstrated by a series of high-visibility interference experiments. This architecture offers a promising route to the construction of nonclassical states of higher photon number by operating many on-chip SFWM sources in parallel. In the second half, we detail one of the first proof-of-principle demonstrations of a new intermediate model of quantum computation called boson sampling. While likely less powerful than a universal quantum computer, boson sampling machines appear significantly easier to build and may allow the first convincing demonstration of a quantum-enhanced computation in the not-distant future. Boson sampling requires a large interferometric network which are challenging to build with bulk optics, we therefore perform our experiment on-chip. We model the effect of loss on our postselected experiment and implement a circuit characterization technique that accounts for this loss. Experimental imperfections, including higher-order emission from our photon pair sources and photon distinguishability, are modeled and found to explain the sampling error observed in our experiment.
APA, Harvard, Vancouver, ISO, and other styles
26

Mosley, Peter James. "Generation of heralded single photons in pure quantum states." Thesis, University of Oxford, 2007. http://ora.ox.ac.uk/objects/uuid:44c36e1e-11ee-41e2-ba29-611c932ce4ff.

Full text
Abstract:
Single photons - discrete wavepackets of light - are one of the most fundamental entities in physics. In recent years, the ability to consistently create and manipulate both single photons and pairs of photons has facilitated everything from tests of quantum theory to the implementation of quantum-enhanced precision measurements. These activities all fall within the scope of the rapidly-growing field of quantum information - the exploitation of the properties of quantum states (and specifically their capability to exist in superpositions) to accomplish tasks that would not be possible with classical objects. One stated goal of research in quantum information is to build a device consisting of a network of quantum logic gates that can evaluate quantum algorithms. The photonic implementation of individual logic gates has already been demonstrated. However, partly due to standard methods of preparing single photons, current schemes have severe limitations in terms of scaling up from a single logic gate to multiple concatenated operations. Until now it has not been proven that single photons can be generated in pure and indistinguishable quantum states, something upon which the successful operation of optical quantum logic gates relies. This thesis presents an experimental demonstration of simultaneous generation of almost identical single photons in highly pure states from two independent sources based on parametric downconversion. This is a process of photon pair generation during the passage of a light beam through a nonlinear crystal; one photon from the resulting pair is detected to herald the other. The work herein describes, refines, and implements a technique that minimises the strong quantum correlations usually present within each pair by spectral engineering of the source. This allows the heralded single photons to be in pure states, a property that is confirmed by observing a high-visibility two-photon interference effect without spectral filtering.
APA, Harvard, Vancouver, ISO, and other styles
27

Pibre, Lionel. "Localisation d'objets urbains à partir de sources multiples dont des images aériennes." Thesis, Montpellier, 2018. http://www.theses.fr/2018MONTS107/document.

Full text
Abstract:
Cette thèse aborde des problèmes liés à la localisation et la reconnaissance d’objets urbains dans des images multi-sources (optique, infrarouge, Modèle Numérique de Surface) de très haute précision acquises par voie aérienne.Les objets urbains (lampadaires, poteaux, voitures, arbres…) présentent des dimensions, des formes, des textures et des couleurs très variables. Ils peuvent être collés les uns les autres et sont de petite taille par rapport à la dimension d’une image. Ils sont présents en grand nombre mais peuvent être partiellement occultés. Tout ceci rend les objets urbains difficilement identifiables par les techniques actuelles de traitement d’images.Dans un premier temps, nous avons comparé les approches d’apprentissage classiques, composées de deux étapes - extraction de caractéristiques par le biais d’un descripteur prédéfini et utilisation d’un classifieur - aux approches d’apprentissage profond (Deep Learning), et plus précisément aux réseaux de neurones convolutionnels (CNN). Les CNN donnent de meilleurs résultats mais leurs performances ne sont pas suffisantes pour une utilisation industrielle. Nous avons donc proposé deux améliorations.Notre première contribution consiste à combiner de manière efficace les données provenant de sources différentes. Nous avons comparé une approche naïve qui consiste à considérer toutes les sources comme des composantes d’une image multidimensionnelle à une approche qui réalise la fusion des informations au sein même du CNN. Pour cela, nous avons traité les différentes informations dans des branches séparées du CNN. Nous avons ainsi montré que lorsque la base d’apprentissage contient peu de données, combiner intelligemment les sources dans une phase de pré-traitement (nous combinons l'optique et l'infrarouge pour créer une image NDVI) avant de les donner au CNN améliore les performances.Pour notre seconde contribution, nous nous sommes concentrés sur le problème des données incomplètes. Jusque-là, nous considérions que nous avions accès à toutes les sources pour chaque image mais nous pouvons aussi nous placer dans le cas où une source n’est pas disponible ou utilisable pour une image. Nous avons proposé une architecture permettant de prendre en compte toutes les données, même lorsqu’il manque une source sur une ou plusieurs images. Nous avons évalué notre architecture et montré que sur un scénario d’enrichissement, cette architecture permet d'obtenir un gain de plus de 2% sur la F-mesure.Les méthodes proposées ont été testées sur une base de données publique. Elles ont pour objectif d’être intégrées dans un logiciel de la société Berger-Levrault afin d’enrichir les bases de données géographiques et ainsi faciliter la gestion du territoire par les collectivités locales
This thesis addresses problems related to the location and recognition of urban objects in multi-source images (optical, infrared, terrain model) of very high precision acquired by air.Urban objects (lamp posts, poles, car, tree...) have dimensions, shapes, textures and very variable colors. They can be glued to each other and are small with respect to the size of an image. They are present in large numbers but can be partially hidden. All this makes urban objects difficult to identify with current image processing techniques.First, we compared traditional learning approaches, consisting of two stages - extracting features through a predefined descriptor and using a classifier - to deep learning approaches and more precisely Convolutional Neural Networks (CNN). CNNs give better results but their performances are not sufficient for industrial use. We therefore proposed two contributions to increase performance.The first is to efficiently combine data from different sources. We compared a naive approach that considers all sources as components of a multidimensional image to an approach that merges information within CNN itself. For this, we have processed the different information in separate branches of the CNN.For our second contribution, we focused on the problem of incomplete data. Until then, we considered that we had access to all the sources for each image but we can also place ourselves in the case where a source is not available or usable. We have proposed an architecture to take into account all the data, even when a source is missing in one or more images. We evaluated our architecture and showed that on an enrichment scenario, it allows to have a gain of more than 2% on the F-measure.The proposed methods were tested on a public database. They aim to be integrated into a Berger-Levrault company software in order to enrich geographic databases and thus facilitate the management of the territory by local authorities
APA, Harvard, Vancouver, ISO, and other styles
28

Ba, Mouhamadou Lamine. "Exploitation de la structure des données incertaines." Electronic Thesis or Diss., Paris, ENST, 2015. http://www.theses.fr/2015ENST0013.

Full text
Abstract:
Cette thèse s’intéresse à certains problèmes fondamentaux découlant d’un besoin accru de gestion des incertitudes dans les applications Web multi-sources ayant de la structure, à savoir le contrôle de versions incertaines dans les plates-formes Web à large échelle, l’intégration de sources Web incertaines sous contraintes, et la découverte de la vérité à partir de plusieurs sources Web structurées. Ses contributions majeures sont : la gestion de l’incertitude dans le contrôle de versions de données arborescentes en s’appuyant sur un modèle XML probabiliste ; les étapes initiales vers un système d’intégration XML probabiliste de sources Web incertaines et dépendantes ; l’introduction de mesures de précision pour les données géographiques et ; la conception d’algorithmes d’exploration pour un partitionnement optimal de l’ensemble des attributs dans un processus de recherche de la vérité sur des sources Web conflictuelles
This thesis addresses some fundamental problems inherent to the need of uncertainty handling in multi-source Web applications with structured information, namely uncertain version control in Web-scale collaborative editing platforms, integration of uncertain Web sources under constraints, and truth finding over structured Web sources. Its major contributions are: uncertainty management in version control of treestructured data using a probabilistic XML model; initial steps towards a probabilistic XML data integration system for uncertain and dependent Web sources; precision measures for location data and; exploration algorithms for an optimal partitioning of the input attribute set during a truth finding process over conflicting Web sources
APA, Harvard, Vancouver, ISO, and other styles
29

Palacino, Julian. "Outils de spatialisation sonore pour terminaux mobiles : microphone 3D pour une utilisation nomade." Thesis, Le Mans, 2014. http://www.theses.fr/2014LEMA1007/document.

Full text
Abstract:
Les technologies nomades (smartphones, tablettes, . . . ) étant actuellement très répandues,nous avons souhaité, dans le cadre de cette thèse, les utiliser comme vecteur pour proposer au grand public des outils de spatialisation sonore. La taille et le nombre de transducteurs utilisés pour la captation et la restitution sonore spatialisée sont à ce jour la limitation principale pour une utilisation nomade. Dans une première étape, la captation d’un opéra pour une restitution sur des tablettes tactiles nous a permis d’évaluer les technologies audio 3D disponibles aujourd’hui. Les résultats de cette évaluation ont révélé que l’utilisation des quatre capteurs du microphone Soundfield donne de bons résultats à condition d’effectuer un décodage binaural adapté pour une restitution sur casque. Selon une approche inspirée des méthodes de localisation de source et le concept de format « objet », un prototype de prise de son 3D léger et compact a été développé. Le dispositif microphonique proposé se compose de trois capsules microphoniques cardioïdes. A partir des signaux microphoniques, un algorithme de post-traitement spatial est capable, d’une part, de déterminer la direction des sources et, d’autre part, d’extraire un signal sonore représentatif de la scène spatiale. Ces deux informations permettent ainsi de caractérisercomplètement la scène sonore 3D en fournissant un encodage spatial offrant le double avantage d’une compression de l’information audio et d’une flexibilité pour le choix du système de reproduction. En effet, la scène sonore ainsi encodée peut être restituée en utilisant un décodage adapté sur n’importe quel type de dispositif.Plusieurs méthodes de localisation et différentes configurations microphoniques (géométrie et directivité) ont été étudiées.Dans une seconde étape, l’algorithme d’extraction de l’information spatiale a été modifié pour prendre en compte les caractéristiques réelles in situ des microphones.Des méthodes pour compléter la chaîne acoustique sont proposées permettant la restitution binaurale ainsi que sur tout autre dispositif de restitution. Elles proposent l’utilisation de capteurs de localisation présents sur les terminaux mobiles afin d’exploiter les capacités qu’ils offrent aujourd’hui
Mobile technologies (such as smartphones and tablets) are now common devices of the consumer market. In this PhD we want to use those technologies as the way to introduce tools of sound spatialization into the mass market. Today the size and the number of traducers used to pick-up and to render a spatial sound scene are the main factors which limit the portability of those devices. As a first step, a listening test, based on a spatial audio recording of an opera, let us to evaluate the 3D audio technologies available today for headphone rendering. The results of this test show that, using the appropriate binaural decoding, it is possible to achieve a good binaural rendering using only the four sensors of the Soundfield microphone.Then, the steps of the development of a 3D sound pick-up system are described. Several configurations are evaluated and compared. The device, composed of 3 cardioid microphones, was developed following an approach inspired by the sound source localization and by the concept of the "object format encoding". Using the microphone signals and an adapted post-processing it is possible to determine the directions of the sources and to extract a sound signal which is representative of the sound scene. In this way, it is possible to completely describe the sound scene and to compress the audio information.This method offer the advantage of being cross platform compatible. In fact, the sound scene encoded with this method can be rendered over any reproduction system.A second method to extract the spatial information is proposed. It uses the real in situ characteristics of the microphone array to perform the sound scene analysis.Some propositions are made to complement the 3D audio chain allowing to render the result of the sound scene encoding over a binaural system or any king of speaker array using all capabilities of the mobile devices
APA, Harvard, Vancouver, ISO, and other styles
30

Reipurth, Bo. "Herbig-haro objects and their energy sources /." København : Københavns Universitets Astronomiske Observatorium, 1999. http://www.gbv.de/dms/goettingen/269253831.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Walkinshaw, Neil. "Partitioning object-oriented source code for inspections." Thesis, University of Strathclyde, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.436084.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Ostrom, Christopher Lloyd II. "Open Source Toolkit for Reentry Object Modeling." DigitalCommons@CalPoly, 2015. https://digitalcommons.calpoly.edu/theses/1528.

Full text
Abstract:
Predicting the mass, position, and velocity of an object during its reentry are critical to satisfy NASA and ESA requirements. This thesis outlines a 3-D orbit and mass determination system for use on low earth orbit as applicable to general objects, of various material and size. The solution uses analytical models to calculate heat flux and aerodynamic drag, with some basic numerical models for simple orbit propagation and mass flow rate due to ablation. The system outlined in this thesis currently provides a framework for rough estimates of demise altitude and final mass, but also allows for many potential accuracy and speed improvements. 77 aerospace materials were tested, in solid spheres, cubes, and cylinders; it was found that materials with low latent heat of fusion (less than 10 kJ/kgK) demise before reaching the ground, while materials with higher melting point temperatures (over 1200K), high specific heats, and high latent heat of fusion (over 30 kJ/kgK) lose small amounts of mass before hitting the ground at speeds of 200-300m/s . The results of this thesis code are validated against NASA's Debris Assessment System (DAS), specifically the test cases of Acrylic, Molybdenum, and Silver.
APA, Harvard, Vancouver, ISO, and other styles
33

Alarcón, Reynaldo. "Sources of happiness : What makes people happy?" Pontificia Universidad Católica del Perú, 2002. http://repositorio.pucp.edu.pe/index/handle/123456789/100299.

Full text
Abstract:
The present research had the following goals: (a) to identify the favorite objects to reach the happiness; and (b ), to determine if the favorite objects are associated with the gender, the married condition and the age of the participant. The sample was integrated by 163 middle classpeople: 81 men and 82 women, all from Lima City. They were requested to choose, from a listof 15 items, the three objects that they consider the most important for happiness. The favorite objects were: "to have good health"; "to have a good relationship with God" and "to have agood family". There were not significant differences in the object choosing according to gender,although there were differences according to age and married condition. The analysis of multiple regression informed that the more important considered objects explain 66% of the varianceof the variable happiness. The results are discussed and compared with similar works.
Se comunican los resultados de una investigación que tuvo los siguientes objetivos: (a) identificarlos objetos preferidos para alcanzar la felicidad, y, (b) determinar si los objetos preferidos están asociados con el género, el estado conyugal y la edad. La muestra estuvo integrada por 163 personas: 81 hombres y 82 mujeres, de clase media de la ciudad de Lima. Se solicitó a los sujetos que seleccionaran, de una lista de 15 ítems, tres objetos que consideraban los más importantes para su felicidad. Los resultados señalan que los objetos preferidos fueron: "gozar debuena salud"; "estar bien con Dios" y "tener buena familia". No se encontraron diferencias significativasen la elección de los objetos según el género, aunque sí según la edad y el estado conyugal. El análisis de regresión múltiple indicó que los tres objetos considerados más importantes explican el 66 % de la varianza de la variable felicidad. Se discuten los hallazgos y se hacen comparaciones con trabajos similares.
APA, Harvard, Vancouver, ISO, and other styles
34

Danbury, Richard M. "The 'full liberty of public writers' : special treatment of journalism in English law." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:5299bf69-f793-4280-9525-9f3cc6f50ccc.

Full text
Abstract:
This thesis investigates whether institutional journalism should receive special treatment at the hands of the law. Special treatment encompasses the affording of benefits to and the imposition of liabilities on journalistic institutions and the individuals who work for them. The arguments against special treatment are pragmatic and theoretical: pragmatic arguments emphasise, inter alia, the difficulty of providing a definition of journalism, and theoretical arguments emphasise the difficulty in explaining why special treatment can be coherent. The former can be addressed by describing how special treatment is already afforded to institutional journalism, both liabilities and benefits, to individuals and institutions, and showing that some of the problems foreseen by the pragmatic arguments have not proved as difficult as they appear. The arguments that special treatment is incoherent can be addressed by arguing that the credibility and assessability of institutional journalism still provide a prima facie rationale for special treatment irrespective of the rise of public speech on the Internet, when combined with the integral nature of journalism to democracy. Two basic arguments are advanced why this is so. The first, the free speech values argument, is a consequentialist account that holds that special treatment is appropriate when (or because) institutional journalism contributes to free speech values. It is attractive, but presents difficulties, both when considered in the abstract and when applied to the free speech value of democracy. The second, a rights-based argument, based on the notion that freedoms of speech and of the Press are distinguishable, can be based on either on Dworkin’s theory of rights as trumps or Raz’s theory of rights as interests. Raz’s account is preferable, as it complements the free speech values thesis in explaining the coherence of special treatment.
APA, Harvard, Vancouver, ISO, and other styles
35

Masakapalli, Shyam Kumar. "Network flux analysis of central metabolism in plants." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:ac8b3836-9ab7-4060-b50a-df8aaa0e4ba5.

Full text
Abstract:
The aim of this thesis was to develop stable-isotope steady-state metabolic flux analysis (MFA) based on 13C labeling to quantify intracellular fluxes of central carbon metabolism in plants. The experiments focus on the analysis of a heterotrophic cell suspension culture of Arabidopsis thaliana (L) Heynh. (ecotype Landsberg erecta). The first objective was to develop a robust methodology based on combining high quality steady-state stable labeling data, metabolic modeling and computational analysis. A comprehensive analysis of the factors that influence the outcome of MFA was undertaken and best practice established. This allowed a critical analysis of the subcellular compartmentation of carbohydrate oxidation in the cell culture. The second objective was to apply the methodology to nutritional perturbations of the cell suspension. A comparison of growth on different nitrogen sources revealed that transfer to an ammonium-free medium: (i) increased flux through the oxidative pentose phosphate pathway (oxPPP) by 10% relative to glucose utilisation; (ii) caused a substantial decrease in entry of carbon into the tricarboxylic acid cycle (TCA); and (iii) increased the carbon conversion efficiency from 55% to 69%. Although growth on nitrate alone might be expected to increase the demand for reductant, the cells responded by decreasing the assimilation of inorganic N. Cells were also grown in media containing different levels of inorganic phosphate (Pi). Comparison of the flux maps showed that decreasing Pi availability: (i) decreased flux through the oxPPP; (ii) increased the proportion of substrate fully oxidised by the TCA cycle; and (iii) decreased carbon conversion efficiency. These changes are consistent with redirection of metabolism away from biosynthesis towards cell maintenance as Pi is depleted. Although published genome-wide transcriptomic and metabolomic studies suggest that Pi starvation leads to the restructuring of carbon and nitrogen metabolism, the current analysis suggests that the impact on metabolic organisation is much less extreme.
APA, Harvard, Vancouver, ISO, and other styles
36

Acharya, Rupesh. "Object Oriented Design Pattern Extraction From Java Source Code." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-207394.

Full text
Abstract:
In case of software architecture reconstruction, design pattern detection plays a vital role since its presence reflects the point of design decision. Currently most of the studied approaches only focus on the Gang of Four (GOF) design patterns so those tools are not flexible enough to identify other proprietary pattern instances. Moreover, the GOF design pattern can be implemented in various ways which many of the tools suffers to detect. Apart from that not only design pattern is of vital importance for software architecture reconstruction but other patterns like anti-patterns and presence of bad smell code are also equally important. So the approach discussed here is a solution for detecting any pattern instances (not only GOF patterns) from the source code provided that relevant information is extracted during the static analysis phase. Our approach is based on the graph pattern matching technique where the source code is modeled as a graph and the pattern to search for is provided as a graph query pattern. For the detection of patterns we focus on structural and behavioral analysis of source code as in the case of a tool called PINOT. The novelty of our approach compared to PINOT is that the choice of behavioral analyzers can be provided as a constraint in the graph query pattern unlike hardcoded in PINOT. Moreover, we can provide more than one constraint in the graph query pattern at node, edge or complete graph level hence, we can compose our query pattern as we want which helps us to specify different kind of new patterns and handle varying implementations of design patterns as well.
APA, Harvard, Vancouver, ISO, and other styles
37

Khayundi, Peter. "A comparison of open source object-oriented database products." Thesis, University of Fort Hare, 2009. http://hdl.handle.net/10353/254.

Full text
Abstract:
Object oriented databases have been gaining popularity over the years. Their ease of use and the advantages that they offer over relational databases have made them a popular choice amongst database administrators. Their use in previous years was restricted to business and administrative applications, but improvements in technology and the emergence of new, data-intensive applications has led to the increase in the use of object databases. This study investigates four Open Source object-oriented databases on their ability to carry out the standard database operations of storing, querying, updating and deleting database objects. Each of these databases will be timed in order to measure which is capable of performing a particular function faster than the other.
APA, Harvard, Vancouver, ISO, and other styles
38

劉美楣 and Mei-mei Lau. "A study of the gamma ray production from extragalactic objects." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1987. http://hub.hku.hk/bib/B31207698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Lau, Mei-mei. "A study of the gamma ray production from extragalactic objects /." [Hong Kong : University of Hong Kong], 1987. http://sunzi.lib.hku.hk/hkuto/record.jsp?B12335320.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Rivas, Teobaldo. "Objetos de aprendizagem no contexto das comunidades virtuais auto-organizadas para a produção de software livre e de código aberto." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/18/18140/tde-23012010-195124/.

Full text
Abstract:
A presente tese tem como objetivo produzir subsídios para a construção de um referencial teórico-metodológico sobre a utilização dos objetos de aprendizagem no contexto das comunidades virtuais auto-organizadas para a produção de software livre e de código aberto. Utiliza-se como metodologia a etnografia virtual (HINE, 2000) em conjunto com a teoria analítica da ação mediada (WERTSCH, 1991) e a análise de conteúdo (BARDIN, 2000). A coleta de dados foi efetuada, em duas fases, nos fóruns de discussão de quatro (4) comunidades, além de uma comunidade incubadora de desenvolvimento de projetos de software livre e de código aberto. Constata-se que os objetos de aprendizagem mediam o processo de solução de problemas, uma vez que 100% dos problemas da amostra analisada foram resolvidos, apesar da ausência de um padrão de conformidade desses objetos. Esta limitação é superada em razão do perfil específico imperante dos membros ativos da comunidade, pois estes possuem elevado nível de comportamento colaborativo/cooperativo, iniciativa voluntária, desprendimento, obstinação, capacidade de autoaprendizagem, autonomia, independência, disciplina, responsabilidade e comprometimento com prazos, qualidade dos produtos e outras exigências estabelecidas pela comunidade. Outro fator de relevância é que as comunidades são ricas em interação humana, o que qualifica o processo do significado da mediação e o ambiente de colaboração, nas ações referentes à localização, montagem e contextualização dos objetos de aprendizagem. Os significativos resultados atingidos por estas comunidades têm impactado, sobremaneira, as grandes organizações de produção de software, levando-as a rever suas estratégias corporativas, boas práticas de desenvolvimento, gestão de pessoas, equipes e projetos. Por outro lado, infere-se que a sustentabilidade de tais comunidades não pode estar assentada somente em atributos e habilidades pessoais, principalmente pelo fato de que a localização dos objetos de aprendizagem para a solução de problemas baseia-se no conhecimento tácito de seus membros. Necessário se faz agregar inovações na forma e funcionalidade de tais comunidades (padrão de conformidade, métodos e ferramentas tecnológicas), com vistas a possibilitar uma efetiva e universal acessibilidade do conhecimento produzido para a solução mais eficiente dos problemas, bem como incorporar membros com comportamentos e habilidades diversos. Os resultados desta pesquisa contribuem para as inovações futuras, no campo teórico e prático, na definição de um padrão de conformidade para a especificação, indexação, uso, combinação e avaliação dos objetos de aprendizagem, além de motivar a mudança de comportamento, cultura e forma de aprender.
The present thesis aims to produce subsidies for the development of a theoreticalmethodological referential on the use of learning objects in the context of self-organized virtual communities for the development of free and open source software. The methodologies used were the virtual ethnography (HINE, 2000), together with the mediated action analytical theory (WERTSCH, 1991) and content analysis (BARDIN, 2000). Data was collected in two phases: from the discussion forums of (4) four communities, and from an incubator community for the development of free and open source software projects. The learning objects appear to mediate the problem solution processes, since all the problems of the analyzed sample were solved, despite the lack of a standard of compliance of those objects. This limitation is overcome due to the specific dominant profile of the active members of the community, who display a high level of collaborative behavior, voluntary initiative, detachment, obstinacy, auto-learning capacity, autonomy, independence, discipline, responsibility and commitment to deadlines, product quality and other requirements established by the community. Another relevant aspect is that the communities are abundant in human interaction, what qualifies the mediation significance process and the collaborative environment in the actions referring to location, assembling and contextualization of the learning objects. The meaningful results obtained by those communities have led great software production organizations to review their corporate strategies, good developmental practices, staff, people and project management. On the other hand, it is inferred that the sustainability of those communities can not be maintained only by personal attributes and abilities, specially because the location of the learning objects for problem solution is based on the tacit knowledge of their members. It is necessary to aggregate innovation into the nature and functionality of those communities (standard compliance, technological methods and tools), to not only enable an effective and universal accessibility to knowledge leading to more efficient problem solution, but also to incorporate members with diverse behavior and abilities. The results of the present research contribute to future innovation in both theoretical and practical fields in the definition of a pattern of conformity for the specification, indexation, use, combination and evaluation of learning objects, in addition to motivating a change of behavior, culture and way of learning.
APA, Harvard, Vancouver, ISO, and other styles
41

Guagliumi, Arthur Robert. "Assemblage art: origins and sources /." Access Digital Full Text version, 1990. http://pocketknowledge.tc.columbia.edu/home.php/bybib/10910244.

Full text
Abstract:
Thesis (Ed.D)--Teachers College, Columbia University, 1990.
Includes appendices. Typescript; issued also on microfilm. Sponsor: Justin Schorr. Dissertation Committee: David S. Nateman. Bibliography: leaves 162-186.
APA, Harvard, Vancouver, ISO, and other styles
42

Aval, Josselin. "Automatic mapping of urban tree species based on multi-source remotely sensed data." Thesis, Toulouse, ISAE, 2018. http://www.theses.fr/2018ESAE0021/document.

Full text
Abstract:
Avec l'expansion des zones urbaines, la pollution de l'air et l'effet d'îlot de chaleur augmentent, entraînant des problèmes de santé pour les habitants et des changements climatiques mondiaux. Dans ce contexte, les arbres urbains sont une ressource précieuse pour améliorer la qualité de l'air et promouvoir les îlot de fraîcheur. D'autre part, les canopées sont soumises à des conditions spécifiques dans l'environnement urbain, causant la propagation de maladies et la diminution de l'espérance de vie parmi les arbres. Cette thèse explore le potentiel de la télédétection pour la cartographie automatique des arbres urbains, de la détection des couronnes d'arbres à l'estimation des espèces, une tâche préliminaire essentielle pour la conception des futures villes vertes, et pour une surveillance efficace de la végétation. Fondé sur des données hyperspectrales aéroportées, panchromatiques et un modèle numérique de surface, le premier objectif de cette thèse consiste à tirer parti de plusieurs sources de données pour améliorer les cartes d'arbres urbains existants, en testant différentes stratégies de fusion (fusion de caractéristiques et fusion de décision). La nature des résultats nous a conduit à optimiser la complémentarité des sources. En particulier, le deuxième objectif est d'étudier en profondeur la richesse des données hyperspectrales, en développant une approche d'ensemble classifier fondée sur des indices de végétation, où les "classifier" sont spécifiques aux espèces. Enfin, la première partie a mis en évidence l'intérêt de distinguer les arbres de rue des autres structures d'arbres urbains. Dans un cadre de Marked Point Process, le troisième objectif est de détecter les arbres en alignement urbain. Par le premier objectif, cette thèse démontre que les données hyperspectrales sont le principal moteur de la précision de la prédiction des espèces. La stratégie de fusion au niveau de décision est la plus appropriée pour améliorer la performance en comparaison des données hyperspectrales seules, mais de légères améliorations sont obtenues (quelques %) en raison de la faible complémentarité des caractéristiques texturales et structurelles en plus des caractéristiques spectrales. L'approche d'ensemble classifier développée dans la deuxième partie permet de classer les espèces d'arbres à partir de références au sol, avec des améliorations significatives par rapport à une approche standard de classification au niveau des caractéristiques. Chaque classifieur d'espèces extrait reflète les attributs spectraux discriminants de l'espèce et peut être relié à l'expertise des botanistes. Enfin, les arbres de rue peuvent être cartographiés grâce au terme d'interaction des MPP proposé qui modélise leurs caractéristiques contextuelles (alignement et hauteurs similaires). De nombreuses améliorations doivent être explorées comme la délimitation plus précise de la couronne de l'arbre, et plusieurs perspectives sont envisageables après cette thèse, parmi lesquelles le suivi de l'état de santé des arbres urbains
With the expansion of urban areas, air pollution and heat island effect are increasing, leading to state of health issues for the inhabitants and global climate changes. In this context, urban trees are a valuable resource for both improving air quality and promoting freshness islands. On the other hand, canopies are subject to specific conditions in the urban environment, causing the spread of diseases and life expectancy decreases among the trees. This thesis explores the potential of remote sensing for the automatic urban tree mapping, from the detection of the individual tree crowns to their species estimation, an essential preliminary task for designing the future green cities, and for an effective vegetation monitoring. Based on airborne hyperspectral, panchromatic and Digital Surface Model data, the first objective of this thesis consists in taking advantage of several data sources for improving the existing urban tree maps, by testing different fusion strategies (feature and decision level fusion). The nature of the results led us to optimize the complementarity of the sources. In particular, the second objective is to investigate deeply the richness of the hyperspectral data, by developing an ensemble classifiers approach based on vegetation indices, where the classifiers are species specific. Finally, the first part highlighted to interest of discriminating the street trees from the other structures of urban trees. In a Marked Point Process framework, the third objective is to detect trees in urban alignment. Through the first objective, this thesis demonstrates that the hyperspectral data are the main driver of the species prediction accuracy. The decision level fusion strategy is the most appropriate one for improving the performance in comparison the hyperspectral data alone, but slight improvements are obtained (a few percent) due to the low complementarity of textural and structural features in addition to the spectral ones. The ensemble classifiers approach developed in the second part allows the tree species to be classified from ground-based references, with significant improvements in comparison to a standard feature level classification approach. Each extracted species classifier reflects the discriminative spectral attributes of the species and can be related to the expertise of botanists. Finally, the street trees can be mapped thanks to the proposed MPP interaction term which models their contextual features (alignment and similar heights). Many improvements have to be explored such as the more accurate tree crown delineation, and several perspectives are conceivable after this thesis, among which the state of health monitoring of the urban trees
APA, Harvard, Vancouver, ISO, and other styles
43

Laverdet, Caroline. "Aspects juridiques des mondes virtuels." Thesis, Paris 2, 2020. http://www.theses.fr/2020PA020006.

Full text
Abstract:
Les « mondes virtuels », ou « métavers », permettent à de nombreux utilisateurs de s’immerger en ligne, dans des espaces tridimensionnels, interactifs et persistants, par le biais de leur avatar. L’engouement économique suscité par ces univers se heurte à un encadrement juridique encore aujourd’hui quasi-inexistant. Sont par exemple revendiqués des droits de propriété sur les objets virtuels, la protection de la liberté d’expression au sein des univers virtuels, ainsi qu’une protection spécifique de l’avatar, notamment lorsque l’éditeur décide unilatéralement de supprimer le compte d’un utilisateur. Or ces droits et libertés s’opposent généralement aux règles et conditions d’utilisation fixées par les éditeurs, conditions qui doivent impérativement être acceptées par les utilisateurs pour pouvoir accéder aux univers persistants. Dès lors, faut-il appliquer et, plus simplement, adapter les règles juridiques existant dans le monde réel aux mondes virtuels ? A travers l’étude des aspects juridiques des mondes virtuels, l’objectif de cette thèse est de s’interroger sur la manière dont le droit s’est saisi à ce jour des espaces persistants et sur les conditions d’une meilleure appréhension juridique future
"Virtual worlds", or "metavers", allow many users to immerse themselves online, in three-dimensional, interactive and persistent spaces, through their avatars. The economic craze generated by these universes is confronted with a legal framework that is still almost non-existent today. For example, property rights on virtual objects, the protection of freedom of expression within virtual universes, as well as specific protection of the avatar are claimed, particularly when the publisher unilaterally decides to delete a user's account. However, these rights and freedoms generally conflict with the rules and conditions of use set by publishers, which must be accepted by users in order to access persistent universes. Therefore, should we apply and, more simply, adapt the legal rules existing in the real world to virtual worlds? Through the study of the legal aspects of virtual worlds, the objective of this thesis is to question the way in which the law has so far seized persistent spaces and the conditions for a better future legal apprehension
APA, Harvard, Vancouver, ISO, and other styles
44

Katarina, Gavrić. "Mining large amounts of mobile object data." Phd thesis, Univerzitet u Novom Sadu, Fakultet tehničkih nauka u Novom Sadu, 2017. https://www.cris.uns.ac.rs/record.jsf?recordId=105036&source=NDLTD&language=en.

Full text
Abstract:
Within this thesis, we examined the possibilities of using an increasing amount ofpublicly available metadata about locations and peoples' activities in order to gainnew knowledge and develop new models of behavior and movement of people. Thepurpose of the research conducted for this thesis was to solve practical problems,such as: analyzing attractive tourist sites, defining the most frequent routes peopleare taking, defining main ways of transportation, and discovering behavioralpatterns in terms of defining strategies to suppress expansion of virus infections. Inthis thesis, a practical study was carried out on the basis of protected (aggregatedand anonymous) CDR (Caller Data Records) data and metadata of geo-referencedmultimedia content.
Предмет и циљ истраживања докторске дисертације представља евалуацијамогућности коришћења све веће количине јавно доступних података олокацији и кретању људи, како би се дошло до нових сазнања, развили новимодели понашања и кретања људи који се могу применити за решавањепрактичних проблема као што су: анализа атрактивних туристичких локација,откривање путања кретања људи и средстава транспорта које најчешћекористе, као и откривање важних параметара на основу којих се можеразвити стратегија за заштиту нације од инфективних болести итд. У раду је уту сврхе спроведена практична студија на бази заштићених (агрегираних ианонимизираних) ЦДР података и метаподатака гео-референцираногмултимедијалног садржаја. Приступ је заснован на примени техникавештачке интелигенције и истраживања података.
Predmet i cilj istraživanja doktorske disertacije predstavlja evaluacijamogućnosti korišćenja sve veće količine javno dostupnih podataka olokaciji i kretanju ljudi, kako bi se došlo do novih saznanja, razvili novimodeli ponašanja i kretanja ljudi koji se mogu primeniti za rešavanjepraktičnih problema kao što su: analiza atraktivnih turističkih lokacija,otkrivanje putanja kretanja ljudi i sredstava transporta koje najčešćekoriste, kao i otkrivanje važnih parametara na osnovu kojih se možerazviti strategija za zaštitu nacije od infektivnih bolesti itd. U radu je utu svrhe sprovedena praktična studija na bazi zaštićenih (agregiranih ianonimiziranih) CDR podataka i metapodataka geo-referenciranogmultimedijalnog sadržaja. Pristup je zasnovan na primeni tehnikaveštačke inteligencije i istraživanja podataka.
APA, Harvard, Vancouver, ISO, and other styles
45

Lee, Young Chang Kai-Hsiung. "Automated source code measurement environment for software quality." Auburn, Ala., 2007. http://repo.lib.auburn.edu/2007%20Fall%20Dissertations/Lee_Young_28.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Dowthwaite, J. C. "Very energetic gamma rays from binary X-ray sources and other astronomical objects." Thesis, Durham University, 1987. http://etheses.dur.ac.uk/7064/.

Full text
Abstract:
This thesis describes the observation of a number of astronomical objects using the University of Durham Atmospheric Cerenkov light detectors. The array of telescopes was used to study the Very High Energy (V.H.E.) gamma-radiation from these objects from June 1981 until November 1984.The general features of Gamma-ray astronomy are briefly discussed, and a review of the main results of previous gamma-ray observations is given. The basic theory and general characteristics of Atmospheric Cerenkov Effect experiments are reviewed. Details of the design, operation and performance of the University of Durham facility are presented in addition to details of the improvements achieved in the development of a new telescope. In particular, the new optical system is described. The main analysis procedures are explained. The adaptation of statistical techniques used to analyse the intensity of the Cerenkov light flash is described in some detail. A discussion of the problems involved in conducting an extensive search for periodicity in the data collected from Cygnus X-3 is given. A procedure for testing for transient pulsed gamma-ray emission from the Crab Pulsar is also described. The results of the observations from several objects are presented., the binary X-ray sources, Cygnus X-3, Hercules X-l and 4U0115+63, the Crab pulsar and the Galactic Plane. In addition, the preliminary results from observations of seven radio pulsars and seven other objects are given. A review of the main production mechanisms of V.H.E. gamma-radiation is given with particular emphasis on the models proposed for the high energy processes in Cygnus X-3, other binary-ray sources and pulsars.
APA, Harvard, Vancouver, ISO, and other styles
47

Bossuyt, Bernard J. Snyder Byron B. "Software testing tools : analyses of effectiveness on procedural and object-orientated source code/." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2001. http://handle.dtic.mil/100.2/ADA397128.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Snyder, Byron B. "Software testing tools : analyses of effectiveness on procedural and object-orientated source code." Thesis, Monterey, California. Naval Postgraduate School, 2001. http://hdl.handle.net/10945/1938.

Full text
Abstract:
The levels of quality, maintainability, testability, and stability of software can be improved and measured through the use of automated testing tools throughout the software development process. Automated testing tools assist software engineers to gauge the quality of software by automating the mechanical aspects of the softwaretesting task. Automated testing tools vary in their underlying approach, quality, and ease-of-use, among other characteristics. Evaluating available tools and selecting the most appropriate suite of tools can be a difficult and time-consuming process. In this thesis, we propose a suite of objective metrics for measuring tool characteristics, as an aid in systematically evaluating and selecting automated testing tools. Future work includes further research into the validity and utility of this suite of metrics, conducting similar research using a larger software project, and incorporating a larger set of tools into similar research.
US Navy (USN) author
APA, Harvard, Vancouver, ISO, and other styles
49

Davorka, Radaković. "Metadata-Supported Object-Oriented Extension of Dynamic Geometry SoftwareTI." Phd thesis, Univerzitet u Novom Sadu, Prirodno-matematički fakultet u Novom Sadu, 2019. https://www.cris.uns.ac.rs/record.jsf?recordId=111034&source=NDLTD&language=en.

Full text
Abstract:
Nowadays, Dynamic Geometry Software (DGS) is widely accepted as a tool for creating and presenting visually rich interactive teaching and learning materials, called dynamic drawings. Dynamic drawings are specified by writing expressions in functional domain-specific languages. Due to wide acceptance of DGS, there has arisen a need for their extensibility, by adding new semantics and visual objects (visuals). We have developed a programming framework for the Dynamic Geometry Software, SLGeometry, with a genericized functional language and corresponding expression evaluator that act as a framework into which specific semantics is embedded in the form of code annotated with metadata. The framework transforms an ordinary expression tree evaluator into an object-oriented one, and provide guidelines and examples for creation of interactive objects with dynamic properties, which participate in evaluation optimization at run-time. Whereas other DGS are based on purely functional expression evaluators, our solution has advantages of being more general, easy to implement, and providing a natural way of specifying object properties in the user interface, minimizing typing and syntax errors.LGeometry is implemented in C# on the .NET Framework. Although attributes are a preferred mechanism to provide association of declarative information with C# code, they have certain restrictions which limit their application to representing complex structured metadata. By developing a metadata infrastructure which is independent of attributes, we were able to overcome these limitations. Our solution, presented in this  dissertation, provides extensibility to simple and complex data types, unary and binary operations, type conversions, functions and visuals, thus enabling developers to seamlessly add new features to SLGeometry by implementing them as C# classes annotated with metadata. It also provides insight into the way a domain specific functional language of dynamic geometry software can be genericized and customized for specific needs by extending or restricting the set of types, operations, type conversions, functions and visuals.Furthermore, we have conducted  experiments with several groups of students of mathematics and high school pupils, in order to test how our approach compares to the existing practice. The experimental subjects tested mathematical games using interactive visual controls (UI controls) and sequential behavior controllers. Finally, we present a new evaluation algorithm, which was compared to the usual approach employed in DGS and found to perform well, introducing advantages while maintaining the same level of performance.
U današnje vreme softver za dinamičku geometriju (DGS) je široko prihvaćen kao alat za kreiranje i prezentovanje vizuelno bogatih interaktivnih nastavnih materijala i materijala za samostalno učenje, nazvanih dinamičkim crtežima. Kako je raslo prihvatanje softvera za dinamičku geometriju, tako je i rasla potreba da se oni proširuju, dodajući im novu semantiku i vizualne objekte. Razvili smo programsko okruženje za softver za dinamičku geometriju, SLGeometry, sa generičkim  funkcionalnim jezikom i odgovarajućim evaluatorom izraza koji čini okruženje u kom su ugrađene specifične semantike u obliku koda označenog metapodacima. Ovo okruženje pretvara uobičajen evaluator stabla izraza u objektno orijentiran, te daje uputstva i primere za stvaranje interaktivnih objekata sa dinamičkim osobinama, koji sudeluju u optimizaciji izvršenja tokom izvođenja. Dok se drugi DGS-ovi temelje na čisto funkcionalnim evaluatorima izraza, naše rješenje ima prednosti jer je uopštenije, lako za implementaciju i pruža prirodan način navođenja osobina objekta u korisničkom interfejsu, minimizirajući kucanje i sintaksne greške. SLGeometry je implementirana u jeziku C# .NET Framework-a. Iako su atributi preferiran mehanizam, koji povezuje C# kôd sa deklarativnim informacijama, oni imaju određena ograničenja koja limitiraju njihovu primenu za predstavljanje složenih strukturiranih metapodataka. Razvijanjem infrastrukture metapodataka koja je nezavisna od atributa, uspeli smo prevladati ta ograničenja. Naše rešenje, predstavljeno u ovoj disertaciji, pruža proširivost: jednostavnim i složenim vrstama podataka, unarnim i binarnim operacijama, konverzijama tipova, funkcijama i vizuelnim objektima, omogućavajući  time programerima da neprimetno dodaju nove osobine u SLGeometry  implementirajući ih kao C# klase označene metapodacima.
APA, Harvard, Vancouver, ISO, and other styles
50

Sodreau, Alexandre. "Design de précurseurs organométalliques et synthèse contrôlée de nano-objets de germaniure de fer." Thesis, Toulouse 3, 2018. http://www.theses.fr/2018TOU30290/document.

Full text
Abstract:
La synthèse contrôlée de nano-alliages de germaniure de fer a connu un regain d'intérêt grâce à la récente découverte de nouvelles applications dans le domaine du stockage d'informations. Toutefois, la chimie du couple fer-germanium est une chimie complexe qui reste peu étudiée. Ces travaux de thèse associent chimie moléculaire et chimie des nano-objets pour explorer les potentialités de précurseurs mono-sources pour la synthèse en solution et en conditions douces de NPs de germaniure de fer. Dans un premier temps, nous nous sommes intéressés à la formation de nouveaux complexes présentant une architecture de type amidinatogermylène offrant un équilibre entre la stabilisation des complexes et leurs températures de décomposition, par exemple les complexes mono-germylène fer {[iPrNC(tBu)NiPr]GeCl}Fe(CO)4 et {[iPrNC(tBu)NiPr]GeHMDS}Fe(CO)4 ou le complexe bis-germylène fer {[iPrNC(tBu)NiPr]GeCl}2Fe(CO)3. Dans un second temps, nous montrons que cette méthode constitue une voie de choix permettant la formation de nano-alliages de germaniure de fer et que l'architecture des précurseurs mono-sources permet d'obtenir un contrôle sur les nanoparticules finales. En particulier, la décomposition à 200°C du complexe {[iPrNC(tBu)NiPr]GeHMDS}Fe(CO)4 conduit à la formation de nanoparticules sphériques de phase Fe3,2Ge2, de 6,5 ± 0,8 nm, présentant un comportement ferromagnétique
Controlled synthesis of nano-alloys of iron germanide has gained a renewed interest thanks to the recent discovery of new applications in the field of information storage. However, the chemistry of the iron-germanium pair is a complex chemistry that remains little studied. The work presented in this thesis combines molecular chemistry and nano-object chemistry to explore the potential of single-source precursors for solution synthesis, in soft conditions, of iron germanium NPs. First, we focused on the formation of new complexes with an amidinatogermylene-type architecture offering a balance between the stabilization of complexes and their decomposition temperatures, for example mono-germylene iron complexes {[iPrNC(tBu)NiPr]GeCl}Fe(CO)4 and {[iPrNC(tBu)NiPr]GeHMDS}Fe(CO)4 or the bis-germylene iron complex {[iPrNC(tBu)NiPr]GeCl}2Fe(CO)3. In a second step, we show that this method represents a path of choice to reach the formation of nano-alloys of iron germanide and that the architecture of the mono-source precursors allows to control the final nanoparticles. In particular, the decomposition at 200°C. of the {[iPrNC(tBu)NiPr]GeHMDS}Fe(CO)4 complex leads to the formation of Fe3,2Ge2 spherical nanoparticles, with a mean diameter of 6.5 ± 0.8 nm, exhibiting a ferromagnetic behavior
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography