To see the other types of publications on this topic, follow the link: Methods and techniques of biology.

Dissertations / Theses on the topic 'Methods and techniques of biology'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Methods and techniques of biology.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Veneziano, Dario. "Knowledge bases, computational methods and data mining techniques with applications to A-to-I RNA editing, Synthetic Biology and RNA interference." Doctoral thesis, Università di Catania, 2015. http://hdl.handle.net/10761/4085.

Full text
Abstract:
La Bioinformatica, nota anche come Biologia Computazionale, è un campo relativamente nuovo che mira alla risoluzione di problemi biologici attraverso approcci computazionali. Questa scienza interdisciplinare persegue due obiettivi particolari tra i molti: da un lato, la costruzione di database biologici per memorizzare razionalmente sempre maggiori quantità di dati che divengono sempre più disponibili, e, dall'altro, lo sviluppo e l'applicazione di algoritmi al fine di estrarre pattern di predizione ed inferire nuove conoscenze altrimenti impossibili da ottenere da tali dati. Questa tesi presenterà nuovi risultati su entrambi questi aspetti. Infatti, il lavoro di ricerca descritto in questa tesi di dottorato ha avuto come obiettivo lo sviluppo di euristiche e tecniche di data mining per la raccolta e l'analisi di dati relativi ai meccanismi di regolazione post-trascrizionale ed RNA interference, così come il collegamento del fenomeno dell RNA A-to-I editing con la regolazione genica mediate dai miRNA. In particolare, gli sforzi sono stati finalizzati allo sviluppo di una banca dati per la predizione di siti di legame per miRNA editati tramite RNA A-to-I editing; un algoritmo per la progettazione di miRNA sintetici con alta specificità; e una base di conoscenza dotata di algoritmi di data mining per l'annotazione funzionale dei microRNA, proposta come risorsa unificata per la ricerca sui miRNA.
APA, Harvard, Vancouver, ISO, and other styles
2

Wallrapp, Frank. "Mixed quantum and classical simulation techniques for mapping electron transfer in proteins." Doctoral thesis, Universitat Pompeu Fabra, 2011. http://hdl.handle.net/10803/22685.

Full text
Abstract:
El objetivo de esta tesis se centra en el estudio de la transferencia de electrones (ET), una de las reacciones más simples y cruciales en bioquímica. Para dichos procesos, obtener información directa de los factores que lo promueves, asi como del camino de transferencia electronica, no es una tarea trivial. Dicha información a un nivel de conocimiento detallado atómico y electrónico, sin embargo, es muy valiosa en términos de una mejor comprensión del ciclo enzimático, que podría conducir, por ejemplo, a un diseño más eficaz de inhibidores. El objetivo principal de esta tesis es el desarrollo de una metodología para el estudio cuantitativo de la ET en los sistemas biológicos. En este sentido, hemos desarrollado un nuevo método para obtener el camino de transferencia electrónico, llamado QM/MM e-­‐ Pathway, que se puede aplicar en sistemas complejos con ET de largo alcance. El método se basa en una búsqueda sucesiva de residuos importantes para la ET, utilizando la modificación de la región quantica en métodos mixtos QM/MM, y siguiendo la evolución de la densidad de espín dentro de la zona de transferencia. Hemos demostrado la utilidad y la aplicabilidad del algoritmo en el complejo P450cam/Pdx, identificando el papel clave de la Arg112 (en P450cam) y del Asp48 (en Pdx), ambos conocidos en la literatura. Además de obtener caminos de ET, hemos cuantificado su importancia en términos del acoplamiento electrónico entre el dador y aceptor para los diferentes caminos. En este sentido, se realizaron dos estudios de la influencia del solvente y de la temperatura en el acoplamiento electrónico para sistemas modelo oligopéptidos. Ambos estudios revelaron que los valores del acoplamiento electrónico fluctúan fuertemente a lo largo de las trayectorias de dinámica molecular obtenidas, y el mecanismo de transferencia de electrones se ve ampliamente afectado por el espacio conformacional del sistema. La combinación del QM/MM e-­‐pathway y de los cálculos de acoplamiento electronico fueron utilizados finalmente para investigar la ET en el complejo CCP/Cytc. Nuestros hallazgos indican el papel fundamental del Trp191 en localizar un estadio intermedio para la transferencia electronica, así como el camino ET principal que incluye Ala194, Ala193, Gly192 y Trp191. Ambos hallazgos fueron confirmados a través de la literatura. Los resultados obtenidos para el muestro de manios de ET, junto con su evaluación a través de cálculos de acoplamiento electrónico, sugieren un enfoque sencillo y prometedor para investigar ET de largo alcance en proteínas.
The focus of this PhD thesis lies on electron transfer (ET) processes, belonging to the simplest but most crucial reactions in biochemistry. Getting direct information of the forces driving the process and the actual electron pathway is not a trivial task. Such atomic and electronic detailed information, however, is very valuable in terms of a better understanding of the enzymatic cycle, which might lead, for example, to more efficient protein inhibitor design. The main objective of this thesis was the development of a methodology for the quantitative study of ET in biological systems. In this regard, we developed a novel approach to map long-­‐range electron transfer pathways, called QM/MM e-­‐Pathway. The method is based on a successive search for important ET residues in terms of modifying the QM region following the evolution of the spin density of the electron (hole) within a given transfer region. We proved the usefulness and applicability of the algorithm on the P450cam/Pdx complex, indicating the key role of Arg112 of P450cam and Asp48 of Pdx for its ET pathway, both being known to be important from the literature. Besides only identifying the ET pathways, we further quantified their importance in terms of electronic coupling of donor and acceptor incorporating the particular pathway residues. Within this regard, we performed two systematic evaluations of the underlying reasons for the influence of solvent and temperature onto electronic coupling in oligopeptide model systems. Both studies revealed that electronic coupling values strongly fluctuate throughout the molecular dynamics trajectories obtained, and the mechanism of electron transfer is affected by the conformational space the system is able to occupy. Combining both ET mapping and electronic coupling calculations, we finally investigated the electron transfer in the CcP/Cytc complex. Our findings indicate the key role of Trp191 being the bridge-­‐localized state of the ET as well as the main pathway consisting of Ala194, Ala193, Gly192 and Trp191 between CcP and Cytc. Both findings were confirmed through the literature. Moreover, our calculations on several snapshots state a nongated ET mechanism in this protein complex. The methodology developed along this thesis, mapping ET pathways together with their evaluation through electronic coupling calculations, suggests a straightforward and promising approach to investigate long-­‐range ET in proteins.
APA, Harvard, Vancouver, ISO, and other styles
3

Botha, Sabine [Verfasser], and Christian [Akademischer Betreuer] Betzel. "Developing Methods towards the Structure Determination of Biological Particles using Crystallographic and Single Particle Imaging Techniques / Sabine Botha ; Betreuer: Christian Betzel." Hamburg : Staats- und Universitätsbibliothek Hamburg, 2018. http://d-nb.info/115890035X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kim, Leo Dhohoon. "Visualizing discourses and governance of human embryonic stem cell research in South Korea (in comparison to the UK)." Thesis, University of Sussex, 2016. http://sro.sussex.ac.uk/id/eprint/65434/.

Full text
Abstract:
This thesis investigates how the discourses and governance of human embryonic stem cell (hESC) research operated in South Korea. Comparing South Korea to the UK in three fields (government, newspapers, and public responses) and reflecting scientific misconduct in the South Korean scientists' community, the study tries to identify hidden variables that influenced the national trajectory. To capture dynamic yet underrepresented national and cultural characteristics, the author has analysed microscopic interactions including actors' utterances, media framing, human relations and strategies. By using the methodology to pursue sociological approaches with semantic and social network analysis, concepts usually inferred and narrated by the researcher gain a visual and measurable representation in terms of Actor-Networks. The study concludes that the failure to institutionalise a sustainably cooperative research environment and (bio)ethical regulation in South Korea is an outcome of the lack of reflexive social discourse and deliberative governance. The national characteristics mainly derived from the subdued status of experts, scientists, in the government and the predominant media framing to represent life science as a mere tool to economic development. More crucially, people in general accepted the economy-oriented discourse. From the outcome of the semantic network analysis, it turns out that the public attitude was mainly constructed from people's limited objective and desire to utilise science to pursue social status and economic development. South Korean people largely disregarded the possible threat of hESC research to women's bodies that was related to human rights. A new scientific leadership should recognise this culturally embedded atmosphere and more effectively mediate government, mass media, lay public and scientific community by reconstituting expert role, critical media framing of science, and broader deliberation on the social function of scientific knowledge.
APA, Harvard, Vancouver, ISO, and other styles
5

Roudot, Philippe. "Image processing methods for dynamical intracellular processes analysis in quantitative fluorescence microscopy." Thesis, Rennes 1, 2014. http://www.theses.fr/2014REN1S025/document.

Full text
Abstract:
Nous présentons dans la première partie du document une étude portant sur l'imagerie de temps de vie de fluorescence sur structures dynamiques dans le domaine de fréquence (FD FLIM). Une mesure en FD FLIM est définie par une série d'images présentant une variation d'intensité sinusoïdale. La variation d'un temps de vie se traduit par une variation dans la phase de la sinusoïde décrite par l'intensité. Notre étude comporte deux contributions principales: une modélisation du processus de formation de l'image et du bruit inhérent au système d'acquisition (capteur ICCD) ; une méthode robuste d'estimation du temps vie sur des structures mobiles et des vésicules intracellulaires. Nous présentons ensuite une étude en microscopie de fluorescence portant sur la quantification du transport hétérogène dans un environnement intracellulaire dense. Les transitions entre la diffusion Brownienne dans le cytoplasme et les transports actifs supportés par le cytosquelette sont en effet des scénarios très couramment observés dans des cellules vivantes. Nous montrons que les algorithmes classiques de suivi d'objets nécessaires dans ce contexte, ne sont pas conçus pour détecter les transitions entre ces deux types de mouvement. Nous proposons donc un nouvel algorithme, inspiré de l'algorithme u-track [Jaqaman et al., 2008], qui s'appuie sur plusieurs filtrages de Kalman adaptés à différents types de transport (Brownien, Dirigé ...), indépendamment pour chaque objet suivi. Nous illustrons sur séquences simulées et expérimentales (vimentine, virus) l'aptitude de notre algorithme à détecter des mouvements dirigés rares
We propose in this manuscript a study of the instrumentation required for the quantification in frequency domain fluorescence lifetime imaging microscopy (FD FLIM). A FD FLIM measurement is defined as a series of images with sinusoidal intensity variations. The fluorescence lifetime is defined as the nanosecond-scale delay between excitation and emission of fluorescence. We propose two main contributions in the area: a modeling of the image process and noise introduced by the acquisition system (ICCD sensor); a robust statistical method for lifetime estimation on moving structures and intracellular vesicles. The second part presents a contribution to the tracking of multiple particles presenting heterogeneous transports in dense conditions. We focus here on the switching between confined diffusion in the cytosol and motor-mediated active transport in random directions. We show that current multiple model filtering and gating strategies fail at estimating unpredictable transitions between Brownian and directed displacements. We propose a new algorithm, based on the u-track algorithm [Jaqaman et al., 2008], based on a set of Kalman filters adapted to several motion types, for each tracked object. The algorithm has been evaluated on simulated and real data (vimentin, virus) data. We show that our method outperforms competing methods in the targeted scenario, but also on more homogeneous types of dynamics challenged by density
APA, Harvard, Vancouver, ISO, and other styles
6

Kratsch, Christina [Verfasser], Alice [Akademischer Betreuer] McHardy, Martin [Akademischer Betreuer] Lercher, and Martin [Akademischer Betreuer] Beer. "Computational methods to study phenotype evolution and feature selection techniques for biological data under evolutionary constraints / Christina Kratsch. Gutachter: Martin Lercher ; Martin Beer. Betreuer: Alice McHardy." Düsseldorf : Universitäts- und Landesbibliothek der Heinrich-Heine-Universität Düsseldorf, 2014. http://d-nb.info/1063085128/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kratsch, Christina Verfasser], Alice [Akademischer Betreuer] [McHardy, Martin [Akademischer Betreuer] Lercher, and Martin [Akademischer Betreuer] Beer. "Computational methods to study phenotype evolution and feature selection techniques for biological data under evolutionary constraints / Christina Kratsch. Gutachter: Martin Lercher ; Martin Beer. Betreuer: Alice McHardy." Düsseldorf : Universitäts- und Landesbibliothek der Heinrich-Heine-Universität Düsseldorf, 2014. http://d-nb.info/1063085128/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nagaraj, Nagarjuna [Verfasser], and Matthias [Akademischer Betreuer] Mann. "Developing mass spectrometry towards applications in clinical proteomics : improved sample preparation techniques and mass spectrometric methods for unbiased identification of proteome from clinical samples / Nagarjuna Nagaraj. Betreuer: Matthias Mann." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2010. http://d-nb.info/1015203140/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ramstein, Gérard. "Application de techniques de fouille de données en Bio-informatique." Habilitation à diriger des recherches, Université de Nantes, 2012. http://tel.archives-ouvertes.fr/tel-00706566.

Full text
Abstract:
Les travaux de recherche présentés par l'auteur ont pour objet l'application de techniques d'extraction de connaissances à partir de données (ECD) en biologie. Deux thèmes majeurs de recherche en bio-informatique sont abordés : la recherche d'homologues distants dans des familles de protéines et l'analyse du transcriptome. La recherche d'homologues distants à partir de séquences protéiques est une problématique qui consiste à découvrir de nouveaux membres d'une famille de protéines. Celle-ci partageant généralement une fonction biologique, l'identification de la famille permet d'investiguer le rôle d'une séquence protéique. Des classifieurs ont été développés pour discriminer une superfamille de protéines particulière, celle des cytokines. Ces protéines sont impliquées dans le système immunitaire et leur étude est d'une importance cruciale en thérapeutique. La technique des Séparateurs à Vastes Marges (SVM) a été retenue, cette technique ayant donné les résultats les plus prometteurs pour ce type d'application. Une méthode originale de classification a été conçue, basée sur une étape préliminaire de découverte de mots sur-représentés dans la famille d'intérêt. L'apport de cette démarche est d'utiliser un dictionnaire retreint de motifs discriminants, par rapport à des techniques utilisant un espace global de k-mots. Une comparaison avec ces dernières méthodes montre la pertinence de cette approche en termes de performances de classification. La seconde contribution pour cette thématique porte sur l'agrégation des classifieurs basée sur des essaims grammaticaux. Cette méthode vise à optimiser l'association de classifieurs selon des modèles de comportement sociaux, à la manière des algorithmes génétiques d'optimisation. Le deuxième axe de recherche traite de l'analyse des données du transcriptome. L'étude du transcriptome représente un enjeu considérable, tant du point de vue de la compréhension des mécanismes du vivant que des applications cliniques et pharmacologiques. L'analyse implicative sur des règles d'association, développée initialement par Régis Gras, a été appliquée aux données du transcriptome. Une approche originale basée sur des rangs d'observation a été proposée. Deux applications illustrent la pertinence de cette méthode : la sélection de gènes informatifs et la classification de tumeurs. Enfin, une collaboration étroite avec une équipe INSERM dirigée par Rémi Houlgatte a conduit à l'enrichissement d'une suite logicielle dédiée aux données de puces à ADN. Cette collection d'outils dénommée MADTOOLS a pour objectifs l'intégration de données du transcriptome et l'aide à la méta-analyse. Une application majeure de cette suite utilise les données publiques relatives aux pathologies musculaires. La méta-analyse, en se basant sur des jeux de données indépendants, améliore grandement la robustesse des résultats. L'étude systématique de ces données a mis en évidence des groupes de gènes co-exprimés de façon récurrente. Ces groupes conservent leur propriété discriminante au travers de jeux très divers en termes d'espèces, de maladies ou de conditions expérimentales. Cette étude peut évidemment se généraliser à l'ensemble des données publiques concernant le transcriptome. Elle ouvre la voie à une approche à très grande échelle de ce type de données pour l'étude d'autres pathologies humaines.
APA, Harvard, Vancouver, ISO, and other styles
10

Denolin, Vincent. "Sources of contrast and acquisition methods in functional MRI of the human brain." Doctoral thesis, Universite Libre de Bruxelles, 2002. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/211408.

Full text
Abstract:

L'Imagerie fonctionnelle par Résonance Magnétique (IRMf) a connu un développement important depuis sa découverte au début des années 1990. Basée le plus souvent sur l'effet BOLD (Blood Oxygenation Level Dependent), cette technique permet d'obtenir de façon totalement non-invasive des cartes d'activation cérébrale, avec de meilleures résolutions spatiale et temporelle que les méthodes préexistantes telles que la tomographie par émission de positrons (TEP). Facilement praticable au moyen des imageurs par RMN disponible dans les hôpitaux, elle a mené à de nombreuses applications dans le domaine des neurosciences et de l'étude des pathologies cérébrales.

Il est maintenant bien établi que l'effet BOLD est dû à une augmentation de l'oxygénation du sang veineux dans les régions du cerveau où se produit l'activation neuronale, impliquant une diminution de la différence de susceptibilité magnétique entre le sang et les tissus environnants (la déoxyhémoglobine étant paramagnétique et l'oxyhémoglobine diamagnétique), et par conséquent un augmentation du signal si la méthode d'acquisition est sensible aux inhomogénéités de champ magnétique. Cependant, il reste encore de nombreuses inconnues quant aux mécanismes liant les variations d'oxygénation, de flux et de volume sanguin à l'augmentation de signal observée, et la dépendance du phénomène en des paramètres tels que l'intensité du champ, la résolution spatiale, et le type de séquence de RMN utilisée. La première partie de la thèse est donc consacrée à l'étude de l'effet BOLD, dans le cas particulier des contributions dues aux veines de drainage dans les séquences de type écho de gradient rendues sensibles au mouvement par l'ajout de gradients de champ. Le modèle développé montre que, contrairement au comportement suggéré par de précédentes publications, l'effet de ces gradients n'est pas une diminution monotone de la différence de signal lorsque l'intensité des gradients augmente. D'importantes oscillations sont produites par l'effet de phase dû au déplacement des spins du sang dans les gradients additionnels, et par la variation de cette phase suite à l'augmentation du flux sanguin. La validation expérimentale du modèle est réalisée au moyen de la séquence PRESTO (Principles of Echo-Shifting combined with a Train of Observations), c'est-à-dire une séquence en écho de gradient où des gradients supplémentaires permettent d'augmenter la sensibilité aux inhomogénéités de champ, et donc à l'effet BOLD. Un accord qualitatif avec la théorie est établi en montrant que la variation de signal observée peut augmenter lorsqu'on intensifie les gradients additionnels.

Un autre source de débat continuel dans le domaine de l'IRMf réside dans l'optimalisation des méthodes d'acquisition, au point de vue notamment de leur sensibilité à l'effet BOLD, leurs résolutions spatiale et temporelle, leur sensibilité à divers artefacts tels que la perte de signal dans les zones présentant des inhomogénéités de champ à grande échelle, et la contamination des cartes d'activation par les contributions des grosses veines, qui peuvent être distantes du lieu d'activation réel. Les séquences en écho de spin sont connues pour être moins sensibles à ces deux derniers problèmes, c'est pourquoi la deuxième partie de la thèse est consacrée à une nouvelle technique permettant de donner une pondération T2 plutôt que T2* aux images. Le principe de base de la méthode n'est pas neuf, puisqu'il s'agit de la « Préparation T2 » (T2prep), qui consiste à atténuer l'aimantation longitudinale différemment selon la valeur du temps de relaxation T2, mais il n’avait jamais été appliqué à l’IRMf. Ses avantages par rapport à d’autres méthodes hybrides T2 et T2* sont principalement le gain en résolution temporelle et en dissipation d’énergie électromagnétique dans les tissus. Le contraste généré par ces séquences est étudié au moyen de solutions stationnaires des équations de Bloch. Des prédictions sont faites quant au contraste BOLD, sur base de ces solutions stationnaires et d’une description simplifiée de l’effet BOLD en termes de variations de T2 et T2*. Une méthode est proposée pour rendre le signal constant au travers du train d’impulsions en faisant varier l’angle de bascule d’une impulsion à l’autre, ce qui permet de diminuer le flou dans les images. Des expériences in vitro montrent un accord quantitatif excellent avec les prédictions théoriques quant à l’intensité des signaux mesurés, aussi bien dans le cas de l’angle constant que pour la série d’angles variables. Des expériences d’activation du cortex visuel démontrent la faisabilité de l’IRMf au moyen de séquences T2prep, et confirment les prédictions théoriques quant à la variation de signal causée par l’activation.

La troisième partie de la thèse constitue la suite logique des deux premières, puisqu’elle est consacrée à une extension du principe de déplacement d’écho (echo-shifting) aux séquences en écho de spin à l’état stationnaire, ce qui permet d’obtenir une pondération T2 et T2* importante tout en maintenant un temps de répétition court, et donc une bonne résolution temporelle. Une analyse théorique approfondie de la formation du signal dans de telles séquences est présentée. Elle est basée en partie sur la technique de résolution des équations de Bloch utilisée dans la deuxième partie, qui consiste à calculer l’aimantation d’état stationnaire en fonction des angles de précession dans le plan transverse, puis à intégrer sur les isochromats pour obtenir le signal résultant d’un voxel (volume element). Le problème est aussi envisagé sous l’angle des « trajectoires de cohérence », c’est-à-dire la subdivision du signal en composantes plus ou moins déphasées, par l’effet combiné des impulsions RF, des gradients appliqués et des inhomogénéités du champ magnétique principal. Cette approche permet d’interpréter l’intensité du signal dans les séquences à écho déplacé comme le résultat d’interférences destructives entre diverses composantes physiquement interprétables. Elle permet de comprendre comment la variation de la phase de l’impulsion d’excitation (RF-spoiling) élimine ces interférences. Des expériences in vitro montrent un accord quantitatif excellent avec les calculs théoriques, et la faisabilité de la méthode in vivo est établie. Il n’est pas encore possible de conclure quant à l’applicabilité de la nouvelle méthode dans le cadre de l’IRMf, mais l’approche théorique proposée a en tout cas permis de revoir en profondeur les mécanismes de formation du signal pour l’ensemble des méthodes à écho déplacé, puisque le cas de l’écho de gradient s’avère complètement similaire au cas de l’écho de spin.

La thèse évolue donc progressivement de la modélisation de l’effet BOLD vers la conception de séquences, permettant ainsi d’aborder deux aspects fondamentaux de la physique de l’IRMf.


Doctorat en sciences appliquées
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
11

Mieth, Bettina [Verfasser], Klaus-Robert [Akademischer Betreuer] Müller, Klaus-Robert [Gutachter] Müller, Arcadi [Gutachter] Navarro, and Peter [Gutachter] Martus. "Combining traditional methods with novel machine learning techniques to understand the translation of genetic code into biological function / Bettina Mieth ; Gutachter: Klaus-Robert Müller, Arcadi Navarro, Peter Martus ; Betreuer: Klaus-Robert Müller." Berlin : Technische Universität Berlin, 2021. http://d-nb.info/1240309414/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Paley, Christopher John. "Network methods in evolutionary biology." Thesis, University of Cambridge, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.611209.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Nalbandian, Christopher John. "Catalytic Methods for Chemical Biology." Thesis, University of California, San Diego, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10812166.

Full text
Abstract:

Inspired by atropisomerism and examples of differential biology displayed by enantiomeric compounds, I set out to develop a late stage regioselective functionalization of known kinase inhibitors that exhibit atropisomerism. The regioselective functionalization of these target atropisomeric scaffolds is pivotal in order to increase their barrier to rotation, rendering them isolable enantiomers. Chapter 1 explores the chemical methods developed in order to achieve this transformation. The mild late stage regioselective chlorination, and more broadly, halogenation of several diverse aromatics along with the target atropisomeric kinase inhibitor was achieved by employing phosphine sulfide Lewis base catalysts. Encouraged by this mild catalytic approach to functionalizing electron rich aromatics and heterocycles, I was intrigued to see if similar conditions could be extended to aromatic sulfenylation. Chapter 2 explores aromatic sulfenylation and the development of several bifunctional Lewis base-Bronsted acid catalysts to affect a mild sulfenylation of electron rich aza-heterocycles. The sulfenylation reagents employed in this project were diverse and the notable azido group was incorporated into one of the reagents. This advancement was applied to bioactives and peptides. When functionalizing di-substituted and tetra-substituted peptides our sulfenylation conditions were shown to be amenable in the presence of other electron rich side chains included amino acids, histidine and tyrosine. Though this methodology is selective, the limitation of this methodology is the scope of the reaction, limited to electron rich aza-heterocycles. This limitation was overcome by employing more electron rich selenoether Lewis base catalysts along with catalytic acid described in chapter 3. The findings in chapter 3 provide an improvement to mild sulfenylation methodologies by increasing the scope of reaction along with uncovering key mechanistic findings. The SCF3 group was applied to 3 FDA-approved drugs and shown to have significant increases in reactivity when catalytic selenoether Lewis base, along with catalytic acid are present. A comparison of electron rich and electron poor sulfenylation reagents corroborate the kinetic findings in this project.

APA, Harvard, Vancouver, ISO, and other styles
14

Zeldin, Robert Oliver. "Methods development for structural biology." Thesis, University of Oxford, 2013. https://ora.ox.ac.uk/objects/uuid:6f86c710-c507-405c-b58f-cd2eb6264eca.

Full text
Abstract:
Two research questions are investigated here: the first, major, section addresses the problem of uneven distributions of dose (absorbed energy per unit mass) in crystals used for macromolecular crystallography (MX), and the second presents the develop- ment of a high-throughput metalloprotein characterisation technique, HT microPIXE. In MX, the advent of X-ray microbeam data collection has led to uneven distributions of dose within the crystal volume becoming increasingly common. In these cases, the rotation method creates a highly damaged central region of crystal that stays within the beam throughout exposure, and less damaged outer regions, which are introduced during rotation. This thesis presents a new software program, raddose-3d, which performs a full 3D simulation of the profile of absorbed energy (the dose state) within a crystal during X-ray exposure. In order to utilise this time resolved, 3D picture of the dose state of the crystal, a new metric – Diffraction Weighted Dose – is proposed. This metric is then experimentally validated, and is found to summarise the dose state into a single dose value, which reflects the damage state of the crystal. Simulations are performed using raddose-3d and Diffraction Weighted Dose to compare possible dose spreading strategies, and generalised recommendations for MX experimentalists are offered. Uniquely identifying the species and stoichiometry of bound metals in protein sam- ples is a significant challenge for biophysical characterisation. Low throughput mi- crobeam Proton Induced X-ray Emission (microPIXE) provides an unambiguous anal- ysis of these properties, but has a limited throughput of ∼10 samples per day. As a consequence, its applicability has been restricted to niche cases. This thesis presents significant progress, including proof of principle experiments, on developing sample preparation methods, data acquisition systems, and data analysis protocols to increase this throughput by an order of magnitude, opening up major new applications for the technique.
APA, Harvard, Vancouver, ISO, and other styles
15

Beal, Craig Rubidge. "Improved rehomogenization techniques for nodal methods." Thesis, Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/19277.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Gimati, Yousef M. T. "Bootstrapping techniques to improve classification methods." Thesis, University of Leeds, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.401072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Inzunza, José. "New micromanipulative techniques in reproductive biology /." Stockholm, 2003. http://diss.kib.ki.se/2003/91-7349-568-9/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Ma, Yingnan. "Intelligent energy management system : techniques and methods." Thesis, City University London, 2011. http://openaccess.city.ac.uk/1212/.

Full text
Abstract:
ABSTRACT Our environment is an asset to be managed carefully and is not an expendable resource to be taken for granted. The main original contribution of this thesis is in formulating intelligent techniques and simulating case studies to demonstrate the significance of the present approach for achieving a low carbon economy. Energy boosts crop production, drives industry and increases employment. Wise energy use is the first step to ensuring sustainable energy for present and future generations. Energy services are essential for meeting internationally agreed development goals. Energy management system lies at the heart of all infrastructures from communications, economy, and society’s transportation to the society. This has made the system more complex and more interdependent. The increasing number of disturbances occurring in the system has raised the priority of energy management system infrastructure which has been improved with the aid of technology and investment; suitable methods have been presented to optimize the system in this thesis. Since the current system is facing various problems from increasing disturbances, the system is operating on the limit, aging equipments, load change etc, therefore an improvement is essential to minimize these problems. To enhance the current system and resolve the issues that it is facing, smart grid has been proposed as a solution to resolve power problems and to prevent future failures. This thesis argues that smart grid consists of computational intelligence and smart meters to improve the reliability, stability and security of power. In comparison with the current system, it is more intelligent, reliable, stable and secure, and will reduce the number of blackouts and other failures that occur on the power grid system. Also, the thesis has reported that smart metering is technically feasible to improve energy efficiency. In the thesis, a new technique using wavelet transforms, floating point genetic algorithm and artificial neural network based hybrid model for gaining accurate prediction of short-term load forecast has been developed. Adopting the new model is more accuracy than radial basis function network. Actual data has been used to test the proposed new method and it has been demonstrated that this integrated intelligent technique is very effective for the load forecast. Choosing the appropriate algorithm is important to implement the optimization during the daily task in the power system. The potential for application of swarm intelligence to Optimal Reactive Power Dispatch (ORPD) has been shown in this thesis. After making the comparison of the results derived from swarm intelligence, improved genetic algorithm and a conventional gradient-based optimization method, it was concluded that swam intelligence is better in terms of performance and precision in solving optimal reactive power dispatch problems.
APA, Harvard, Vancouver, ISO, and other styles
19

Dunbar, Charles David. "Methods and techniques for valuation of patents." CSUSB ScholarWorks, 2003. https://scholarworks.lib.csusb.edu/etd-project/2306.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Pilát, Zdeněk. "Optical Micromanipulation Techniques Combined with Microspectroscopic Methods." Doctoral thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2015. http://www.nusl.cz/ntk/nusl-234266.

Full text
Abstract:
Předložená dizertační práce se zabývá kombinací optických mikromanipulací s mikrospektroskopickými metodami. Využili jsme laserovou pinzetu pro transport a třídění živých mikroorganismů, například jednobuněčných řas, či kvasinek. Ramanovskou spektroskopií jsme analyzovali chemické složení jednotlivých buněk a tyto informace jsme využili k automatické selekci buněk s vybranými vlastnostmi. Zkombinovali jsme pulsní amplitudově modulovanou fluorescenční mikrospektroskopii, optické mikromanipulace a jiné techniky ke zmapování stresové odpovědi opticky zachycených buněk při různých časech působení, vlnových délkách a intenzitách chytacího laseru. Vyrobili jsme různé typy mikrofluidních čipů a zkonstruovali jsme Ramanovu pinzetu pro třídění mikro-objektů, především živých buněk, v mikrofluidním prostředí.
APA, Harvard, Vancouver, ISO, and other styles
21

BAGOZI, ADA. "METHODS AND TECHNIQUES FOR BIG DATA EXPLORATION." Doctoral thesis, Università degli studi di Brescia, 2021. http://hdl.handle.net/11379/554944.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Miller, David J. Ghosh Avijit. "New methods in computational systems biology /." Philadelphia, Pa. : Drexel University, 2008. http://hdl.handle.net/1860/2810.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Li, Limin, and 李丽敏. "Machine learning methods for computational biology." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2010. http://hub.hku.hk/bib/B44546749.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Selega, Alina. "Computational methods for RNA integrative biology." Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/29630.

Full text
Abstract:
Ribonucleic acid (RNA) is an essential molecule, which carries out a wide variety of functions within the cell, from its crucial involvement in protein synthesis to catalysing biochemical reactions and regulating gene expression. Such diverse functional repertoire is indebted to complex structures that RNA can adopt and its flexibility as an interacting molecule. It has become possible to experimentally measure these two crucial aspects of RNA regulatory role with such technological advancements as next-generation sequencing (NGS). NGS methods can rapidly obtain the nucleotide sequence of many molecules in parallel. Designing experiments, where only the desired parts of the molecule (or specific parts of the transcriptome) are sequenced, allows to study various aspects of RNA biology. Analysis of NGS data is insurmountable without computational methods. One such experimental method is RNA structure probing, which aims to infer RNA structure from sequencing chemically altered transcripts. RNA structure probing data is inherently noisy, affected both by technological biases and the stochasticity of the underlying process. Most existing methods do not adequately address the issue of noise, resorting to heuristics and limiting the informativeness of their output. In this thesis, a statistical pipeline was developed for modelling RNA structure probing data, which explicitly captures biological variability, provides automated bias-correcting strategies, and generates a probabilistic output based on experimental measurements. The output of our method agrees with known RNA structures, can be used to constrain structure prediction algorithms, and remains robust to reduced sequence coverage, thereby increasing sensitivity of the technology. Another recent experimental innovation maps RNA-protein interactions at very high temporal resolution, making it possible to study rapid binding events happening on a minute time scale. In this thesis, a non-parametric algorithm was developed for identifying significant changes in RNA-protein binding time-series between different conditions. The method was applied to novel yeast RNA-protein binding time-course data to study the role of RNA degradation in stress response. It revealed pervasive changes in the binding to the transcriptome of the yeast transcription termination factor Nab3 and the cytoplasmic exoribonuclease Xrn1 under nutrient stress. This challenged the common assumption of viewing transcriptional changes as the major driver of changes in RNA expression during stress and highlighted the importance of degradation. These findings inspired a dynamical model for RNA expression, where transcription and degradation rates are modelled using RNA-protein binding time-series data.
APA, Harvard, Vancouver, ISO, and other styles
25

Zagordi, Osvaldo. "Statistical physics methods in computational biology." Doctoral thesis, SISSA, 2007. http://hdl.handle.net/20.500.11767/3971.

Full text
Abstract:
The interest of statistical physics for combinatorial optimization is not new, it suffices to think of a famous tool as simulated annealing. Recently, it has also resorted to statistical inference to address some "hard" optimization problems, developing a new class of message passing algorithms. Three applications to computational biology are presented in this thesis, namely: 1) Boolean networks, a model for gene regulatory networks; 2) haplotype inference, to study the genetic information present in a population; 3) clustering, a general machine learning tool.
APA, Harvard, Vancouver, ISO, and other styles
26

Jung, Min Kyung. "Statistical methods for biological applications." [Bloomington, Ind.] : Indiana University, 2007. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3278454.

Full text
Abstract:
Thesis (Ph.D.)--Indiana University, Dept. of Mathematics, 2007.
Source: Dissertation Abstracts International, Volume: 68-10, Section: B, page: 6740. Adviser: Elizabeth A. Housworth. Title from dissertation home page (viewed May 20, 2008).
APA, Harvard, Vancouver, ISO, and other styles
27

Marin, Oana. "Boundary integral methods for Stokes flow : Quadrature techniques and fast Ewald methods." Doctoral thesis, KTH, Skolan för teknikvetenskap (SCI), 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-105540.

Full text
Abstract:
Fluid phenomena dominated by viscous effects can, in many cases, be modeled by the Stokes equations. The boundary integral form of the Stokes equations reduces the number of degrees of freedom in a numerical discretization by reformulating the three-dimensional problem to two-dimensional integral equations to be discretized over the boundaries of the domain. Hence for the study of objects immersed in a fluid, such as drops or elastic/solid particles, integral equations are to be discretized over the surfaces of these objects only. As outer boundaries or confinements are added these must also be included in the formulation. An inherent difficulty in the numerical treatment of boundary integrals for Stokes flow is the integration of the singular fundamental solution of the Stokes equations, e.g. the so called Stokeslet. To alleviate this problem we developed a set of high-order quadrature rules for the numerical integration of the Stokeslet over a flat surface. Such a quadrature rule was first designed for singularities of the type . To assess the convergence properties of this quadrature rule a theoretical analysis has been performed. The slightly more complicated singularity of the Stokeslet required certain modifications of the integration rule developed for . An extension of this type of quadrature rule to a cylindrical surface is also developed. These quadrature rules are tested also on physical problems that have an analytic solution in the literature. Another difficulty associated with boundary integral problems is introduced by periodic boundary conditions. For a set of particles in a periodic domain periodicity is imposed by requiring that the motion of each particle has an added contribution from all periodic images of all particles all the way up to infinity. This leads to an infinite sum which is not absolutely convergent, and an additional physical constraint which removes the divergence needs to be imposed. The sum is decomposed into two fast converging sums, one that handles the short range interactions in real space and the other that sums up the long range interactions in Fourier space. Such decompositions are already available in the literature for kernels that are commonly used in boundary integral formulations. Here a decomposition in faster decaying sums than the ones present in the literature is derived for the periodic kernel of the stress tensor. However the computational complexity of the sums, regardless of the decomposition they stem from, is . This complexity can be lowered using a fast summation method as we introduced here for simulating a sedimenting fiber suspension. The fast summation method was initially designed for point particles, which could be used for fibers discretized numerically almost without any changes. However, when two fibers are very close to each other, analytical integration is used to eliminate numerical inaccuracies due to the nearly singular behavior of the kernel and the real space part in the fast summation method was modified to allow for this analytical treatment. The method we have developed for sedimenting fiber suspensions allows for simulations in large periodic domains and we have performed a set of such simulations at a larger scale (larger domain/more fibers) than previously feasible.

QC 20121122

APA, Harvard, Vancouver, ISO, and other styles
28

Jiang, Hao, and 姜昊. "Construction and computation methods for biological networks." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hub.hku.hk/bib/B50662144.

Full text
Abstract:
Biological systems are complex in that they comprise large number of interacting entities, and their dynamics follow mechanic regulations for movement and biological function organization. Established computational modeling deals with studying and manipulating biologically relevant systems as a powerful approach. Inner structure and behavior of complex biological systems can be analyzed and understood by computable biological networks. In this thesis, models and computation methods are proposed for biological networks. The study of Genetic Regulatory Networks (GRNs) is an important research topic in genomic research. Several promising techniques have been proposed for capturing the behavior of gene regulations in biological systems. One of the promising models for GRNs, Boolean Network (BN) has gained a lot of attention. However, little light has been shed on the analysis of internal connection between the dynamics of biological molecules and network systems. Inference and completion problems of a BN from a given set of singleton attractors are considered to be important in understanding the relationship between dynamics of biological molecules and network systems. Discrete dynamic systems model has been recently proposed to model time-course microarray measurements of genes, but delay effect may be modeled as a realistic factor in studying GRNs. A delay discrete dynamic systems model is developed to model GRNs. Inference and analysis of networks is one of the grand challenges in modern statistical biology. Machine learning method, in particular, Support Vector Machine (SVM), has been successfully applied in predictions of internal connections embedded in networks. Kernels in conjunction with SVM demonstrate strong ability in performing various tasks such as biomedical diagnosis, function prediction and motif extractions. In biomedical diagnosis, data sets are always high dimensional which provide a challenging research problem in machine learning area. Novel kernels using distance-metric that are not common in machine learning framework are proposed for possible tumor differentiation discrimination problem. Protein function prediction problem is a hot topic in bioinformatics. The K-spectrum Kernel is among the top popular models in description of protein sequences. Taking into consideration of positive-semi-definiteness in kernel construction, Eigen-matrix translation technique is introduced in novel kernel formulation to give better prediction result. In a further step, power of Eigen-matrix translation technique in feature selection is demonstrated through mathematical formulation. Due to structure complexity of carbohydrates, the study of carbohydrate sugar chains has lagged behind compared to that of DNA and proteins. A weighted q-gram kernel is constructed in classifying glycan structures with limitations in feature extractions. A biochemically-weighted tree kernel is then proposed to enhance the ability in both classification as well as motif extractions. Finally the problem of metabolite biomarker discovery is researched. Human diseases, in particular metabolic diseases, can be directly caused by the lack of essential metabolites. Identification of metabolite biomarkers has significant importance in the study of biochemical reaction and signaling networks. A promising computational approach is proposed to identify metabolic biomarkers through integrating biomedical data and disease-specific gene expression data.
published_or_final_version
Mathematics
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
29

Ljungkvist, Karl. "Techniques for finite element methods on modern processors." Licentiate thesis, Uppsala universitet, Avdelningen för beräkningsvetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-242186.

Full text
Abstract:
In this thesis, methods for efficient utilization of modern computer hardware for numerical simulation are considered. In particular, we study techniques for speeding up the execution of finite-element methods. One of the greatest challenges in finite-element computation is how to efficiently perform the the system matrix assembly efficiently in parallel, due to its complicated memory access pattern. The main difficulty lies in the fact that many entries of the matrix are being updated concurrently by several parallel threads. We consider transactional memory, an exotic hardware feature for concurrent update of shared variables, and conduct benchmarks on a prototype processor supporting it. Our experiments show that transactions can both simplify programming and provide good performance for concurrent updates of floating point data. Furthermore, we study a matrix-free approach to finite-element computation which avoids the matrix assembly. Motivated by its computational properties, we implement the matrix-free method for execution on graphics processors, using either atomic updates or a mesh coloring approach to handle the concurrent updates. A performance study shows that on the GPU, the matrix-free method is faster than a matrix-based implementation for many element types, and allows for solution of considerably larger problems. This suggests that the matrix-free method can speed up execution of large realistic simulations.
UPMARC
eSSENCE
APA, Harvard, Vancouver, ISO, and other styles
30

Frohm, Anna. "Patellar tendinopathy : on evaluation methods and rehabilitation techniques /." Stockholm, 2006. http://diss.kib.ki.se/2006/91-7140-994-7/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

De, Rosa Mattia. "New methods, techniques and applications for sketch recognition." Doctoral thesis, Universita degli studi di Salerno, 2014. http://hdl.handle.net/10556/1460.

Full text
Abstract:
2012-2013
The use of diagrams is common in various disciplines. Typical examples include maps, line graphs, bar charts, engineering blueprints, architects’ sketches, hand drawn schematics, etc.. In general, diagrams can be created either by using pen and paper, or by using specific computer programs. These programs provide functions to facilitate the creation of the diagram, such as copy-and-paste, but the classic WIMP interfaces they use are unnatural when compared to pen and paper. Indeed, it is not rare that a designer prefers to use pen and paper at the beginning of the design, and then transfer the diagram to the computer later. To avoid this double step, a solution is to allow users to sketch directly on the computer. This requires both specific hardware and sketch recognition based software. As regards hardware, many pen/touch based devices such as tablets, smartphones, interactive boards and tables, etc. are available today, also at reasonable costs. Sketch recognition is needed when the sketch must be processed and not considered as a simple image and it is crucial to the success of this new modality of interaction. It is a difficult problem due to the inherent imprecision and ambiguity of a freehand drawing and to the many domains of applications. The aim of this thesis is to propose new methods and applications regarding the sketch recognition. The presentation of the results is divided into several contributions, facing problems such as corner detection, sketched symbol recognition and autocompletion, graphical context detection, sketched Euler diagram interpretation. The first contribution regards the problem of detecting the corners present in a stroke. Corner detection is often performed during preprocessing to segment a stroke in single simple geometric primitives such as lines or curves. The corner recognizer proposed in this thesis, RankFrag, is inspired by the method proposed by Ouyang and Davis in 2011 and improves the accuracy percentages compared to other methods recently proposed in the literature. The second contribution is a new method to recognize multi-stroke hand drawn symbols, which is invariant with respect to scaling and supports symbol recognition independently from the number and order of strokes. The method is an adaptation of the algorithm proposed by Belongie et al. in 2002 to the case of sketched images. This is achieved by using stroke related information. The method has been evaluated on a set of more than 100 symbols from the Military Course of Action domain and the results show that the new recognizer outperforms the original one. The third contribution is a new method for recognizing multi-stroke partially hand drawn symbols which is invariant with respect to scale, and supports symbol recognition independently from the number and order of strokes. The recognition technique is based on subgraph isomorphism and exploits a novel spatial descriptor, based on polar histograms, to represent relations between two stroke primitives. The tests show that the approach gives a satisfactory recognition rate with partially drawn symbols, also with a very low level of drawing completion, and outperforms the existing approaches proposed in the literature. Furthermore, as an application, a system presenting a user interface to draw symbols and implementing the proposed autocompletion approach has been developed. Moreover a user study aimed at evaluating the human performance in hand drawn symbol autocompletion has been presented. Using the set of symbols from the Military Course of Action domain, the user study evaluates the conditions under which the users are willing to exploit the autocompletion functionality and those under which they can use it efficiently. The results show that the autocompletion functionality can be used in a profitable way, with a drawing time saving of about 18%. The fourth contribution regards the detection of the graphical context of hand drawn symbols, and in particular, the development of an approach for identifying attachment areas on sketched symbols. In the field of syntactic recognition of hand drawn visual languages, the recognition of the relations among graphical symbols is one of the first important tasks to be accomplished and is usually reduced to recognize the attachment areas of each symbol and the relations among them. The approach is independent from the method used to recognize symbols and assumes that the symbol has already been recognized. The approach is evaluated through a user study aimed at comparing the attachment areas detected by the system to those devised by the users. The results show that the system can identify attachment areas with a reasonable accuracy. The last contribution is EulerSketch, an interactive system for the sketching and interpretation of Euler diagrams (EDs). The interpretation of a hand drawn ED produces two types of text encodings of the ED topology called static code and ordered Gauss paragraph (OGP) code, and a further encoding of its regions. Given the topology of an ED expressed through static or OGP code, EulerSketch automatically generates a new topologically equivalent ED in its graphical representation. [edited by author]
XII n.s.
APA, Harvard, Vancouver, ISO, and other styles
32

Seelig, Johannes. "Optical methods for nanoscale investigations in biology /." Zürich : ETH, 2006. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=16856.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Cherry, Amanda M. "Methods to Characterize Orofacial Development." VCU Scholars Compass, 2018. https://scholarscompass.vcu.edu/etd/5484.

Full text
Abstract:
In this thesis, several techniques were combined to optimize, evaluate and characterize craniofacial development in Xenopus, with additional focus on understanding the alterations made during maturation in the craniofacial region and the cartilage. Three important techniques used were: confocal microscopy in conjunction with Acridine Orange (AO) labeling, Alcian Blue (AB) labeling, and geometric morphometric analysis. I found that facial width increased across all techniques used to evaluate it. Included within this focus was the study of the development of the ceratohyal (CH) cartilage, which supported the mouth and snout. This was also found to increase width wise, in unison with facial and orofacial growth. This data may suggest a link between the face, mouth and CH growth, in which the developing cartilage elongates and widens causing the increase seen in the width and distension of the mouth.
APA, Harvard, Vancouver, ISO, and other styles
34

Tegeder, Roland W. "Large deviations, Hamiltonian techniques and applications in biology." Thesis, University of Cambridge, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.308322.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Meer, Ralph Raymond. "Rapid methods for the detection of toxigenic Clostridium perfringens." Diss., The University of Arizona, 1996. http://hdl.handle.net/10150/290593.

Full text
Abstract:
Clostridium perfringens may be the most widely occurring bacterial pathogen and is responsible for a variety of diseases in both humans and animals. The virulence of this organism is associated with the ability to produce an estimated 17 potential exotoxins. The production of one or more of the five major toxins (α,β,ε, and ι) is the basis for placing isolates into five toxigenic types, A through E. Enterotoxin (CPE), is not used in typing but is considered a major virulence factor. A multiplex PCR genotyping assay was developed, utilizing primers derived from sequences of cpa, cpb, etx, iA, and cpe, yielding products of 324, 196, 655, 446, and 233 bp, respectively. Template for this assay was derived from individual colonies suspended in 200 μl of HPLC-grade water, boiled for 20 min or heated in a microwave oven for 10 min at 700 W. Included in the 50 μl reaction volume was 10 μl of template, 0.15 to 0.7 μM of each primer, 0.1 mM dNTPs, 2 mM MgCl₂, and 2 units of Taq DNA polymerase. The PCR products were examined by electrophoresis in a 1.5% agarose gel stained with EtBr. Correlation of genotype with toxin phenotype in strains examined by mouse inoculation was excellent, and it was possible to provide results rapidly, usually in < 4 h. An ELISA procedure was established for detection of β toxin produced by C. perfringens types B and C. The ELISA was used to differentiate Cpb⁺ from Cpb⁻ isolates grown in overnight broth cultures and to measure β toxin in commercial fermentations of type C organisms. In addition to the above assays, preliminary work was initiated on the development of a PCR procedure for quantitation of C. perfringens in clinical or environmental samples, and involved the construction of a 233 bp homologous, competitive mimic from a restriction digest of a 323 bp PCR product generated from cpa.
APA, Harvard, Vancouver, ISO, and other styles
36

Burr, Mark Daniel 1949. "An evaluation of DNA fingerprinting methods for subtyping Salmonella." Diss., The University of Arizona, 1996. http://hdl.handle.net/10150/290630.

Full text
Abstract:
The use of DNA typing and fingerprinting methods to identify and discriminate strains of bacteria, including Salmonella, has increased dramatically in recent years. Traditional typing methods, including serotyping and phage typing, have often not adequately discriminated strains, nor have they always identified virulent or antibiotic resistant strains. In a literature review, DNA-based methods, including plasmid analysis, restriction fragment length polymorphism (RFLP) analysis, and polymerase chain reaction (PCR) fingerprinting methods were evaluated. Plasmid analysis, including plasmid profiles and plasmid fingerprints have been shown to be useful primarily in short-term investigations of disease outbreak. However, plasmid profiles or possession of individual plasmids have generally not been good indicators of cell phenotypes overall. RFLP fingerprinting of Salmonella utilizing probes from ribosomal DNA, insertion sequence IS200, or random sequences has been reported. Ribotypes detected with ribosomal probes have generally been shared among different serotypes, whereas IS200 profiles have tended to be more serotype-specific. AP PCR and rep-PCR primers have been shown to discriminate Salmonella isolates, but fingerprints have been more difficult to reproduce and interpret than RFLP fingerprints. Several authors have reported bands of varying intensities, and some faint bands have not been reproducible. Improved methods of resolving and detecting PCR products are necessary. In a laboratory study, 85 environmental Salmonella isolates belonging to 22 serotypes were fingerprinted by 16S RFLP ribotyping, by rep-PCR, using ERIC (enterobacterial repetitive intergenic consensus) primers, and by AP PCR. Ribotypes were shared by isolates from different serotypes. ERIC PCR and one AP PCR primer produced fingerprints that discriminated among the different isolates, but did not identify serotypes. Another AP PCR primer produced simple patterns that neither discriminated isolates, nor identified serotypes. In a second related laboratory study, computer-assisted matching of AP PCR fingerprints of several known isolates was evaluated. Aliquots of the PCR reaction were run in the same and different gels, and the fingerprints bands were scored by two technicians on a presence-absence basis, and matched by creating dendrograms. Although replicate fingerprints of an isolate appeared reproducible, they were not always scored identically. Thus, the computer was not always able to correctly match fingerprints.
APA, Harvard, Vancouver, ISO, and other styles
37

Hansen, Cristina M. "Novel methods of disease surveillance in wildlife." Thesis, University of Alaska Fairbanks, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3702799.

Full text
Abstract:

Both infectious and noninfectious disease agents in wildlife impact human health and accurate research, monitoring, and diagnostic methods are necessary. The objectives of the research reported here were to develop and implement novel methods for bacterial and toxicological disease agent surveillance in wildlife. This dissertation begins with a review of tularemia, an important zoonotic disease to the state of Alaska and the Northern hemisphere. In chapter two, I show the development and implementation of broad-based PCR and quantitative PCR (qPCR) surveillance methods for bacterial DNA in tissue samples; 1298 tissue samples were assayed, numerous potential bacterial pathogens were identified and qPCR detection limits were quantified for various tissue matrices. Chapter three describes an investigation into microbial infection as a source of embryo mortality in greater white-fronted geese (Anser albifrons) in Arctic Alaska. This chapter builds upon our previously developed PCR surveillance techniques by which I demonstrated that bacterial infection is responsible for some greater white-fronted goose embryo mortality in Arctic Alaska. Chapter four describes the development and validation of a cellulose filter paper method for quantifying total mercury in whole blood. I determined that filter paper technology is useful for monitoring total mercury in whole blood, with excellent recoveries (82 - 95% of expected values) and R2 values (0.95 - 0.97) when regressed against the concentration of total mercury in whole blood, the technique generally considered as the "gold standard" for mercury detection. These methods will aid in the accurate detection of disease agents in wildlife as demonstrated by our white-fronted goose work.

APA, Harvard, Vancouver, ISO, and other styles
38

Sankaran, Krishnaswamy. "Accurate domain truncation techniques for time-domain conformal methods /." Zürich : ETH, 2007. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=17447.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Islas, Michael. "EFFICIENCY IMPROVEMENT TECHNIQUES FOR HIGH VOLTAGE CAPACITOR CHARGING METHODS." Master's thesis, University of Central Florida, 2009. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/2969.

Full text
Abstract:
The goal of this thesis is to design and fabricate a DC-to-DC converter for use in high-voltage capacitor charging applications. The primary objectives include increasing the efficiency and reducing the cost of traditional methods used for this application. Traditional methods were not designed specifically for high-voltage capacitor charging and were thus very primitive and exhibited lower efficiency. Prior methods made use of a high voltage power supply and a current limiting resistor or control scheme. The power supply would often only operate efficiently at a single voltage value and would thus function poorly over a range used in charging a capacitor. The resistor would also dissipate a fair amount of power, also limiting efficiency. This design makes use of a traditional flyback topology utilizing a controller developed specifically for this application, centering the design approach on the LT3750. Hence, taking full advantage of the efficiency improving control scheme it provides. Additionally, through the use of advanced techniques to eliminate noise and power losses, the efficiency may be significantly improved. A detailed theoretical analysis of the charger is also presented. The analysis will then be applied to optimization techniques to select ideal component values to meet specific design specifications. In this research, a specifically designed and developed prototype will be used to experimentally verify the theoretical work and optimization techniques.
M.S.E.E.
School of Electrical Engineering and Computer Science
Engineering and Computer Science
Electrical Engineering MSEE
APA, Harvard, Vancouver, ISO, and other styles
40

Lishchuk, Viktor. "Geometallurgical programs – critical evaluation of applied methods and techniques." Licentiate thesis, Luleå tekniska universitet, Mineralteknik och metallurgi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-26607.

Full text
Abstract:
Geometallurgy is a team-based multidisciplinary approach aimed at integrating geological, mineralogical and metallurgical information and yielding a spatial quantitative predictive model for production management. Production management includes forecast, control and optimization of the product quality (concentrates and tailings) and metallurgical performance (e.g. recoveries and throughput); and minimization of the environmental impact. Favourable characteristics of an ore body calling for geometallurgical model are high variability, low mineral grades, complex mineralogy and several alternative processing routes or beneficiation methods.Industrial application of geometallurgy is called a geometallurgical program. This study undertook a critical review and evaluation of methods and techniques used in geometallurgical programs. This evaluation aimed at defining how geometallurgical program should be carried out for different kinds of ore bodies. Methods applied here were an industry survey (questionnaire) along with development and use of a synthetic ore body build-up of geometallurgical modules. Survey on geometallurgical programs included fifty two case studies from both industry professionals and comprehensive literature studies. Focus in the survey was on answering why and how geometallurgical programs are built. This resulted in a two-dimensional classification system where geometallurgical program depth of application was presented in six levels. Geometallurgical methods and techniques were summarised accordingly under three approaches: traditional, proxy and mineralogical. Through the classification it was established that due to similar geometallurgical reasoning and methodologies the deposit and process data could be organized in a common way. Thus, a uniform data structure (Papers I, II) was proposed.Traditionally the scientific development in geometallurgy takes place through case studies. This is slow and results are often confidential. Therefore, an alternative way is needed; here a synthetic testing framework for geometallurgy was established and used as such alternative. The synthetic testing framework for geometallurgy consists of synthetic ore body and a mineral processing circuit. The generated digital ore body of a kind is sampled through a synthetic sampling module, followed by chemical and mineralogical analyses, and by geometallurgical and metallurgical testing conducted in a synthetic laboratory. The synthetic testing framework aims at being so realistic that an expert could not identify it from a true one while studying data it offers. Important and unique aspect here is that the geological ore body model is based on minerals. This means that synthetic ore body has full mineralogical composition and properties information at any point of the ore body. This makes it possible to run different characterisation techniques in synthetic analysis laboratory.The first framework built was based on Malmberget iron ore mine (LKAB). Two aspects were studied: sampling density required for a geometallurgical program and difference in the prediction capabilities between different geometallurgical approaches. As a result of applying synthetic testing framework, it was confirmed that metallurgical approach presents clear advantage in product quality prediction for production planning purposes. Another conclusion was that optimising the production based solely on head grade without application of variability in the processing properties gives significantly less reliable forecast and optimisation information for the mining value chain.For the iron ore case study it was concluded that the number of samples required for a geometallurgical program must vary based on the parameters to be forecasted. Reliable recovery model could be established based on some tens of samples whereas the reliable concentrate quality prediction (e.g metal grade, penalty elements) required more than 100 samples. In the latter the mineralogical approach proved to be significantly better in the quality of prediction in comparison to the traditional approach based on elemental grades. Model based on proxy approach could forecast well the response in magnetic separation performance with the help of Davis tube test. But the lack of geometallurgical test for flotation and gravity separation caused that in total the proxy approach forecast capability was worse than in mineralogical approach. This study is a part of a larger research program, PREP (Primary resource efficiency by enhanced prediction), and the results will be applied to on-going industrial case studies.

För godkännande; 2016; 20160516 (viklis); Nedanstående person kommer att hålla licentiatseminarium för avläggande av teknologie licentiatexamen. Namn: Viktor Lishchuk Ämne: Mineralteknik/Mineral Processing Uppsats: Geometallurgical Programs – Critical Evaluation of Applied Methods and Techniques Examinator: Professor Pertti Lamberg, Institutionen för samhällsbyggnad och naturresurser, Avdelning: Mineralteknik och metallurgi, Luleå tekniska universitet. Diskutant: PhD Simon Michaux, BRGM, University of Liege, Institution of Genie Mineral, Materiaux et Environment, Brisbane Tid: Torsdag 2 juni, 2016 kl 10.00 Plats: F531, Luleå tekniska universitet

APA, Harvard, Vancouver, ISO, and other styles
41

Yeganehtalab, Babak. "Construction Management Methods and Techniques in Army Tactical Shelter." Thesis, University of North Texas, 2019. https://digital.library.unt.edu/ark:/67531/metadc1609144/.

Full text
Abstract:
This thesis presents a research effort aimed at developing using construction methods and techniques in army tactical shelter. The beginning step focuses on developing and identifying different activities and work breakdown structure applicable in shelter prototype. The next step focuses on identifying resource allocation. This include allocate resources based on the delivered project as per alternative one and for the second alternative as optimization, resource allocation modified and tried to level and minimize resource peak. In addition, the cost calculated for the whole project as well as for each WBS and activities which consider as alternative one and in the second alternative, cost mitigation applied according to available resources and adjusting predecessors and successors of each activity. In conclusion, two alternatives compared, available outcome presents, and future work suggested for the project team to continue this effort.
APA, Harvard, Vancouver, ISO, and other styles
42

Синилкіна, Анна. "Innovative techniques and methods in teaching english at universities." Thesis, Київський національний університет технологій та дизайну, 2017. https://er.knutd.edu.ua/handle/123456789/7232.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Mitu, Leonard Gabriel. "Methods and techniques for bio-system's materials behaviour analysis." Doctoral thesis, Universitat Politècnica de València, 2014. http://hdl.handle.net/10251/35445.

Full text
Abstract:
In the context of the rapid development of the research in the biosystem structure materials domain, a representative direction shows the analysis of the behavior of these materials. This direction of research requires the use of various means and methods for theoretical and experimental measuring and evaluating. PhD thesis "Methods and means for analyzing the behavior of biosystems structure materials" shall be given precedence in this area of research, aimed at studying the behavior of polymeric materials and composites biosystems structure and in particular the skeletal structure biosystem. Therefore, it is developed a specific method of research based on the development of theoretical models for the prediction of the mechanical, thermal and machinability properties of these materials. There are used Moldflow, Solidworks and Ansys software types. In order to validate the theoretical research were designed and conducted experimental research on the mechanical properties and the behavior of the polymeric biomaterials represented by ABS, UHMWPE, HDPE, PA, PC, PET, PP, PP_GF-30% and composite materials with polymeric thermoplastic matrixes from the skeletal biosystem¿s structure. In order to analyze the theoretical and experimental correlations, the experimental data were processed using the statistical analysis software programs SPSS v17, v8 Origin, Palisade Decision Tools. In conclusion, the thesis represents a technic, scientific and efficient support for analyzing the behavior of the new polymeric and composite materials from the biosystem structure.
Mitu, LG. (2014). Methods and techniques for bio-system's materials behaviour analysis [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/35445
TESIS
APA, Harvard, Vancouver, ISO, and other styles
44

Ding, Jiarui. "Computational methods for systems biology data of cancer." Thesis, University of British Columbia, 2016. http://hdl.handle.net/2429/58164.

Full text
Abstract:
High-throughput genome sequencing and other techniques provide a cost-effective way to study cancer biology and seek precision treatment options. In this dissertation I address three challenges in cancer systems biology research: 1) predicting somatic mutations, 2) interpreting mutation functions, and 3) stratifying patients into biologically meaningful groups. Somatic single nucleotide variants are frequent therapeutically actionable mutations in cancer, e.g., the ‘hotspot’ mutations in known cancer driver genes such as EGFR, KRAS, and BRAF. However, only a small proportion of cancer patients harbour these known driver mutations. Therefore, there is a great need to systematically profile a cancer genome to identify all the somatic single nucleotide variants. I develop methods to discover these somatic mutations from cancer genomic sequencing data, taking into account the noise in high-throughput sequencing data and valuable validated genuine somatic mutations and non-somatic mutations. Of the somatic alterations acquired for each cancer patient, only a few mutations ‘drive’ the initialization and progression of cancer. To better understand the evolution of cancer, as well as to apply precision treatments, we need to assess the functions of these mutations to pinpoint the driver mutations. I address this challenge by predicting the mutations correlated with gene expression dysregulation. The method is based on hierarchical Bayes modelling of the influence of mutations on gene expression, and can predict the mutations that impact gene expression in individual patients. Although probably no two cancer genomes share exactly the same set of somatic mutations because of the stochastic nature of acquired mutations across the three billion base pairs, some cancer patients share common driver mutations or disrupted pathways. These patients may have similar prognoses and potentially benefit from the same kind of treatment options. I develop an efficient clustering algorithm to cluster high-throughput and high-dimensional bio- logical datasets, with the potential to put cancer patients into biologically meaningful groups for treatment selection.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
45

Cong, Yang, and 丛阳. "Optimization models and computational methods for systems biology." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2012. http://hub.hku.hk/bib/B47752841.

Full text
Abstract:
Systems biology is a comprehensive quantitative analysis of the manner in which all the components of a biological system interact functionally along with time. Mathematical modeling and computational methods are indispensable in such kind of studies, especially for interpreting and predicting the complex interactions among all the components so as to obtain some desirable system properties. System dynamics, system robustness and control method are three crucial properties in systems biology. In this thesis, the above properties are studied in four different biological systems. The outbreak and spread of infectious diseases have been questioned and studied for years. The spread mechanism and prediction about the disease could enable scientists to evaluate isolation plans to have significant effects on a particular epidemic. A differential equation model is proposed to study the dynamics of HIV spread in a network of prisons. In prisons, screening and quarantining are both efficient control manners. An optimization model is proposed to study optimal strategies for the control of HIV spread in a prison system. A primordium (plural: primordia) is an organ or tissue in its earliest recognizable stage of development. Primordial development in plants is critical to the proper positioning and development of plant organs. An optimization model and two control mechanisms are proposed to study the dynamics and robustness of primordial systems. Probabilistic Boolean Networks (PBNs) are mathematical models for studying the switching behavior in genetic regulatory networks. An algorithm is proposed to identify singleton and small attractors in PBNs which correspond to cell types and cell states. The captured problem is NP-hard in general. Our algorithm is theoretically and computationally demonstrated to be much more efficient than the naive algorithm that examines all the possible states. The goal of studying the long-term behavior of a genetic regulatory network is to study the control strategies such that the system can obtain desired properties. A control method is proposed to study multiple external interventions meanwhile minimizing the control cost. Robustness is a paramount property for living organisms. The impact degree is a measure of robustness of a metabolic system against the deletion of single or multiple reaction(s). An algorithm is proposed to study the impact degree in Escherichia coli metabolic system. Moreover, approximation method based on Branching process is proposed for estimating the impact degree of metabolic networks. The effectiveness of our method is assured by testing with real-world Escherichia coli, Bacillus subtilis, Saccharomyces cerevisiae and Homo Sapiens metabolic systems.
published_or_final_version
Mathematics
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
46

Gamalielsson, Jonas. "Developing semantic pathway comparison methods for systems biology." Thesis, Heriot-Watt University, 2009. http://hdl.handle.net/10399/2294.

Full text
Abstract:
Systems biology is an emerging multi-disciplinary field in which the behaviour of complex biological systems is studied by considering the interaction of many cellular and molecular constituents rather than using a “traditional” reductionist approach where constituents are studied individually. Systems are often studied over time with the ultimate goal of developing models which can be used to understand and predict complex biological processes, such as human diseases. To support systems biology, a large number of biological pathways are being derived for many different organisms, and these are stored in various databases. This pathway collection presents an opportunity to compare and contrast pathways, and to utilise the knowledge they represent. This thesis presents some of the first algorithms that are designed to explore this opportunity. It is argued that the methods will be useful to biologists in order to assess the biological plausibility of derived pathways, compare different biological pathways for semantic similarities, and to derive putative pathways that are semantically similar to documented biological pathways. The methods will therefore extend the systems biology toolbox that biologists can use to make new biological discoveries.
APA, Harvard, Vancouver, ISO, and other styles
47

Wang, Naining. "Quantitative cellular methods in the evaluation of prostate cancer /." Stockholm, 2000. http://diss.kib.ki.se/2000/91-628-3929-2/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Fuxelius, Hans-Henrik. "Methods and Applications in Comparative Bacterial Genomics." Doctoral thesis, Uppsala universitet, Molekylär evolution, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-8398.

Full text
Abstract:
Comparative studies of bacterial genomes, now counting in the hundreds, generate massive amounts of information. In order to support a systematic and efficient approach to genomic analyses, a database driven system with graphic visualization of genomic properties was developed - GenComp. The software was applied to studies of obligate intracellular bacteria. In all studies, ORFs were extracted and grouped into ORF-families. Based on gene order synteny, orthologous clusters of core genes and variable spacer ORFs were identified and extracted for alignments and computation of substitution frequencies. The software was applied to the genomes of six Chlamydia trachomatis strains to identify the most rapidly evolving genes. Five genes were chosen for genotyping, and close to a 3-fold higher discrimination capacity was achieved than that of serotypes. With GenComp as the backbone, a massive comparative analysis were performed on the variable gene set in the Rickettsiaceae, which includes Rickettsia prowazekii and Orientia tsutsugamushi, the agents of epidemic and scrub typhus, respectively. O. tsutsugamushi has the most exceptional bacterial genome identified to date; the 2.2 Mb genome is 200-fold more repeated than the 1.1 Mb R. prowazekii genome due to an extensive proliferation of conjugative type IV secretion systems and associated genes. GenComp identified 688 core genes that are conserved across 7 closely related Rickettsia genomes along with a set of 469 variably present genes with homologs in other species. The analysis indicates that up to 70% of the extensively degraded and variably present genes represent mobile genetic elements and genes putatively acquired by horizontal gene transfer. This explains the paradox of the high pseudogene load in the small Rickettsia genomes. This study demonstrates that GenComp provides an efficient system for pseudogene identification and may help distinguish genes from spurious ORFs in the many pan-genome sequencing projects going on worldwide.
APA, Harvard, Vancouver, ISO, and other styles
49

Ollivier, Julien. "Scalable methods for modelling complex biochemical networks." Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=104586.

Full text
Abstract:
In cells, complex networks of interacting biomolecules process both environmental and endogenous signals to control gene expression and other cellular processes. This poses a challenge to researchers who attempt to develop mathematical and computational models of biochemical networks that reflect this complexity. In this thesis, I propose methods that help manage complexity by exploiting the finding that, as for other biological systems, cellular networks are characterized by a modularity that appears at all levels of organization.The first part of this work focuses on the modular properties of proteins and how their function can be characterized through their structure and allosteric properties. I develop a modular rule-based framework and formal modelling language that describes the computations performed by allosteric proteins and that is rooted in biophysical principles. Rule-based modelling conventionally addresses the problem of combinatorial complexity, whereby protein interactions can generate a combinatorial explosion of protein complex states. However, I explore how these same interactions can potentially require a combinatorial number of parameters to describe them. I demonstrate that my rule-based framework effectively addresses this problem of regulatory complexity, and describes allosteric proteins and networks in a unified, consistent, and modular fashion. I use the framework in three applications. First, I show that allostery can make macromolecular assembly more efficacious when a protein that joins two separable parts of a complex is present in excessively high concentrations. Second, I demonstrate that I can straightforwardly analyze the complex cooperative interactions that arise when competitive ligands bind to a multimeric protein. Third, I analyze a new model of G protein-coupled receptor signalling and demonstrate that it explains the functional selectivity of these receptors while being parsimonious in the number of parameters used. Overall, I find that my rule-based modelling framework, implemented as the Allosteric Network Compiler software tool, can ease of modelling and analysis of complex allosteric interactions.If cellular networks are modular, this implies that small sub-systems can be studied in isolation, provided that external inputs and perturbations to the system can be modelled appropriately. However, cellular networks are subject to both intrinsic noise, which is endogenous to the system, but also extrinsic noise, arising from noisy inputs. Furthermore, many inputs may be dynamic, whether due to experimental protocols or perhaps reflecting the cyclic process of cell division. This motivates my development, in the second part of this work, of efficient stochastic simulation algorithms for biochemical networks that can accommodate time-varying biochemical parameters. Starting from Gillespie's well-known First Reaction Method and Gibson and Bruck's Next Reaction Method, I develop two new algorithms that allow time-varying inputs of arbitrary functional form while scaling well to systems comprising many biochemical reactions. I analyze their scaling properties and find that a modified First Reaction Method may scale better than a modified Next Reaction Method in some applications.The third and last part of this thesis introduces a new software tool, Facile, that eases the creation, update and simulation of biochemical network models. Models created through a simple and intuitive textual language are automatically converted into a form usable by downstream tools, for example ordinary differential equations for simulation by Matlab. Also, Facile conveniently accommodates mathematical and time-varying expressions in rate laws.
Au niveau cellulaire, des réseaux complexes d'interaction biomoléculaire traitent les signaux tant environnementaux qu'endogènes dans le but de contrôler l'expression génétique ainsi que d'autres processus cellulaires. Ceci est un défi pour les chercheurs qui veulent concevoir des modèles mathématiques et calculatoires des réseaux biochimiques. Dans cette thèse, je propose des méthodes qui facilitent la gestion de cette complexité en exploitant la constatation que, tout comme d'autres systèmes biologiques, les réseaux cellulaires se caractérisent par une modularité qui transparaît à tous les niveaux d'organisation.Dans la première partie, je mets l'accent sur les propriétés modulaires des protéines et sur la façon de caractériser leur fonction, compte tenu de leur structure et de leurs propriétés allostériques. J'ai mis au point un cadre modulaire à base de règles ainsi qu'un langage formel de modélisation qui permet de décrire les calculs effectués par les protéines allostériques et qui découle de principes biophysiques. La modélisation à base de règles s'adresse conventionnellement au problème de la complexité combinatoire, où les interactions entre les protéines peuvent générer une explosion combinatoire d'états des complexes protéiques. J'examine, cependant, comment il peut s'avérer nécessaire d'utiliser un nombre combinatoire de paramètres pour décrire ces mêmes interactions. Je démontre que notre cadre à base de règles peut régler efficacement ce problème de la complexité régulatoire, et permet de décrire les protéines et les réseaux allostériques de façon unifiée, cohérente et modulaire. J'utilise le cadre développé dans trois applications. Tout d'abord, je montre que l'allostérie peut rendre l'assemblage macromoléculaire plus efficace lorsqu'une protéine qui unit deux parties distinctes d'un complexe protéique est présente en concentration excessive. Deuxièmement, je démontre qu'il est relativement simple d'analyser les interactions coopératives complexes qui surviennent lorsque des ligands compétitifs se lient à une protéine multimérique. En troisième lieu, j'analyse un nouveau modèle de la signalisation des récepteurs couplés aux protéines G qui explique leur sélectivité fonctionnelle tout en limitant le nombre des paramètres utilisés. Globalement, je montre que ce cadre basé sur des règles, qui est implémenté dans le logiciel ‘Allosteric Network Compiler', peut faciliter la modélisation et l'analyse d'interactions allostériques complexes.Si les réseaux cellulaires sont modulaires, il en résulte que des sous-systèmes peuvent être étudiés séparément, à la condition que les entrées et les perturbations externes du système puissent être modélisées adéquatement. Cependant, ces réseaux sont soumis à l'influence du bruit intrinsèque, qui est endogène au système, mais également au bruit extrinsèque, venant des entrées bruyantes. De plus, de nombreuses entrées peuvent être dynamiques. Cela motive, dans la deuxième partie de ce travail, le développement d'algorithmes efficients de simulation stochastique pour les réseaux biochimiques qui peuvent tenir compte de paramètres biochimiques dynamiques. En me fondant sur la méthode maintenant célèbre de Gillespie, d'appellation ‘First Reaction Method', et sur celle de Gibson et Bruck, la ‘Next Reaction Method', j'ai développé deux nouveaux algorithmes qui permettent des entrées dynamiques de forme fonctionnelle arbitraire tout en s'échelonnant bien sur les systèmes qui comportent de nombreuses réactions biochimiques. J'analyse leurs propriétés d'échelonnement et je constate que, pour certaines applications, la ‘First Reaction Method' modifiée s'échelonne mieux que la ‘Next Reaction Method' modifiée.La troisième et dernière partie cette thèse est la présentation d'un nouvel outil informatique, Facile, qui simplifie la création, la mise à jour et la simulation de modèles de réseaux biochimiques.
APA, Harvard, Vancouver, ISO, and other styles
50

Dimont, Emmanuel. "Methods for the Analysis of Differential Composition of Gene Expression." Thesis, Harvard University, 2015. http://nrs.harvard.edu/urn-3:HUL.InstRepos:14226062.

Full text
Abstract:
Modern next-generation sequencing and microarray-based assays have empowered the computational biologist to measure various aspects of biological activity. This has led to the growth of genomics, transcriptomics and proteomics as fields of study of the complete set of DNA, RNA and proteins in living cells respectively. One major challenge in the analysis of this data, however, has been the widespread lack of sufficiently large sample sizes due to the high cost of new emerging technologies, making statistical inference difficult. In addition, due to the hierarchical nature of the various types of data, it is important to correctly integrate them to make meaningful biological discoveries and better informed decisions for the successful treatment of disease. In this dissertation I propose: (1) a novel method for more powerful statistical testing of differential digital gene expression between two conditions, (2) a framework for the integration of multi-level biologic data, demonstrated with the compositional analysis of gene expression and its link to promoter structure, and (3) an extension to a more complex generalized linear modeling framework, demonstrated with the compositional analysis of gene expression and its link to pathway structure adjusted for confounding covariates.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography