Academic literature on the topic 'WE 2100'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'WE 2100.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "WE 2100":

1

Bontempo, R., and M. Manna. "Highly accurate error estimate of the momentum theory as applied to wind turbines." Wind Energy 20, no. 8 (March 9, 2017): 1405–19. http://dx.doi.org/10.1002/we.2100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

de Marsily, Ghislain. "Will We Soon Run Out of Water?" Annals of Nutrition and Metabolism 76, Suppl. 1 (2020): 10–16. http://dx.doi.org/10.1159/000515019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In 2000, the World population was 6.2 billion; it reached 7 billion in 2012 and should reach 9.5 billion (±0.4) in 2050 and 11 billion (±1.5) in 2100, according to UN projections. The trend after 2100 is still one of global demographic growth, but after 2060, Africa would be the only continent where the population would still increase. The amount of water consumed annually to produce the food necessary to meet the needs varies greatly between countries, from about 600 to 2,500 m<sup>3</sup>/year per capita, depending on their wealth, their food habits (particularly meat consumption), and the percentage of food waste they generate. In 2000, the total food production was on the order of 3,300 million tons (in cereal equivalents). In 2019, about 0.8 billion inhabitants of the planet still suffer from hunger and do not get the nutrition they need to be in good health or, in the case of children, to grow properly (both physically and intellectually). Assuming a World average water consumption for food of 1,300 m<sup>3</sup>/year per capita in 2000, 1,400 m<sup>3</sup>/year in 2050, and 1,500 m<sup>3</sup>/year in 2100, a volume of water of around 8,200 km<sup>3</sup>/year was needed in 2000, 13,000 km<sup>3</sup>/year will be needed in 2050, and 16,500 km<sup>3</sup>/year in 2100. Will that much water be available on earth? Can there be conflicts related to a food deficit? Some preliminary answers and scenarios for food production will be given from a hydrologist viewpoint.
3

Heinrich, Michael, and José M. Prieto. "Diet and healthy ageing 2100: Will we globalise local knowledge systems?" Ageing Research Reviews 7, no. 3 (October 2008): 249–74. http://dx.doi.org/10.1016/j.arr.2007.08.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Grenfell, J. L., D. T. Shindell, and V. Grewe. "Sensitivity studies of oxidative changes in the troposphere in 2100 using the GISS GCM." Atmospheric Chemistry and Physics 3, no. 5 (September 3, 2003): 1267–83. http://dx.doi.org/10.5194/acp-3-1267-2003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract. We examine the relative importance of chemical precursor emissions affecting ozone (O3) and hydroxyl (OH) for the year 2100. Runs were developed from the Comparison of Tropospheric Oxidants (Ox_Comp) modeling workshop year 2100 A2p emissions scenario, part of the Intergovernmental Panel on Climate Change (IPCC) third assessment report (TAR). While TAR examined only cumulative change, we examine individual components (NOx, CH4, CO, etc.). Also, since there will be climate changes in 2100 (not accounted for by TAR), we investigate the effect of changing our fixed SSTs/ocean ice from present day to 2100 conditions, as projected by a coupled ocean-atmosphere model with doubled CO2. Unlike TAR we perform multiannual integrations and we include interactive lightning. Largest changes arose from the run with 2100 industrial NOx (O3=+16.9%, OH=+29.4% in July) and the run with 2100 methane (O3=+17.4%, OH= -19.1% in July). In the latter run, large ozone increases in the NH upper troposphere appeared to repartition HO2 into OH to such an extent that the lowering in OH associated with increased methane was overwhelmed in that region. Incorporating all changes collectively led to the July tropospheric ozone burden increasing from 426 to 601 Tg (+41.1%) and the July OH concentration increasing from 13.6 to 15.2x105 molecules/cm3 (+11.8%).
5

Grenfell, J. L., D. T. Shindell, and V. Grewe. "Sensitivity studies of oxidative changes in the troposphere in 2100 using the GISS GCM." Atmospheric Chemistry and Physics Discussions 3, no. 2 (April 1, 2003): 1805–42. http://dx.doi.org/10.5194/acpd-3-1805-2003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract. We examine the relative importance of chemical precursor emissions affecting ozone (O3) and hydroxyl (OH) for the year 2100. Runs were developed from the Comparison of Tropospheric Oxidants (Ox_Comp) modeling workshop year 2100 A2p emissions scenario, part of the Intergovernmental Panel on Climate Change (IPCC) third assessment report (TAR). While TAR examined only cumulative change, we examine individual components (NOx, CH4, CO, etc.). Also, since there will be climate changes in 2100 (not accounted for by TAR), we investigate the effect of changing our fixed SSTs/ocean ice from present day to 2100 conditions, as projected by a coupled ocean-atmosphere model with doubled CO2. Largest changes arose from the run with 2100 industrial NOx O3= +16.9%, OH= +29.4% in July) and the run with 2100 methane (O3= +17.4%, OH= −19.1% in July). In the latter run, large ozone increases in the NH upper troposphere appeared to repartition HO2 into OH to such an extent that the lowering in OH associated with increased methane was overwhelmed in that region. Incorporating all changes collectively led to the July tropospheric ozone burden increasing from 426 to 601 Tg (+41.1%) and the July OH concentration increasing from 13.6 to 15.2×105 molecules/cm3 (+11.8%).
6

Tsuda, Izumi, Masayuki Hino, Takayuki Takubo, Tomoko Katagami, Hiroshi Kubota, Seiki Kawai, and N. Tatsumi. "First basic performance evaluation of the XE-2100 haematology analyser." Journal of Automated Methods and Management in Chemistry 21, no. 4 (1999): 127–33. http://dx.doi.org/10.1155/s1463924699000152.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The newly developed XE-2100 haematology analyser can provide complete blood counts, leukocyte differentials, perform reticulocyte analysis, and obtain quantitative data on nucleated red blood cells (NRBCs). In this study, we evaluated the basic performance of this instrument using routinely obtained blood specimens treated with ethylenediaminetetraacetic acid-2k. Reproducibility, carryover, stability during storage at 4°C and room temperature, and accuracy were evaluated. In this evaluation, reproducibility was good and little carryover was found. Accurate measurements were possible for up to 48h of storage. A good correlation between findings with the XE-2100 and SE-9000 haematology analysers was found for complete blood count on 210 samples tested. The leukocyte differential obtained with the XE-2100 correlated well with eye counts and with the results obtained with the SE-9000 automated haematology analyser, with r values over 0.9 for the percentages of neutrophils, lymphocytes and eosinophils. The precision and accuracy of VRBC and reticulocyte counts by the XE-2100 were satisfactory. We used the XE-2100 to obtain differential counts for bone marrow aspirates, and good correlations with manual differentials were obtained for total nucleated cell count, percentage of myeloid cells and percentage of erythroid cells. The performance of the XE-2100 was excellent, and this instrument should be able to provide reliable data to clinical laboratories.
7

Le Goff, Héloïse, Mike D. Flannigan, and Yves Bergeron. "Potential changes in monthly fire risk in the eastern Canadian boreal forest under future climate change." Canadian Journal of Forest Research 39, no. 12 (December 2009): 2369–80. http://dx.doi.org/10.1139/x09-153.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The main objective of this paper is to evaluate whether future climate change would trigger an increase in the fire activity of the Waswanipi area, central Quebec. First, we used regression analyses to model the historical (1973–2002) link between weather conditions and fire activity. Then, we calculated Fire Weather Index system components using 1961–2100 daily weather variables from the Canadian Regional Climate Model for the A2 climate change scenario. We tested linear trends in 1961–2100 fire activity and calculated rates of change in fire activity between 1975–2005, 2030–2060, and 2070–2100. Our results suggest that the August fire risk would double (+110%) for 2100, while the May fire risk would slightly decrease (–20%), moving the fire season peak later in the season. Future climate change would trigger weather conditions more favourable to forest fires and a slight increase in regional fire activity (+7%). While considering this long-term increase, interannual variations of fire activity remain a major challenge for the development of sustainable forest management.
8

Ruzicka, Katharina, Mario Veitl, Renate Thalhammer-Scherrer, and Ilse Schwarzinger. "The New Hematology Analyzer Sysmex XE-2100." Archives of Pathology & Laboratory Medicine 125, no. 3 (March 1, 2001): 391–96. http://dx.doi.org/10.5858/2001-125-0391-tnhasx.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract Context.—The new hematology analyzer Sysmex XE-2100 (TOA Medical Electronics, Kobe, Japan) has a novel, combined, white blood cell differential technology and a special reagent system to enumerate nucleated red blood cells. Design.—Performance evaluation of both technologies of the Sysmex XE-2100 according to the H20-A protocol of the National Committee for Clinical and Laboratory Standards and comparison of the results with those for the hematology analyzer Sysmex NE-8000 (TOA Medical Electronics). Specimens.—Five hundred forty-four blood samples randomly chosen from various inpatient and outpatient departments of the Vienna University hospital. Results.—Five-part white blood cell differential counts on the XE-2100 revealed excellent correlation with the manual reference method for neutrophils, lymphocytes, and eosinophils (r = .925, .922, and .877, respectively) and good correlation for monocytes and basophils (r = .756 and .763, respectively). The efficiency rates of flagging for the presence of ≥1% abnormal white blood cells were 83% (XE-2100) and 66% (NE-8000). The correlation of automated and microscopic nucleated red blood cell counts was excellent (r = .97). Conclusions.—From the present evaluation and our former experience with other types of Sysmex analyzers, we conclude that the new white blood cell differential technology of the XE-2100 represents a further development toward more efficient flagging of abnormal white blood cells.
9

Palter, Jaime B., Thomas L. Frölicher, David Paynter, and Jasmin G. John. "Climate, ocean circulation, and sea level changes under stabilization and overshoot pathways to 1.5 K warming." Earth System Dynamics 9, no. 2 (June 13, 2018): 817–28. http://dx.doi.org/10.5194/esd-9-817-2018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract. The Paris Agreement has initiated a scientific debate on the role that carbon removal – or net negative emissions – might play in achieving less than 1.5 K of global mean surface warming by 2100. Here, we probe the sensitivity of a comprehensive Earth system model (GFDL-ESM2M) to three different atmospheric CO2 concentration pathways, two of which arrive at 1.5 K of warming in 2100 by very different pathways. We run five ensemble members of each of these simulations: (1) a standard Representative Concentration Pathway (RCP4.5) scenario, which produces 2 K of surface warming by 2100 in our model; (2) a “stabilization” pathway in which atmospheric CO2 concentration never exceeds 440 ppm and the global mean temperature rise is approximately 1.5 K by 2100; and (3) an “overshoot” pathway that passes through 2 K of warming at mid-century, before ramping down atmospheric CO2 concentrations, as if using carbon removal, to end at 1.5 K of warming at 2100. Although the global mean surface temperature change in response to the overshoot pathway is similar to the stabilization pathway in 2100, this similarity belies several important differences in other climate metrics, such as warming over land masses, the strength of the Atlantic Meridional Overturning Circulation (AMOC), ocean acidification, sea ice coverage, and the global mean sea level change and its regional expressions. In 2100, the overshoot ensemble shows a greater global steric sea level rise and weaker AMOC mass transport than in the stabilization scenario, with both of these metrics close to the ensemble mean of RCP4.5. There is strong ocean surface cooling in the North Atlantic Ocean and Southern Ocean in response to overshoot forcing due to perturbations in the ocean circulation. Thus, overshoot forcing in this model reduces the rate of sea ice loss in the Labrador, Nordic, Ross, and Weddell seas relative to the stabilized pathway, suggesting a negative radiative feedback in response to the early rapid warming. Finally, the ocean perturbation in response to warming leads to strong pathway dependence of sea level rise in northern North American cities, with overshoot forcing producing up to 10 cm of additional sea level rise by 2100 relative to stabilization forcing.
10

Wu, S., L. J. Mickley, J. O. Kaplan, and D. J. Jacob. "Impacts of changes in land use and land cover on atmospheric chemistry and air quality over the 21st century." Atmospheric Chemistry and Physics 12, no. 3 (February 14, 2012): 1597–609. http://dx.doi.org/10.5194/acp-12-1597-2012.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract. The effects of future land use and land cover change on the chemical composition of the atmosphere and air quality are largely unknown. To investigate the potential effects associated with future changes in vegetation driven by atmospheric CO2 concentrations, climate, and anthropogenic land use over the 21st century, we performed a series of model experiments combining a general circulation model with a dynamic global vegetation model and an atmospheric chemical-transport model. Our results indicate that climate- and CO2-induced changes in vegetation composition and density between 2100 and 2000 could lead to decreases in summer afternoon surface ozone of up to 10 ppb over large areas of the northern mid-latitudes. This is largely driven by the substantial increases in ozone dry deposition associated with increases in vegetation density in a warmer climate with higher atmospheric CO2 abundance. Climate-driven vegetation changes over the period 2000–2100 lead to general increases in isoprene emissions, globally by 15% in 2050 and 36% in 2100. These increases in isoprene emissions result in decreases in surface ozone concentrations where the NOx levels are low, such as in remote tropical rainforests. However, over polluted regions, such as the northeastern United States, ozone concentrations are calculated to increase with higher isoprene emissions in the future. Increases in biogenic emissions also lead to higher concentrations of secondary organic aerosols, which increase globally by 10% in 2050 and 20% in 2100. Summertime surface concentrations of secondary organic aerosols are calculated to increase by up to 1 μg m−3 and double for large areas in Eurasia over the period of 2000–2100. When we use a scenario of future anthropogenic land use change, we find less increase in global isoprene emissions due to replacement of higher-emitting forests by lower-emitting cropland. The global atmospheric burden of secondary organic aerosols changes little by 2100 when we account for future land use change, but both secondary organic aerosols and ozone show large regional changes at the surface.

Dissertations / Theses on the topic "WE 2100":

1

Biswas, Abin. "A quantitative analysis of the optical and material properties of metaphase spindles." Doctoral thesis, Humboldt-Universität zu Berlin, 2020. http://dx.doi.org/10.18452/21836.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Die Metaphasenspindel ist eine selbstorganisierende molekulare Maschine, die die entscheidende Funktion erfüllt, das Genom während der Zellteilung gleichmäßig zu trennen. Spindellänge und -form sind emergente Eigenschaften, die durch komplexe Wechselwirkungsnetzwerke zwischen Molekülen hervorgerufen werden. Obwohl erhebliche Fortschritte beim Verständnis der einzelnen molekularen Akteure erzielt wurden, die ihre Länge und Form beeinflussen, haben wir erst kürzlich damit begonnen, die Zusammenhänge zwischen Spindelmorphologie, Dynamik und Materialeigenschaften zu untersuchen. In dieser Arbeit untersuchte ich zunächst quantitativ die Rolle zweier molekularer Kraftgeneratoren - Kinesin-5 und Dynein - bei der Regulierung der Spindelform von Xenopus-Eiextrakt. Eine Störung ihrer Aktivität veränderte die Spindelmorphologie, ohne die Gesamtmasse der Mikrotubuli zu beeinflussen. Um die Spindelform physikalisch zu stören, wurde ein Optical Stretcher (OS) -Aufbau entwickelt. Obwohl das OS Vesikel in Extrakten verformen könnte, konnte keine Kraft auf Spindeln ausgeübt werden. Die Untersuchung des Brechungsindex der Struktur mittels optischer Beugungstomographie (ODT) ergab, dass es keinen Unterschied zwischen Spindel und Zytoplasma gab. Korrelative Fluoreszenz- und ODT-Bildgebung zeigten, wie sich die Materialeigenschaften innerhalb verschiedener Biomoleküle räumlich unterschieden. Die Gesamttrockenmasse der Spindel skalierte mit der Länge, während die Gesamtdichte konstant blieb. Interessanterweise waren die Spindeln in HeLa-Zellen dichter als das Zytoplasma. Schließlich deckte eine störende Mikrotubulusdichte auf, wie die Gesamttubulinkonzentration die Spindelgröße, die Gesamtmasse und die Materialeigenschaften regulierte. Insgesamt bietet diese Studie eine grundlegende Charakterisierung der physikalischen Eigenschaften der Spindel und hilft dabei, Zusammenhänge zwischen der Biochemie und der Biophysik einer aktiven Form weicher Materie zu beleuchten.
The metaphase spindle is a self-organising molecular machine that performs the critical function of segregating the genome equally during cell division. Spindle length and shape are emergent properties brought about by complex networks of interactions between molecules. Although significant progress has been made in understanding the individual molecular players influencing its length and shape, we have only recently started exploring the links between spindle morphology, dynamics, and material properties. A thorough analysis of spindle material properties is essential if we are to comprehend how such a dynamic structure responds to forces, and maintains its steady-state length and shape. In this work, I first quantitatively investigated the role of two molecular force generators– Kinesin-5 and Dynein in regulating Xenopus egg extract spindle shape. Perturbing their activity altered spindle morphology without impacting total microtubule mass. To physically perturb spindle shape, an Optical Stretcher (OS) setup was developed. Although the OS could deform vesicles in extracts, force could not be exerted on spindles. Investigating the structure’s refractive index using Optical Diffraction Tomography (ODT) revealed that there was no difference between the spindle and cytoplasm. Correlative fluorescence and ODT imaging revealed how material properties varied spatially within different biomolecules. Additionally, spindle mass density and the microtubule density were correlated. The total dry mass of the spindle scaled with length while overall density remained constant. Interestingly, spindles in HeLa cells were denser than the cytoplasm. Finally, perturbing microtubule density uncovered how total tubulin concentration regulated spindle size, overall mass and material properties. Overall, this study provides a fundamental characterisation of the spindle’s physical properties and helps illuminate links between the biochemistry and biophysics of an active form of soft matter.
2

Dunwell, Lara Dalene. "We make fiction because we are fiction : authorities displaced in the novels of Russell Hoban." Master's thesis, University of Cape Town, 1995. http://hdl.handle.net/11427/21400.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Russell Hoban, born in Pennsylvania in 1925, is the author of fifty children's books and eight novels. This thesis provides a critical reading of his novels Kleinzeit (1974), The Medusa Frequency (1987), Riddley Walker (1980) and Pilgermann (1983). The thesis argues that the alienation of the protagonist from his society -- a theme common to the novels above -- is the result of the operation of the Derridean process of displacement. Hoban's novels work deconstructively to undermine binary oppositions (such as "reality" versus "fantasy"). I argue that the novels aim to recuperate the marginal by displacing the centre. In Kleinzeit and The Medusa Frequency, reality itself is figured as an absent centre. Through a discussion of magical realism, I show how Hoban questions the idea of a "consensus reality". I argue that by denying authority to the authors in these texts, Hoban privileges the uncertain authority of language itself. Using Derrida's concept of différance, I show that language in Kleinzeit is figured as an endless deferral of meaning. In Chapter II, I turn to an analysis of the invented post-atomic language of Riddley Walker, and examine how the neologisms and futuristic orthography of the text contribute towards significant wordplay. I argue that Riddley's attempts to read his culture's past offer a critique of the contemporary reader's assumptions, both about her present and about reading itself. I rely on Mircea Eliade's The Myth of the Eternal Return (1965) in discussing the nature of myth-making in Riddley Walker. In the final chapter, I discuss in detail the mechanism of displacement in Pilgermann. By examining the role of the grotesque in the novel, I argue that Pilgermann can be read hymeneutically. Derrida's figure of the hymen becomes the emblem of marginalisation. Using the example of the mode of the grotesque {which is prominent in the novel), I argue that the marginal is always already present in the very centre which would expel it. Pilgermann is read as an attempt to recuperate the margin in spite of "the confusion between the present and the non-present" (Derrida, 1984: 212) which is the hymen. Finally, I conclude that Hoban's works, while focussing on displacement, unwittingly displace women, by figuring them as absences whose existence is primarily metaphorical.
3

McMahon, Debbie L. Driskell Robyn Bateman. "Hispanic assimilation are we there yet? /." Waco, Tex. : Baylor University, 2008. http://hdl.handle.net/2104/5175.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ciolli, Mattioli Camilla. "Post-transcriptional mechanisms contributing to RNA and protein localization: study of local translation and alternative 3′UTRs in induced neurons." Doctoral thesis, Humboldt-Universität zu Berlin, 2019. http://dx.doi.org/10.18452/20702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Die asymmetrische Verteilung von mRNA und Proteinen innerhalb einer Zelle definiert die Polarität. Dies ermöglicht eine strikte Regulierung der Genexpression in Raum und Zeit. Ich habe in dieser Arbeit untersucht, wie das Soma und die Neuriten in induzierten Neuronen sich hinsichtlich ihres Transkriptoms und Translatoms unterscheiden. Eine räumliche ribosomale Profilanalyse ergab, dass die Hälfte des lokalen Proteoms durch die mRNA-Lokalisierung und der lokalen Translation definiert wird. Dies sind Prozesse, die durch die synergistische Aktivität von trans- und cis-agierenden Elementen durchgeführt werden. In dieser Arbeit konzentrierte ich mich auf MOV10 als trans-agierendes Element und die alternativen 3′UTRs als cis-agierende Elemente, um ihre Rolle in der Asymmetrie zu untersuchen. MOV10 ist eine RNA-Helikase, welche an vielen Aspekten des RNA-Metabolismus beteiligt ist. Mit den Methoden RIP und PAR-CLIP konnte ich zeigen, dass sowohl MOV10-Ziele als auch MOV10 selbst in den Neuriten lokalisiert sind. Aus ̈erdem ist MOV10 möglicherweise an der translationalen Repression mitinvolviert. In der Tat konnte ich unter den MOV10-Protein-Interaktoren mehrere Proteine identifizieren, welche an der translationalen Repression beteiligt sind, wie z.Bsp. AGO2, FMR1, und TRIM71. Für die Identifizierung der cis-agierenden Elemente führte ich das "Mapping" von alternativen 3′UTRs durch. Diese Analyse zeigte mehrere Gene, die differentiell lokalisierte 3′UTR-Isoformen exprimieren. Insbesondere habe ich mich auf Cdc42 konzentriert. Ich konnte beweisen, dass die beiden Isoformen von Cdc42 auf mRNA-Ebene unterschiedlich lokalisiert sind und dass die 3′UTR der entscheidende Faktor für die mRNA- und Proteinlokalisierung ist. Darüber hinaus habe ich mehrere RBPs identifiziert, die an der Cdc42-Lokalisierung beteiligt sind. Diese Analyse zeigt, dass für die differenzierte Lokalisierung von funktional unterschiedlichen alternativen Protein-Isoformen die Verwendung von alternativen 3′UTR Isoformen als neu-entdeckter Mechanismus eine entscheidende Rolle spielt.
Asymmetric distribution of mRNA and proteins inside a cell defines polarity, which allow tight regulation of gene expression in space and time. In this thesis I investigated how asymmetric distribution characterizes the somatic and neuritic compartments of in induced neurons, in terms of transcriptome and translatome. Spatial ribosome profiling analysis revealed that half of the local proteome is defined by mRNA localization and local translation. These, are processes accomplished by the synergistic activity of trans- and cis-acting elements. I focused on MOV10 as trans-acting element, and on alternative 3′UTRs as cis-elements, to investigate their role in asymmetry. MOV10 is an RNA helicase which participates to many aspects of RNA metabolism. With RIP and PAR-CLIP I showed that MOV10 targets are localized to the neurites, consistently with MOV10-neuritic localization, and that MOV10 might be involved in translational repression. Indeed, among MOV10 protein interactors, I identified several proteins involved in translational repression, i.e. AGO2, FMR1, and TRIM71. On the side of cis-elements, I performed mapping of alternative 3′UTRs. This analysis identified several genes expressing differentially localized 3′UTR isoforms. In particular, I focused on Cdc42. I showed that the two isoforms of Cdc42 are differentially localized at mRNA level, and that the 3′UTR is the driver of mRNA and protein localization. Moreover, I identified several RBPs that might be involved in Cdc42 localization. This analysis points to usage of alternative 3′UTR isoforms as a novel mechanism to provide for differential localization of functionally diverse alternative protein isoforms.
5

Considine, Laura. "What we talk about when we talk about trust : nuclear weapons in the Nixon and Reagan Administrations." Thesis, Aberystwyth University, 2014. http://hdl.handle.net/2160/97656a84-3295-499f-8002-ca0a28379a13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis asks what it is that we are doing when we talk about trust in international politics. It begins by reviewing the recent and growing body of literature on trust and International Relations, locating this more nascent collection of literature within a wider, established body of social science work on trust in disciplines such as psychology, political science, business and management studies. It claims that an implicit but ubiquitous assumption about how words gain meaning underpins the literature, and that this assumption precedes and limits the range of possibilities for the form of the subsequent research. The thesis challenges this way of understanding by deploying Ludwig Wittgenstein's Philosophical Investigations. It then undertakes an alternative study of trust that acts as an ostensive challenge to the literature and thus shows by example how accepting different sites and processes of meaning can add to our understanding of words such as trust in International Relations. It accomplishes this through a 'grammatical investigation' of the uses of trust by President Richard M. Nixon and President Ronald Reagan regarding nuclear weapons and nuclear arms control with the Soviet Union. Using these examples, the thesis then suggests several alternative ways of talking about trust that would provide avenues for further research while avoiding the semantic and methodological difficulties of the dominant social science approaches. The contribution of this work is to challenge prevailing assumptions about words and meaning that exist within the literature and in so doing, to open up a path for alternative ways to talk about words like trust in International Relations.
6

Speegle, Jonathan Patterson Bob E. "We believe in the Communion of Saints : a proposed Protestant reclamation of the doctrine /." Waco, Tex. : Baylor University, 2006. http://hdl.handle.net/2104/4830.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Woodward, Peter. "From boring to divine encounter: Can we preach without the violence of certitude and hegemony?" Institut für Praktische Theologie, 2019. https://ul.qucosa.de/id/qucosa%3A36324.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
“Preaching is boring,” is the expectation for most who sit in the pews Sunday after Sunday. The dominant paradigm for that preaching is “preaching the gospel” as the truth that listeners need to hear - a message delivered with certitude and directiveness. This presentation of the Good News of Jesus Christ has the marks of hegemony and violence visited on both listeners and the preacher. This paper explores an approach to preaching which eschews certitude and hegemony by providing a reflective and invitational approach; using five categories: What is God doing?; What is the aim and intention of preaching; Preaching and the preacher; Preparation and Delivery; and Evaluation.
8

Sills, Rebekah S. Ferdon Douglas Robert. "'We Shall Not Fail Freedom' Oveta Culp Hobby's role in the formation and implementation of the Women's Army Corps /." Waco, Tex. : Baylor University, 2007. http://hdl.handle.net/2104/5089.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mercado, Auf der Maur Adhemar. ""We do not play music for the applause!" : explorations of Andean autochthonous music as worlding practices in urban Bolivia." Thesis, Aberystwyth University, 2017. http://hdl.handle.net/2160/11fbc4cd-5932-4515-b2ea-570627e6c671.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This thesis is an exploration of Andean autochthonous music as a practice of decolonisation in the urban context of Bolivia. It follows the cultural, social, religious and political activities of different music groups who play autochthonous music in the city and the surrounding area of Cochabamba and La Paz. Following their stories it contrasts and contextualises these groups’ journeys within the wider socio–political processes of Bolivian society. In this sense, my research follows Anders Burman’s call for a move away from the ‘critical intellectual theorizing’ of decolonisation and towards a more practice-oriented approach to decolonisation. Music in this context is understood as a complex, interdependent and inherently situated practice that is in constant process of creating worlds. The thesis dwells on the implications for academic knowledge production when we take seriously the claims, practices and experiences of those people we engage with in our research. The thesis thus explores the ramifications and importance of the claim made by autochthonous musicians that music is more than just an artistic performance, an aesthetic endeavour for applause or for political vindications. Doing so, the thesis problematizes the questions of authenticity, folklorisation and politics of recognition more broadly that are generally associated with Andean autochthonous music. The thesis seeks to take the experience and ideas of urban autochthonous musicians seriously by engaging with those worlds, and spiritual hinterlands that are invoked through Andean autochthonous music. The question then is not whether music can be an instrument of decolonisation. Rather the thesis asks: under what circumstances does music contribute to decolonisation and what kind of decolonisation processes does music bring about? In this sense, the project explores the possibilities and limitations of discourses and activities of the urban autochthonous music groups wherein Quechua and Aymara political vindications and the empowerment of Andean ways of knowing and being become possible. Through the example of urban autochthonous music groups the thesis engages with the idea and the conditions of possibility necessary for social and political change. I suggest to look at music and autochthonous music in particular as sites of many worldings, where the pluriverse gets enacted and performed. The thesis’ aim is to contribute to further our understanding of how the decolonial, post-colonial or critical theoretical frameworks continue to perpetuate colonial structures and contribute to the further folklorisation and cooptation of the indigenous and other marginalised cultures and lived experiences rather than their liberation.
10

"We Love To Hate Help Desk." University of Technology, Sydney. School of Computing Sciences, 2000. http://hdl.handle.net/2100/237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Customer satisfaction with the Information Technology Help Desk is the focus of this study. Technology in the workplace has increased exponentially. Therefore customers are more reliant on the Help Desk then ever before. This has raised the importance of the role that Help Desk plays in the functioning of an organisation. The fundamental aim of this study is to answer the questions below; 1. Is dissatisfaction truly present for individual problems, or is it a generalisation or "urban myth"? 2. Which of the five hypotheses are the most significant in causing dissatisfaction amongst customers? The five hypotheses focus on the areas of Communication, Solutions, Service, Knowledge (up-to-date), and Morale. A computer-based survey was used to query the customers. The survey questions linked back to the hypotheses. The customer was given the opportunity to make an optional comment to discover any sensitive issues that the survey did not address. The average "overall satisfaction" rating for the survey suggested the general population is more satisfied then dissatisfied with the services of the Help Desk. From the study I was able to conclude that dissatisfaction is present for individual problems, but the dissatisfied customer only accounts for 8% of the surveyed population. Having proven that customer dissatisfaction is present the next step was to determine the nature of the problem to provide useful information to reduce customer dissatisfaction. Investigating the surveys on the basis of problem category did this. The results indicated that customer dissatisfaction was most prevalent in calls concerning changes made to PCs and server interruptions. Therefore the Help Desk needs to re-evaluate the processes for handling problems of this nature. In contrast customers were most satisfied with assistance for problems relating to desktop software and hardware. Therefore dissatisfaction is not an "urban myth". Of all the five hypotheses, Help Desk morale stood out as producing more satisfaction than any of the other hypotheses including "overall satisfaction". Help Desk morale proved to be significantly different in nature when compared to the four other hypotheses. Therefore the moral of the Help Desk team is a fundamental ingredient for brewing a successful service. Get this wrong and all aspects of the team and the service will decline. The most important influence on "overall satisfaction" was "satisfaction with keeping up with technological change", and the least important factor was "satisfaction with ability to predict problems through good communication". This would indicate an up-to-date Help Desk is more likely to have satisfied customers.

Books on the topic "WE 2100":

1

Hawkins, John. Transpluto or Should We Call Him Bacchus the Ruler of Taurus?/Includes: Sign, Houses, Aspects, Midpoints and Ephemeris 1750-2100. Amer Federation of Astrologers Inc, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

organizers, Women's March. Together we rise: Behind the scenes of the protest heard around the world. 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Markley, Robert. Kim Stanley Robinson. University of Illinois Press, 2019. http://dx.doi.org/10.5622/illinois/9780252042751.001.0001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Kim Stanley Robinson is the first full-length study of one of the most widely read and influential science-fiction writers of our era. In dicussing eighteen of his novels published since 1984 and a selection of his short fiction, this study explores the significance of his work in reshaping contemporary literature. Three of the chapters are devoted to Robinson’s major trilogies: the Orange County trilogy (1984-90), the Mars trilogy (1992-96), and the Science in the Capital trilogy (2004-07). Two other chapters consider his groundbreaking alternative histories, including “The Lucky Strike” (1984), The Years of Rice and Salt (2002), and Shaman (2014), and his future histories set among colonies in the solar system, notably Galileo’s Dream (2009) and 2312 (2012). The concluding chapter examines Robinson’s most recent novels Aurora (2015) and New York 2140 (2017). In interviews, Robinson describes his fiction as weaving together, in various combinations, Marxism, ecology, and Buddhist thought, and all of his novels explore how we might imagine forms of utopian political action. His novels—from the Mars trilogy to New York 2140—offer a range of possible futures that chart humankind’s uneven progress, often over centuries, toward the greening of science, technology, economics, and politics. Robinson filters our knowledge of the past and our imagination of possible futures through two superimposed lenses: the ecological fate of the Earth (or other planets) and the far-reaching consequences of moral, political, and socioeconomic decisions of individuals, often scientists and artists, caught up in world or solar-systemic events. In this respect, his fiction charts a collective struggle to think beyond the contradictions of historical existence, and beyond our locations in time, culture, and geography.
4

(Editor), Alan J. Avery-Peck, and Jacob Neusner (Editor), eds. Where We Stand: Judaism in Late Antiquity (Handbook of Oriental Studies/Handbuch Der Orientalistik). Brill Academic Publishers, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

(Editor), Alan J. Avery-Peck, and Jacob Neusner (Editor), eds. Where We Stand: The Special Problem of the Synagogue (Handbook of Oriental Studies/Handbuch Der Orientalistik). Brill Academic Publishers, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bagno, Raoni Barros, Matheus Luiz Pontelo de Souza, Lin Chih Cheng, Adriana Ferreira de Faria, Jonathan Simões Freitas, Leonel Del Rey de Melo Filho, Elimar Pires Vasconcellos, et al. Perspectivas sobre o empreendedorismo tecnológico: Da ação empreendedora aos programas de apoio e dinâmica do ecossistema. Brazil Publishing, 2020. http://dx.doi.org/10.31012/978-65-5861-210-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The importance of technology entrepreneurship stems from its growing relevance for economic and social development. Managers, whether in companies, tech parks, or startup-related institutions, entrepreneurs, students, researches, public policymakers, among other agents, have got involved with the process of creating and developing new technology-based ventures. Such a dynamic is, however, complex and full of uncertainties. Further, the current literature is usually focused on a single level of analysis, bringing to the arena debates of interest of one or a few of these agents. This book offers a highly transversal approach, touching the subject from the perspective of diverse agents and stimulating an integrated and non-simplistic discussion, whether for students or experienced practitioners. The book marks the 25-year trajectory of the Núcleo de Tecnologia da Qualidade e da Inovação (Technology Center for Quality and Innovation - NTQI / UFMG). We bring the contribution of more than 20 NTQI collaborators, all of them specialists in Technology Entrepreneurship who made all efforts to combine academic rigor with the objectiveness and didactic to properly support practitioners, students, and researches in their involvement with this intricate and multifaced phenomenon. Therefore, the 18 chapters of this work offer diverse perspectives of technology entrepreneurship: from the entrepreneurial action and thought, processes, and methods for new venture creation, passing through the entrepreneurial ecosystems, tech parks, and the perspective of the incumbent companies. Whether for students or practitioners, a reference for an innovative and integrated discussion.
7

(Editor), Jacob Neusner, and Alan J. Avery-Peck (Editor), eds. Judaism in Late Antiquity: Where We Stand, Issues and Debates in Ancient Judaism (Handbook of Oriental Studies/Handbuch Der Orientalistik). Brill Academic Publishers, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

(Editor), Jacob Neusner, and Alan J. Avery-Peck (Editor), eds. Judaism in Late Antiquity: Where We Stand : Issues and Debates in Ancient Judaism (Handbook of Oriental Studies/Handbuch Der Orientalistik). Brill Academic Publishers, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "WE 2100":

1

Bartholy, J., R. Pongrácz, Gy Gelybó, and A. Kern. "What Climate Can We Expect in Central/Eastern Europe by 2071–2100?" In Bioclimatology and Natural Hazards, 3–14. Dordrecht: Springer Netherlands, 2009. http://dx.doi.org/10.1007/978-1-4020-8876-6_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Landis, John D., and Michael Reilly. "How We Will Grow: Baseline Projections of California’s Urban Footprint Through the Year 2100." In Integrated Land Use and Environmental Models, 55–98. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-662-05109-2_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Lüderitz, B. "Implantable Atrial Defibrillator: Where Are We and Where Are We Going?" In Cardiac Arrhythmias 2001, 479–84. Milano: Springer Milan, 2002. http://dx.doi.org/10.1007/978-88-470-2103-7_75.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gottlieb, Gerald L. "What do We Plan or Control When We Perform a Voluntary Movement?" In Biomechanics and Neural Control of Posture and Movement, 354–62. New York, NY: Springer New York, 2000. http://dx.doi.org/10.1007/978-1-4612-2104-3_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Phillips, R. L., S. M. Kaeppler, and V. M. Peschke. "Do We Understand Somaclonal Variation?" In Progress in Plant Cellular and Molecular Biology, 131–41. Dordrecht: Springer Netherlands, 1990. http://dx.doi.org/10.1007/978-94-009-2103-0_19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zirkin, Barry R., and Bernard Robaire. "Unlocking the Mysteries of Male Reproductive Function: How Far Have We Come and Where Are We Going?" In The Testis, 3–9. New York, NY: Springer New York, 2000. http://dx.doi.org/10.1007/978-1-4612-2106-7_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Mandal, Bappaditya, and Nizar Ouarti. "Spontaneous Versus Posed Smiles—Can We Tell the Difference?" In Advances in Intelligent Systems and Computing, 261–71. Singapore: Springer Singapore, 2016. http://dx.doi.org/10.1007/978-981-10-2107-7_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lunati, M., G. Magenta, G. Cattafi, R. Vecchi, M. Paolucci, and T. Di Camillo. "Ventricular Resynchronization: What May We Expect from Technological Advances?" In Cardiac Arrhythmias 2001, 181–84. Milano: Springer Milan, 2002. http://dx.doi.org/10.1007/978-88-470-2103-7_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Vietri, M. "What Have We learned about Gamma Ray Bursts from Afterglows?" In Recent Developments in General Relativity, Genoa 2000, 261–75. Milano: Springer Milan, 2002. http://dx.doi.org/10.1007/978-88-470-2101-3_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bergfeldt, L. "Syncope or Seizures: What Have We Learned from Electrocardiographic Monitoring?" In Cardiac Arrhythmias 2001, 19–21. Milano: Springer Milan, 2002. http://dx.doi.org/10.1007/978-88-470-2103-7_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "WE 2100":

1

van de Ketterij, R. G. "Emissions reduction at the Netherlands ministry of defence: potential, possibilities and impact." In 14th International Naval Engineering Conference and Exhibition. IMarEST, 2018. http://dx.doi.org/10.24868/issn.2515-818x.2018.065.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
To limit the global temperature rise to 1.5°C in 2100 compared to mid nineteenth century, net post 2015 emissions should amount maximum 200 Gigaton Carbon (GTC) or 734 GT CO2 emissions [Millar, 2017]. Annual world CO2 emission rate was 36.2GT, and CO2_eq (the combined impact of all emissions on global warming, translated to the equivalent impact of CO2 emissions) emission rate was 49 GT in 2016, [Carbonatlas, 2017]. Currently only 685 GT CO2 emission quota is left, or 14 years of emitting at the current emission rate. Estimates vary widely: IPCC thinks we only have 485 GT CO2 emission quota left, while the most pessimistic estimates talk about only 200 GT CO2. With this in mind, the ambition of the Dutch Operational Energy Strategy [Schulten 2017] to reduce the dependency on fossil fuels (and hence CO2 emissions) by 20 % in 2030, is not sufficient to meet the objectives of the Treaty of Paris. We have to choose whether to keep this ambition, defining much stricter ambitions, or invest differently to keep global warming within acceptable limits. This paper discusses CO2 emissions and their distribution both over different sectors and geographical, worldwide. Next the paper discusses the options we have on short and medium term to reduce emissions, and their impact on emission reduction.
2

Goldstein, Neil, Steven Adler-Golden, Xuemin Jin, Jamine Lee, Steven Richtsmeier, and Carlos A. Arana. "Temperature and Temperature Profile Measurements in the Combustor Flowpath Using Structured Emission Thermography." In ASME Turbo Expo 2003, collocated with the 2003 International Joint Power Generation Conference. ASMEDC, 2003. http://dx.doi.org/10.1115/gt2003-38695.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Structured Emission Thermometry (SET) is a new optical technique for deriving temperature and species concentrations in high-temperature regions of a combustor flow-path from measurements of water and soot emission spectra in the 1 micron region. A prototype instrument has been built for gas- and liquid-fuel combustors with temperatures ranging from 1400–2500K (2100–4000F). This instrument can be used for test stand instrumentation and routine monitoring in fixed installations. With further evolution, the technique may be suitable for an aircraft engine control system. The emissions are measured along narrow lines of sight using small, simple passive fiber probes placed at a short standoff distance from the hot gas flow-path. With proper placement of multiple probes, data may be collected over overlapping lines of sight and inverted to produce a low-resolution spatial map of the temperature and emitter density in the flow-path. In this work, we describe a series of combustor measurements, including cross-validation measurements in a well-controlled laboratory flame and measurements in combustor test stands. Both line-of-sight measurements and reconstructed temperature profiles are presented and discussed.
3

Rodrigues Minucci, Frederico, Auteliano Antunes dos Santos, and Rafael A. Lima e Silva. "Comparison of Multiaxial Fatigue Criteria to Evaluate the Life of Crankshafts." In ASME 2010 International Mechanical Engineering Congress and Exposition. ASMEDC, 2010. http://dx.doi.org/10.1115/imece2010-39018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Crankshafts are subject to multiaxial fatigue because of the complex stress distribution caused by the loads when this engine component is in service. So far there is no universally accepted approach to multiaxial fatigue; each theory is used for a determined application. Among the many high-cicle fatigue criteria, two have been largely accepted to be used with mechanical components, the one based on the critical plane and the one based on the stress invariant. This work presents the comparison of the safety factors calculated with those criteria for a conventional crankshaft. As critical plane criteria we chose those known as Matake, McDiarmid, and Dang Van, each one with a different approach. Same way we chose the Sines, Crossland, and Kakuna-kawada criterias as the stress invariant approaches. First, the work describes the basic concepts and the fatigue criteria listed. Following that the loads over the crankshaft are estimated from the loads flowing through the conrods and from the dynamic of the movement for critical points of the crankshaft. The third step was to apply the criteria to evaluate the safety when the component is working at 1700, 2100, and 3050rpm, which are the speeds where the engine is subjected to the maximum torque, maximum power, and at the maximum speed conditions. The results showed that the engine analyzed did not fail, that the critical plane criteria is more conservative, and that the safety factor is not smaller than 1.77.
4

Tickle, Evelyn. "Oyster Hack." In 2018 ACSA International Conference. ACSA Press, 2018. http://dx.doi.org/10.35483/acsa.intl.2018.57.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
There is a state of emergency in the USA- catastrophic coastal erosion, rising sea levels at the rate of one-eighth of an inch per year and poor water quality. Oysters can help. Oysters filter the water, removing toxins. Oyster reefs are living infra-structures that protect coastlines from storms and tidal surges. But…many of the world’s existing oyster reefs are functionally impaired. The Chesapeake Bay is dying. Untreated chemical run-off and human waste is creating ‘Dead Zones’ where there is no oxygen to support marine life. Much of Hurricane Sandy’s damage to New York City could have been prevented. In the early 1800’s the Harbor was lined with living oyster reefs. Now, these are dead or dying, fragile and vulnerable. Miami is flooded on a regular basis reports Miami Herald. Our oyster reefs must be revived or rebuilt- they will help. Walls are not the answer. 14% of US coastal cities have massive sea-walls already. National Geographic reports that by 2100 one-third of our coastal cities will be protected by walls, that cost billions of dollars and will not provide protection from the most severe storms. I believe in the power of the oyster. The oyster is an engineer- its reefs and shells work together as a “system of systems” to protect our waters and coastlines. Without them we are sunk, literally, no matter how many engineered systems we humans try to substitute and pay billions of dollars to implement.
5

Gramstad, Odin, Elzbieta Bitner-Gregersen, and Erik Vanem. "Projected Changes in the Occurrence of Extreme and Rogue Waves in Future Climate in the North Atlantic." In ASME 2017 36th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/omae2017-61795.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
We investigate the future wave climate in the North Atlantic with respect to extreme events as well as on wave parameters that have previously not been considered in much details in the perspective of wave climate change, such as those associated with occurrence of rogue waves. A number of future wave projections is obtained by running the third generation wave model WAM with wind input derived from several global circulation models. In each case the wave model has been run for the 30-year historical period 1971–2000 and the future period 2071–2100 assuming the two different future climate scenarios RCP 4.5 and RCP 8.5. The wave model runs have been carried out by the Norwegian Meteorological Institute in Bergen, and the climate model result are taken from The Coupled Model Intercomparison Project phase 5 - CMIP5. In addition to the standard wave parameters such as significant wave height and peak period the wave model runs provided the full two-dimensional wave spectrum. This has enabled the study of a larger set of wave parameters. The focus of the present study is the projected future changes in occurrence of extreme sea states and extreme and rogue waves. The investigations are limited to parameters related to this in a few selected locations in the North Atlantic. Our results show that there are large uncertainties in many of the parameters considered in this study, and in many cases the different climate models and different model scenarios provide contradicting results with respect to the predicted change from past to future climate. There are, however, some situations for which a clearer tendency is observed.
6

Smit, C., I. Varekamp, F. Rosendaal, A. Bröcker-Vriends, T. Suurmeijer, and E. Briët. "THE BENEFITS OF MODERN SUBSTITUTION THERAPY IN HEMOPHILIA." In XIth International Congress on Thrombosis and Haemostasis. Schattauer GmbH, 1987. http://dx.doi.org/10.1055/s-0038-1644026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Coagulation factor preparations became available in the treatment of hemophilia about twenty years ago, followed by the introduction of prophylactic therapy and home-treatment. The purpose of our longitudinal study was to quantify the impact of these treatment modalities on the medical and social situation of hemophiliacs.We carried out three mail surveys (1972, 1978 and 1985) among Dutch hemophiliacs. In 1985 we sent questionnaires to 1162 of the estimated total of 1300 patients with a response of 81%. Eighty-six percent of the respondents had hemophilia A, 14% hemophilia B; 41% had severe hemophilia (<1%) 19% moderately-severe (1-5%) and 40% mild hemophilia (>5%). Growth of prophylactic therapy and home-treatment for severe hemophilia was The mean age increased from 21 yrs in 1972 (n=435) to 29 yrs in 1985 (n=935)(general male population: 34 yrs). The number of manifest bleedings decreased from 25 in 1972 to 15 in 1985, with a corresponding decrease in the number of transfusions for acute bleedings. Hospitalization per 100 patients with severe hemophilia decreased from more than 2100 days in 1972 to 440 days in 1985. Non-attendance at school caused by hemophilia dropped from 6 to 2 weeks per year, and sickleave from work from 35 to 15 days per year, so that it now equals sickleave among the general male population. Unemployment figures for hemophiliacs were similar to those for the general population, but disability figures are still higher. Our study shows in a quantitative way that the benefits of modern hemophilia treatment are impressive and that its costs are more than justified.
7

Sapač, Klaudija, Simon Rusjan, Nejc Bezak, and Mojca Šraj. "ANALYSIS OF LOW-FLOW CONDITIONS IN A HETEROGENEOUS KARST CATCHMENT AS A BASIS FOR FUTURE PLANNING OF WATER RESOURCE MANAGEMENT." In XXVII Conference of the Danubian Countries on Hydrological Forecasting and Hydrological Bases of Water Management. Nika-Tsentr, 2020. http://dx.doi.org/10.15407/uhmi.conference.01.20.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Understanding and prediction of low-flow conditions are fundamental for efficient water resources planning and management as well as for identification of water-related environmental problems. This is problematic especially in view of water use in economic sectors (e.g., tourism) where water-use peaks usually coincide with low-flow conditions in the summer time. In our study, we evaluated various low-flow characteristics at 11 water stations in the non-homogenous Ljubljanica river catchment in Slovenia. Approximately 90% of the catchment is covered by karst with a diverse subsurface, consisting of numerous karst caves. The streams in the remaining part of the catchment have mainly torrential characteristics. Based on daily discharge data we calculated and analyzed values of 5 low-flow indices. In addition, by analyzing hydrograph recession curves, recession constants were determined to assess the catchment’s responsiveness to the absence of precipitation. By using various calculation criteria, we analyzed the influence of individual criteria on the values of low-flow recession constants. Recession curves are widely used in different fields of hydrology, for example in hydrological models, baseflow studies, for low-flow forecasting, and in assessing groundwater storages which are crucial in view of assessing water availability for planning water resources management. Moreover, in the study we also investigated the possible impact of projected climate change (scenario RCP4.5) on low-flow conditions in two sub-catchments of the Ljubljanica river catchment. For the evaluation we used the lumped conceptual hydrological model implemented in the R package airGR. For periods 2011-2040, 2041-2070, and 2071-2100 low-flow conditions were evaluated based on flow duration curves compared with the 1981-2010 period. The lowest discharges at all water stations in the Ljubljanica river catchment occur mostly during the summer months. Our results for the future show that we can expect a decrease of the lowest low-flows in the first two 30-year periods, while in the last one low-flows could increase by approx. 15%. However, the uncertainty/variability of the results is very high and as such should be taken into account when interpreting and using the results. This study demonstrates that evaluation of several low-flow characteristics is needed for a comprehensive and holistic overview of low-flow dynamics. In non-homogeneous catchments with a high karstic influence, the hydrogeological conditions of rivers should also be taken into account in order to adequately interpret the results of low-flow analyses. This proved to be important even in case of neighboring water stations.
8

Lee, Anbae, Seungwoo Jin, Younghwan Joo, Ilsik Jang, Jaechun Cha, Kichel Jeong, Hyosang Kang, et al. "How can we improve sub 40 nm Transistor properties by using Ion implantation." In ION IMPLANTATION TECHNOLOGY 2101: 18th International Conference on Ion Implantation Technology IIT 2010. AIP, 2011. http://dx.doi.org/10.1063/1.3548426.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hijikata, Takatoshi, and Tadafumi Koyama. "Transport of High-Temperature Molten Salt Slurry for Pyro-Reprocessing." In 17th International Conference on Nuclear Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/icone17-75379.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Pyro-reprocessing is one of the most promising technologies for advanced fuel cycle with favorable economic potential and intrinsic proliferation resistance. The development of transport technology for molten salt is a key issue in the industrialization of pyro-reprocessing. As for pure molten LiCl-KCl eutectic salt at approximately 773 K, we have already reported the successful results of transport using gravity and a centrifugal pump. However, molten salt in an electrorefiner mixes with insoluble fines when spent fuel is dissolved in porous anode basket. The insoluble consists of noble metal fission products, such as Pd, Ru, Mo, and Zr. There have been very few transport studies of a molten salt slurry (metal fines - molten salt mixture). Hence, transport experiments on a molten salt slurry were carried out to investigate the behavior of the slurry in a tube. The apparatus used in the transport experiments on a molten salt slurry consisted of a supply tank, a 10° inclined transport tube (10 mm inner diameter), a valve, a filter, and a recovery tank. Stainless steel (SS) fines with diameters from 53 to 415 μm were used. To disperse these fines homogenously, the molten salt and fines were stirred in the supply tank by an impeller at speeds from 1200 to 2100 rpm. The molten salt slurry containing 0.2 to 0.4 vol.% SS fines was transported from the supply tank to the recovery tank through the transportation tube. In the recovery tank, the fines were separated from the molten salt by the filter to measure the transport behavior of molten salt and SS fines. When the velocity of the slurry was 0.02 m/s, only 1% of the fines were transported to the recovery tank. On the other hand, most of the fines were transported when the velocity of the slurry was more than 0.6 m/s. Consequently, the molten salt slurry can be transported when the velocity is more than 0.6 m/s.
10

Chrane, Colton, and Sathish Kumar. "An Examination of Tor Technology Based Anonymous Internet." In InSITE 2015: Informing Science + IT Education Conferences: USA. Informing Science Institute, 2015. http://dx.doi.org/10.28945/2190.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The concept of Anonymous Internet is getting traction in the wake of recent privacy related NSA (National Security Agency) privacy allegations. Based on our initial findings, Tor appears to be a solution for this and taking control of the Internet and the User’s privacy. Our study reveals that Tor’s power and future ability to thrive revolves around its expansion. Tor needs to grow in size specifically in nodes as well as in software (Torrenting). With the increase in servers and relays in the system as well as few tweaks including a constant shuffling of servers, the more protected a user would be. We plan to put further research and development efforts in Tor to ensure that Tor becomes a viable source for anonymous Internet as well as popular mainstream browser.

To the bibliography