To see the other types of publications on this topic, follow the link: Detect the Difference.

Books on the topic 'Detect the Difference'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 books for your research on the topic 'Detect the Difference.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse books on a wide variety of disciplines and organise your bibliography correctly.

1

Using Lacuna Theory to Detect Cultural Differences in American and German Automotive Advertising (Kulturwissenschaftliche Werbeforschung). Germany: Peter Lang Publishing, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Brown, Andrew W., Tapan S. Mehta, and David B. Allison. Publication Bias in Science. Edited by Kathleen Hall Jamieson, Dan M. Kahan, and Dietram A. Scheufele. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780190497620.013.10.

Full text
Abstract:
When we rely on science to inform decisions about matters such as the environment, teaching strategies, economics, government, and medicine, evidence-based decision-making can only be as reliable as the totality of the science itself. We must avoid distortions of the scientific literature such as publication bias, which is an expected systematic difference between estimates of associations, causal effects, or other quantities of interest compared to the actual values of those quantities, caused by differences between research that is published and the totality of research conducted. Publication bias occurs when the probability of publishing a result of a study is influenced by the result obtained. It appears to be common and can produce misleading conclusions about interventions, make effects appear greater than they are, lead to irreproducible research, and ultimately undermine the credibility of science in general. Methods to detect publication bias and steps to reduce it are discussed.
APA, Harvard, Vancouver, ISO, and other styles
3

Schotte, Simone, Michael Danquah, Robert Darko Osei, and Kunal Sen. The labour market impact of COVID-19 lockdowns: Evidence from Ghana. 27th ed. UNU-WIDER, 2021. http://dx.doi.org/10.35188/unu-wider/2021/965-5.

Full text
Abstract:
In this paper, we provide causal evidence of the impact of stringent lockdown policies on labour market outcomes at both the extensive and intensive margins, using Ghana as a case study. We take advantage of a specific policy setting, in which strict stay-at-home orders were issued and enforced in two spatially delimited areas, bringing Ghana’s major metropolitan centres to a standstill, while in the rest of the country less stringent regulations were in place. Using a difference-in-differences design, we find that the three-week lockdown had a large and significant immediate negative impact on employment in the treated districts, particularly among workers in informal self-employment. While the gap in employment between the treated and control districts had narrowed four months after the lockdown was lifted, we detect a persistent nationwide impact on labour market outcomes, jeopardizing particularly the livelihoods of small business owners mainly operating in the informal economy.
APA, Harvard, Vancouver, ISO, and other styles
4

Call, Josep. Bonobos, chimpanzees and tools: Integrating species-specific psychological biases and socio-ecology. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198728511.003.0012.

Full text
Abstract:
Over the years there has been some controversy regarding the comparison between chimpanzees and bonobos. Whereas some authors have stressed their differences, others have stressed their similarities. One striking difference between wild chimpanzees and bonobos is tool use, especially in foraging contexts. While several chimpanzee populations possess tool kits formed by multiple tools (and their associated techniques) to exploit embedded resources, bonobos display no such tool specialization. However, studies in the laboratory have shown that bonobos are perfectly capable of using tools. In fact, several studies devoted to investigate the cognitive abilities underlying tool use have failed to detect any substantial differences between the two species. This chapter explores three aspects that could explain the difference between chimpanzees and bonobos in their propensity to use tools in the wild: socio-ecological factors, social versus technical cognition, and personality profiles. Au cours du temps, il y a eu beaucoup de controverse en relation aux comparaisons entres les chimpanzés et les bonobos. Alors que certains auteurs ont stressé les différences entre eux, d’autres ont stressé les similarités. Une grande différence entre les chipmanzés et les bonobos sauvages est l’utilisation des outils, spécialement en butinage. Tandis que plusieurs populations de chimpanzés possèdent des boîtes à outils diverses (et leur techniques respectives) pour exploiter les ressources, les bonobos ne montrent pas une spécialisation pareille. Cependant, les études en laboratoir ont montré que les bonobos sont capables d’utiliser des outils. En faite, plusieurs études des facultés cognitives dans l’utilisation des outils n’ont pas pu détecter de différences substantielles entre les deux espèces. Je vais explorer trois aspects qui pourraient expliquer les différences entre les chimpanzés et les bonobos en ce qui concerne leur tendance naturelle à utiliser les outils: facteurs socio-écologiques, cognition social vs. technique, et profils de personnalité.
APA, Harvard, Vancouver, ISO, and other styles
5

Grodzki, Erika. Using Lacuna Theory To Detect Cultural Differences In American And German Automotive Advertising (Kulturwissenschaftliche Werbeforschung). Peter Lang Pub Inc, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Walsh, Bruce, and Michael Lynch. Using Molecular Data to Detect Selection: Signatures from Recent Single Events. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198830870.003.0009.

Full text
Abstract:
Different types and phases of a selective sweep (hard, soft, partial, polygenic) generate different patterns of departures from neutrality, and hence require different tests. It is thus not surprising that a large number of tests have been proposed that use sequence information to detect ongoing, or very-recently completed, episodes of selection. This chapter critically reviews over 50 such tests, which use information on allele-frequency change, linkage disequilibrium patterns, spatial allele-frequency patterns, site-frequency spectrum data, allele-frequency spectrum data, and haplotype structure. This chapter discusses the domain of applicability for each test, and their strengths and weaknesses. Finally, this chapter examines application of these methods in the search for recent, or ongoing, selection in humans and for genes involved in the domestication process in plants and animals.
APA, Harvard, Vancouver, ISO, and other styles
7

Walsh, Bruce, and Michael Lynch. Using Molecular Data to Detect Selection: Signatures from Multiple Historical Events. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198830870.003.0010.

Full text
Abstract:
This chapter examines the search for a pattern of repetitive adaptive substitutions over evolutionary time. In contrast with the previous chapter, only a modest number of tests toward this aim have been proposed. The HKA and McDonald-Kreitman tests contrast the polymorphism to divergence ratio between different genomic classes (such as different genes or silent versus replacement sites within the same gene). These approaches can detect an excess of substitutions, which allows one to estimate the fraction of adaptive sites. This chapter reviews the empirical data on estimates of this fraction and discusses some of the sources of bias it its estimation. Over an even longer time scale, one can contrast the rate of change of sites in a sequence over a phylogeny. These tests require a rather special type of selection, wherein the same specific site (usually a codon) experiences multiple adaptive substitutions over a phylogeny, such as might occur in arms-race genes.
APA, Harvard, Vancouver, ISO, and other styles
8

Traxler, Matthew J. Using Multilevel Models to Evaluate Individual Differences in Deaf Readers. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190455651.003.0014.

Full text
Abstract:
Understanding how and why individuals vary is an important aspect of understanding language function. In assessing literacy in deaf readers, we must supplement normative models of functioning with models that take into account how individual differences enhance or detract from skill attainment. This chapter provides a brief case for and description of multilevel models (sometimes known as hierarchical linear models) as a tool to aid research on individual differences. These kinds of models have been applied successfully to understand variability in both hearing and deaf readers. This chapter explains how multilevel models resemble and differ from other commonly applied data analysis techniques, and why they offer a better alternative than those techniques for many applications within deaf education research.
APA, Harvard, Vancouver, ISO, and other styles
9

Ellam, Rob. 4. Measuring isotopes. Oxford University Press, 2016. http://dx.doi.org/10.1093/actrade/9780198723622.003.0004.

Full text
Abstract:
‘Measuring isotopes: counting the atoms’ explores how isotopes are measured. For stable isotopes, atoms of each isotope are counted using a mass spectrometer. This turns atoms into charged ions and separates them into the different isotopic species using a mass filter. Precise measurements of isotopic abundance can be achieved in a few minutes or hours. Mass spectrometry could be used for radioactive isotopes, but for short-lived isotopes, their low abundance often makes them difficult to detect. The alternative is to use nuclear spectroscopy or counting methods to detect the characteristic energy released by the radioactive decay of a particular isotope, but these can be much longer processes.
APA, Harvard, Vancouver, ISO, and other styles
10

Jacquemyn, Yves, and Anneke Kwee. Antenatal and intrapartum fetal evaluation. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780198713333.003.0006.

Full text
Abstract:
Antenatal and intrapartum fetal monitoring aim to identify the beginning of the process of fetal hypoxia before irreversible fetal damage has taken place. Fetal movement counting by the mother has not been reported to be of any benefit. The biophysical profile score, incorporating ultrasound and fetal heart rate monitoring, has not been proven to reduce perinatal mortality in randomized trials. Doppler ultrasound allows the exploration of the perfusion of different fetal organ systems and provides data on possible hypoxia and fetal anaemia. Maternal uterine artery Doppler can be used to select women with a high risk for intrauterine growth restriction and pre-eclampsia but does not directly provide information on fetal status. Umbilical artery Doppler has been shown to reduce perinatal mortality significantly in high-risk pregnancies (but not in low-risk women). Adding middle cerebral artery Doppler to umbilical artery Doppler does not increase accuracy for detecting adverse perinatal outcome. Ductus venosus Doppler demonstrates moderate value in diagnosing fetal compromise; it is not known whether its use adds any value to umbilical artery Doppler alone. Cardiotocography (CTG) reflects the interaction between the fetal brain and peripheral cardiovascular system. Prelabour routine use of CTG in low-risk pregnancies has not been proven to improve outcome; computerized CTG significantly reduces perinatal mortality in high-risk pregnancies. Monitoring the fetus during labour with intermittent auscultation has not been compared to no monitoring at all; when compared with CTG no difference in perinatal mortality or cerebral palsy has been noted. CTG does lower neonatal seizures and is accompanied by a statistically non-significant rise in caesarean delivery. Fetal blood sampling to detect fetal pH and base deficit lowers caesarean delivery rate and neonatal convulsions when used in adjunct to CTG. Determination of fetal scalp lactate has not been shown to have an effect on neonatal outcome or on the rate of instrumental deliveries but is less often hampered by technical failure than fetal scalp pH. Analysis of the ST segment of the fetal ECG (STAN®) in combination with CTG during labour results in fewer vaginal operative deliveries, less need for neonatal intensive care, and less use of fetal blood sampling during labour, without a change in fetal metabolic acidosis when compared to CTG alone.
APA, Harvard, Vancouver, ISO, and other styles
11

Cuenca-Estrella, Manuel. Guidelines for the diagnosis of fungal disease. Edited by Christopher C. Kibbler, Richard Barton, Neil A. R. Gow, Susan Howell, Donna M. MacCallum, and Rohini J. Manuel. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780198755388.003.0044.

Full text
Abstract:
This chapter summarizes the current recommendations about the diagnostic methods used to detect fungal diseases. The aim of this chapter is to appraise the different techniques and procedures for detecting and investigating fungal infections, including recommendations about conventional methods of microbiological diagnosis such as microscopic examination, culture, and identification of microorganisms, and alternative diagnostic procedures—also known as ‘non-culture procedures’—based on biomarker detection.
APA, Harvard, Vancouver, ISO, and other styles
12

Law, John. The Materials of STS. Edited by Dan Hicks and Mary C. Beaudry. Oxford University Press, 2012. http://dx.doi.org/10.1093/oxfordhb/9780199218714.013.0006.

Full text
Abstract:
Matter matters — this is the issue which is explored in this article. How science, technology, and society (STS) imagines that matter matters. In STS, materiality is usually understood as relational effect. Something becomes material because it makes a difference, because somehow or other it is detectable. It depends, then, on a relation between that which is detected and that which does the detecting. Matter that does not make a difference does not matter. It is not matter since there is no relation. No relation of difference and detection. No relation at all. This article further explains the functioning of the STS through various case studies. Some of these describe the social shaping of technologies. This article says that materials — technologies — are moulded by the intersection of natural and social factors. A detailed analysis of ontological politics and differences and its influence on materialization winds up this article.
APA, Harvard, Vancouver, ISO, and other styles
13

Morlino, Leonardo. Equality, Freedom, and Democracy. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198813873.001.0001.

Full text
Abstract:
A democratic regime is assumed to implement freedom and equality as the two critical and most important values. The question we intend to address here is: how and why has the actual implementation of freedom and equality been changing in the 1990–2020 period? Researching this topic, we cannot ignore the impact of the Great Recession since 2008. Thus, in this comparative research, we analyse France, Germany, Italy, Poland, Spain, and the United Kingdom to detect the changes. As expected, the six largest European democracies have been differently affected by the crisis, as they also had different background factors. We address an additional question: what is the impact of the European Union on the two democratic values? Accordingly, we analyse economic inequality, social inequality, and ethnic inequality with the related changing trends and explanations. We also detect and analyse the trend of freedoms, and especially personal dignity, civil rights, and political rights. Thus, the relative decline of equalities and freedoms in the six countries emerge in the different complex facets. We also explore the demand for equalities and freedoms by citizens and the political commitments of party leaders. The other issues we address include how and why, respectively, equalities and freedoms are affected by domestic aspects and the role of external factors, especially the European Union. By connecting equalities and freedoms and drawing the lines of entire research, we show how there are three different paths in the future of democracy: balanced democracy, protest democracy, and unaccountable democracy.
APA, Harvard, Vancouver, ISO, and other styles
14

Kröber, Hans-Ludwig. Mental illness versus mental disorder: Arguments and forensic implications. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780198722373.003.0011.

Full text
Abstract:
Modern psychiatry uses a theoretical concept of ‘disorder’: it describes various impaired functions without distinguishing pathological disorders from non-pathological disorders, or even from disorders similar to an illness. Of course, this usage eliminates neither illnesses nor the subjective experience of being ill, but it has implications for forensic psychiatry and for the assessment of a person’s legal responsibility. Having schizophrenic delusions constitutes a categorically different state from having only wishful illusions or a vivid imagination. In the context of medicine and psychiatry, we certainly encounter stages that signal fundamental differences. These differences are easily detected when assessing psychotic disorders or similar states, but a lot of differences also arise when assessing perpetrators with personality disorders or simply antisocial behavior. Where, in these states, are the borders that demarcate full responsibility from substantially reduced social responsibility?
APA, Harvard, Vancouver, ISO, and other styles
15

Graham, Gordon. Architecture. Edited by Jerrold Levinson. Oxford University Press, 2009. http://dx.doi.org/10.1093/oxfordhb/9780199279456.003.0031.

Full text
Abstract:
This article traces the ideas that have marked and divided the major architectural fashions of the last 150 years, and the refinements that have been given to these ideas by philosophers of architecture working within a wider philosophical perspective. In fact, despite the differences between the various schools of thought just alluded to, it is not difficult to detect an underlying unity in the central conceptual problem that both philosophers and architects have sought to address. This may be summarized in the question ‘How is architecture to be secured a place in the sphere of the aesthetic?’ or, more simply, ‘What makes architecture an art?’
APA, Harvard, Vancouver, ISO, and other styles
16

Carroll, Maureen. Mors Immatura II. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199687633.003.0007.

Full text
Abstract:
To gain insight into the family’s investment in and attachment to infants, as expressed in burial rituals, cemetery data are explored in Chapter 7. In order to recognize whether infants were afforded special treatment based on their age, the various ways in which their bodies were treated and prepared for burial are examined. The chapter also seeks to detect regional and perhaps cultural differences in the deposition of an array of grave goods that accompanied infants, and to understand possible patterns and meanings in those assemblages. In addition to evidence from Argenton, Beaumont, Kempten, Marseille, Poundbury, Sétif and Tavant, material from many other sites is also discussed in this chapter.
APA, Harvard, Vancouver, ISO, and other styles
17

Basu, Sanjay. Fundamentals. Edited by Sanjay Basu. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780190667924.003.0001.

Full text
Abstract:
In this chapter, the author defines and provides examples of several key terms used in public health and healthcare modeling research. The chapter begins by clarifying the differences between key terms used to describe rates of disease (incidence, prevalence, and mortality) as well as the performance characteristics of tests used to detect disease (sensitivity, specificity, positive predictive value, and negative predictive value), prevent or treat disease (odds ratios, relative risks), understand studies (case-control, cohort, and randomized controlled trials), and avoid common study problems (bias, confounding). Understanding these key terms and how they are used in research studies allows the reader to correctly interpret study results.
APA, Harvard, Vancouver, ISO, and other styles
18

Wyatt, Tristram D. 2. Sensing and responding. Oxford University Press, 2017. http://dx.doi.org/10.1093/actrade/9780198712152.003.0002.

Full text
Abstract:
How an animal behaves is coordinated by nerves and hormones in different, complementary ways. Stimuli, such as the sound of a predator, cause fast behavioural responses coordinated by nerve signals. The stimuli also cause longer lasting physiological changes via hormones, which release energy sources needed for the muscle action required for escape. ‘Sensing and responding’ considers the sensory responses of bats and moths, and then explains selective sensitivity—how animals evolve to detect only what affects their survival or reproductive success. It also shows how the study of neural circuits in simple model systems, such as sea slugs, can help us understand more complicated behaviours in other animals.
APA, Harvard, Vancouver, ISO, and other styles
19

Kreit, John W. Severe Obstructive Lung Disease. Edited by John W. Kreit. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780190670085.003.0013.

Full text
Abstract:
Although chronic obstructive lung disease, asthma, bronchiectasis, and bronchiolitis have very different causes, clinical features, and therapies, they share the same underlying pathophysiology. They are referred to as obstructive lung diseases because airway narrowing causes increased resistance and slowing of expiratory gas flow. Mechanical ventilation of patients with severe obstructive lung disease often produces two problems that must be recognized and effectively managed: over-ventilation and dynamic hyperinflation. Severe Obstructive Lung Disease reviews these two major adverse consequences of mechanical ventilation in patients with severe air flow obstruction. The chapter explains how to detect and correct both of these problems and provides guidelines for managing patients with respiratory failure caused by severe obstructive lung disease.
APA, Harvard, Vancouver, ISO, and other styles
20

Zahn, Roland, and Alistair Burns. Dementia disorders. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780198779803.003.0001.

Full text
Abstract:
This chapter provides a brief overview of the different forms of dementia syndromes and provides a simple algorithm for initial differential diagnosis. Rapidly progressive dementias have to be excluded which require specific investigations to detect Creutzfeldt–Jakob as well as inflammatory and autoimmune diseases. A lead symptom-based approach in patients with slowly progressive cognitive and behavioural impairments without neurological symptoms is applied: progressive and primary impairments in recent memory are characteristic of typical Alzheimer’s dementia, primary behavioural changes point to the behavioural variant of frontotemporal dementia, primary impairments of language or speech are distinctive for progressive aphasias, fluctuating impairments of attention are a hallmark of Lewy body dementia, whereas primary visuospatial impairments suggest a posterior cortical atrophy. The chapter further discusses updated vascular dementia guidelines and DSM-5 revisions of defining dementia. Current diagnostic criteria for the different dementias are referenced and the role of neuroimaging is illustrated.
APA, Harvard, Vancouver, ISO, and other styles
21

Pollack, Detlef, and Gergely Rosta. Reflections on the Concept of Religion. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198801665.003.0003.

Full text
Abstract:
There is no definition of religion that is universally valid and generally accepted in religious studies. Increasing numbers of scholars of religion see the attempt to define religion as doomed to failure, and therefore do not even try. A concept of religion is, however, indispensable for staking out the subject area which the sociology of religion and religious studies are concerned with. Defining clearly what is meant by religion is necessary not only to determine the content of the object to be examined and to distinguish it from other objects, but also to detect changes in the field of study. After discussing different approaches that are taken to define religion, the chapter proposes a working definition that combines substantive and functional arguments. The different forms of religious meaning available to mediate between immanence and transcendence can be classified as religious identification, religious practices, and religious belief and experience.
APA, Harvard, Vancouver, ISO, and other styles
22

Martin, Graham R. Postscript: Conclusions, Implications, and Comment. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780199694532.003.0010.

Full text
Abstract:
The natural world contains a huge amount of constantly changing information but specializations within sensory systems mean that each species receives only a small part of that information. Information is filtered by sensory systems. We cannot assume what a bird can detect–it is important to measure its sensory capacities and to quantify the sensory challenges posed for the conduct of tasks in different environments. No sensory system can function adequately throughout the full ranges of stimuli that are found in the natural world. There have been many trade-offs in the evolution of particular sensory capacities and tradeoffs and complementarity between different sensory capacities within a species. Birds may often be guided by information at the limits of their sensory capacities. Information that guides behaviours may often be sparse and partial. Key behaviours may only be possible because of cognitive abilities which allow adequate interpretation of such partial information.
APA, Harvard, Vancouver, ISO, and other styles
23

Cleary, Paul, Sam Ghebrehewet, and David Baxter. Essential statistics and epidemiology. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780198745471.003.0022.

Full text
Abstract:
This chapter provides a grounding in basic statistics, descriptive epidemiology, analytical epidemiology, and hypothesis testing appropriate for health protection practitioners. The analysis of categorical data using frequency distributions, and charts, and the interpretation of epidemic curves is described. The description of quantitative data including central tendency, standard deviation, and interquartile range is concisely explained. The role of geographical information systems and different disease map types is used to demonstrate how disease clusters may be detected. Determining possible association between specific risk factors and outcome is described in the section on analytical epidemiology, using the risk ratio and the odds ratio. The use of these in different study/investigation types is explained. The importance of confounding, matching, and standardization in study design is described. The final part of the chapter covers hypothesis testing to distinguish between real differences and chance variation, and the use of confidence intervals.
APA, Harvard, Vancouver, ISO, and other styles
24

O'Callaghan, Casey. Perception and Multimodality. Edited by Eric Margolis, Richard Samuels, and Stephen P. Stich. Oxford University Press, 2012. http://dx.doi.org/10.1093/oxfordhb/9780195309799.013.0005.

Full text
Abstract:
The article presents some findings concerning multimodality, and the philosophical implications of these findings. One of the findings is that crossmodal illusions show that perception involves interactions among processes associated with different modalities. Patterns of crossmodal bias and recalibration reveal the organization of multimodal perceptual processes. Multimodal interactions obey intelligible principles, they resolve conflicts, and they enhance the reliability of perception. Multimodal processes also demonstrate a concern across the senses for common features and individuals, for several reasons such as the intermodal biasing and recalibration responsible for crossmodal illusions requires that information from sensory stimulation associated with different senses be taken to be commensurable. The commensurable information from different senses shares, or traces to, a common source since conflict resolution requires a common subject matter. One important lesson of multimodal effects is that an analog of the correspondence problem within a modality holds between modalities. Spatio-temporal unity, objectual unity, and integration are tied to the capacity to detect constancies and solve correspondence problems across modalities. Solving crossmodal correspondence problems requires a common modal or multimodal code that is shared among modalities.
APA, Harvard, Vancouver, ISO, and other styles
25

Al-Nahhas, Adil, and Imene Zerizer. Nuclear medicine. Oxford University Press, 2013. http://dx.doi.org/10.1093/med/9780199642489.003.0070.

Full text
Abstract:
The application of nuclear medicine techniques in the diagnosis and management of rheumatological conditions relies on its ability to detect physiological and pathological changes in vivo, usually at an earlier stage compared to structural changes visualized on conventional imaging. These techniques are based on the in-vivo administration of a gamma-emitting radionuclide whose distribution can be monitored externally using a gamma camera. To guide a radionuclide to the area of interest, it is usually bound to a chemical label to form a 'radiopharmaceutical'. There are hundreds of radiopharmaceuticals in clinical use with different 'homing' mechanisms, such as 99 mTc HDP for bone scan and 99 mTc MAA for lung scan. Comparing pre- and posttherapy scans can aid in monitoring response to treatment. More recently, positron emission tomography combined with simultaneous computed tomography (PET/CT) has been introduced into clinical practice. This technique provides superb spatial resolution and anatomical localization compared to gamma-camera imaging. The most widely used PET radiopharmaceutical, flurodeoxyglucose (18F-FDG), is a fluorinated glucose analogue, which can detect hypermetabolism and has therefore been used in imaging and monitoring response to treatment of a variety of cancers as well as inflammatory conditions such as vasculitis, myopathy, and arthritides. Other PET radiopharmaceuticals targeting inflammation and activated macrophages are becoming available and could open new frontiers in PET imaging in rheumatology. Nuclear medicine procedures can also be used therapeutically. Beta-emitting radiopharmaceuticals, such as yttrium-90, invoke localized tissue damage at the site of injection and can be used in the treatment of synovitis.
APA, Harvard, Vancouver, ISO, and other styles
26

Klempe, Sven Hroar. Music and Imagination. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190468712.003.0012.

Full text
Abstract:
Music is close to language, and when we listen to music, we may also imagine certain events, stories, and the like. The differences, although obvious, are not so easy to detect. These subtle nuances are examined in this chapter with the aim of delineating the general traits of musical imagination. The author defines musical imagination in terms of a human act that provides a type of framework for cognition in which cognition and sensations are united in feelings. This also forms the basis for verticality, which is expressed in terms of musical polyphony. The multitude in musical polyphony opens up for a sort of community, which brings in a social dimension. As long as the social community forms the basis of cultural psychology, a thorough understanding of musical imagination may contribute to a more complete understanding of cultural psychology as well.
APA, Harvard, Vancouver, ISO, and other styles
27

Meijer, Ewout H., and Bruno Verschuere. Detection Deception Using Psychophysiological and Neural Measures. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190612016.003.0010.

Full text
Abstract:
The use of physiological signals to detect deception can be traced back almost a century. Historically, the polygraph has been used—and debated. This chapter discusses the merits of polygraph testing, and to what extent the introduction of measures of brain activity—most notably functional magnetic imaging (fMRI)—have solved the problems associated with polygraph testing. It discusses the different question formats used with polygraph and brain activity measures, and argues that these formats are the main factor contributing to the tests’ validity. Moreover, the authors argue that erroneous test outcomes are caused by errors in logical inferences, and that these errors will not be remedied by new technology. The biggest challenge for the field is to find a question format that isolates deception, and to corroborate laboratory data with methodologically sound field studies.
APA, Harvard, Vancouver, ISO, and other styles
28

Medforth, Janet, Linda Ball, Angela Walker, Sue Battersby, and Sarah Stables. Medical conditions during pregnancy. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780198754787.003.0010.

Full text
Abstract:
This chapter comprises a review of common medical conditions, including asthma, thyroid conditions, cardiac conditions, and renal diseases. There is a small section on renal transplant and care of the woman during pregnancy. Insulin-dependent diabetes, non-insulin-dependent diabetes, gestational diabetes, and their impact on pregnancy are discussed. The effect of pregnancy on the conditions themselves is reviewed, along with recognition and management of pregnancy changes due to the condition. Pregnancy management options, including altered physiology and pharmacological treatment, are discussed. Clinical and laboratory investigations are also listed. The section on cardiac conditions refers to circulatory changes during pregnancy and how these may be affected by a range of different cardiac conditions, both those that are congenital and those acquired. Pregnancy management, investigations, and clinical observations used to detect deterioration are included.
APA, Harvard, Vancouver, ISO, and other styles
29

Ellam, Rob. 6. Measuring isotopes. Oxford University Press, 2016. http://dx.doi.org/10.1093/actrade/9780198723622.003.0006.

Full text
Abstract:
Mass spectrometers have become routine laboratory instruments in many disciplines. ‘Measuring isotopes: mass spectrometers’ concentrates on those used to quantify the abundance of different isotopes—gas source isotope ratio, thermal ionization, inductively coupled plasma, and secondary ion mass spectrometers. A mass spectrometer can be used to quantify the concentration of a particular element by monitoring an isotope of that element not overlapped by isotopes of other elements. All mass spectrometers have three essential components: an ion source, a mass filter, and a detector. There are two main types of detector: Faraday detectors measure large signals and a variant of photomultiplier tubes measures small isotope signals.
APA, Harvard, Vancouver, ISO, and other styles
30

Walsh, Bruce, and Michael Lynch. Hitchhiking and Selective Sweeps. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198830870.003.0008.

Full text
Abstract:
When a favorable allele increases in frequency, it alters the coalescent structure (the pattern of times back to a common ancestor) at linked sites relative to that under drift. This creates patterns of sequence polymorphism than can be used to potentially detect ongoing, or very recent, selection. This idea of a neutral allele hitchhiking up to high frequency when coupled to a favorable allele is the notion of a selective sweep, and this chapter reviews the considerable body of associated population-genetics theory on sweeps. Different types of sweeps leave different signatures, resulting in the very diverse collection of tests of selection discussed in Chapter 9. Either a history of recurrent sweeps, or of background selection, results in linked genomic regions of reduced effective population size. This implies that more mutations in sich regions are efficiently neutral, which can result in increased substitution rates and lower codon bias. Finally, the chapter examines the theory for when response is expected to start from existing variation, as opposed to waiting for the appearance of new mutations.
APA, Harvard, Vancouver, ISO, and other styles
31

Martin, Jeffrey J. Doing Research. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190638054.003.0005.

Full text
Abstract:
Researchers have to consider a host of factors when planning their research and analyzing their data. This chapter discusses a number of important research considerations. For instance, when planning research it is important to have a large enough sample to prevent conducting an underpowered study that would be unable to detect true differences when they existed. When selecting measures, researchers should understand exactly what they are assessing and determine if the scales used have a history of producing valid and reliable scores with similar samples. When developing measures, researchers should avoid the jingle jangle fallacy and avoid creating scales that are redundant with already developed scales or use names that obfuscate the reader. When analyzing their data scientists should avoid dichotomizing continuous constructs and should shun stepwise regression techniques. When compiling findings, researchers need to consider if their results are meaningful, so effect sizes should be reported and interpreted in light of absolute standards and relative to prior research.
APA, Harvard, Vancouver, ISO, and other styles
32

Eibach, Joachim. Violence and Masculinity. Edited by Paul Knepper and Anja Johansen. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780199352333.013.9.

Full text
Abstract:
A consistent overrepresentation of men in recorded violent crimes and thus a certain disposition of male aggressiveness has been evident from the late Middle Ages to today. However, we can also detect several major shifts in the history of interpersonal male violence from the eighteenth century onward. From a cultural historical perspective, violent actions by men or women cannot be interpreted as contingent, individual acts, but rather must be seen as practices embedded in sociocultural contexts and accompanied by informal norms. Because one grand theory cannot account convincingly for the history of violence and masculinity, an array of approaches is more likely to shed light on the issue. Interestingly, shifts in the history of violence have often corresponded with changes to prevailing notions of masculinity. This essay delineates the relevant historical shifts from the early modern “culture of dispute” to the different paths of interpersonal violence over the twentieth century.
APA, Harvard, Vancouver, ISO, and other styles
33

Huschka, Sabine. Pina Bausch, Mary Wigman, and the Aesthetic of “Being Moved”. University of Illinois Press, 2017. http://dx.doi.org/10.5406/illinois/9780252036767.003.0012.

Full text
Abstract:
This chapter rethinks the relationship between Mary Wigman and Pina Bausch from a viewpoint informed by recent philosophical approaches to dance history. Dance research often draws a genealogy that connects Wigman's approach to that of Bausch, the central representative of German Tanztheater as it emerged in the 1970s. However, it is argued Bausch took a fundamentally different position compared to the one propagated by her predecessor: turning her attention away from absolute truth and toward the truthfulness of any given physical movement on stage, while retaining the appeal to feeling, she sought to develop emotionally determined forms of movement and to create a shared space of human experience beyond any essentialism. But what about the choreographed body in these theatrical spaces of experience? How do movements and gestures function to reveal a perspective on the human being? Which choreographic or theatrical means are used, at the discretion of the individual body, to produce an impression of unmediated immediacy? The radical difference between Wigman and Bausch can be detected in their aesthetics of representation, in the way in which they choreograph emotion.
APA, Harvard, Vancouver, ISO, and other styles
34

Hardt, Yvonne. Engagements with the Past in Contemporary Dance. University of Illinois Press, 2017. http://dx.doi.org/10.5406/illinois/9780252036767.003.0014.

Full text
Abstract:
For a long time, what has been considered “avant-garde” embodied the “new” and was perceived as different from those dance forms considered traditional, historical, or marked by ethnic inheritance. This chapter traces how contemporary dance performances and dance historical writing have challenged these demarcations as one detects a remarkable trend toward evoking the past in contemporary dance. Numerous artists and festivals increasingly feature works that address the past, having discovered the potential for a self-reflexivity of dance in conversation with its history. From this larger group of artists, the chapter focuses on four contemporary European choreographers: Jér ô me Bel, Xavier Le Roy, Eszter Salamon, and Martin Nachbar to discuss what working with the past in contemporary performance can entail. These choreographers expose different modes of taking up the past; however, they all engage a concept of history understood as a construction based on the needs of the present.
APA, Harvard, Vancouver, ISO, and other styles
35

Kettler, Mark D. Large Circumscribed Mass in Young Female. Edited by Christoph I. Lee, Constance D. Lehman, and Lawrence W. Bassett. Oxford University Press, 2018. http://dx.doi.org/10.1093/med/9780190270261.003.0021.

Full text
Abstract:
Fibroepithelial lesions account for the overwhelming majority of solid breast masses affecting women younger than age 20. Nearly all present as self-detected or provider-detected palpable masses. All fibroepithelial lesions are composed of stromal (fibrous) and glandular (epithelial) elements and variable histology. Rapidly growing mobile breast masses in girls or female adolescents may represent juvenile fibroadenomas, which have different but benign histological features when compared to typical fibroadenomas. Benign phyllodes tumors closely resemble usual fibroadenomas and juvenile fibroadenomas on imaging. Decisions whether to biopsy these tumors are made clinically; the diagnosis of phyllodes tumor depends on histological assessment.This chapter, appearing in the section on circumscribed mass, reviews the key clinical and imaging features, differential diagnosis, and management recommendations of large solid breast masses affecting young women, including typical fibroadenomas, giant fibroadenomas, juvenile fibroadenomas, and phyllodes tumors.
APA, Harvard, Vancouver, ISO, and other styles
36

Kolanoski, Hermann, and Norbert Wermes. Particle Detectors. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198858362.001.0001.

Full text
Abstract:
The book describes the fundamentals of particle detectors in their different forms as well as their applications, presenting the abundant material as clearly as possible and as deeply as needed for a thorough understanding. The target group for the book are both, students who want to get an introduction or wish to deepen their knowledge on the subject as well as lecturers and researchers who intend to extent their expertise. The book is also suited as a preparation for instrumental work in nuclear, particle and astroparticle physics and in many other fields (addressed in chapter 2). The detection of elementary particles, nuclei and high-energetic electromagnetic radiation, in this book commonly designated as ‘particles’, proceeds through interactions of the particles with matter. A detector records signals originating from the interactions occurring in or near the detector and (in general) feeds them into an electronic data acquisition system. The book describes the various steps in this process, beginning with the relevant interactions with matter, then proceeding to their exploitation for different detector types like tracking detectors, detectors for particle identification, detectors for energy measurements, detectors in astroparticle experiments, and ending with a discussion of signal processing and data acquisition. Besides the introductory and overview chapters (chapters 1 and 2), the book is divided into five subject areas: – fundamentals (chapters 3 to 5), – detection of tracks of charged particles (chapters 6 to 9), – phenomena and methods mainly applied for particle identification (chapters 10 to 14), – energy measurement (accelerator and non-accelerator experiments) (chapters 15, 16), – electronics and data acquisition (chapters 17 and 18). Comprehensive lists of literature, keywords and abbreviations can be found at the end of the book.
APA, Harvard, Vancouver, ISO, and other styles
37

Williams, Jerry R. Diagnostic radiology equipment. Oxford University Press, 2015. http://dx.doi.org/10.1093/med/9780199655212.003.0012.

Full text
Abstract:
The chapter is concerned with the features of radiographic and fluoroscopic equipment that present radiation protection issues for both patients and staff. These are managed through regulation, manufacturing standards, and adherence to safe working practices. It is different for patients who are deliberately irradiated in accordance with justification protocols not considered here. Radiation protection is based on the ALARP principle which requires the resultant dose to be minimized consistent with image quality is sufficient to provide accurate and safe diagnosis. Dose minimization is critically dependent on detector efficiency. Quality control of dose for individual examinations is particularly important to provide assurance of ALARP. It should include not only patient dose assessment but also detector dose indicators, particularly in radiography. These issues are discussed in detail together with other dose-saving features and discussion on objective methods of image quality assessment. Commissioning and lifetime tests are required for quality assurance programmes. These are described.
APA, Harvard, Vancouver, ISO, and other styles
38

Giannitsis, Evangelos, and Hugo A. Katus. Biomarkers in acute coronary syndromes. Oxford University Press, 2015. http://dx.doi.org/10.1093/med/9780199687039.003.0036.

Full text
Abstract:
Biomarker testing in the evaluation of a patient with acute chest pain is best established for cardiac troponins that allow the diagnosis of myocardial infarction, risk estimation of short- and long-term risk of death and myocardial infarction, and guidance of pharmacological therapy, as well as the need and timing of invasive strategy. Newer, more sensitive troponin assays have become commercially available and have the capability to detect myocardial infarction earlier and more sensitively than standard assays, but they are hampered by a lack of clinical specificity, i.e. the ability to discriminate myocardial ischaemia from myocardial necrosis not related to ischaemia such as myocarditis, pulmonary embolism, or decompensated heart failure. Strategies to improve clinical specificity (including strict adherence to the universal myocardial infarction definition and the need for serial troponin measurements to detect an acute rise and/or fall of cardiac troponin) will improve the interpretation of the increasing number of positive results. Other biomarkers of inflammation, activated coagulation/fibrinolysis, and increased ventricular stress mirror different aspects of the underlying disease activity and may help to improve our understanding of the pathophysiological mechanisms of acute coronary syndromes. Among the flood of new biomarkers, there are several novel promising biomarkers, such as copeptin that allows an earlier rule-out of myocardial infarction in combination with cardiac troponin, whereas MR-proANP and MR-proADM appear to allow a refinement of cardiovascular risk. GDF-15 might help to identify candidates for an early invasive vs conservative strategy. A multi-marker approach to biomarkers becomes more and more attractive, as increasing evidence suggests that a combination of several biomarkers may help to predict individual risk and treatment benefits, particularly among troponin-negative subjects. Future goals include the acceleration of rule-in and rule-out of patients with suspected acute coronary syndrome, in order to shorten lengths of stay in the emergency department, and to optimize patient management and the use of health care resources. New algorithms using high-sensitivity cardiac troponin assays at low cut-offs alone, or in combination with additional biomarkers, allow to establish accelerated rule-out algorithms within 1 or 2 hours.
APA, Harvard, Vancouver, ISO, and other styles
39

Giannitsis, Evangelos, and Hugo A. Katus. Biomarkers in acute coronary syndromes. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780199687039.003.0036_update_001.

Full text
Abstract:
Biomarker testing in the evaluation of a patient with acute chest pain is best established for cardiac troponins that allow the diagnosis of myocardial infarction, risk estimation of short- and long-term risk of death and myocardial infarction, and guidance of pharmacological therapy, as well as the need and timing of invasive strategy. Newer, more sensitive troponin assays have become commercially available and have the capability to detect myocardial infarction earlier and more sensitively than standard assays, but they are hampered by a lack of clinical specificity, i.e. the ability to discriminate myocardial ischaemia from myocardial necrosis not related to ischaemia such as myocarditis, pulmonary embolism, or decompensated heart failure. Strategies to improve clinical specificity (including strict adherence to the universal myocardial infarction definition and the need for serial troponin measurements to detect an acute rise and/or fall of cardiac troponin) will improve the interpretation of the increasing number of positive results. Other biomarkers of inflammation, activated coagulation/fibrinolysis, and increased ventricular stress mirror different aspects of the underlying disease activity and may help to improve our understanding of the pathophysiological mechanisms of acute coronary syndromes. Among the flood of new biomarkers, there are several novel promising biomarkers, such as copeptin that allows an earlier rule-out of myocardial infarction in combination with cardiac troponin, whereas MR-proANP and MR-proADM appear to allow a refinement of cardiovascular risk. GDF-15 might help to identify candidates for an early invasive vs conservative strategy. A multi-marker approach to biomarkers becomes more and more attractive, as increasing evidence suggests that a combination of several biomarkers may help to predict individual risk and treatment benefits, particularly among normal-troponin subjects. Future goals include the acceleration of rule-in and rule-out of patients with suspected acute coronary syndrome, in order to shorten lengths of stay in the emergency department, and to optimize patient management and the use of health care resources. New algorithms using high-sensitivity cardiac troponin assays at low cut-offs alone, or in combination with additional biomarkers, allow to establish accelerated rule-out algorithms within 1 or 2 hours.
APA, Harvard, Vancouver, ISO, and other styles
40

Giannitsis, Evangelos, and Hugo A. Katus. Biomarkers in acute coronary syndromes. Oxford University Press, 2018. http://dx.doi.org/10.1093/med/9780199687039.003.0036_update_002.

Full text
Abstract:
Biomarker testing in the evaluation of a patient with acute chest pain is best established for cardiac troponins that allow the diagnosis of myocardial infarction, risk estimation of short- and long-term risk of death and myocardial infarction, and guidance of pharmacological therapy, as well as the need and timing of invasive strategy. Newer, more sensitive troponin assays have become commercially available and have the capability to detect myocardial infarction earlier and more sensitively than standard assays, but they are hampered by a lack of clinical specificity, i.e. the ability to discriminate myocardial ischaemia from myocardial necrosis not related to ischaemia such as myocarditis, pulmonary embolism, or decompensated heart failure. Strategies to improve clinical specificity (including strict adherence to the universal myocardial infarction definition and the need for serial troponin measurements to detect an acute rise and/or fall of cardiac troponin) will improve the interpretation of the increasing number of positive results. Other biomarkers of inflammation, activated coagulation/fibrinolysis, and increased ventricular stress mirror different aspects of the underlying disease activity and may help to improve our understanding of the pathophysiological mechanisms of acute coronary syndromes. Among the flood of new biomarkers, there are several novel promising biomarkers, such as copeptin that allows an earlier rule-out of myocardial infarction in combination with cardiac troponin, whereas MR-proANP and MR-proADM appear to allow a refinement of cardiovascular risk. GDF-15 might help to identify candidates for an early invasive vs conservative strategy. A multi-marker approach to biomarkers becomes more and more attractive, as increasing evidence suggests that a combination of several biomarkers may help to predict individual risk and treatment benefits, particularly among normal-troponin subjects. Future goals include the acceleration of rule-in and rule-out of patients with suspected acute coronary syndrome, in order to shorten lengths of stay in the emergency department, and to optimize patient management and the use of health care resources. New algorithms using high-sensitivity cardiac troponin assays at low cut-offs alone, or in combination with additional biomarkers, allow to establish accelerated rule-out algorithms within 1 or 2 hours.
APA, Harvard, Vancouver, ISO, and other styles
41

Hagendorff, Andreas. Cardiac involvement in systemic diseases. Oxford University Press, 2011. http://dx.doi.org/10.1093/med/9780199599639.003.0020.

Full text
Abstract:
Systemic diseases are generally an interdisciplinary challenge in clinical practice. Systemic diseases are able to induce tissue damage in different organs with ongoing duration of the illness. The heart and the circulation are important targets in systemic diseases. The cardiac involvement in systemic diseases normally introduces a chronic process of alterations in cardiac tissue, which causes cardiac failure in the end stage of the diseases or causes dangerous and life-threatening problems by induced acute cardiac events, such as myocardial infarction due to coronary thrombosis. Thus, diagnostic methods—especially imaging techniques—are required, which can be used for screening as well as for the detection of early stages of the diseases. Two-dimensional echocardiography is the predominant diagnostic technique in cardiology for the detection of injuries in cardiac tissue—e.g. the myocardium, endocardium, and the pericardium—due to the overall availability of the non-invasive procedure.The quality of the echocardiography and the success rate of detecting cardiac pathologies in patients with primary non-cardiac problems depend on the competence and expertise of the investigator. Especially in this scenario clinical knowledge about the influence of the systemic disease on cardiac anatomy and physiology is essential for central diagnostic problem. Therefore the primary echocardiography in these patients should be performed by an experienced clinician or investigator. It is possible to detect changes of cardiac morphology and function at different stages of systemic diseases as well as complications of the systemic diseases by echocardiography.The different parts of this chapter will show proposals for qualified transthoracic echocardiography focusing on cardiac structures which are mainly involved in different systemic diseases.
APA, Harvard, Vancouver, ISO, and other styles
42

Taberlet, Pierre, Aurélie Bonin, Lucie Zinger, and Eric Coissac. Environmental DNA. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198767220.001.0001.

Full text
Abstract:
Environmental DNA (eDNA), i.e. DNA released in the environment by any living form, represents a formidable opportunity to gather high-throughput and standard information on the distribution or feeding habits of species. It has therefore great potential for applications in ecology and biodiversity management. However, this research field is fast-moving, involves different areas of expertise and currently lacks standard approaches, which calls for an up-to-date and comprehensive synthesis. Environmental DNA for biodiversity research and monitoring covers current methods based on eDNA, with a particular focus on “eDNA metabarcoding”. Intended for scientists and managers, it provides the background information to allow the design of sound experiments. It revisits all steps necessary to produce high-quality metabarcoding data such as sampling, metabarcode design, optimization of PCR and sequencing protocols, as well as analysis of large sequencing datasets. All these different steps are presented by discussing the potential and current challenges of eDNA-based approaches to infer parameters on biodiversity or ecological processes. The last chapters of this book review how DNA metabarcoding has been used so far to unravel novel patterns of diversity in space and time, to detect particular species, and to answer new ecological questions in various ecosystems and for various organisms. Environmental DNA for biodiversity research and monitoring constitutes an essential reading for all graduate students, researchers and practitioners who do not have a strong background in molecular genetics and who are willing to use eDNA approaches in ecology and biomonitoring.
APA, Harvard, Vancouver, ISO, and other styles
43

Eljaafari, Assia, and Pierre Miossec. Cellular side of acquired immunity (T cells). Oxford University Press, 2013. http://dx.doi.org/10.1093/med/9780199642489.003.0049.

Full text
Abstract:
The adaptive T-cell response represents the most sophisticated component of the immune response. Foreign invaders are recognized first by cells of the innate immune system. This leads to a rapid and non-specific inflammatory response, followed by induction of the adaptive and specific immune response. Different adaptive responses can be promoted, depending on the predominant effector cells that are involved, which themselves depend on the microbial/antigen stimuli. As examples, Th1 cells contribute to cell-mediated immunity against intracellular pathogens, Th2 cells protect against parasites, and Th17 cells act against extracellular bacteria and fungi that are not cleared by Th1 and Th2 cells. Among the new subsets, Th22 cells protect against disruption of epithelial layers secondary to invading pathogens. Finally these effector subsets are regulated by regulatory T cells. These T helper subsets counteract each other to maintain the homeostasis of the immune system, but this balance can be easily disrupted, leading to chronic inflammation or autoimmune diseases. The challenge is to detect early changes in this balance, prior to its clinical expression. New molecular tools such as microarrays could be used to determine the predominant profile of the immune effector cells involved in a disease process. Such understanding should provide better therapeutic tools to counteract deregulated effector cells.
APA, Harvard, Vancouver, ISO, and other styles
44

Glazov, M. M. Spin Resonance. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198807308.003.0003.

Full text
Abstract:
This chapter is devoted to one of key phenomena in the field of spin physics, namely, resonant absorption of electromagnetic waves under conditions where the Zeeman splitting of spin levels in magnetic field is equal to photon energy. This method is particularly important for identification of nuclear spin effects, because resonance spectra provide fingerprints of different involved spin species and make it possible to distinguish different nuclear isotopes. As discussed in this chapter the nuclear magnetic resonance provides also an access to local magnetic fields acting on nuclear spins. These fields are caused by the magnetic interactions between the nuclei and by the quadrupole splittings of nuclear spin states in anisotropic crystalline environment. Manifestations of spin resonance in optical responses of semiconductors–that is, optically detected magnetic resonance–are discussed.
APA, Harvard, Vancouver, ISO, and other styles
45

Scooby-Doo! Read and Solve: Volume 6 - Disappearing Donuts. New York, USA: Advance Publishers, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
46

Scooby-Doo! Read and Solve: Volume 11 - Football Fight. New York, USA: Advance Publishers, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
47

Scooby-Doo! Read and Solve: Volume 8 - Vanishing Apples. New York, USA: Advance Publishers, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
48

Scooby-Doo! Read and Solve: Volume 2 - Mummies at the Mall. Houston, USA: Advance Publishers, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
49

Scooby-Doo! Read and Solve: Volume 9 - Howling on the Playground. New York, USA: Advance Publishers, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
50

Scooby-Doo! Read and Solve: Volume 1 - Map in the Mystery Machine. Houston, USA: Advance Publishers, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography