To see the other types of publications on this topic, follow the link: Event data methods.

Books on the topic 'Event data methods'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 books for your research on the topic 'Event data methods.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse books on a wide variety of disciplines and organise your bibliography correctly.

1

Interval-censored time-to-event data: Methods and applications. Chapman and Hall/CRC, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lemeshow, Stanley. Applied Survival Analysis: Regression Modeling of Time to Event Data. 2nd ed. Wiley-Interscience, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hosmer, David W. Applied survival analysis: Regression modeling of time-to-event data. 2nd ed. John Wiley & Sons, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hosmer, David W. Applied survival analysis: Regression modeling of time-to-event data. 2nd ed. Wiley-Interscience, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Stanley, Lemeshow, ed. Applied survival analysis: Regression modeling of time to event data. Wiley, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Rosenbluth, William. Black box data from accident vehicles: Methods of retrieval, translation, and interpretation. ASTM International, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Luttmer, Erzo F. P. Measuring poverty dynamics and inequality in transition economies: Disentangling real events from noisy data. World Bank, Europe and Central Asia Region, Poverty Reduction and Economic Management Sector Unit, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Luttmer, Erzo F. P. Measuring poverty dynamics and inequality in transition economies: Disentangling real events from noisy data. World Bank, Europe and Central Asia Region, Poverty Reduction and Economic Management Sector Unit, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Schneider, Jörg, and Ton Vrouwenvelder. Introduction to safety and reliability of structures. 3rd ed. International Association for Bridge and Structural Engineering (IABSE), 1997. http://dx.doi.org/10.2749/sed005.

Full text
Abstract:
<p>Society expects that buildings and other structures are safe for the people who use them or who are near them. The failure of a building or structure is expected to be an extremely rare event. Thus, society implicitly relies on the expertise of the professionals involved in the planning, design, construction, operation and maintenance of the structures it uses.<p>Structural engineers devote all their effort to meeting society’s expectations effi ciently. Engineers and scientists work together to develop solutions to structural problems. Given that nothing is absolutely and eternally safe, the goal is to attain an acceptably small probability of failure for a structure, a facility, or a situation. Reliability analysis is part of the science and practice of engineering today, not only with respect to the safety of structures, but also for questions of serviceability and other requirements of technical systems that might be impacted by some probability.<p>The present volume takes a rather broad approach to safety and reliability in Structural Engineering. It treats the underlying concepts of safety, reliability and risk and introduces the reader in a fi rst chapter to the main concepts and strategies for dealing with hazards. The next chapter is devoted to the processing of data into information that is relevant for applying reliability theory. Two following chapters deal with the modelling of structures and with methods of reliability analysis. Another chapter focuses on problems related to establishing target reliabilities, assessing existing structures, and on effective strategies against human error. The last chapter presents an outlook to more advanced applications. The Appendix supports the application of the methods proposed and refers readers to a number of related computer programs.<p>This book is aimed at both students and practicing engineers. It presents the concepts and procedures of reliability analysis in a straightforward, understandable way, making use of simple examples, rather than extended theoretical discussion. It is hoped that this approach serves to advance the application of safety and reliability analysis in engineering practice.<p>The book is amended with a free access to an educational version of a Variables Processor computer program. FreeVaP can be downloaded free of charge and supports the understanding of the subjects treated in this book.
APA, Harvard, Vancouver, ISO, and other styles
10

J, Rosenblatt Alan, ed. International relations: Using MicroCase ExplorIt. Wadsworth/Thomson Learning, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
11

Murashko, Mikhail, Igor Ivanov, and Nadezhda Knyazyuk. THE BASICS OF MEDICAL CARE QUALITY AND SAFETY PROVISION. Advertising and Information Agency "Standards and quality», 2020. http://dx.doi.org/10.35400/978-5-600-02711-4.

Full text
Abstract:
SUMMARY
 
 Current monograph represents and reviews key approaches to creating an effective internal quality and safety control system for an organization, based on patient-oriented approach, process approach, risk management, continuous process improvement and other methods including definition of all applied terms, a number of examples and step by step manuals on executing key measures and events to create and develop a quality control system and local documentation samples.
 Target audience for this monograph: hospital leadership, including CMO, deputy CMO on quality, head of quality control committee or designated quality control specialist, other medical workers.
 
 ABOUT «THE BASICS OF MEDICAL CARE QUALITY AND SAFETY PROVISION»
 All changes and reforms in healthcare should provide for medical care quality improvement, preservation of life and health of all citizens. Once an abstract word “quality” has its’ own specific meaning today, acquired by means of legislative validation of the term “medical care quality and safety”. Providing healthcare quality and safety is one of the key priorities within the confines of Russian Federation national policy for citizens’ health protection. 
 Current issue represents actual knowledge and practical experience in terms of medical care quality and safety control, continuous medical organization efficiency improvement. Current issue addresses the matters of theoretical and practical aspects of introducing management and internal quality and safety control system in medical care. It also contains the methodological description of Proposals (practical recommendations) of Federal Service for Supervision in the Sphere of Healthcare, developed based on global experience generalization, adapted to Russian specificity, aimed at quality and safety provision. Current issue represents a large number of samples, examples, templates and check-list tables. Data, accumulated in the monograph, allows the reader create a proper system of measures in a medical organization to comply with the order № 381-н of Ministry of Health of Russian Federation «On approving Requirements towards organizing and executing medical care internal quality and safety control». 
 
 TARGET AUDIENCE
 Current issue is intended for a wide range of readers, interested in management: for healthcare organization leaders, CMOs and deputy CMOs, deputy CMOs on quality, quality control committee leaders or designated quality control specialists, physicians, nurses, medical academicians and students, and all specialists, interested in medical organizations’ stable development and improvement.
APA, Harvard, Vancouver, ISO, and other styles
12

C, Handy Todd, ed. Event-related potentials: A methods handbook. MIT Press, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
13

1955-, Rubino Gerardo, and Tuffin Bruno, eds. Rare event simulation using Monte Carlo methods. Wiley, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
14

Publishers), PEP (Professional Engineering. Optical Methods and Data Processing in Heat and Fluid Flow (Imeche Event Publications). Wiley, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
15

Publishers), PEP (Professional Engineering. Optical Methods and Data Processing in Heat and Fluid Flow (Imeche Event Publications). Wiley, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
16

Kragh, Andersen Per, and Keiding Niels, eds. Survival and event history analysis. Wiley, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
17

Hosmer, David W., Stanley Lemeshow, Susanne May, and David W. Jr Hosmer. Applied Survival Analysis: Regression Modeling of Time-To-Event Data. Wiley & Sons, Incorporated, John, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
18

Hosmer, David W., Stanley Lemeshow, Susanne May, and Hosmer David W. Jr. Applied Survival Analysis: Regression Modeling of Time-To-Event Data. Wiley & Sons, Incorporated, John, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
19

Jr, David W. Hosmer, and Stanley Lemeshow. Applied Survival Analysis: Regression Modeling of Time to Event Data. Wiley-Interscience, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
20

Hosmer, David W., Stanley Lemeshow, Susanne May, and Hosmer David W. Jr. Applied Survival Analysis: Regression Modeling of Time-To-Event Data. Wiley & Sons, Incorporated, John, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
21

Federation of European Chemical Societies., Chemometrics Society, Österreichische Gesellschaft für Mikrochemie und Analytische Chemie im Verein Österreichischer Chemiker., and Verein Österreichischer Chemiker. Working Group "Computers in Chemistry.", eds. COBAC IV, computer based methods in analytical chemistry: 90th event of FECS, Graz, Austria, September 15-19, 1986. Austrian Society for Microchemistry and Analytical Chemistry, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
22

(Editor), Niels Keiding, and Per Kragh Andersen (Editor), eds. Survival and Event History Analysis (Wiley Reference Series in Biostatistics). Wiley, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
23

Hosmer, David W., Stanley Lemeshow, and Hosmer David W. Jr. Applied Survival Analysis: Time-To-Event. Wiley & Sons, Incorporated, John, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
24

Beck, Nathaniel. Time‐Series Cross‐Section Methods. Edited by Janet M. Box-Steffensmeier, Henry E. Brady, and David Collier. Oxford University Press, 2009. http://dx.doi.org/10.1093/oxfordhb/9780199286546.003.0020.

Full text
Abstract:
This article outlines the literature on time-series cross-sectional (TSCS) methods. First, it addresses time-series properties including issues of nonstationarity. It moves to cross-sectional issues including heteroskedasticity and spatial autocorrelation. The ways that TSCS methods deal with heterogeneous units through fixed effects and random coefficient models are shown. In addition, a discussion of binary variables and their relationship to event history models is provided. The best way to think about modeling single time series is to think about modeling the time-series component of TSCS data. On the cross-sectional side, the best approach is one based on thinking about cross-sectional issues like a spatial econometrician. In general, the critical insight is that TSCS and binary TSCS data present a series of interesting issues that must be carefully considered, and not a standard set of nuisances that can be dealt with by a command in some statistical package.
APA, Harvard, Vancouver, ISO, and other styles
25

Agency, International Atomic Energy, ed. Derived intervention levels for application in controlling radiation doses to the public in the event of a nulcear accident or radiological emergency: Principles, procedures, and data. International Atomic Energy Agency ; Lanham, MD : U.S.A., Unipub-Bernan [distributor], 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
26

Applied Longitudinal Data Analysis: Modeling Change and Event Occurrence. Oxford University Press, USA, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
27

Kam, Julia W. Y., and Todd C. Handy. Electroencephalogram Recording in Humans. Oxford University Press, 2015. http://dx.doi.org/10.1093/med/9780199939800.003.0006.

Full text
Abstract:
This chapter provides an elementary introduction to the theory and practical application of electroencephalogram (EEG) recording for the purpose of studying neurocognitive processes. It is aimed at readers who have had little or no experience in EEG data collection, and would like to gain a better understanding of scientific papers employing this methodology or start their own EEG experiment. We begin with a definition of EEG, and a summary of the strengths and limitations of EEG-based techniques. Following this is a description of the basic theory concerning the cellular mechanisms underlying EEG, as well as two types of data generated by EEG recording. We then present a brief summary of the equipment necessary for EEG data acquisition and important considerations for presentation software. Finally, we provide an overview of the protocol for data acquisition and processing, as well as methods for quantifying both EEG and event-related potentials data.
APA, Harvard, Vancouver, ISO, and other styles
28

Puvenesvary, M., Radziah Abdul Rahim, R. Sivabala Naidu, Mastura Badzis, Noor Fadhilah Mat Nayan, and Noor Hashima Abd Aziz. Qualitative Research: Data Collection & Data Analysis Techniques. UUM Press, 2008. http://dx.doi.org/10.32890/9789833827596.

Full text
Abstract:
Qualitative Research: Data Collection & Data Analysis Techniques is especially written for anyone who is interested in doing or learning more about qualitative research methods. The reader-friendly organisation and writing style of the book makes it accessible to everyone-academics,professionals, undergraduates, postgraduates, researchers, and even for those who are just beginning to explore the field of qualitative research. Each chapter provides a clear, contextualized and comprehensive coverage of the main qualitative research methods (interviews, focus groups, observations, diary studies, archival document, and content analysis) and will thus equip readers with a thorough understanding of the steps and skills to undertake qualitative research effectively. Bringing together qualitative research scholars from three different tertiary institutions in the country Associate Prof Dr. Puvensvary Muthiah, Dr. Radziah Abdul Rahim, Puan Noor Hashima Abd Aziz, and Noor Fadhilah Mat Nayan, from Universiti Utara Malaysia (UUM), Assoc. Prof. Dr. Mastura Badzis from Universiti Pendidikan Sultan Idris (UPSI) and R. Sivabala Naidu from Darulaman Teacher Training Institute, this book addresses some of the most important questions facing students and researchers in qualitative research
APA, Harvard, Vancouver, ISO, and other styles
29

Walsh, Bruce, and Michael Lynch. Using Molecular Data to Detect Selection: Signatures from Recent Single Events. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198830870.003.0009.

Full text
Abstract:
Different types and phases of a selective sweep (hard, soft, partial, polygenic) generate different patterns of departures from neutrality, and hence require different tests. It is thus not surprising that a large number of tests have been proposed that use sequence information to detect ongoing, or very-recently completed, episodes of selection. This chapter critically reviews over 50 such tests, which use information on allele-frequency change, linkage disequilibrium patterns, spatial allele-frequency patterns, site-frequency spectrum data, allele-frequency spectrum data, and haplotype structure. This chapter discusses the domain of applicability for each test, and their strengths and weaknesses. Finally, this chapter examines application of these methods in the search for recent, or ongoing, selection in humans and for genes involved in the domestication process in plants and animals.
APA, Harvard, Vancouver, ISO, and other styles
30

Jemielniak, Dariusz. Thick Big Data. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198839705.001.0001.

Full text
Abstract:
The social sciences are becoming datafied. The questions that have been considered the domain of sociologists, now are answered by data scientists, operating on large datasets, and breaking with the methodological tradition for better or worse. The traditional social sciences, such as sociology or anthropology, are thus under the double threat of becoming marginalized or even irrelevant; both because of the new methods of research, which require more computational skills, and because of the increasing competition from the corporate world, which gains an additional advantage based on data access. However, sociologists and anthropologists still have some important assets, too. Unlike data scientists, they have a long history of doing qualitative research. The more quantified datasets we have, the more difficult it is to interpret them without adding layers of qualitative interpretation. Big Data needs Thick Data. This book presents the available arsenal of new tools for studying the society quantitatively, but also show the new methods of analysis from the qualitative side and encourages their combination. In shows that Big Data can and should be supplemented and interpreted through thick data, as well as cultural analysis, in a novel approach of Thick Big Data.The book is critically important for students and researchers in the social sciences to understand the possibilities of digital analysis, both in the quantitative and qualitative area, and successfully build mixed-methods approaches.
APA, Harvard, Vancouver, ISO, and other styles
31

Atkeson, Lonna Rae, and R. Michael Alvarez. Introduction to Polling and Survey Methods. Edited by Lonna Rae Atkeson and R. Michael Alvarez. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780190213299.013.34.

Full text
Abstract:
Polling and survey methods is an interdisciplinary activity and includes actors in all areas of society, including academia, government, and the private sector. Designing, implementing, and analyzing high-quality, accurate, and cost-effective polls and surveys requires a combination of skills and methodological perspectives. Despite the well-publicized issues that have cropped up in recent political polling, a great deal is known today about how to collect high-quality polling and survey data even in complex and difficult environments. Quality surveys and good survey data are important because social scientists are only as good as the data produced. Therefore, it is critical to follow best practices and guidelines and help researchers assess a variety of factors to make good choices when collecting and analyzing data. Equally important is transmitting those results to others in a clear and accessible manner.
APA, Harvard, Vancouver, ISO, and other styles
32

Mastroianni, George R. Matters of Method. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190638238.003.0003.

Full text
Abstract:
Chapter 3 explores issues in the application of the traditional methods of psychological science to understanding the Holocaust. Three such issues are (a) What is the nature of available “data” in studying the Holocaust? (b) Can contemporary laboratory research inform our understanding of the Holocaust? (c) Can we reconcile the exculpatory implications of a deterministic analysis with our moral assessment of the Holocaust? Psychologists accustomed to exerting considerable control over the data they produce and study confront considerable challenges in the use of historical data. Laboratory research, while potentially relevant to the Holocaust, similarly presents considerable challenges in assessing the generalizability of such findings to temporally and culturally distant events. Insofar as psychological explanations are framed in the vernacular of social science, the risk of such explanations being seen as inappropriately exculpatory cannot be avoided.
APA, Harvard, Vancouver, ISO, and other styles
33

Witkov, Carey, and Keith Zengel. Chi-Squared Data Analysis and Model Testing for Beginners. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198847144.001.0001.

Full text
Abstract:
This book is the first to make chi-squared model testing, one of the data analysis methods used to discover the Higgs boson and gravitational waves, accessible to undergraduate students in introductory physics laboratory courses. By including uncertainties in the curve fitting, chi-squared data analysis improves on the centuries old ordinary least squares and linear regression methods and combines best fit parameter estimation and model testing in one method. A toolkit of essential statistical and experimental concepts is developed from the ground up with novel features to interest even those familiar with the material. The presentation of one- and two-parameter chi-squared model testing, requiring only elementary probability and algebra, is followed by case studies that apply the methods to simple introductory physics lab experiments. More challenging topics, requiring calculus, are addressed in an advanced topics chapter. This self-contained and student-friendly introduction to chi-squared analysis and model testing includes a glossary, end-of-chapter problems with complete solutions, and software scripts written in several popular programming languages, that the reader can use for chi-squared model testing. In addition to introductory physics lab students, this accessible introduction to chi-squared analysis and model testing will be of interest to all who need to learn chi-squared model testing, e.g. beginning researchers in astrophysics and particle physics, beginners in data science, and lab students in other experimental sciences.
APA, Harvard, Vancouver, ISO, and other styles
34

Clark, James S., Dave Bell, Michael Dietze, et al. Assessing the probability of rare climate events. Edited by Anthony O'Hagan and Mike West. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780198703174.013.16.

Full text
Abstract:
This article focuses on the use of Bayesian methods in assessing the probability of rare climate events, and more specifically the potential collapse of the meridional overturning circulation (MOC) in the Atlantic Ocean. It first provides an overview of climate models and their use to perform climate simulations, drawing attention to uncertainty in climate simulators and the role of data in climate prediction, before describing an experiment that simulates the evolution of the MOC through the twenty-first century. MOC collapse is predicted by the GENIE-1 (Grid Enabled Integrated Earth system model) for some values of the model inputs, and Bayesian emulation is used for collapse probability analysis. Data comprising a sparse time series of five measurements of the MOC from 1957 to 2004 are analysed. The results demonstrate the utility of Bayesian analysis in dealing with uncertainty in complex models, and in particular in quantifying the risk of extreme outcomes.
APA, Harvard, Vancouver, ISO, and other styles
35

Zawiszewski, Adam. Processing Ergativity: Behavioral and Electrophysiological Evidence. Edited by Jessica Coon, Diane Massam, and Lisa Demena Travis. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780198739371.013.28.

Full text
Abstract:
So far ergativity has been mostly studied from a language-theoretic perspective and the evidence on how it is processed and represented is rather scarce. In this paper I provide an insight into ergativity from an experimental approach. First, I present an overview of the experimental methods used to investigate ergativity (self-paced reading, event-related potentials and functional magnetic resonance imaging) and next I review studies that examined behavioral, electrophysiological and neuroanatomical correlates of ergativity in both native and non-native speakers, as well as those focused on the universality of processing strategies in ergative languages. Finally, I also review and discuss the experimental data from works that dealt with syntactic and semantic aspects of ergativity and discuss the implication of the results for future research.
APA, Harvard, Vancouver, ISO, and other styles
36

Rathbun, Brian Christopher. Interviewing and Qualitative Field Methods: Pragmatism and Practicalities. Edited by Janet M. Box-Steffensmeier, Henry E. Brady, and David Collier. Oxford University Press, 2009. http://dx.doi.org/10.1093/oxfordhb/9780199286546.003.0029.

Full text
Abstract:
This article recommends the use of intensive, in-depth interviews which can help to establish motivations and preferences, even though they must deal with the perils of ‘strategic reconstruction’. The first section of this article makes the pragmatic case for interviewing. The second portion is devoted to assembling in one place the consensus in the literature on the basics of how to undertake interviews, including issues of how to build arguments using interview data, how to structure questionnaires, the proper role to adopt vis-à-vis respondents, and how to gain access to conversation partners. Doubts about the status of interview data and the reliability of respondents must be taken into account but can be addressed. These disadvantages rarely outweigh the unique advantages of interviewing.
APA, Harvard, Vancouver, ISO, and other styles
37

Hayes, Gillian R. Design, Action, and Practice. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198733249.003.0010.

Full text
Abstract:
Action research (AR) is an approach to research that involves engaging with a community to address some problem or challenge, and through this problem-solving, to develop scholarly knowledge. AR is not a method, nor even a suite of methods, but a perspective that makes use of a wide variety of methods. AR is explicitly democratic, collaborative, and interdisciplinary. It focuses on highly contextualized, localized solutions with a greater emphasis on transferability than generalizability. In other words, scholars and community partners work together to develop and learn from solutions that work in a single context; in addition, they collect data that will enable these solutions to be adapted or transferred to other contexts. Most importantly, AR claims that the intervention, the learning, and the doing and knowing cannot be disentangled.
APA, Harvard, Vancouver, ISO, and other styles
38

Dowd, Cate. Digital Journalism, Drones, and Automation. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780190655860.001.0001.

Full text
Abstract:
Advances in online technology and news systems, such as automated reasoning across digital resources and connectivity to cloud servers for storage and software, have changed digital journalism production and publishing methods. Integrated media systems used by editors are also conduits to search systems and social media, but the lure of big data and rise in fake news have fragmented some layers of journalism, alongside investments in analytics and a shift in the loci for verification. Data has generated new roles to exploit data insights and machine learning methods, but access to big data and data lakes is so significant it has spawned newsworthy partnerships between media moguls and social media entrepreneurs. However, digital journalism does not even have its own semantic systems that could protect the values of journalism, but relies on the affordances of other systems. Amidst indexing and classification systems for well-defined vocabulary and concepts in news, data leaks and metadata present challenges for journalism. By contrast data visualisations and real-time field reporting with short-form mobile media and civilian drones set new standards during the European asylum seeker crisis. Aerial filming with drones also adds to the ontological base of journalism. An ontology for journalism and intersecting ontologies can inform the design of new semantic learning systems. The Semantic CAT Method, which draws on participatory design and game design, also assists the conceptual design of synthetic players with emotion attributes, towards a meta-model for learning. The design of context-aware sensor systems to protect journalists in conflict zones is also discussed.
APA, Harvard, Vancouver, ISO, and other styles
39

Elwood, Mark. Chance variation. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780199682898.003.0008.

Full text
Abstract:
This chapter explains chance variation and statistical tests, including discrete and continuous measures, the concept of significance, one and two sided test, exact tests, precision and confidence limits. It shows tests of differences in proportions and chi-square tests, the Mantel-Haenszel test, and calculation of confidence limits, for simple tables and for stratified data. It covers heterogeneity tests, multiplicative and additive models, ordered exposure variables and tests of trend. It explains statistical tests for matched studies and in multivariate models. Multiple testing, the Bonferroni correction, issues of hypothesis testing and hypothesis generation, and subgroup analyses are discussed. Stopping rules and repeated testing in trials is covered. It explains how to calculate study power and the necessary size of the study. The chapter describes time to event analysis, including survival curves, product-limit and actuarial or life-table methods, and the calculation of confidence limits, relative survival ratios, the log rank test with control for confounding, and multivariate analysis.
APA, Harvard, Vancouver, ISO, and other styles
40

Roger, Chou, Fu Rongwei, Carson Susan, et al., eds. Empirical evaluation of the association between methodological shortcomings and estimates of adverse events. U.S. Dept. of Health and Human Services, Agency for Healthcare Research and Quality, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
41

Whiting, Rebecca, Helen Roby, Gillian Symon, and Petros Chamakiotis. Participant-led video diaries. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198796978.003.0010.

Full text
Abstract:
Rebecca Whiting, Helen Roby, Gillian Symon, and Petros Chamakiotis develop an unconventional research design using video methods, asking participants to produce their own video diaries, a process which is then followed by narrative interviews. This approach generates multi-modal data: audio, visual, and textual, and involves adopting a qualitative perspective, and a social constructionist epistemology. This participant-led research design allows researchers to investigate a range of issues that are not often recalled in interviews or surveys, by capturing naturally occurring, real-time events and activities, and micro-interactions including non-verbal behaviours. Although video methods are used in other disciplines, they are rare in organizational research. The approach is illustrated by a study which explored how digital technologies affect our ability to manage switches across work-life boundaries. Analysis of participants’ video diaries illustrates the theoretical and reflexive insights that can be gained from this method. The problems and pitfalls encountered in this study are also considered.
APA, Harvard, Vancouver, ISO, and other styles
42

Alexander, Peter D. G., and Malachy O. Columb. Presentation and handling of data, descriptive and inferential statistics. Edited by Jonathan G. Hardman. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780199642045.003.0028.

Full text
Abstract:
The need for any doctor to comprehend, assimilate, analyse, and form an opinion on data cannot be overestimated. This chapter examines the presentation and handling of such data and its subsequent statistical analysis. It covers the organization and description of data, measures of central tendency such as mean, median, and mode, measures of dispersion (standard deviation), and the problems of missing data. Theoretical distributions, such as the Gaussian distribution, are examined and the possibility of data transformation discussed. Inferential statistics are used as a means of comparing groups, and the rationale and use of parametric and non-parametric tests and confidence intervals is outlined. The analysis of categorical variables using the chi-squared test and assessing the value of diagnostic tests using sensitivity, specificity, positive and negative predictive values, and a likelihood ratio are discussed. Measures of association are covered, namely linear regression, as is time-to-event analysis using the Kaplan–Meier method. Finally, the chapter discusses the statistical analysis used when comparing clinical measurements—the Bland and Altman method. Illustrative examples, relevant to the practice of anaesthesia, are used throughout and it is hoped that this will provide the reader with an outline of the methodologies employed and encourage further reading where necessary.
APA, Harvard, Vancouver, ISO, and other styles
43

Andrew, Pickles, Maughan Barbara 1946-, and Wadsworth Michael E. J, eds. Epidemiological methods in life course research. Oxford University Press, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
44

Hari, MD, PhD, Riitta, and Aina Puce, PhD. MEG-EEG Primer. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780190497774.001.0001.

Full text
Abstract:
This book provides newcomers and more experienced researchers with the very basics of magnetoencephalography (MEG) and electroencephalography (EEG)—two noninvasive methods that can inform about the neurodynamics of the human brain on a millisecond scale. These two closely related methods are addressed side by side, starting from their physical and physiological bases and then advancing to methods of data acquisition, analysis, visualization, and interpretation. Special attention is paid to careful experimentation, guiding the readers to differentiate brain signals from various biological and non-biological artifacts and to ascertain that the collected data are reliable. The strengths and weaknesses of MEG and EEG are presented relative to each other and to other available brain-imaging methods. Necessary instrumentation and laboratory set-ups, as well as potential pitfalls in data collection and analysis are discussed. Spontaneous brain rhythms and evoked responses to sensory and multisensory stimulation are covered and examined both in healthy individuals and in various brain disorders, such as epilepsy. MEG/EEG signals related to motor, cognitive, and social events are discussed as well. The integration of MEG and EEG information with other methods to assess human brain function is discussed with respect to the current state-of-the art in the field. The book ends with a look to future developments in equipment design, and experimentation, emphasizing the role of accurate temporal information for human brain function.
APA, Harvard, Vancouver, ISO, and other styles
45

Moseley, Mason W. Uneven Democracy and Contentious Politics. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190694005.003.0007.

Full text
Abstract:
Building on the previous chapter, this chapter analyzes variation in protest activity across Argentine provinces using statistical analysis. Drawing on two sources of protest events data, survey data, and an inventive method for measuring subnational democracy introduced by Gervasoni (2010), I trace how characteristics of subnational democratic institutions related to electoral competition and executive dominance produce different protest outcomes over the past twenty years. Departing from prior studies of protest in Latin America, I focus on the differential effects of subnational democracy on distinct protest repertoires. That is, might certain institutional characteristics of provinces spur aggressive modes of contention but diminish the incidence of peaceful protests, and vice versa? In conclusion, this chapter reveals that even in a protest state like Argentina, significant subnational variation in terms of democratic quality can produce stark variation in both the prevalence and type of contentious politics.
APA, Harvard, Vancouver, ISO, and other styles
46

Giacovazzo, Carmelo. Phasing in Crystallography. Oxford University Press, 2013. http://dx.doi.org/10.1093/oso/9780199686995.001.0001.

Full text
Abstract:
Modern crystallographic methods originate from the synergy of two main research streams, the small-molecule and the macro-molecular streams. The first stream was able to definitively solve the phase problem for molecules up to 200 atoms in the asymmetric unit. The achievements obtained by the macromolecular stream are also impressive. A huge number of protein structures have been deposited in the Protein Data Bank. The solution of them is no longer reserved to an elite group of scientists, but may be attained in a large number of laboratories around the world, even by young scientists. New probabilistic approaches have been tailored to deal with larger structures, errors in the experimental data, and modest data resolution. Traditional phasing techniques like ab initio, molecular replacement, isomorphous replacement, and anomalous dispersion techniques have been revisited. The new approaches have been implemented in robust phasing programs, which have been organized in automatic pipelines usable even by non-experts. Protein structures, which 50 years ago could take months or even years to solve, can now be solved in a matter of hours, partly also due to technological advances in computer science. This book describes all modern crystallographic phasing methods, and introduces a new rational classification of them. A didactic approach is used, with the techniques described simply and logically in the main text, and further mathematical details confined to the Appendices for motivated readers. Numerous figures and applicative details illustrate the text.
APA, Harvard, Vancouver, ISO, and other styles
47

Kirchman, David L. Microbial growth, biomass production, and controls. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198789406.003.0008.

Full text
Abstract:
Soon after the discovery that bacteria are abundant in natural environments, the question arose as to whether or not they were active. Although the plate count method suggested that they were dormant if not dead, other methods indicated that a large fraction of bacteria and fungi are active, as discussed in this chapter. It goes on to discuss fundamental equations for exponential growth and logistic growth, and it describes phases of growth in batch cultures, continuous cultures, and chemostats. In contrast with measuring growth in laboratory cultures, it is difficult to measure in natural environments for complex communities with co-occurring mortality. Among many methods that have been suggested over the years, the most common one for bacteria is the leucine approach, while for fungi it is the acetate-in ergosterol method. These methods indicate that the growth rate of the bulk community is on the order of days for bacteria in their natural environment. It is faster in aquatic habitats than in soils, and bacteria grow faster than fungi in soils. But bulk rates for bacteria appear to be slower than those for phytoplankton. All of these rates for natural communities are much slower than rates measured for most microbes in the laboratory. Rates in subsurface environments hundreds of meters from light-driven primary production and high organic carbon conditions are even lower. Rates vary greatly among microbial taxa, according to data on 16S rRNA. Copiotrophic bacteria grow much faster than oligotrophic bacteria, but may have low growth rates when conditions turn unfavorable. Some of the factors limiting heterotrophic bacteria and fungi include temperature and inorganic nutrients, but the supply of organic compounds is perhaps most important in most environments.
APA, Harvard, Vancouver, ISO, and other styles
48

Wilson, Mark. Pragmatics’ Place at the Table. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198803478.003.0001.

Full text
Abstract:
Physical events that transpire across many size scales require significant data compression for their successful handling. A popular remedy practiced within modern multiscalar methods breaks a descriptive task into sub-problems focused upon dominant behaviors that arise on different length scales. Each localized form of description employs the same language in different ways. This contextualization requires that these localized veins of description share data with one another in non-standard ways. We employ allied techniques in everyday life as well and philosophical confusions arise when the underlying strategic architecture is not properly recognized. Nine general morals concerning language usage are abstracted from this examination.
APA, Harvard, Vancouver, ISO, and other styles
49

Ingram, Scott E. Climate. Edited by Barbara Mills and Severin Fowles. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199978427.013.40.

Full text
Abstract:
This chapter serves as an introduction to and reference for climate–human behavior studies in the Southwest. These studies investigate potential climatic impacts on social change and historical trajectories. To build foundational understanding, a representative climate–human behavior model is presented and evaluated, commonly used paleoclimatic data are detailed, and methods for identifying climate extremes (e.g., droughts, wet periods) in these data are described. Some extreme climate events and the challenge of identifying their influence (if any) on social change are noted. A familiarity with these aspects of climate–human behavior studies is essential for effectively evaluating interpretations of historical trajectories that invoke climatic influences.
APA, Harvard, Vancouver, ISO, and other styles
50

Hora, Stephen. Probability Elicitation. Edited by Alan Hájek and Christopher Hitchcock. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199607617.013.30.

Full text
Abstract:
Probability Elicitation refers to the practice and methods of encoding the judgments of experts into probabilities or probability distributions. Such probabilistic judgments are often used to provide information that cannot be obtained directly from data, observation, or first principles. Best practices have been developed for probability elicitation by statisticians, psychologists, and decision analysts, among others, and have been found useful in the evaluation of risks and prediction of future events. These best principles include methods for qualifying experts and selecting the number of experts to employ, the organization of these experts, and techniques for encoding the judgments as probabilities. Studies have been conducted to evaluate the reliability of these elicited probabilities, to develop a set of desirable properties for these judgments and the measurement of these properties, and to compare various methods and protocols used to obtain the judgments.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography