To see the other types of publications on this topic, follow the link: Computational Advantage.

Books on the topic 'Computational Advantage'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 15 books for your research on the topic 'Computational Advantage.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse books on a wide variety of disciplines and organise your bibliography correctly.

1

Vickery, JR. Food Science and Technology in Australia. CSIRO Publishing, 1990. http://dx.doi.org/10.1071/9780643105003.

Full text
Abstract:
The main purpose of this book is to give food technologists in industry and students in training a comprehensive review of research findings by Australian workers in government, university and industrial laboratories from 1900 to 1990.
 To further its aims as a reference book, detailed bibliographies of some 1400 research papers have been compiled particularly for the period prior to access of references through databases.
 Another aim was to draw attention to the many contributions which brought international recognition to their authors; particularly those who did not have the advantage s of modern separation, analytical and computational techniques.
APA, Harvard, Vancouver, ISO, and other styles
2

Brown, Andrew R., Damián Keller, and Maria Helena de Lima. How Ubiquitous Technologies Support Ubiquitous Music. Edited by Brydie-Leigh Bartleet and Lee Higgins. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780190219505.013.5.

Full text
Abstract:
Pervasive computing technologies are providing opportunities and challenges for new musical practices and offering greater access to musical interactions for people at all levels of musical experience. In this chapter we review theoretical insights and practical experiences of taking advantage of these opportunities and meeting these challenges; we describe how to leverage ubiquitous technologies to support ubiquitous music; and we discuss ideas and techniques that can assist in ensuring that social music activities provide an appropriate variety of experiences and strategies to maximize socially positive and musically creative outcomes. Strategies include starting with what is known and available, enhancing human skills with computational automation, and increasing participation through simplification to improve access and promote cultures of open sharing. Three case studies illustrate how these ideas are put into practice, covering experiences from across the world based in varied social contexts and using differing technologies, but sharing the same ambition of enhancing everyday experience through musical interactions mediated by pervasive technologies.
APA, Harvard, Vancouver, ISO, and other styles
3

Coveney, Peter V., and Shunzhou Wan. Molecular Dynamics: Probability and Uncertainty. Oxford University PressOxford, 2025. https://doi.org/10.1093/9780198893479.001.0001.

Full text
Abstract:
Abstract This book explores the intersection of molecular dynamics (MD) simulation with advanced probabilistic methodologies to address the inherent uncertainties in the approach. Beginning with a clear and comprehensible introduction to classical mechanics, the book transitions into the probabilistic formulation of MD, highlighting the importance of ergodic theory, kinetic theory, and unstable periodic orbits, concepts which are largely unknown to current practitioners within the domain. It discussed ensemble-based simulations, free energy calculations and the study of polymer nanocomposites using multi-scale modelling, providing detailed guidance on performing reliable and reproducible computations. A thorough discussion on Verification, Validation, and Uncertainty Quantification (VVUQ) lays out a definitive approach to formulating the uncertainty of MD modelling and simulation. Its interaction with artificial intelligence is examined in the light of these issues. While machine learning (ML) methods offer some advantages and less often-noted drawbacks, the integration of ML with physics-based MD simulations may in future enhance our ability to predict new drugs and advanced materials. The final chapter, ‘The End of Certainty’, synthesizes these themes, advocating a systematic and careful approach to computational research and the embrace of uncertainty as an integral part of innovation. This book serves as a highly original, conceptual and comprehensible guide for researchers and practitioners, emphasizing the need for rapid, accurate, precise and actionable techniques in the rapidly-evolving field of molecular dynamics.
APA, Harvard, Vancouver, ISO, and other styles
4

Allen, Colin, Peter M. Todd, and Jonathan M. Weinberg. Reasoning and Rationality. Edited by Eric Margolis, Richard Samuels, and Stephen P. Stich. Oxford University Press, 2012. http://dx.doi.org/10.1093/oxfordhb/9780195309799.013.0003.

Full text
Abstract:
The article explores five parts of Cartesian thought that include individualism, internalism, rationalism, universalism, and human exceptionalism demonstrating the philosophical and psychological theories of rationality. Ecological rationality comes about through the coadaptation of minds and their environments. The internal bounds comprising the capacities of the cognitive system can be shaped by evolution, learning, or development to take advantage of the structure of the external environment. The external bounds, comprising the structure of information available in the environment, can be shaped by the effects of minds making decisions in the world, including most notably in humans the process of cultural evolution. The internal constraints on decision-making including limited computational power and limited memory in the organism and the external ones include limited time push toward simple cognitive mechanisms for making decisions quickly and without much information. Human exceptionalism is one of the strands of Residual Cartesianism that puts the greatest focus on language and symbolic reasoning as the basis for human rationality. The invention of symbolic systems exhibits how humans deliberately and creatively alter their environments to enhance learning and memory and to support reasoning. Nonhuman animals also alter their environments in ways that support adaptive behavior. Stigmergy, an important mechanism for swarm intelligence, is the product of interactions among multiple agents and their environments. It is enhanced through cumulative modification, of the environment by individuals.
APA, Harvard, Vancouver, ISO, and other styles
5

Jemielniak, Dariusz. Thick Big Data. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198839705.001.0001.

Full text
Abstract:
The social sciences are becoming datafied. The questions that have been considered the domain of sociologists, now are answered by data scientists, operating on large datasets, and breaking with the methodological tradition for better or worse. The traditional social sciences, such as sociology or anthropology, are thus under the double threat of becoming marginalized or even irrelevant; both because of the new methods of research, which require more computational skills, and because of the increasing competition from the corporate world, which gains an additional advantage based on data access. However, sociologists and anthropologists still have some important assets, too. Unlike data scientists, they have a long history of doing qualitative research. The more quantified datasets we have, the more difficult it is to interpret them without adding layers of qualitative interpretation. Big Data needs Thick Data. This book presents the available arsenal of new tools for studying the society quantitatively, but also show the new methods of analysis from the qualitative side and encourages their combination. In shows that Big Data can and should be supplemented and interpreted through thick data, as well as cultural analysis, in a novel approach of Thick Big Data.The book is critically important for students and researchers in the social sciences to understand the possibilities of digital analysis, both in the quantitative and qualitative area, and successfully build mixed-methods approaches.
APA, Harvard, Vancouver, ISO, and other styles
6

Patisaul, Heather B., and Scott M. Belcher. The Path Forward. Oxford University Press, 2017. http://dx.doi.org/10.1093/acprof:oso/9780199935734.003.0008.

Full text
Abstract:
This chapter focuses on the contemporary approaches of research being used to understand the actions of EDCs and emerging high-throughput screening approaches to examine new and existing chemicals for endocrine-disrupting activities. Concepts arising from the 2007 NRC report “Toxicity Testing in the 21st Century: A Vision and a Strategy” are delineated and the ongoing development of predictive computational toxicology approaches are addressed. The screening strategies being developed under the Tox21 and Toxicity Forecaster (ToxCast) programs are described, with a review of advantages, challenges, and progress to date. There is a brief overview of the EPA’s Interactive Chemical Safety for Sustainability (iCSS) Dashboard as a portal for accessing the ToxCast data through ToxCastDB, and the EPA’s Aggregated Computational Toxicology data warehouse (ACToR), which contains all publicly available EPA chemical toxicity data. Additional challenges related to the inability of current screening approaches to address complex physiology involved in neuroendocrine disruption are addressed.
APA, Harvard, Vancouver, ISO, and other styles
7

Flarend, Alice, and Robert Hilborn. Quantum Computing: From Alice to Bob. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780192857972.001.0001.

Full text
Abstract:
Quantum Computing: From Alice to Bob provides a distinctive and accessible introduction to the rapidly growing fields of quantum information science (QIS) and quantum computing (QC). The book is designed for undergraduate students and upper-level secondary school students with little or no background in physics, computer science, or mathematics beyond secondary school algebra and trigonometry. While broadly accessible, the book provides a solid conceptual and formal understanding of quantum states and entanglement—the key ingredients in quantum computing. The authors give detailed treatments of many of the classic quantum algorithms that demonstrate how and when QC has an advantage over classical computers. The book provides a solid explanation of the physics of QC and QIS and then weds that knowledge to the mathematics of QC algorithms and how those algorithms deploy the principles of quantum physics to solve the problem. This book connects the physics concepts, the computer science vocabulary, and the mathematics, providing a complete picture of how QIS and QC work. The authors give multiple representations of the concept—textual, graphical, and symbolic (state vectors, matrices, and Dirac notation)—which are the lingua franca of QIS and QC. Those multiple representations allow the readers to develop a broader and deeper understanding of the fundamental concepts and their applications. In addition, the book provides examples of recent experimental demonstrations of quantum teleportation and the applications of quantum computational chemistry. The last chapter connects to the growing commercial world of QC and QIS and provides recommendations for further study.
APA, Harvard, Vancouver, ISO, and other styles
8

Anderson, James A. Computing Hardware. Oxford University Press, 2018. http://dx.doi.org/10.1093/acprof:oso/9780199357789.003.0002.

Full text
Abstract:
Chapter 2 presents a kind of computation currently unfamiliar to most, the analog computer. Fifty years ago, they were considered viable competitors to the newer digital computer. Analog computers compute by the use of physical analogs, using, for example, voltages, currents, or shaft positions to represent numbers. They compute using the device properties, not logic. Examples include the balance, a simple device known for millennia; the “Antikythera mechanism,” a complex astronomical calculator from the first century BC; the slide rule; the US Navy’s Mark I fire control computer used for much of the 20th century to aim naval gunfire; and electronic analog computers built in large numbers after World War II. Analog computers can have advantages in ruggedness, simplicity, and reliability but lack the flexibility of digital computers.
APA, Harvard, Vancouver, ISO, and other styles
9

Allwein, Gerard, and Jon Barwise. Logical Reasoning with Diagrams. Oxford University Press, 1996. http://dx.doi.org/10.1093/oso/9780195104271.001.0001.

Full text
Abstract:
One effect of information technology is the increasing need to present information visually. The trend raises intriguing questions. What is the logical status of reasoning that employs visualization? What are the cognitive advantages and pitfalls of this reasoning? What kinds of tools can be developed to aid in the use of visual representation? This newest volume on the Studies in Logic and Computation series addresses the logical aspects of the visualization of information. The authors of these specially commissioned papers explore the properties of diagrams, charts, and maps, and their use in problem solving and teaching basic reasoning skills. As computers make visual representations more commonplace, it is important for professionals, researchers and students in computer science, philosophy, and logic to develop an understanding of these tools; this book can clarify the relationship between visuals and information.
APA, Harvard, Vancouver, ISO, and other styles
10

Green, Peter, Kanti Mardia, Vysaul Nyirongo, and Yann Ruffieux. Bayesian modelling for matching and alignment of biomolecules. Edited by Anthony O'Hagan and Mike West. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780198703174.013.2.

Full text
Abstract:
This article describes Bayesian modelling for matching and alignment of biomolecules. One particular task where statistical modelling and inference can be useful in scientific understanding of protein structure is that of matching and alignment of two or more proteins. In this regard, statistical shape analysis potentially has something to offer in solving biomolecule matching and alignment problems. The article discusses the use of Bayesian methods for shape analysis to assist with understanding the three-dimensional structure of protein molecules, with a focus on the problem of matching instances of the same structure in the CoMFA (Comparative Molecular Field Analysis) database of steroid molecules. It introduces a Bayesian hierarchical model for pairwise matching and for alignment of multiple configurations before concluding with an overview of some advantages of the Bayesian approach to problems in protein bioinformatics, along with modelling and computation issues, alternative approaches, and directions for future research.
APA, Harvard, Vancouver, ISO, and other styles
11

Di Ventra, Massimiliano. MemComputing. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780192845320.001.0001.

Full text
Abstract:
From the originator of MemComputing comes the very first book on this new computing paradigm that employs time non-locality (memory) to both process and store information. The book discusses the rationale behind MemComputing, its theoretical foundations, and wide-range applicability to combinatorial optimization problems, Machine Learning, and Quantum Mechanics. The book is ideal for graduate students in Physics, Computer Science, Electrical Engineering, and Mathematics as well as researchers in both academia and industry interested in unconventional computing. The author relies on extensive margin notes, important remarks, and several artworks to better explain the main concepts and clarify all the jargon, making the book as self-contained as possible. The reader will be guided from the basic notions to the more advanced ones with a writing style that is always clear and engaging. Along the way, the reader will appreciate the advantages of this computing paradigm and the major differences that set it apart from the prevailing Turing model of computation, and even Quantum Computing.
APA, Harvard, Vancouver, ISO, and other styles
12

Laver, Michael, and Ernest Sergenti. Party Competition. Princeton University Press, 2017. http://dx.doi.org/10.23943/princeton/9780691139036.001.0001.

Full text
Abstract:
Party competition for votes in free and fair elections involves complex interactions by multiple actors in political landscapes that are continuously evolving, yet classical theoretical approaches to the subject leave many important questions unanswered. This book offers the first comprehensive treatment of party competition using the computational techniques of agent-based modeling. This exciting new technology enables researchers to model competition between several different political parties for the support of voters with widely varying preferences on many different issues. The book models party competition as a true dynamic process in which political parties rise and fall, a process where different politicians attack the same political problem in very different ways, and where today's political actors, lacking perfect information about the potential consequences of their choices, must constantly adapt their behavior to yesterday's political outcomes. This book shows how agent-based modeling can be used to accurately reflect how political systems really work. It demonstrates that politicians who are satisfied with relatively modest vote shares often do better at winning votes than rivals who search ceaselessly for higher shares of the vote. It reveals that politicians who pay close attention to their personal preferences when setting party policy often have more success than opponents who focus solely on the preferences of voters, that some politicians have idiosyncratic “valence” advantages that enhance their electability—and much more.
APA, Harvard, Vancouver, ISO, and other styles
13

Audring, Jenny, and Francesca Masini, eds. The Oxford Handbook of Morphological Theory. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199668984.001.0001.

Full text
Abstract:
Morphology, the science of words, is a complex theoretical landscape, where a multitude of frameworks, each with their own tenets and formalism, compete for the explanation of linguistic facts. The Oxford Handbook of Morphological Theory is a comprehensive guide through this jungle of morphological theories. It provides a rich and up-to-date overview of theoretical frameworks, from Structuralism to Optimality Theory and from Minimalism to Construction Morphology. In the core part of the handbook (Part II), each theory is introduced by a practitioner, who guides the reader through its principles and technicalities, its advantages and disadvantages. All chapters are written to be accessible, authoritative, and critical. Cross-references reveal agreements and disagreements among frameworks, and a rich body of references encourages further reading. As well as introducing individual theories, the volume speaks to the bigger picture. Part I identifies time-honoured issues in word-formation and inflection that have set the theoretical scene. Part III connects morphological theory to other fields of linguistics. These include typology and creole linguistics, diachronic change and synchronic variation, first and second language acquisition, psycho-/neurolinguistics, computational linguistics, and sign language theory. Each of these fields informs and challenges morphological theory in particular ways. By linking specialist data and insights from the various subfields, the volume fosters the dialogue among sub-disciplines that is much needed for a graceful integration of linguistic thinking.
APA, Harvard, Vancouver, ISO, and other styles
14

Wikle, Christopher K. Spatial Statistics. Oxford University Press, 2018. http://dx.doi.org/10.1093/acrefore/9780190228620.013.710.

Full text
Abstract:
The climate system consists of interactions between physical, biological, chemical, and human processes across a wide range of spatial and temporal scales. Characterizing the behavior of components of this system is crucial for scientists and decision makers. There is substantial uncertainty associated with observations of this system as well as our understanding of various system components and their interaction. Thus, inference and prediction in climate science should accommodate uncertainty in order to facilitate the decision-making process. Statistical science is designed to provide the tools to perform inference and prediction in the presence of uncertainty. In particular, the field of spatial statistics considers inference and prediction for uncertain processes that exhibit dependence in space and/or time. Traditionally, this is done descriptively through the characterization of the first two moments of the process, one expressing the mean structure and one accounting for dependence through covariability.Historically, there are three primary areas of methodological development in spatial statistics: geostatistics, which considers processes that vary continuously over space; areal or lattice processes, which considers processes that are defined on a countable discrete domain (e.g., political units); and, spatial point patterns (or point processes), which consider the locations of events in space to be a random process. All of these methods have been used in the climate sciences, but the most prominent has been the geostatistical methodology. This methodology was simultaneously discovered in geology and in meteorology and provides a way to do optimal prediction (interpolation) in space and can facilitate parameter inference for spatial data. These methods rely strongly on Gaussian process theory, which is increasingly of interest in machine learning. These methods are common in the spatial statistics literature, but much development is still being done in the area to accommodate more complex processes and “big data” applications. Newer approaches are based on restricting models to neighbor-based representations or reformulating the random spatial process in terms of a basis expansion. There are many computational and flexibility advantages to these approaches, depending on the specific implementation. Complexity is also increasingly being accommodated through the use of the hierarchical modeling paradigm, which provides a probabilistically consistent way to decompose the data, process, and parameters corresponding to the spatial or spatio-temporal process.Perhaps the biggest challenge in modern applications of spatial and spatio-temporal statistics is to develop methods that are flexible yet can account for the complex dependencies between and across processes, account for uncertainty in all aspects of the problem, and still be computationally tractable. These are daunting challenges, yet it is a very active area of research, and new solutions are constantly being developed. New methods are also being rapidly developed in the machine learning community, and these methods are increasingly more applicable to dependent processes. The interaction and cross-fertilization between the machine learning and spatial statistics community is growing, which will likely lead to a new generation of spatial statistical methods that are applicable to climate science.
APA, Harvard, Vancouver, ISO, and other styles
15

Hilgurt, S. Ya, and O. A. Chemerys. Reconfigurable signature-based information security tools of computer systems. PH “Akademperiodyka”, 2022. http://dx.doi.org/10.15407/akademperiodyka.458.297.

Full text
Abstract:
The book is devoted to the research and development of methods for combining computational structures for reconfigurable signature-based information protection tools for computer systems and networks in order to increase their efficiency. Network security tools based, among others, on such AI-based approaches as deep neural networking, despite the great progress shown in recent years, still suffer from nonzero recognition error probability. Even a low probability of such an error in a critical infrastructure can be disastrous. Therefore, signature-based recognition methods with their theoretically exact matching feature are still relevant when creating information security systems such as network intrusion detection systems, antivirus, anti-spam, and wormcontainment systems. The real time multi-pattern string matching task has been a major performance bottleneck in such systems. To speed up the recognition process, developers use a reconfigurable hardware platform based on FPGA devices. Such platform provides almost software flexibility and near-ASIC performance. The most important component of a signature-based information security system in terms of efficiency is the recognition module, in which the multipattern matching task is directly solved. It must not only check each byte of input data at speeds of tens and hundreds of gigabits/sec against hundreds of thousand or even millions patterns of signature database, but also change its structure every time a new signature appears or the operating conditions of the protected system change. As a result of the analysis of numerous examples of the development of reconfigurable information security systems, three most promising approaches to the construction of hardware circuits of recognition modules were identified, namely, content-addressable memory based on digital comparators, Bloom filter and Aho–Corasick finite automata. A method for fast quantification of components of recognition module and the entire system was proposed. The method makes it possible to exclude resource-intensive procedures for synthesizing digital circuits on FPGAs when building complex reconfigurable information security systems and their components. To improve the efficiency of the systems under study, structural-level combinational methods are proposed, which allow combining into single recognition device several matching schemes built on different approaches and their modifications, in such a way that their advantages are enhanced and disadvantages are eliminated. In order to achieve the maximum efficiency of combining methods, optimization methods are used. The methods of: parallel combining, sequential cascading and vertical junction have been formulated and investigated. The principle of multi-level combining of combining methods is also considered and researched. Algorithms for the implementation of the proposed combining methods have been developed. Software has been created that allows to conduct experiments with the developed methods and tools. Quantitative estimates are obtained for increasing the efficiency of constructing recognition modules as a result of using combination methods. The issue of optimization of reconfigurable devices presented in hardware description languages is considered. A modification of the method of affine transformations, which allows parallelizing such cycles that cannot be optimized by other methods, was presented. In order to facilitate the practical application of the developed methods and tools, a web service using high-performance computer technologies of grid and cloud computing was considered. The proposed methods to increase efficiency of matching procedure can also be used to solve important problems in other fields of science as data mining, analysis of DNA molecules, etc. Keywords: information security, signature, multi-pattern matching, FPGA, structural combining, efficiency, optimization, hardware description language.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography