Academic literature on the topic 'Operatic database'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Operatic database.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Operatic database"

1

van Nieuwkerk, Mascha M., Liselotte Salters, R. M. Helmers, and Ivan Kisjes. "Operatic Productions in the Netherlands, 1886–1995: from Printed Annals to Searchable Performance Data." Research Data Journal for the Humanities and Social Sciences 5, no. 2 (November 4, 2020): 79–90. http://dx.doi.org/10.1163/24523666-00502007.

Full text
Abstract:
Abstract This data paper accompanies the database Operatic productions in the Netherlands, an open dataset containing details on over five thousand opera productions in the Netherlands between 1885 and 1995 extracted from the Annalen van de Nederlandse Opera-gezelschappen (Annals of the Opera Companies in the Netherlands), which appeared in book form in 1996. These data give an extremely rich account of the performance history of operatic works and the personnel involved in their production. Since the original publication lacks a critical introduction, the authors have attempted to reconstruct the origins and systematics of the collection. They also discuss the attributes of the data and the basic data structure in order to give users relevant information to use and restructure the data for their interests. The data structure and metadata classifications are based on an inventory of the classifications used in existing performing arts databases across Europe. This facilitates future connections to other relevant performing arts datasets. The transfer of the Annals into a relational database finally brings out their full research potential.
APA, Harvard, Vancouver, ISO, and other styles
2

Albrecht-Hohmaier, Martin, Berthold Over, Emilia Pelliccia, and Sonia Wronkowska. "Pasticcio-Daten und Daten-Pasticcio – zur Edition kompilierter musikalischer Werke." editio 34, no. 1 (November 1, 2020): 45–71. http://dx.doi.org/10.1515/editio-2020-0004.

Full text
Abstract:
AbstractThis text, which was written by members of the project Pasticcio. Ways of arranging attractive operas, uses Gérard Genette’s term ‘paratext’ in a broader sense. Assuming that every text besides the edited work, which is part of the genesis of these oscillating works compiled of pre-existing musical material called pasticcios, can be understood as a paratext, it reveals the importance of printed librettos, related musical sources and philological contexts (like the often underestimated influence of the singers in the creation of a musical work) for their understanding and edition. Aiming at connecting a database with the online edition of four operatic works via XML, the project members present ideas and structures of the project’s data which can be described as a paratext as well.
APA, Harvard, Vancouver, ISO, and other styles
3

SZABÓOVÁ, Veronika, Csaba SZABÓ, Valerie NOVITZKÁ, and Emília DEMETEROVÁ. "GAME SEMANTICS OF THE TRANSACTION ROLLBACK DATABASE OPERATION." Acta Electrotechnica et Informatica 15, no. 1 (March 1, 2015): 3–8. http://dx.doi.org/10.15546/aeei-2015-0001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Pham, Shirzadi, Shahabi, Omidvar, Singh, Sahana, Asl, Ahmad, Quoc, and Lee. "Landslide Susceptibility Assessment by Novel Hybrid Machine Learning Algorithms." Sustainability 11, no. 16 (August 13, 2019): 4386. http://dx.doi.org/10.3390/su11164386.

Full text
Abstract:
: Landslides have multidimensional effects on the socioeconomic as well as environmental conditions of the impacted areas. The aim of this study is the spatial prediction of landslide using hybrid machine learning models including bagging (BA), random subspace (RS) and rotation forest (RF) with alternating decision tree (ADTree) as base classifier in the northern part of the Pithoragarh district, Uttarakhand, Himalaya, India. To construct the database, ten conditioning factors and a total of 103 landslide locations with a ratio of 70/30 were used. The significant factors were determined by chi-square attribute evaluation (CSEA) technique. The validity of the hybrid models was assessed by true positive rate (TP Rate), false positive rate (FP Rate), recall (sensitivity), precision, F-measure and area under the receiver operatic characteristic curve (AUC). Results concluded that land cover was the most important factor while curvature had no effect on landslide occurrence in the study area and it was removed from the modelling process. Additionally, results indicated that although all ensemble models enhanced the power prediction of the ADTree classifier (AUCtraining = 0.859; AUCvalidation = 0.813); however, the RS ensemble model (AUCtraining = 0.883; AUCvalidation = 0.842) outperformed and outclassed the RF (AUCtraining = 0.871; AUCvalidation = 0.840), and the BA (AUCtraining = 0.865; AUCvalidation = 0.836) ensemble model. The obtained results would be helpful for recognizing the landslide prone areas in future to better manage and decrease the damage and negative impacts on the environment.
APA, Harvard, Vancouver, ISO, and other styles
5

IGARI, Shinji, Fumiki TANAKA, and Masahiko ONOSATO. "3268 Computer aided operation planning for an actual machine tool based on updatable machining database and database oriented planning algorithm." Proceedings of International Conference on Leading Edge Manufacturing in 21st century : LEM21 2011.6 (2011): _3268–1_—_3268–6_. http://dx.doi.org/10.1299/jsmelem.2011.6._3268-1_.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gorla, Narasimhaiah, and Warren J. Boe. "Database operating efficiency in fragmented databases in mainframe, mini, and micro system environments." Data & Knowledge Engineering 5, no. 1 (March 1990): 1–19. http://dx.doi.org/10.1016/0169-023x(90)90030-h.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chenxi, Zhang, Ci Yungui, and Liu Bo. "Implementation of prolog databases and database operation builtins in the WAM-Plus model." Future Generation Computer Systems 6, no. 1 (June 1990): 3–9. http://dx.doi.org/10.1016/0167-739x(90)90003-v.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Jason, Ming-Hsien Yang, and Tian-Lih Koo. "A Control-Data-Mapping Entity-Relationship Model for Internal Controls Construction in Database Design." International Journal of Knowledge-Based Organizations 4, no. 2 (April 2014): 20–36. http://dx.doi.org/10.4018/ijkbo.2014040102.

Full text
Abstract:
The internal controls construction of a transaction system is important to management, operation and auditing. In the environment of manual operation, the internal controls of the transaction process are all done by manual mechanism. However, after the transaction processing environment has been changed from manual operation to computerized operation, the internal control techniques have been gradually transformed from manual mechanisms to computerized methods. The essence of internal controls in operational activities is the data expressions or constraints. The adoption of information systems often results in internal control deficiencies and operating risks due to the data unavailable in database for the data expressions of internal controls. Hence, how to design database schema to support internal controls mechanism is becoming a crucial issue for a computerized enterprise. Therefore, this paper referred Entity-Relationship model (ER model) in order to propose a Control-Data-Mapping Entity-relationship (CDMER) model by manipulating the required fields of tables to design database to support internal controls construction. Finally, a simple simulated case is prepared for illustration of the CDMER model. The contribution of this paper is to enhance the reliability of information systems through internal controls construction by applying the model to design databases.
APA, Harvard, Vancouver, ISO, and other styles
9

Nhu, Viet-Ha, Saeid Janizadeh, Mohammadtaghi Avand, Wei Chen, Mohsen Farzin, Ebrahim Omidvar, Ataollah Shirzadi, et al. "GIS-Based Gully Erosion Susceptibility Mapping: A Comparison of Computational Ensemble Data Mining Models." Applied Sciences 10, no. 6 (March 17, 2020): 2039. http://dx.doi.org/10.3390/app10062039.

Full text
Abstract:
Gully erosion destroys agricultural and domestic grazing land in many countries, especially those with arid and semi-arid climates and easily eroded rocks and soils. It also generates large amounts of sediment that can adversely impact downstream river channels. The main objective of this research is to accurately detect and predict areas prone to gully erosion. In this paper, we couple hybrid models of a commonly used base classifier (reduced pruning error tree, REPTree) with AdaBoost (AB), bagging (Bag), and random subspace (RS) algorithms to create gully erosion susceptibility maps for a sub-basin of the Shoor River watershed in northwestern Iran. We compare the performance of these models in terms of their ability to predict gully erosion and discuss their potential use in other arid and semi-arid areas. Our database comprises 242 gully erosion locations, which we randomly divided into training and testing sets with a ratio of 70/30. Based on expert knowledge and analysis of aerial photographs and satellite images, we selected 12 conditioning factors for gully erosion. We used multi-collinearity statistical techniques in the modeling process, and checked model performance using statistical indexes including precision, recall, F-measure, Matthew correlation coefficient (MCC), receiver operatic characteristic curve (ROC), precision–recall graph (PRC), Kappa, root mean square error (RMSE), relative absolute error (PRSE), mean absolute error (MAE), and relative absolute error (RAE). Results show that rainfall, elevation, and river density are the most important factors for gully erosion susceptibility mapping in the study area. All three hybrid models that we tested significantly enhanced and improved the predictive power of REPTree (AUC=0.800), but the RS-REPTree (AUC= 0.860) ensemble model outperformed the Bag-REPTree (AUC= 0.841) and the AB-REPTree (AUC= 0.805) models. We suggest that decision makers, planners, and environmental engineers employ the RS-REPTree hybrid model to better manage gully erosion-prone areas in Iran.
APA, Harvard, Vancouver, ISO, and other styles
10

Volkanovski, Andrija, Antonio Ballesteros Avila, and Miguel Peinador Veira. "Statistical Analysis of Loss of Offsite Power Events." Science and Technology of Nuclear Installations 2016 (2016): 1–9. http://dx.doi.org/10.1155/2016/7692659.

Full text
Abstract:
This paper presents the results of the statistical analysis of the loss of offsite power events (LOOP) registered in four reviewed databases. The reviewed databases include the IRSN (Institut de Radioprotection et de Sûreté Nucléaire) SAPIDE database and the GRS (Gesellschaft für Anlagen- und Reaktorsicherheit mbH) VERA database reviewed over the period from 1992 to 2011. The US NRC (Nuclear Regulatory Commission) Licensee Event Reports (LERs) database and the IAEA International Reporting System (IRS) database were screened for relevant events registered over the period from 1990 to 2013. The number of LOOP events in each year in the analysed period and mode of operation are assessed during the screening. The LOOP frequencies obtained for the French and German nuclear power plants (NPPs) during critical operation are of the same order of magnitude with the plant related events as a dominant contributor. A frequency of one LOOP event per shutdown year is obtained for German NPPs in shutdown mode of operation. For the US NPPs, the obtained LOOP frequency for critical and shutdown mode is comparable to the one assessed in NUREG/CR-6890. Decreasing trend is obtained for the LOOP events registered in three databases (IRSN, GRS, and NRC).
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Operatic database"

1

Gong, Guohui. "On concurrency control in logbased databases." Thesis, Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/8175.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gonzaga, André dos Santos. "The Similarity-aware Relational Division Database Operator." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-17112017-135006/.

Full text
Abstract:
In Relational Algebra, the operator Division (÷) is an intuitive tool used to write queries with the concept of for all, and thus, it is constantly required in real applications. However, as we demonstrate in this MSc work, the division does not support many of the needs common to modern applications, particularly those that involve complex data analysis, such as processing images, audio, genetic data, large graphs, fingerprints, and many other non-traditional data types. The main issue is the existence of intrinsic comparisons of attribute values in the operator, which, by definition, are always performed by identity (=), despite the fact that complex data must be compared by similarity. Recent works focus on supporting similarity comparison in relational operators, but no one treats the division. MSc work proposes the new Similarity-aware Division (÷) operator. Our novel operator is naturally well suited to answer queries with an idea of candidate elements and exigencies to be performed on complex data from real applications of high-impact. For example, it is potentially useful to support agriculture, genetic analyses, digital library search, and even to help controlling the quality of manufactured products and identifying new clients in industry. We validate our proposal by studying the first two of these applications.
O operador de Divisão (÷) da Álgebra Relacional permite representar de forma simples consultas com o conceito de para todos, e por isso é requerido em diversas aplicações reais. Entretanto, evidencia-se neste trabalho de mestrado que a divisão não atende às necessidades de diversas aplicações atuais, principalmente quando estas analisam dados complexos, como imagens, áudio, textos longos, impressões digitais, entre outros. Analisando o problema verifica-se que a principal limitação é a existência de comparações de valores de atributos intrínsecas à Divisão Relacional, que, por definição, são efetuadas sempre por identidade (=), enquanto objetos complexos devem geralmente ser comparados por similaridade. Hoje, encontram-se na literatura propostas de operadores relacionais com suporte à similaridade de objetos complexos, entretanto, nenhuma trata a Divisão Relacional. Este trabalho de mestrado propõe investigar e estender o operador de Divisão da Álgebra Relacional para melhor adequá-lo às demandas de aplicações atuais, por meio de suporte a comparações de valores de atributos por similaridade. Mostra-se aqui que a Divisão por Similaridade é naturalmente adequada a responder consultas diversas com um conceito de elementos candidatos e exigências descrito na monografia, envolvendo dados complexos de aplicações reais de alto impacto, com potencial por exemplo, para apoiar a agricultura, análises de dados genéticos, buscas em bibliotecas digitais, e até mesmo para controlar a qualidade de produtos manufaturados e a identificação de novos clientes em indústrias. Para validar a proposta, propõe-se estudar as duas primeiras aplicações citadas.
APA, Harvard, Vancouver, ISO, and other styles
3

Chang, Sidney H. (Sidney Hsiao-Ning) 1978. "Adapting an object-oriented database for disconnected operation." Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/86646.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Tomé, Diego Gomes. "A near-data select scan operator for database systems." reponame:Repositório Institucional da UFPR, 2017. http://hdl.handle.net/1884/53293.

Full text
Abstract:
Orientador : Eduardo Cunha de Almeida
Coorientador : Marco Antonio Zanata Alves
Dissertação (mestrado) - Universidade Federal do Paraná, Setor de Ciências Exatas, Programa de Pós-Graduação em Informática. Defesa: Curitiba, 21/12/2017
Inclui referências : p. 61-64
Resumo: Um dos grandes gargalos em sistemas de bancos de dados focados em leitura consiste em mover dados em torno da hierarquia de memória para serem processados na CPU. O movimento de dados é penalizado pela diferença de desempenho entre o processador e a memória, que é um problema bem conhecido chamado memory wall. O surgimento de memórias inteligentes, como o novo Hybrid Memory Cube (HMC), permitem mitigar o problema do memory wall executando instruções em chips de lógica integrados a uma pilha de DRAMs. Essas memórias possuem potencial para computação de operações de banco de dados direto em memória além do armazenamento de bancos de dados. O objetivo desta dissertação é justamente a execução do operador algébrico de seleção direto em memória para reduzir o movimento de dados através da memória e da hierarquia de cache. O foco na operação de seleção leva em conta o fato que a leitura de colunas a serem filtradas movem grandes quantidades de dados antes de outras operações como junções (ou seja, otimização push-down). Inicialmente, foi avaliada a execução da operação de seleção usando o HMC como uma DRAM comum. Posteriormente, são apresentadas extensões à arquitetura e ao conjunto de instruções do HMC, chamado HMC-Scan, para executar a operação de seleção próximo aos dados no chip lógico do HMC. Em particular, a extensão HMC-Scan tem o objetivo de resolver internamente as dependências de instruções. Contudo, nós observamos que o HMC-Scan requer muita interação entre a CPU e a memória para avaliar a execução de filtros de consultas. Portanto, numa segunda contribuição, apresentamos a extensão arquitetural HIPE-Scan para diminuir esta interação através da técnica de predicação. A predicação suporta a avaliação de predicados direto em memória sem necessidade de decisões da CPU e transforma dependências de controle em dependências de dados (isto é, execução predicada). Nós implementamos a operação de seleção próximo aos dados nas estratégias de execução de consulta orientada a linha/coluna/vetor para a arquitetura x86 e para nas duas extensões HMC-Scan e HIPE-Scan. Nossas simulações mostram uma melhora de desempenho de até 3.7× para HMC-Scan e 5.6× para HIPE-Scan quando executada a consulta 06 do benchmark TPC-H de 1 GB na estratégia de execução orientada a coluna. Palavras-chave: SGBD em Memória, Cubo de Memória Híbrido, Processamento em Memória.
Abstract: A large burden of processing read-mostly databases consists of moving data around the memory hierarchy rather than processing data in the processor. The data movement is penalized by the performance gap between the processor and the memory, which is the well-known problem called memory wall. The emergence of smart memories, as the new Hybrid Memory Cube (HMC), allows mitigating the memory wall problem by executing instructions in logic chips integrated to a stack of DRAMs. These memories can enable not only in-memory databases but also have potential for in-memory computation of database operations. In this dissertation, we focus on the discussion of near-data query processing to reduce data movement through the memory and cache hierarchy. We focus on the select scan database operator, because the scanning of columns moves large amounts of data prior to other operations like joins (i.e., push-down optimization). Initially, we evaluate the execution of the select scan using the HMC as an ordinary DRAM. Then, we introduce extensions to the HMC Instruction Set Architecture (ISA) to execute our near-data select scan operator inside the HMC, called HMC-Scan. In particular, we extend the HMC ISA with HMC-Scan to internally solve instruction dependencies. To support branch-less evaluation of the select scan and transform control-flow dependencies into data-flow dependencies (i.e., predicated execution) we propose another HMC ISA extension called HIPE-Scan. The HIPE-Scan leads to less iteration between processor and HMC during the execution of query filters that depends on in-memory data. We implemented the near-data select scan in the row/column/vector-wise query engines for x86 and two HMC extensions, HMC-Scan and HIPE-Scan achieving performance improvements of up to 3.7× for HMC-Scan and 5.6× for HIPE-Scan when executing the Query-6 from 1 GB TPC-H database on column-wise. Keywords: In-Memory DBMS, Hybrid Memory Cube, Processing-in-Memory.
APA, Harvard, Vancouver, ISO, and other styles
5

McLean, Angus L. M. Thom III. "Real-time distributed simulation analysis : an application of temporal database and simulation systems research." Diss., Georgia Institute of Technology, 2002. http://hdl.handle.net/1853/9124.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Oosthoek, Peter B. "A DATABASE SYSTEM CONCEPT TO SUPPORT FLIGHT TEST - MEASUREMENT SYSTEM DESIGN AND OPERATION." International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/608879.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
Information management is of essential importance during design and operation of flight test measurement systems to be used for aircraft airworthiness certification. The reliability of the data generated by the realtime- and post-processing processes is heavily dependent on the reliability of all provided information about the used flight test measurement system. Databases are well fitted to the task of information management. They need however additional application software to store, manage and retrieve the measurement system configuration data in a specified way to support all persons and aircraft- and ground based systems that are involved in the design and operation of flight test measurement systems. At the Dutch National Aerospace Laboratory (NLR) a "Measurementsystem Configuration DataBase" (MCDB) is being developed under contract with the Netherlands Agency for Aerospace Programs (NIVR) and in cooperation with Fokker to provide the required information management. This paper addresses the functional and operational requirements to the MCDB, its data-contents and computer configuration and a description of its intended way of operation.
APA, Harvard, Vancouver, ISO, and other styles
7

Procházka, Jiří. "Databázový systém pro výrobu desek plošných spojů." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2009. http://www.nusl.cz/ntk/nusl-218033.

Full text
Abstract:
Analysis of problems during designing web database and their structure for requirement of manufacturer of PCBs. Study of using applications intended for creation, administration and protection of database system. Systems for operation of web server. Principles of projection tabels in databases. Design of database's structure for production system of PCBs.
APA, Harvard, Vancouver, ISO, and other styles
8

Motara, Yusuf Moosa. "File integrity checking." Thesis, Rhodes University, 2006. http://hdl.handle.net/10962/d1007701.

Full text
Abstract:
This thesis looks at file execution as an attack vector that leads to the execution of unauthorized code. File integrity checking is examined as a means of removing this attack vector, and the design, implementation, and evaluation of a best-of-breed file integrity checker for the Linux operating system is undertaken. We conclude that the resultant file integrity checker does succeed in removing file execution as an attack vector, does so at a computational cost that is negligible, and displays innovative and useful features that are not currently found in any other Linux file integrity checker.
APA, Harvard, Vancouver, ISO, and other styles
9

Alqahatni, Zakyah. "Hierarchical Alignment of Tuples in Databases for Fast Join Processing." OpenSIUC, 2019. https://opensiuc.lib.siu.edu/theses/2604.

Full text
Abstract:
Data is distributed across interconnected relations in relational databases. Relationships between tuples can be rearranged in distinct relations by matching the values of the join attribute, a process called equi-join operation. Unlike standard attempts to design efficient join algorithms in this thesis, an approach is proposed to align tuples in relation so that joins can be readily and effectively done. We position tuples in their respective relationships, called relations alignment, which has matching join attribute values in the corresponding positions. We also address how to align relations and perform joins on aligned relations. The experiments were conducted in this research to measure and analyze the efficiency of the proposed approach compared to standard MySQL joins.
APA, Harvard, Vancouver, ISO, and other styles
10

Poulon, Fanny. "Tissue database of autofluorescence response to improve intra-operative diagnosis of primitive brain tumors." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLS236/document.

Full text
Abstract:
Le premier traitement standard pour les tumeurs cérébrales est la résection chirurgicale. Dans cette procédure un enjeu important demeure, l'identification des berges tumorales pour assurer une résection totale et éviter le risque de récidive pour le patient. A ce jour aucune technique d'imagerie peropératoire est capable de résoudre l'infiltration tumorale du tissu sain. La norme pour le diagnostic des berges tumorales est l'analyse histologique des biopsies. Une méthode ex vivo qui requiert un à plusieurs jours pour fournir ler apport pathologique final, un lapse de temps qui peut s'avérer fatal pour le patient. La microscopie optique a récemment été développer vers une utilisation clinique peropératoire pour répondre à cet enjeu. Dans travail, la technique de microscopie à deux-photons a été préférée pouressayer de répondre à cette problématique. Cette méthode donne accès à deux contrastes d'imagerie, la génération de seconde harmonique et l’émission de fluorescence, qui peuvent être combinés à des mesures quantitatives, tel que la spectroscopie et le temps de vie de fluorescence. Combiner ces quatre modalités de détection donnera une information complète sur la structure et le métabolisme de la région observée. Pour soutenir le développement technique vers une sonde endomicroscopique visant une utilisation peropératoire, les données en résultants doivent être fiables, et se montrer d'un intérêt pour le chirurgien. Par conséquent, une base de données sur le signal d'autofluorescence des tissus a été construite et présentée dans ce manuscrit, avec des algorithmes capables de discriminer de façon fiable les régions tumorales des régions saines. Des algorithmes qui ont montré le potentiel d'être automatisé dans une configuration clinique, afin de fournir une réponse en temps-réel au chirurgien
The first standard approach for brain tumor treatment is the surgical resection. In this protocol an important challenge remains, the identification of tumor margins to ensure a complete resection and avoid risk of tumor recurrence. Nowadays no intra-operative means of contrast are able to resolve infiltrated regions from healthy tissue. The standard for tumor margin diagnosis is the histological analysis of biopsies. An ex vivo method that requires one to several days to issue a final pathological reports, a time lapse that could be fatal to the patient. Optical microscopy have recently been developed towards an intra-operative clinical use to answer this challenge. In this work, the technique of two-photon microscopy based on the autofluorescence of tissue have been favored. This technique gives access to two imaging contrasts, the second-harmonic generation and emission of fluorescence, and can be combined to quantitative measurements, such as spectroscopy and fluorescence lifetime. The combination of these four modalities of detection will give a complete structural and metabolic information on the observed region. To support the technical development towards an endomicroscopic probe, the resulted data have to be reliable and proved to be of interest for the surgeon. Consequently, an extensive database of the autofluorescence response of brain tumor tissue have been constructed and presented in this manuscript, with algorithms able to discriminate with reliability tumoral from healthy regions. Algorithms that have shown potential to be automatized in a clinical setting, in order to give a real-time answer to the surgeons
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Operatic database"

1

1956-, Forbes Dorothy, ed. The Linux database. New York, N.Y: MIS Press, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bill, Wong, and Baklarz George, eds. DB2 Universal Database v8.1 for Linux, UNIX, and Windows database administration certification guide. 5th ed. Upper Saddle River, N.J: Prentice Hall PTR, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cummings, Mark. Distributed database technology trends. Menlo Park, CA (333 Ravenswood Ave., Menlo Park 94025-3476): SRI International, Business Intelligence Program, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

1947-, Bourdon Roger J., ed. Advanced Pick open database and operating system. 2nd ed. Harlow, England: Addison-Wesley, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Vu, Anthony Ma. Optimistic partitioned operation in distributed database systems. Urbana, Ill: Dept. of Computer Science, University of Illinois at Urbana-Champaign, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Paul, Zikopoulos, and Baklarz George, eds. DB2 9 for Linux, UNIX, and Windows: DBA guide, reference, and exam prep. 6th ed. Upper Saddle River, NJ: IBM Press/Pearson, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

G, Shivaratri Niranjan, ed. Advanced concepts in operating systems: Distributed, database, and multiprocessor operating systems. New York: McGraw-Hill, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

G, Shivaratri Niranjan, ed. Advanced concepts in operating systems: Distributed, database, and multiprocessor operating systems. New York: McGraw-Hill, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

R, Bobak Angelo, ed. Ranade's OS/2 extended edition: Database manager. New York: Bantam Books, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

ISACA (Serving IT Governance Professionals). Security, audit and control features: Oracle database. 3rd ed. Rolling Meadows, IL: ISACA, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Operatic database"

1

Weik, Martin H. "database operator." In Computer Science and Communications Dictionary, 340. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/1-4020-0613-6_4195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jensen, Christian S., and Richard T. Snodgrass. "Timeslice Operator." In Encyclopedia of Database Systems, 1–2. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4899-7993-3_1426-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jensen, Christian S., and Richard T. Snodgrass. "Timeslice Operator." In Encyclopedia of Database Systems, 3120–21. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-39940-9_1426.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Jensen, Christian S., and Richard T. Snodgrass. "Timeslice Operator." In Encyclopedia of Database Systems, 4159. New York, NY: Springer New York, 2018. http://dx.doi.org/10.1007/978-1-4614-8265-9_1426.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Pacitti, Esther. "Intra-operator Parallelism." In Encyclopedia of Database Systems, 1. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4899-7993-3_1480-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tan, Pang-Ning. "Receiver Operating Characteristic." In Encyclopedia of Database Systems, 1–5. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4899-7993-3_569-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hardavellas, Nikos, and Ippokratis Pandis. "Operator-Level Parallelism." In Encyclopedia of Database Systems, 1–6. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4899-7993-3_661-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Tan, Pang-Ning. "Receiver Operating Characteristic." In Encyclopedia of Database Systems, 2349–52. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-39940-9_569.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hardavellas, Nikos, and Ippokratis Pandis. "Operator-Level Parallelism." In Encyclopedia of Database Systems, 1981–85. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-39940-9_661.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Pacitti, Esther. "Inter-Operator Parallelism." In Encyclopedia of Database Systems, 1566. Boston, MA: Springer US, 2009. http://dx.doi.org/10.1007/978-0-387-39940-9_1085.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Operatic database"

1

Li, Huwei, Duo Li, and Huasheng Xiong. "Research on Distribute Real-Time Database Based on Vxworks." In 18th International Conference on Nuclear Engineering. ASMEDC, 2010. http://dx.doi.org/10.1115/icone18-30366.

Full text
Abstract:
Distributed control system (DCS) is widely used in industrial process control. As a matter of fact, the database in DCS is of great importance for the safety and reliability of the whole system. This paper focuses on the distributed real-time database in a dedicated DCS used to a small nuclear power station. A database which runs under Vxworks embedded operation system and can satisfy the basic database requirements was established by using the RDM Embedded database management system which reduces the operation system resources consumption. Three redundant databases are running in three distributed embedded computers, and the UDP protocol is used to implement their synchronization and the data transmission from low level computers to them. This system architecture is able to enhance the database reliability and the overall system performance. Experimental testing results show that the developed redundant databases structure running in three embedded computer called operator stations is efficient and reliable to manage real-time data, and it can meet its function and performance requirements in small nuclear power plant.
APA, Harvard, Vancouver, ISO, and other styles
2

Koubogiannis, D. G., V. P. Iliadis, and K. C. Giannakoglou. "A Parallel CFD Tool to Produce Faulty Blade Signatures for Diagnostic Purposes." In ASME 1998 International Gas Turbine and Aeroengine Congress and Exhibition. American Society of Mechanical Engineers, 1998. http://dx.doi.org/10.1115/98-gt-169.

Full text
Abstract:
In the turbomachinery field, many diagnostic systems utilize databases with symptoms corresponding to the most frequent operation faults. Thanks to Computational Fluid Dynamics (CFD), databases can be created without costly experiments, whereas the use of unstructured grids in combination with parallel processing makes the whole task easy and fast to accomplish. In this paper, a procedure that builds up a database for gas-turbine fault diagnosis is demonstrated. Advanced CFD tools that operate concurrently on multi-processor platforms are used. The so-prepared database contributes to the identification of faults through the analysis of the unsteady pressure signals that correspond to hypothetical sensors located in the inter-blade region. The pressure signals are post-processed in a similar way to the one experimentalists employ for fast-response pressure measurements. Symptoms related to displaced and/or twisted blades in an industrial high-speed compressor cascade, at design and off-design operating conditions, are analyzed.
APA, Harvard, Vancouver, ISO, and other styles
3

Volkanovski, Andrija, Antonio Ballesteros Avila, and Miguel Peinador Veira. "Results of the Loss of Offsite Power Events Analysis." In 2016 24th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/icone24-60153.

Full text
Abstract:
This paper presents the results of statistical and engineering analysis of Loss of Offsite Power (LOOP) events registered in four reviewed databases. The paper includes events registered in IRSN (Institut de Radioprotection et de Sûreté Nucléaire) SAPIDE and GRS (Gesellschaft für Anlagen- und Reaktorsicherheit mbH) VERA database in time period 1992 to 2011. The Nuclear Regulatory Commission (NRC) Licensee Event Reports (LERs) database and the IAEA International Reporting System (IRS) database are screened for the relevant events registered in period 1990 to 2013. In total, 228 relevant events were identified in the IRSN database, 190 in GRS, 120 in LER and 52 in IRS. The data include events registered both during the critical (at power) and shutdown operation of the plants. The identified events were classified considering nine different categories. In the three databases (SAPIDE, VERA, IAEA-IRS) the largest numbers of events are registered for the plant centered category. The largest number of the events in the NRC-LER database is found for switchyard centered events. According to the mode of operation, most events were reported during critical power operation, in all four databases. The “Partial loss of external power” events are the most frequent type of event found in the IRSN and NRC databases while the “Physical loss of electrical busbars” is the main type in the GRS and IAEA databases. The largest number of events in all databases is identified for the switchyard failures followed by the interconnections failures (both lines and transformers). Mainly LOOP event are identified by the fault report in the control room. Electrical deficiency is detected as the main direct cause of events. Environment is registered as the main contributor for the electrical grid deficiency in the French and NRC databases. Electrical failures are dominant contributor to the electrical grid deficiency in the German and IAEA databases. The principal root cause for the LOOP events are human failures with the human errors during test, inspection and maintenance as the largest sub-group. The largest number of the LOOP events resulted in reactor trip followed by the Emergency Diesel Generator (EDG) start. The majority of the reported LOOP events lasted for more than 2 minutes. Main lessons learned from the analysed events and potential actions for decrease of the number of LOOP events are presented.
APA, Harvard, Vancouver, ISO, and other styles
4

Volkanovski, Andrija, Antonio Ballesteros Avila, and Miguel Peinador Veira. "Trend Analysis of Loss of Offsite Power Events." In 2016 24th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/icone24-60154.

Full text
Abstract:
This paper presents the results of the trend analysis of Loss of Offsite Power (LOOP) events registered in two reviewed databases. The reviewed databases include the Nuclear Regulatory Commission (NRC) Licensee Event Reports (LERs) database and the IAEA International Reporting System (IRS). Both databases were screened for the relevant events registered in period 1990 to 2012. The statistical analysis of the identified relevant LOOP events is done. The analysis includes assessment of the LOOP initiating event frequency, distribution of the events per year in the analysed period and trend analysis of the identified events. The LOOP frequency is calculated for LOOP events registered in NRC LERs subdivided into four types by cause or location: plant centered, switchyard centered, grid related, and weather related. These four LOOP categories are assessed for two modes of operation (critical and shutdown operation). The number of LOOP events in each year over the analysed period and distribution of events per unit in given year were assessed from the reviewed databases. Trend analysis of the identified events is performed with the utilization of four trend measures. Analysis is done for events registered during power and shutdown operation and their sum. The obtained LOOP frequency for events registered NRC LERs for critical and shutdown mode is comparable to the one assessed in NUREG/CR-6890. Decreasing trend is obtained for the LOOP events registered for events in NRC LER database. Different trends are identified during critical and shutdown modes of operation for the events registered in the IAEA database. The sum of the LOOP events reported during critical and shutdown modes in IAEA IRS show no trend.
APA, Harvard, Vancouver, ISO, and other styles
5

Ortner, Susan, and Milan Brumovsky. "Age 60+ — Applicability of Ageing Related Data Bases and Methodologies for Ensuring Safe Operation of LWR Beyond 60 Years." In ASME 2017 Pressure Vessels and Piping Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/pvp2017-65283.

Full text
Abstract:
The AGE60+ project aimed towards the application of ageing related data bases and methodologies to ensuring the safe operation of Light Water Reactors beyond 60 years, and the longer-term safe storage of waste. It acted by encouraging European researchers to share data in order to maximize its utility. As an initial step, the AGE60+ consortium considered the development and utilization of a number of databases set up to collate data and derive information relevant to the management of nuclear power plant. This review was used to identify the factors which lead to a database successfully making a useful contribution. The AGE60+ project has been a very fruitful collaboration. It has: • Raised the concept of sharing data and setting up databases related to the ageing of concrete in waste disposal, and received a generally positive response to this first step. • Identified an area in which a new database would provide the opportunity to advance predictive modeling of the degradation of internals. • Collated and reviewed data on the thermal ageing of low alloy steels. • Identified information gaps in the area of internals degradation and the thermal ageing of low alloy steel components. • Produced an up-to-date database on WWER-440 RPV steel embrittlement. • Produced a new embrittlement trend curve (ETC), relating the fracture toughness transition temperature to fluence and composition for WWER-440 weld and base metal. • Expanded and utilized a database on MnMoNi steel embrittlement. This AGE 60+ project was realized as a part of the NUGENIA+ Call and lasted 18 months with participating organizations: UJV Rez a.s., Czech Republic, AREVA GmbH, Germany, CIEMAT, Spain, MTA EK, Hungary and NNL, UK as a leading organization.
APA, Harvard, Vancouver, ISO, and other styles
6

Tsai, Jhy-Cherng, and Weirong Tsai. "A Knowledge Base Approach for Tolerancing in Computer-Aided Process Planning." In ASME 1999 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1999. http://dx.doi.org/10.1115/detc99/dac-8632.

Full text
Abstract:
Abstract This paper presents a knowledge-base approach that assists a designer to evaluate possible process plans and associated costs based on tolerancing specifications of the designed part. It is an effort to take dimensional tolerances into computer-aided process planning (CAPP) for cylindrical parts through the usage of databases and knowledge bases. Geometric features with tolerancing specifications in a CAD system are first used to determine possible machining operations that can achieve the specified tolerances based on data from the machining feature database, the process precision grade database, and the precision grade database. Process plans are then generated based on rules and knowledge from process sequence knowledge base and the machining feature database. Possible process plans are further organized as a graph. Optimal process plan with least cost is then selected by searching through the graph. This is achieved based on machine set-up and operation costs that are derived from the machine tool resource database, the process parameter database, and the machine set-up and operation cost database. A CAPP software prototype supporting tolerance design on the AutoCAD platform is also demonstrated with examples to illustrate this approach.
APA, Harvard, Vancouver, ISO, and other styles
7

Lou, Kuiyang, Subramaniam Jayanti, Natraj Iyer, Yagnanarayanan Kalyanaraman, Sunil Prabhakar, and Karthik Ramani. "A Reconfigurable 3D Engineering Shape Search System: Part II — Database Indexing, Retrieval, and Clustering." In ASME 2003 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2003. http://dx.doi.org/10.1115/detc2003/cie-48188.

Full text
Abstract:
This paper introduces database and related techniques for a reconfigurable, intelligent 3D engineering shape search system, which retrieves similar 3D models based on their shape content. Feature vectors, which are numeric “fingerprints” of 3D models, and skeletal graphs, which are the “minimal representations of the shape content” of a 3D model, represent the shape content. The Euclidean distance of the feature vectors, as well as the distance between skeletal graphs, provides indirect measures of shape similarity between the 3D models. Critical database issues regarding 3D shape search systems are discussed: (a) database indexing, (b) semantic gap, (c) subjectivity of similarity, and (d) database clustering. An Rtree based multidimensional index is used to speed up the feature-vector based search operation, while a decision treebased approach is used for efficiently indexing/searching skeletal graphs. Interactions among users and the search system, such as relevance feedback and feature vector reconfiguration, are used to bridge the semantic gap and to customize the system for different users. Database clustering of the R-tree index is compared with that generated by a selforganizing map (SOM). Synthetic databases and real 3D model databases are employed to investigate the efficiency of the multidimensional index and the effectiveness of relevance feedback.
APA, Harvard, Vancouver, ISO, and other styles
8

Marzoev, Alana, Lara Timbó Araújo, Malte Schwarzkopf, Samyukta Yagati, Eddie Kohler, Robert Morris, M. Frans Kaashoek, and Sam Madden. "Towards Multiverse Databases." In HotOS '19: Workshop on Hot Topics in Operating Systems. New York, NY, USA: ACM, 2019. http://dx.doi.org/10.1145/3317550.3321425.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Grubbs, Paul, Thomas Ristenpart, and Vitaly Shmatikov. "Why Your Encrypted Database Is Not Secure." In HotOS '17: Workshop on Hot Topics in Operating Systems. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3102980.3103007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Silva, Yasin N., Walid G. Aref, and Mohamed H. Ali. "The similarity join database operator." In 2010 IEEE 26th International Conference on Data Engineering (ICDE 2010). IEEE, 2010. http://dx.doi.org/10.1109/icde.2010.5447873.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Operatic database"

1

Yue, Lei, Guanzhang Mu, Zengmao Lin, and Haolin Sun. Impact of low-dose intrathecal morphine on orthopedic surgery: a protocol of a systematic review and meta-analysis of randomized controlled trials. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, February 2022. http://dx.doi.org/10.37766/inplasy2022.2.0029.

Full text
Abstract:
Review question / Objective: Patients undergoing orthopedic surgery usually suffer considerably from peri-operative pain and intrathecal morphine (ITM) has recent been used as an effective analgesia method. The intrathecal morphine dose achieving optimal analgesia for orthopedic surgery while minimizing side effects has not yet been determined. There is currently a lack of literature synthesis in the safety and effects of low-dose ITM on orthopedic surgery. Condition being studied: Low-dose intrathecal morphine on orthopedic surgery. Information sources: We will search the following electronic databases, registries and websites on January 11th 2022, unrestricted by date. Grey literature and non-English studies will not be excluded. English Databases: PubMed, Cochrane library and Web of science. Chinese database: Cnki.net Trial registries: ClinicalTrials.gov.
APA, Harvard, Vancouver, ISO, and other styles
2

Salko, Jr., Robert, Aaron C. Graham, Robert Alexander Lefebvre, and Brandon R. Langley. Demonstration of MSR salt property database operation with a reactor analysis tool. Office of Scientific and Technical Information (OSTI), September 2019. http://dx.doi.org/10.2172/1615804.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ma, Zhegang. NRC Reactor Operating Experience Results and Databases Trend Summary: 2020 Update. Office of Scientific and Technical Information (OSTI), March 2022. http://dx.doi.org/10.2172/1894884.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sumbukeni, H. Operating and maintenance of computerized geosciences database management systems in some of Zambia's mineral exploration and mining sectors. Natural Resources Canada/ESS/Scientific and Technical Publishing Services, 1994. http://dx.doi.org/10.4095/193887.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sriraj, P. S., Bo Zou, Lise Dirks, Nahid Parvez Farazi, Elliott Lewis, and Jean Paul Manzanarez. Maritime Freight Data Collection Systems and Database to Support Performance Measures and Market Analyses. Illinois Center for Transportation, December 2020. http://dx.doi.org/10.36501/0197-9191/20-021.

Full text
Abstract:
The Illinois Marine Transportation System (IMTS) is a key component of the nation’s inland waterway system. IMTS is comprised of 27 locks and dams, 19 port districts, more than 350 active terminals, and 1,118 miles of navigable inland waterways traversing along the borderline or within the state of Illinois. However, the infrastructure of IMTS is aging and its conditions are deteriorating. To monitor the performance of IMTS and guide infrastructure investment to enhance safety, efficiency, and reliability of the system, a comprehensive performance measurement program is needed. To this end, the objective of this project is to create an integrated, comprehensive, and maintainable database that facilitates performance measurement of maritime freight to, from, and through Illinois. To achieve this objective, a review of the literature on maritime freight transportation both in the United States and abroad was performed. To gauge practitioners’ points of view, a series of phone interviews and online surveys of Illinois’ neighboring state DOT officials, officials from the US Army Corps of Engineers, Illinois port district authorities, and carriers operating in Illinois was also conducted. With the findings from the literature review and an understanding of state DOT practices, the needed and available data sources for a maritime freight performance measurement program were identified. Building on all the above efforts, a first-of-its-kind PM database for IMTS was designed and developed, along with a detailed user manual, ready for IDOT’s immediate use and future updates. In addition, opportunities for IDOT to use the database to conduct analysis are discussed. Key programmatic recommendations that outline the role of IDOT as a champion and as a facilitator are further included. The outcome of this project will help IDOT gain much-needed knowledge of and develop programs to improve IMTS performance, increase multimodal transportation network capacity, and expand the transportation and logistics sector of the state, which ultimately benefit the people and economy of Illinois.
APA, Harvard, Vancouver, ISO, and other styles
6

Busch, Julian Conn, Madhavi Muralidharan, Jasmine Wu, Laura Di Taranti, Enrique Torres Hernandez, Meredith Collard, and Meghan Lane-Fall. Systematic review of OR to ICU handoff standardization interventions highlights need for focus on sustainability and patient outcomes. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, February 2022. http://dx.doi.org/10.37766/inplasy2022.2.0035.

Full text
Abstract:
Review question / Objective: The objective of this review is to examine if and how interventional studies on handoffs of patients from the operating room (OR) to the intensive care unit (ICU) analyze interventional sustainability and their impacts on patient outcomes. Eligibility criteria: Inclusion criteria for studies were as follows: (1) publication of the study as a full-text manuscript in a peer-reviewed journal and (2) description of an intervention to standardize the OR to ICU handoff. Information sources: Information sources are the following electronic databases: ABI Inform, Business Source Complete and HealthBusiness FullText (EBSCO), CINAHL, ClinicalTrials.Gov, Cochrane Review, EMBASE, Ovid Medline, PubMed, Scopus, and Web of Science.
APA, Harvard, Vancouver, ISO, and other styles
7

Lewis, Dustin, ed. Database of States’ Statements (August 2011–October 2016) concerning Use of Force in relation to Syria. Harvard Law School Program on International Law and Armed Conflict, May 2017. http://dx.doi.org/10.54813/ekmb4241.

Full text
Abstract:
Many see armed conflict in Syria as a flashpoint for international law. The situation raises numerous unsettling questions, not least concerning normative foundations of the contemporary collective-security and human-security systems, including the following: Amid recurring reports of attacks directed against civilian populations and hospitals with seeming impunity, what loss of legitimacy might law suffer? May—and should—states forcibly intervene to prevent (more) chemical-weapons attacks? If the government of Syria is considered unwilling or unable to obviate terrorist threats from spilling over its borders into other countries, may another state forcibly intervene to protect itself (and others), even without Syria’s consent and without an express authorization of the U.N. Security Council? What began in Daraa in 2011 as protests escalated into armed conflict. Today, armed conflict in Syria implicates a multitude of people, organizations, states, and entities. Some are obvious, such as the civilian population, the government, and organized armed groups (including designated terrorist organizations, for example the Islamic State of Iraq and Syria, or ISIS). Other implicated actors might be less obvious. They include dozens of third states that have intervened or otherwise acted in relation to armed conflict in Syria; numerous intergovernmental bodies; diverse domestic, foreign, and international courts; and seemingly innumerable NGOs. Over time, different states have adopted wide-ranging and diverse approaches to undertaking measures (or not) concerning armed conflict in Syria, whether in relation to the government, one or more armed opposition groups, or the civilian population. Especially since mid-2014, a growing number of states have undertaken military operations directed against ISIS in Syria. For at least a year-and-a-half, Russia has bolstered military strategies of the Syrian government. At least one state (the United States) has directed an operation against a Syrian military base. And, more broadly, many states provide (other) forms of support or assistance to the government of Syria, to armed opposition groups, or to the civilian population. Against that backdrop, the Harvard Law School Program on International Law and Armed Conflict (HLS PILAC) set out to collect states’ statements made from August 2011 through November 2016 concerning use of force in relation to Syria. A primary aim of the database is to provide a comparatively broad set of reliable resources regarding states’ perspectives, with a focus on legal parameters. A premise underlying the database is that through careful documentation of diverse approaches, we can better understand those perspectives. The intended audience of the database is legal practitioners. The database is composed of statements made on behalf of states and/or by state officials. For the most part, the database focuses on statements regarding legal parameters concerning use of force in relation to Syria. HLS PILAC does not pass judgment on whether each statement is necessarily legally salient for purposes of international law. Nor does HLS PILAC seek to determine whether a particular statement may be understood as an expression of opinio juris or an act of state practice (though it might be).
APA, Harvard, Vancouver, ISO, and other styles
8

Visser, R., H. Kao, R. M. H. Dokht, A. B. Mahani, and S. Venables. A comprehensive earthquake catalogue for northeastern British Columbia: the northern Montney trend from 2017 to 2020 and the Kiskatinaw Seismic Monitoring and Mitigation Area from 2019 to 2020. Natural Resources Canada/CMSS/Information Management, 2021. http://dx.doi.org/10.4095/329078.

Full text
Abstract:
To increase our understanding of induced seismicity, we develop and implement methods to enhance seismic monitoring capabilities in northeastern British Columbia (NE BC). We deploy two different machine learning models to identify earthquake phases using waveform data from regional seismic stations and utilize an earthquake database management system to streamline the construction and maintenance of an up-to-date earthquake catalogue. The completion of this study allows for a comprehensive catalogue in NE BC from 2014 to 2020 by building upon our previous 2014-2016 and 2017-2018 catalogues. The bounds of the area where earthquakes were located were between 55.5°N-60.0°N and 119.8°W-123.5°W. The earthquakes in the catalogue were initially detected by machine learning models, then reviewed by an analyst to confirm correct identification, and finally located using the Non-Linear Location (NonLinLoc) algorithm. Two distinct sub-areas within the bounds consider different periods to supplement what was not covered in previously published reports - the Northern Montney Trend (NMT) is covered from 2017 to 2020 while the Kiskatinaw Seismic Monitoring and Mitigation Area (KSMMA) is covered from 2019 to 2020. The two sub-areas are distinguished by the BC Oil & Gas Commission (BCOGC) due to differences in their geographic location and geology. The catalogue was produced by picking arrival phases on continuous seismic waveforms from 51 stations operated by various organizations in the region. A total of 17,908 events passed our quality control criteria and are included in the final catalogue. Comparably, the routine Canadian National Seismograph Network (CNSN) catalogue reports 207 seismic events - all events in the CNSN catalogue are present in our catalogue. Our catalogue benefits from the use of enhanced station coverage and improved methodology. The total number of events in our catalogue in 2017, 2018, 2019, and 2020 were 62, 47, 9579 and 8220, respectively. The first two years correspond to seismicity in the NMT where poor station coverage makes it difficult to detect small magnitude events. The magnitude of completeness within the KSMMA (ML = ~0.7) is significantly smaller than that obtained for the NMT (ML = ~1.4). The new catalogue is released with separate files for origins, arrivals, and magnitudes which can be joined using the unique ID assigned to each event.
APA, Harvard, Vancouver, ISO, and other styles
9

Harris, Gregory, Brooke Hatchell, Davelin Woodard, and Dwayne Accardo. Intraoperative Dexmedetomidine for Reduction of Postoperative Delirium in the Elderly: A Scoping Review. University of Tennessee Health Science Center, July 2021. http://dx.doi.org/10.21007/con.dnp.2021.0010.

Full text
Abstract:
Background/Purpose: Post-operative delirium leads to significant morbidity in elderly patients, yet there is no regimen to prevent POD. Opioid use in the elderly surgical population is of the most significant risk factors for developing POD. The purpose of this scoping review is to recognize that Dexmedetomidine mitigates cognitive dysfunction secondary to acute pain and the use of narcotic analgesia by decreasing the amount of norepinephrine (an excitatory neurotransmitter) released during times of stress. This mechanism of action also provides analgesia through decreased perception and modulation of pain. Methods: The authors developed eligibility criteria for inclusion of articles and performed a systematic search of several databases. Each of the authors initially selected five articles for inclusion in the scoping review. We created annotated literature tables for easy screening by co-authors. After reviewing the annotated literature table four articles were excluded, leaving 11 articles for inclusion in the scoping review. There were six level I meta-analysis/systematic reviews, four level II randomized clinical trials, and one level IV qualitative research article. Next, we created a data-charting form on Microsoft Word for extraction of data items and synthesis of results. Results: Two of the studies found no significant difference in POD between dexmedetomidine groups and control groups. The nine remaining studies noted decreases in the rate, duration, and risk of POD in the groups receiving dexmedetomidine either intraoperatively or postoperatively. Multiple studies found secondary benefits in addition to decreased POD, such as a reduction of tachycardia, hypertension, stroke, hypoxemia, and narcotic use. One study, however, found that the incidence of hypotension and bradycardia were increased among the elderly population. Implications for Nursing Practice: Surgery is a tremendous stressor in any age group, but especially the elderly population. It has been shown postoperative delirium occurs in 17-61% of major surgery procedures with 30-40% of the cases assumed to be preventable. Opioid administration in the elderly surgical population is one of the most significant risk factors for developing POD. With anesthesia practice already leaning towards opioid-free and opioid-limited anesthetic, the incorporation of dexmedetomidine could prove to be a valuable resource in both reducing opioid use and POD in the elderly surgical population. Although more research is needed, the current evidence is promising.
APA, Harvard, Vancouver, ISO, and other styles
10

Downes, Jane, ed. Chalcolithic and Bronze Age Scotland: ScARF Panel Report. Society for Antiquaries of Scotland, September 2012. http://dx.doi.org/10.9750/scarf.09.2012.184.

Full text
Abstract:
The main recommendations of the panel report can be summarised under five key headings:  Building the Scottish Bronze Age: Narratives should be developed to account for the regional and chronological trends and diversity within Scotland at this time. A chronology Bronze Age Scotland: ScARF Panel Report iv based upon Scottish as well as external evidence, combining absolute dating (and the statistical modelling thereof) with re-examined typologies based on a variety of sources – material cultural, funerary, settlement, and environmental evidence – is required to construct a robust and up to date framework for advancing research.  Bronze Age people: How society was structured and demographic questions need to be imaginatively addressed including the degree of mobility (both short and long-distance communication), hierarchy, and the nature of the ‘family’ and the ‘individual’. A range of data and methodologies need to be employed in answering these questions, including harnessing experimental archaeology systematically to inform archaeologists of the practicalities of daily life, work and craft practices.  Environmental evidence and climate impact: The opportunity to study the effects of climatic and environmental change on past society is an important feature of this period, as both palaeoenvironmental and archaeological data can be of suitable chronological and spatial resolution to be compared. Palaeoenvironmental work should be more effectively integrated within Bronze Age research, and inter-disciplinary approaches promoted at all stages of research and project design. This should be a two-way process, with environmental science contributing to interpretation of prehistoric societies, and in turn, the value of archaeological data to broader palaeoenvironmental debates emphasised. Through effective collaboration questions such as the nature of settlement and land-use and how people coped with environmental and climate change can be addressed.  Artefacts in Context: The Scottish Chalcolithic and Bronze Age provide good evidence for resource exploitation and the use, manufacture and development of technology, with particularly rich evidence for manufacture. Research into these topics requires the application of innovative approaches in combination. This could include biographical approaches to artefacts or places, ethnographic perspectives, and scientific analysis of artefact composition. In order to achieve this there is a need for data collation, robust and sustainable databases and a review of the categories of data.  Wider Worlds: Research into the Scottish Bronze Age has a considerable amount to offer other European pasts, with a rich archaeological data set that includes intact settlement deposits, burials and metalwork of every stage of development that has been the subject of a long history of study. Research should operate over different scales of analysis, tracing connections and developments from the local and regional, to the international context. In this way, Scottish Bronze Age studies can contribute to broader questions relating both to the Bronze Age and to human society in general.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography