To see the other types of publications on this topic, follow the link: Operatic database.

Journal articles on the topic 'Operatic database'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Operatic database.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

van Nieuwkerk, Mascha M., Liselotte Salters, R. M. Helmers, and Ivan Kisjes. "Operatic Productions in the Netherlands, 1886–1995: from Printed Annals to Searchable Performance Data." Research Data Journal for the Humanities and Social Sciences 5, no. 2 (November 4, 2020): 79–90. http://dx.doi.org/10.1163/24523666-00502007.

Full text
Abstract:
Abstract This data paper accompanies the database Operatic productions in the Netherlands, an open dataset containing details on over five thousand opera productions in the Netherlands between 1885 and 1995 extracted from the Annalen van de Nederlandse Opera-gezelschappen (Annals of the Opera Companies in the Netherlands), which appeared in book form in 1996. These data give an extremely rich account of the performance history of operatic works and the personnel involved in their production. Since the original publication lacks a critical introduction, the authors have attempted to reconstruct the origins and systematics of the collection. They also discuss the attributes of the data and the basic data structure in order to give users relevant information to use and restructure the data for their interests. The data structure and metadata classifications are based on an inventory of the classifications used in existing performing arts databases across Europe. This facilitates future connections to other relevant performing arts datasets. The transfer of the Annals into a relational database finally brings out their full research potential.
APA, Harvard, Vancouver, ISO, and other styles
2

Albrecht-Hohmaier, Martin, Berthold Over, Emilia Pelliccia, and Sonia Wronkowska. "Pasticcio-Daten und Daten-Pasticcio – zur Edition kompilierter musikalischer Werke." editio 34, no. 1 (November 1, 2020): 45–71. http://dx.doi.org/10.1515/editio-2020-0004.

Full text
Abstract:
AbstractThis text, which was written by members of the project Pasticcio. Ways of arranging attractive operas, uses Gérard Genette’s term ‘paratext’ in a broader sense. Assuming that every text besides the edited work, which is part of the genesis of these oscillating works compiled of pre-existing musical material called pasticcios, can be understood as a paratext, it reveals the importance of printed librettos, related musical sources and philological contexts (like the often underestimated influence of the singers in the creation of a musical work) for their understanding and edition. Aiming at connecting a database with the online edition of four operatic works via XML, the project members present ideas and structures of the project’s data which can be described as a paratext as well.
APA, Harvard, Vancouver, ISO, and other styles
3

SZABÓOVÁ, Veronika, Csaba SZABÓ, Valerie NOVITZKÁ, and Emília DEMETEROVÁ. "GAME SEMANTICS OF THE TRANSACTION ROLLBACK DATABASE OPERATION." Acta Electrotechnica et Informatica 15, no. 1 (March 1, 2015): 3–8. http://dx.doi.org/10.15546/aeei-2015-0001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Pham, Shirzadi, Shahabi, Omidvar, Singh, Sahana, Asl, Ahmad, Quoc, and Lee. "Landslide Susceptibility Assessment by Novel Hybrid Machine Learning Algorithms." Sustainability 11, no. 16 (August 13, 2019): 4386. http://dx.doi.org/10.3390/su11164386.

Full text
Abstract:
: Landslides have multidimensional effects on the socioeconomic as well as environmental conditions of the impacted areas. The aim of this study is the spatial prediction of landslide using hybrid machine learning models including bagging (BA), random subspace (RS) and rotation forest (RF) with alternating decision tree (ADTree) as base classifier in the northern part of the Pithoragarh district, Uttarakhand, Himalaya, India. To construct the database, ten conditioning factors and a total of 103 landslide locations with a ratio of 70/30 were used. The significant factors were determined by chi-square attribute evaluation (CSEA) technique. The validity of the hybrid models was assessed by true positive rate (TP Rate), false positive rate (FP Rate), recall (sensitivity), precision, F-measure and area under the receiver operatic characteristic curve (AUC). Results concluded that land cover was the most important factor while curvature had no effect on landslide occurrence in the study area and it was removed from the modelling process. Additionally, results indicated that although all ensemble models enhanced the power prediction of the ADTree classifier (AUCtraining = 0.859; AUCvalidation = 0.813); however, the RS ensemble model (AUCtraining = 0.883; AUCvalidation = 0.842) outperformed and outclassed the RF (AUCtraining = 0.871; AUCvalidation = 0.840), and the BA (AUCtraining = 0.865; AUCvalidation = 0.836) ensemble model. The obtained results would be helpful for recognizing the landslide prone areas in future to better manage and decrease the damage and negative impacts on the environment.
APA, Harvard, Vancouver, ISO, and other styles
5

IGARI, Shinji, Fumiki TANAKA, and Masahiko ONOSATO. "3268 Computer aided operation planning for an actual machine tool based on updatable machining database and database oriented planning algorithm." Proceedings of International Conference on Leading Edge Manufacturing in 21st century : LEM21 2011.6 (2011): _3268–1_—_3268–6_. http://dx.doi.org/10.1299/jsmelem.2011.6._3268-1_.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gorla, Narasimhaiah, and Warren J. Boe. "Database operating efficiency in fragmented databases in mainframe, mini, and micro system environments." Data & Knowledge Engineering 5, no. 1 (March 1990): 1–19. http://dx.doi.org/10.1016/0169-023x(90)90030-h.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chenxi, Zhang, Ci Yungui, and Liu Bo. "Implementation of prolog databases and database operation builtins in the WAM-Plus model." Future Generation Computer Systems 6, no. 1 (June 1990): 3–9. http://dx.doi.org/10.1016/0167-739x(90)90003-v.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Jason, Ming-Hsien Yang, and Tian-Lih Koo. "A Control-Data-Mapping Entity-Relationship Model for Internal Controls Construction in Database Design." International Journal of Knowledge-Based Organizations 4, no. 2 (April 2014): 20–36. http://dx.doi.org/10.4018/ijkbo.2014040102.

Full text
Abstract:
The internal controls construction of a transaction system is important to management, operation and auditing. In the environment of manual operation, the internal controls of the transaction process are all done by manual mechanism. However, after the transaction processing environment has been changed from manual operation to computerized operation, the internal control techniques have been gradually transformed from manual mechanisms to computerized methods. The essence of internal controls in operational activities is the data expressions or constraints. The adoption of information systems often results in internal control deficiencies and operating risks due to the data unavailable in database for the data expressions of internal controls. Hence, how to design database schema to support internal controls mechanism is becoming a crucial issue for a computerized enterprise. Therefore, this paper referred Entity-Relationship model (ER model) in order to propose a Control-Data-Mapping Entity-relationship (CDMER) model by manipulating the required fields of tables to design database to support internal controls construction. Finally, a simple simulated case is prepared for illustration of the CDMER model. The contribution of this paper is to enhance the reliability of information systems through internal controls construction by applying the model to design databases.
APA, Harvard, Vancouver, ISO, and other styles
9

Nhu, Viet-Ha, Saeid Janizadeh, Mohammadtaghi Avand, Wei Chen, Mohsen Farzin, Ebrahim Omidvar, Ataollah Shirzadi, et al. "GIS-Based Gully Erosion Susceptibility Mapping: A Comparison of Computational Ensemble Data Mining Models." Applied Sciences 10, no. 6 (March 17, 2020): 2039. http://dx.doi.org/10.3390/app10062039.

Full text
Abstract:
Gully erosion destroys agricultural and domestic grazing land in many countries, especially those with arid and semi-arid climates and easily eroded rocks and soils. It also generates large amounts of sediment that can adversely impact downstream river channels. The main objective of this research is to accurately detect and predict areas prone to gully erosion. In this paper, we couple hybrid models of a commonly used base classifier (reduced pruning error tree, REPTree) with AdaBoost (AB), bagging (Bag), and random subspace (RS) algorithms to create gully erosion susceptibility maps for a sub-basin of the Shoor River watershed in northwestern Iran. We compare the performance of these models in terms of their ability to predict gully erosion and discuss their potential use in other arid and semi-arid areas. Our database comprises 242 gully erosion locations, which we randomly divided into training and testing sets with a ratio of 70/30. Based on expert knowledge and analysis of aerial photographs and satellite images, we selected 12 conditioning factors for gully erosion. We used multi-collinearity statistical techniques in the modeling process, and checked model performance using statistical indexes including precision, recall, F-measure, Matthew correlation coefficient (MCC), receiver operatic characteristic curve (ROC), precision–recall graph (PRC), Kappa, root mean square error (RMSE), relative absolute error (PRSE), mean absolute error (MAE), and relative absolute error (RAE). Results show that rainfall, elevation, and river density are the most important factors for gully erosion susceptibility mapping in the study area. All three hybrid models that we tested significantly enhanced and improved the predictive power of REPTree (AUC=0.800), but the RS-REPTree (AUC= 0.860) ensemble model outperformed the Bag-REPTree (AUC= 0.841) and the AB-REPTree (AUC= 0.805) models. We suggest that decision makers, planners, and environmental engineers employ the RS-REPTree hybrid model to better manage gully erosion-prone areas in Iran.
APA, Harvard, Vancouver, ISO, and other styles
10

Volkanovski, Andrija, Antonio Ballesteros Avila, and Miguel Peinador Veira. "Statistical Analysis of Loss of Offsite Power Events." Science and Technology of Nuclear Installations 2016 (2016): 1–9. http://dx.doi.org/10.1155/2016/7692659.

Full text
Abstract:
This paper presents the results of the statistical analysis of the loss of offsite power events (LOOP) registered in four reviewed databases. The reviewed databases include the IRSN (Institut de Radioprotection et de Sûreté Nucléaire) SAPIDE database and the GRS (Gesellschaft für Anlagen- und Reaktorsicherheit mbH) VERA database reviewed over the period from 1992 to 2011. The US NRC (Nuclear Regulatory Commission) Licensee Event Reports (LERs) database and the IAEA International Reporting System (IRS) database were screened for relevant events registered over the period from 1990 to 2013. The number of LOOP events in each year in the analysed period and mode of operation are assessed during the screening. The LOOP frequencies obtained for the French and German nuclear power plants (NPPs) during critical operation are of the same order of magnitude with the plant related events as a dominant contributor. A frequency of one LOOP event per shutdown year is obtained for German NPPs in shutdown mode of operation. For the US NPPs, the obtained LOOP frequency for critical and shutdown mode is comparable to the one assessed in NUREG/CR-6890. Decreasing trend is obtained for the LOOP events registered in three databases (IRSN, GRS, and NRC).
APA, Harvard, Vancouver, ISO, and other styles
11

Osmanov, I. H. "METHODOLOGY FOR SYSTEM CONSTRUCTION FOR COLLECTING AND PROCESSING INFORMATION ABOUT THE TECHNICAL AND OPERATING CONDITION OF OBJECTS OF LINEAR-STRETCHED CONSTRUCTION ON THE EXAMPLE OF THE HIGHWAYS OF REPUBLIC (HIGHWAY DATA BANK)." Construction economic and environmental management 77, no. 4 (2021): 50–55. http://dx.doi.org/10.37279/2519-4453-2020-4-50-55.

Full text
Abstract:
The research considers the methodology for construction a system for collecting and processing information about the technical and operating condition of linear-length construction objects using the example of the highways of republic (highway databank). The republican elaborated databank consists of various systems of ordered information (databases) which are united by common principles and rules for their automated processing. Creating a system enables to develop economically feasible current and annual programs for new construction, reconstruction, reconditioning and operation of highways in the republic, harmonized with material, technical, labor, and financial resources, obtaining current information for effective controlling actions at various levels of management.
APA, Harvard, Vancouver, ISO, and other styles
12

Wagner, Zdenek. "Co-operation of a Database with LaTeX." Zpravodaj Československého sdružení uživatelů TeXu 10, no. 1-3 (2000): 49–78. http://dx.doi.org/10.5300/2000-1-3/49.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Qi, Weiqiang, Hongkai Wang, and Tong Chen. "Multimedia System Design and Data Storage Optimization Based on Machine Learning Algorithm." Computational Intelligence and Neuroscience 2022 (August 1, 2022): 1–12. http://dx.doi.org/10.1155/2022/6426551.

Full text
Abstract:
With the advancement of science and technology, digital technology and Internet of Things network technology have been developed rapidly, and multimedia technology has also been widely used. Multimedia formats such as digital TV and elevator posters are shaking up traditional media. At the same time, many media operation models and multimedia technologies are combined to plan operational strategies, determine operational goals, and change the traditional media structure to achieve commercial profits and society benefit. However, due to limitations in the existing operating model or unreasonable technical solutions, it is not easy to maximize the value of multimedia technology. The XML-based database has been submitted, and it will carry out the business requirements of the transaction network and the business platform of the transaction network. Integrated management mechanism is analyzed and applied. The framework design includes parallel quota processing module, update processing module, result processing module, and storage library and database connection management module. The department runs multiple parts of the system together and completes the database. The development of cloud database is based on cloud computing. It can effectively fill the shortcomings and gaps of traditional database storage and processing, and it can also provide high-reciprocity databases to provide storage and management services. It has high reliability. Cloud servers use fair weighted rounding algorithms to achieve load balancing and use the in-memory database Redis to realize terminal data caching. After a comprehensive test of the system, the system can perform all functions normally, and it has good performance and stable operation.
APA, Harvard, Vancouver, ISO, and other styles
14

Stepanenko, A. A., and V. I. Martyanov. "Design of databases in road industry using set-theory analysis of complex systems." Journal «Izvestiya vuzov. Investitsiyi. Stroyitelstvo. Nedvizhimost» 12, no. 2 (2022): 214–23. http://dx.doi.org/10.21285/2227-2917-2022-2-214-223.

Full text
Abstract:
In the article, a method for designing databases for the road industry that meets the requirements of the digital economy was developed. The set-theory analysis of complex systems implemented by relational databases (RDBs) was used. This approach meets current trends in the global digital economy, where relational databases provide efficient operation of the banking sector, industrial production management, fast data processing (bank cards, Internet search), etc. The Department of Roads at Irkutsk National Research Technical University designed and is developing the road database for the Irkutsk region. The paper considers the concept of this project and prospects of its further development with a view to the possible use of this database by remote users. The specifics of digital economy in the Russian Federation call for establishing branches and territorial databases, including the road database for the Russian regions. The efficient solutions for such a database were proposed, including, in particular, integrating remote users with the back-end database based on the attribute characteristics to limit the available content and allowable changes to the back-end data.
APA, Harvard, Vancouver, ISO, and other styles
15

Kungurtsev, A. B., S. L. Zinovatnaya, and M. Al Abdo. "Prediction of a relational database’s operation in the information system." Odes’kyi Politechnichnyi Universytet. Pratsi, no. 1 (March 31, 2015): 113–20. http://dx.doi.org/10.15276/opu.1.45.2015.19.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Chen, Jeang-Kuo, and Wei-Zhe Lee. "An Introduction of NoSQL Databases Based on Their Categories and Application Industries." Algorithms 12, no. 5 (May 16, 2019): 106. http://dx.doi.org/10.3390/a12050106.

Full text
Abstract:
The popularization of big data makes the enterprise need to store more and more data. The data in the enterprise’s database must be accessed as fast as possible, but the Relational Database (RDB) has the speed limitation due to the join operation. Many enterprises have changed to use a NoSQL database, which can meet the requirement of fast data access. However, there are more than hundreds of NoSQL databases. It is important to select a suitable NoSQL database for a certain enterprise because this decision will affect the performance of the enterprise operations. In this paper, fifteen categories of NoSQL databases will be introduced to find out the characteristics of every category. Some principles and examples are proposed to choose an appropriate NoSQL database for different industries.
APA, Harvard, Vancouver, ISO, and other styles
17

Sinaga, Theresia Liana, Novrido Charibaldi, and Nur Heri Cahyana. "Perbandingan Waktu Respon Aplikasi Database NoSQL Elasticsearch dan MongoDB pada Pengujian Operasi CRUD." JISKA (Jurnal Informatika Sunan Kalijaga) 8, no. 1 (January 30, 2023): 22–35. http://dx.doi.org/10.14421/jiska.2023.8.1.22-35.

Full text
Abstract:
Currently, humans live in an era of data oceans, where the amount of data production is increasing from time to time, which is followed by severe challenges in terms of processing, storing, and analyzing data, especially big data. The increase in the number of large data production can affect the speed of access to the database, effectiveness, and speed of response time in the data processing. Relational databases have been the leading model for data storage, analysis, processing, and retrieval for more than forty years. However, due to the increasing need for large-scale data storage, the scalability and performance of a data processing system, as well as the constant growth of the amount of data, another alternative to databases emerged, namely NoSQL technology. Based on previous studies regarding the comparison of response time and database performance, the average concludes that NoSQL performance is more effective and efficient than relational databases. Based on the implementation and testing, it can be concluded that the NoSQL database application MongoDB is proven to be superior in every command of CRUD tested compared to the Elasticsearch NoSQL database application, where in testing the create data command with a JSON file, the MongoDB database application is 42.5 times faster than the Elasticsearch database application. In testing the command to create data into a database containing different amounts of data, the MongoDB database application is 333.9 times faster than the average response time of the Elasticsearch database application. In testing the read command for data in a database containing different amounts of data, the MongoDB database application is 35.5 times faster than the Elasticsearch database application. In testing the update operation of data in a database containing different amounts of data, the MongoDB database application is 9.8 times faster than the Elasticsearch database application. in testing the delete operation of data in a database containing different amounts of data, the MongoDB database application is 58.9 times faster than the Elasticsearch database application.
APA, Harvard, Vancouver, ISO, and other styles
18

Carluccio, G., G. Olivo, A. Calisti, and A. Lotto. "Database of the operating room." Urologia Journal 61, no. 1 (February 1994): 43–44. http://dx.doi.org/10.1177/039156039406100108.

Full text
Abstract:
Statistical evaluations and revisions of surgical case histories are often difficult since each surgical operation is defined by numerous data. In a traditional surgery-book such revisions are often not homogeneous because reported data are incomplete or inaccurate. Homogeneous and orderly files facilitate rational management and fast, correct research. On the basis of this consideration, we have been studying and proposing a database of surgical operations.
APA, Harvard, Vancouver, ISO, and other styles
19

Liekovuori, Kaisa, Samu Rautio, and Aatu Härkönen. "Shared Parameter Database of War Gaming Software: Case Study on Commercial Simulation Databases." Security Dimensions 35, no. 35 (March 31, 2021): 82–99. http://dx.doi.org/10.5604/01.3001.0014.8241.

Full text
Abstract:
Background: The current research brings up the perspective of security-critical information systems in shared parameter databases in the context of processing sensitive data at Finnish Naval Warfare Centre. It refers to the environment of isolated military war gaming simulation and modeling systems. The research problem is: How to make an optimal solution for data distribution in different military war gaming simulation and modeling software? Objectives: The objective is to create a single shared database usable with different detail level software, e.g. high-level scenario simulation, technical system-of-system simulations, and system-level physical simulations. Methods: The methods are modeling, simulation and operation analysis. The approach is inductive, the strategy is a qualitative case study and the data collection was implemented by exploring database models and their combinations. The integration was implemented in an object-relational database management system (ORDBMS), PostgreSQL. Results: The shared database led to efficient access to simulation parameters, more straightforward system integration and improved scalability. Conclusions: The results of modeling and simulation indicated that the integration is possible to implement.
APA, Harvard, Vancouver, ISO, and other styles
20

Menshikov, V. V. "Lesson on the theme "Database management systems. Sorting. Requests to select data"." Informatics in school 1, no. 1 (March 18, 2021): 9–19. http://dx.doi.org/10.32517/2221-1993-2021-20-1-9-19.

Full text
Abstract:
The article presents one of the lessons of the "Databases" section of the ninth grade informatics course. This lesson explores such concepts as "DBMS", "sorting", "filter", "selection", "request". Methods of sorting information in a database are considered, as well as methods of searching for information in a database using filters and requests. The content of the lesson corresponds to the teaching materials on informatics for the ninth grade of L. L. Bosova, A. Yu. Bosova. A feature of the presented lesson is that it is intended for students studying at a cadet school (cadets), therefore a military component has been added to it — the examples of databases considered in the lesson are related to aviation. The databases, on the example of which the educational material of the lesson is considered, is a database containing information about the characteristics of aircraft of the USSR and Russia, and a database containing information about flights that are operated from Moscow Domodedovo and Sheremetyevo airports.
APA, Harvard, Vancouver, ISO, and other styles
21

Okido, Toshihisa, Yuichi Kodama, Jun Mashima, Takehide Kosuge, Takatomo Fujisawa, and Osamu Ogasawara. "DNA Data Bank of Japan (DDBJ) update report 2021." Nucleic Acids Research 50, no. D1 (November 9, 2021): D102—D105. http://dx.doi.org/10.1093/nar/gkab995.

Full text
Abstract:
Abstract The Bioinformation and DDBJ (DNA Data Bank of Japan) Center (DDBJ Center; https://www.ddbj.nig.ac.jp) operates archival databases that collect nucleotide sequences, study and sample information, and distribute them without access restriction to progress life science research as a member of the International Nucleotide Sequence Database Collaboration (INSDC), in collaboration with the National Center for Biotechnology Information (NCBI) and the European Bioinformatics Institute. Besides the INSDC databases, the DDBJ Center also provides the Genomic Expression Archive for functional genomics data and the Japanese Genotype-phenotype Archive for human data requiring controlled access. Additionally, the DDBJ Center started a new public repository, MetaboBank, for experimental raw data and metadata from metabolomics research in October 2020. In response to the COVID-19 pandemic, the DDBJ Center openly shares SARS-CoV-2 genome sequences in collaboration with Shizuoka Prefecture and Keio University. The operation of DDBJ is based on the National Institute of Genetics (NIG) supercomputer, which is open for large-scale sequence data analysis for life science researchers. This paper reports recent updates on the archival databases and the services of DDBJ.
APA, Harvard, Vancouver, ISO, and other styles
22

IORDACHE, Dorin. "DATABASE – WEB INTERFACE VULNERABILITIES." STRATEGIES XXI - Security and Defense Faculty 17, no. 1 (November 1, 2021): 279–87. http://dx.doi.org/10.53477/2668-2001-21-35.

Full text
Abstract:
The importance of information security in general, of managed information at the level of a database has increased with the expansion of the Internet. On the other hand, it has acquired new facets with the increase of the accessibility of the users to as many resources as possible. Large volume of private data in use and the limitation of unauthorized actions to information have brought new aspects to the issue of ensuring their protection. The scope of this field is wide and allows the operation in several directions: identification, description, creation, implementation and testing of mechanisms aimed at improving the working environment in which database management systems operates. Due to the importance of the information managed by a DBMS[1], it is necessary to define a framework safe and easy to use. The database fulfills not only the role of storage, but also of data provider to users. Thus, the information must be protected throughout the interaction process: generation, storage, processing, modification, deletion, etc. Therefore, the security of databases must not only be reduced to the protection of certain data considered sensitive, but also to the creation of a secure, authorized and controlled global environment through which information becomes available to users. [1] DBMS – DataBase Management System
APA, Harvard, Vancouver, ISO, and other styles
23

Masic, Izet, Slobodan Jankovic, Doncho Donev, Muharem Zildzic, and Izet Hozo. "Operation of Medical Journal Citation Databases Without Control. Dilemma: Are They What They Want to Be in the Eyes of Scientific Community." Materia Socio Medica 34, no. 4 (2022): 248. http://dx.doi.org/10.5455/msm.2022.34.248-253.

Full text
Abstract:
The decision of the citation database to include or not include a journal is not subject to the control of another entity, or the professional public, and there are no internationally established ethical standards that the citation database would have to apply. As a consequence of the absence of control, the already mentioned offensive reviews and arbitrary interpretation of the criteria for journal inclusion appear. Given that a journal’s entry into the citation database is a condition for its long-term survival, people who make decisions in the citation databases gain the power to shut down or revive certain journals based on personal preferences. Any power that is not controlled is eventually abused. Therefore, our proposal is to urgently establish the principles of ethical behavior of citation databases at the global level and find ways to ensure compliance with such principles.
APA, Harvard, Vancouver, ISO, and other styles
24

M, Natarajan, and Manimegalai R. "Key agreement based secure Kerberos authentication protocol (KASKAP) for distributed database access in secured manner." International Journal of Engineering & Technology 7, no. 2.9 (April 26, 2018): 24. http://dx.doi.org/10.14419/ijet.v7i2.9.9364.

Full text
Abstract:
Distributed database is a collection of multiple databases that can be stored at different network sites. It acts as an important role in today’s world intended for storing and retrieving huge data. The implementation of distributed database advantages such as data replication, low operating costs, faster data transaction and data processing, but security is still a significant problem. In this paper make clear to explain security issues of distributed database and give the suggestion to improve security of distributed database. Subsequently, secured distributed database design in light of trusted node is proposed. The design contains a unique node in a system called a trusted node for each site through which every single other node will get to the database. Trusted node process client demands, joins the outcomes from concerned distributed databases and forward it to the confirmed client. The system adjusted by the trusted nodes keeping in mind the end goal to give authentication is Key Agreement based Secure Kerberos Authentication Protocol (KASKAP). Hence authenticated users can only access the database.
APA, Harvard, Vancouver, ISO, and other styles
25

Choi, Wonsang. "A Study on the Intelligent Disaster Management System Based on Artificial Intelligence." Journal of the Korean Society of Hazard Mitigation 20, no. 1 (February 29, 2020): 127–40. http://dx.doi.org/10.9798/kosham.2020.20.1.127.

Full text
Abstract:
This study was conducted with the help of information and communications technology in the government's disaster safety management, which is promoting an intelligent government. We would like to suggest policy measures for the application of artificial intelligence (AI), also referred to as determinants. To this end, we review the government's disaster safety management and review the development and operation of AI. AI-based government disaster safety management is examined, showing databases according to disaster area, the infrastructure of information systems that are operated by institutions, and that the existing infrastructure operates on one disaster platform. Furthermore, this paper proposes the fostering of manpower and organizations for the operation of AI on the deployed infrastructure and platforms.
APA, Harvard, Vancouver, ISO, and other styles
26

Putra, Raden Bagus Dimas, Eko Setia Budi, and Abdul Rahman Kadafi. "Perbandingan Antara SQLite, Room, dan RBDLiTe Dalam Pembuatan Basis Data pada Aplikasi Android." JURIKOM (Jurnal Riset Komputer) 7, no. 3 (June 14, 2020): 376. http://dx.doi.org/10.30865/jurikom.v7i3.2161.

Full text
Abstract:
Android is one of the largest mobile operating system platforms today. Amid the ever-increasing needs of users, Android programmers are still lacking to meet the current market needs. One of the problems that causes this is the difficulty of creating an internal database, so many people give up to learn to make Android applications. Android's internal database is hard to create because the entire data structure on that database should be built with the query executed in the created program code. Some of the common internal databases currently used are SQLiTe and Room. Both databases are quite difficult to learn where SQLite should build everything from scratch while Room should study the structure and systematics of of the library. Therefore, in this article, the author proposes library and template named "Relational Database Library and Template" which will be referred to as RBDLiTe which can create an internal database for Android applications easily which will also be compared with SQLiTe and Room in its use.
APA, Harvard, Vancouver, ISO, and other styles
27

Tan, Xiu Ling, Sae Cheong Yap, Xiang Li, and Leonard W. Yip. "Comparison of Ethnic-specific Databases in Heidelberg Retina Tomography-3 to Discriminate Between Early Glaucoma and Normal Chinese Eyes." Open Ophthalmology Journal 11, no. 1 (February 28, 2017): 40–46. http://dx.doi.org/10.2174/1874364101711010040.

Full text
Abstract:
Purpose: To compare the diagnostic accuracy of the 3 race-specific normative databases in Heidelberg Retina Tomography (HRT)-3, in differentiating between early glaucomatous and healthy normal Chinese eyes. Method: 52 healthy volunteers and 25 glaucoma patients were recruited for this prospective cross-sectional study. All underwent standardized interviews, ophthalmic examination, perimetry and HRT optic disc imaging. Area under the curve (AUC) receiver operating characteristics, sensitivity and specificity were derived to assess the discriminating abilities of the 3 normative databases, for both Moorfields Regression Analysis (MRA) and Glaucoma Probability Score (GPS). Results: A significantly higher percentage (65%) of patients were classified as “within normal limits” using the MRA-Indian database, as compared to the MRA-Caucasian and MRA-African-American databases. However, for GPS, this was observed using the African-American database. For MRA, the highest sensitivity was obtained with both Caucasian and African-American databases (68%), while the highest specificity was from the Indian database (94%). The AUC for discrimination between glaucomatous and normal eyes by MRA-Caucasian, MRA-African-American and MRA-Indian databases were 0.77 (95% CI, 0.67-0.88), 0.79 (0.69-0.89) and 0.73 (0.63-0.84) respectively. For GPS, the highest sensitivity was obtained using either Caucasian or Indian databases (68%). The highest specificity was seen with the African-American database (98%). The AUC for GPS-Caucasian, GPS-African-American and GPS-Indian databases were 0.76 (95% CI, 0.66-0.87), 0.77 (0.67-0.87) and 0.76 (0.66-0.87) respectively. Conclusion: Comparison of the 3 ethnic databases did not reveal significant differences to differentiate early glaucomatous from normal Chinese eyes.
APA, Harvard, Vancouver, ISO, and other styles
28

Dewan, Rajiv M. "On the Distributed Database Union Operation." Journal of Database Management 3, no. 2 (April 1992): 24–35. http://dx.doi.org/10.4018/jdm.1992040103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Clemencic, M., I. Shapoval, M. Cattaneo, H. Degaudenzi, and R. Santinelli. "LHCb Conditions database operation assistance systems." Journal of Physics: Conference Series 396, no. 5 (December 13, 2012): 052022. http://dx.doi.org/10.1088/1742-6596/396/5/052022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Petrosyan, Arthur S., and Gurgen S. Petrosyan. "Development of Management System for eduroam Database Updated Specification." Mathematical Problems of Computer Science 53 (July 10, 2020): 57–62. http://dx.doi.org/10.51408/1963-0081.

Full text
Abstract:
This paper presents a system developed to simplify the eduroam database management. The goal of the eduroam database is to provide the necessary information needed for the operation of the eduroam service and related supporting services like eduroam monitoring, service location maps and usage statistics, as well as the eduroam CAT tool. eduroam database specifications have been updated to the second version and the update has recently become mandatory, requiring many changes for databases of all National Roaming Operators (NRO). The solution presented here can be useful for NRO administrators to simplify the eduroam database management.
APA, Harvard, Vancouver, ISO, and other styles
31

Petrosyan, Arthur, and Gurgen Petrosyan. "Development of Management System for eduroam Database Updated Specification." Mathematical Problems of Computer Science 53 (July 10, 2020): 57–62. http://dx.doi.org/10.51408/1963-0075.

Full text
Abstract:
This paper presents a system developed to simplify the eduroam database management. The goal of the eduroam database is to provide the necessary information needed for the operation of the eduroam service and related supporting services like eduroam monitoring, service location maps and usage statistics, as well as the eduroam CAT tool. eduroam database specifications have been updated to the second version and the update has recently become mandatory, requiring many changes for databases of all National Roaming Operators (NRO). The solution presented here can be useful for NRO administrators to simplify the eduroam database management.
APA, Harvard, Vancouver, ISO, and other styles
32

Gaensslen, R. E. "Should Biological Evidence or DNA Be Retained by Forensic Science Laboratories after Profiling? No, except under Narrow Legislatively-Stipulated Conditions." Journal of Law, Medicine & Ethics 34, no. 2 (2006): 375–79. http://dx.doi.org/10.1111/j.1748-720x.2006.00042.x.

Full text
Abstract:
DNA profiling and databasing have become commonplace in criminal investigation and prosecution. There is a body of both state and federal legislation enabling the establishment and operation of profile databases for law enforcement purposes. Most legislation is specific as to who (or what evidence) may be profiled for inclusion in a database. The majority of state laws permit DNA profile databasing of offenders convicted of certain defined crimes, of missing persons and their relatives, and of DNA profiles from criminal-case evidence where the depositor is unknown. More recently, a few states have acted to permit databasing profiles of suspects of certain types of crimes, and there appears to be a trend toward wider adoption of this practice. The legislation adopted or proposed thus far defines whose DNA profiles can be databased, and under what circumstances. Less attention has been given to the matter of specimen retention following profiling and databasing.
APA, Harvard, Vancouver, ISO, and other styles
33

Alekseev, Konstantin. "Relational database problems." Кибернетика и программирование, no. 2 (February 2020): 7–18. http://dx.doi.org/10.25136/2644-5522.2020.2.34076.

Full text
Abstract:
The relevance of this article lies in the fact that today's databases are the basis of numerous information systems. The information accumulated in them is extremely valuable material, and today database processing methods are widely spread in terms of extracting additional methods, knowledge from them, which are interconnected with generalization and various additional methods of information processing.The object of research in this work is relational databases and DBMS, the subject of research is the features of their use in applied programming.In accordance with the set goal, it is necessary to solve the following tasks:1) to consider the concept and essence of a relational database;2) to analyze the problematic aspects of relational databases in modern conditions. Relational databases are among the most widespread due to their simplicity and clarity at the creation stage and at the user level. It should also be noted that the main advantage of RDB is its compatibility with the main query language SQL, which is intuitive for users.Nevertheless, with all the variety of approaches, there are still some canons, violation of which greatly affects both the design of the database and its operation. For example, the problem of database normalization is very relevant. Neglecting normalization makes the database structure confusing and the database itself unreliable.Promising directions include the development of queries to a relational database using heuristic methods, as well as the method of accumulating previously optimized queries with subsequent verification of the derivability of the current query from the accumulated ones.Finally, a very slow decline in relational databases is probably happening. While they are still the primary storage medium, especially in large enterprise projects, they are gradually being replaced by non-relational solutions that will become the majority over time.
APA, Harvard, Vancouver, ISO, and other styles
34

Ray, Loye L., and Henry Felch. "Methodology for Detecting Advanced Persistent Threats in Oracle Databases." International Journal of Strategic Information Technology and Applications 5, no. 1 (January 2014): 42–53. http://dx.doi.org/10.4018/ijsita.2014010104.

Full text
Abstract:
Advanced persistent threats (APTs) have become a big problem for computer systems. Databases are vulnerable to these threats and can give attackers access to an organizations sensitive data. Oracle databases are at greater risk due to their heavy use as back-ends to corporate applications such as enterprise resource planning software. This paper will describe a methodology for finding APTs that may be hiding or operating deep within an Oracle database system. Using a deep understanding of Oracle normal operations provides a baseline to assist in discovering APT behavior. Incorporating these into a database intrusion detection system can raise the ability for finding these threats.
APA, Harvard, Vancouver, ISO, and other styles
35

Wang, Nuo, Yan Li, and Li Min Yuan. "Simulation on Optimized Intrusion Detection of Multi-Layer, Distributed and Large Differences Database." Applied Mechanics and Materials 556-562 (May 2014): 2886–89. http://dx.doi.org/10.4028/www.scientific.net/amm.556-562.2886.

Full text
Abstract:
Different from the traditional single databases, there is a big difference between different layers’ data of multi-level database. The differentiation of categorical attributes is small. Traditional database intrusion detection process is simply to consider the point to point data detection between the layers, without considering the similarity between the layers and ignoring the optimization for detected properties of the applied classification between the levels, resulting in lower detection accuracy. In order to avoid the above-mentioned defects of the conventional algorithm, this paper propos an intrusion detection model of multi-layered network by introducing the coarse-to-fine concept. The intrusion feature of computer database is extracted to be used as the basis for intrusion detection of database. The particle swarm distinguish tree is established to make the hierarchical processing for nodes. Through the probability operation of database intrusion detection in different layers, intrusion detection of multi-layer, distributed and large differences database can be achieved. Experimental results show that the use of the intrusion detection algorithm for multi-layer, distributed and large differences database, can increase the security of the database, ensure the safe operation of the database.
APA, Harvard, Vancouver, ISO, and other styles
36

Marchand, Mathilde, Yves-Marie Saint-Drenan, Laurent Saboret, Etienne Wey, and Lucien Wald. "Performance of CAMS Radiation Service and HelioClim-3 databases of solar radiation at surface: evaluating the spatial variation in Germany." Advances in Science and Research 17 (July 14, 2020): 143–52. http://dx.doi.org/10.5194/asr-17-143-2020.

Full text
Abstract:
Abstract. The present work deals with the spatial consistency of two well-known databases of solar radiation received at ground level: the CAMS Radiation Service database version 3.2, abbreviated as CAMS-Rad and the HelioClim-3 database version 5, abbreviated as HC3v5. Both databases are derived from satellite images. They are validated against 10 min means of irradiance for the period 2010–2018 recorded in a network of 26 ground stations in Germany operated by the Deutscher Wetterdienst (DWD). For the CAMS-Rad database, the correlation coefficient between ground measurements and estimates ranges between 0.83 and 0.92 for all sky conditions. The bias ranges from −41 and 32 W m−2 (−11 % and 10 % of the mean irradiance). The standard deviation ranges between 89 and 129 W m−2 (25 % and 39 %). For the HC3v5 database, the correlation coefficient ranges between 0.90 and 0.95. The bias and the standard deviation are comprised between −22 and 16 W m−2 (−6 % and 5 %), and between respectively 70 and 104 W m−2 (20 % and 31 %). For the CAMS Rad database, overestimation is observed in the South, and underestimation in the North with a faint tendency of the bias to increase from East to West. For the HC3v5 database, the bias is fairly homogeneous across Germany. For both databases, there is no noticeable spatial trend in the standard deviation.
APA, Harvard, Vancouver, ISO, and other styles
37

Kofroň, Jan, Michal Opletal, and Matyáš Zrno. "Analysis of the Malian conflict dynamics – exploiting ACLED database." Vojenské rozhledy 29, no. 4 (November 24, 2020): 046–64. http://dx.doi.org/10.3849/2336-2995.29.2020.04.046-064.

Full text
Abstract:
Focusing on current Malian conflict, the aim of the article is to demonstrate usefulness of the „Armed Conflict Location & Event Data Project“ database for the analysis of intra-state conflicts. At the macro-level the paper analyzes geographical spread of the conflict and its key quantitative characteristics (numbers of fatalities stemming from different types of incidents). At the micro-level it focuses on Malian region Mopti. At this level the paper analyzes geographical distribution of various incidents and the interaction of the key armed groups operating within the region.
APA, Harvard, Vancouver, ISO, and other styles
38

Li, Changqing, and Jianhua Gu. "A Paged Prefetching Model for Join Operations of Cross-Databases." Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University 38, no. 6 (December 2020): 1225–34. http://dx.doi.org/10.1051/jnwpu/20203861225.

Full text
Abstract:
As the applications of big data are increasing, the NoSQL (Not only Structured Query Language) database management systems have been developed rapidly. How to integrate NoSQL database and relational database effectively has become one of the research hotspots. In the existing research results, the paged query method used for join operations of these heterogeneous multi-databases can produce a large delay. In view of this deficiency, this paper presents a paged prefetching model, and focuses on its basic composition, prefetching mode and operation mechanism. A prototype system is designed and developed. The effect of this model is verified, and the expected targets are achieved. Compared with the non-prefetched paging query method, the outstanding contribution of this research result is that it can reduce the delay of paging query and thus improve the efficiency of the join operations of cross-databases.
APA, Harvard, Vancouver, ISO, and other styles
39

Geng, Wei Jiang. "Research on Database Design Based on Wireless Sensor Network." Applied Mechanics and Materials 539 (July 2014): 239–42. http://dx.doi.org/10.4028/www.scientific.net/amm.539.239.

Full text
Abstract:
This paper analyzes the composition of architecture and sensor nodes in wireless sensor networks, component models running on the nodes embedded operating system and programming model based on nesC. Then, we analyze network databases based on wireless sensor, design and implement wireless sensor network database TaraxDB, focused on design and implement the receiving client inquiries, analysis and presentation, send queries and receive results between the database client and sensor networks, database client Receive query and submit the results between the client and the user, The main feature on the database, including front-end and network nodes, as well as the completion of key technical queries collaboration between nodes.
APA, Harvard, Vancouver, ISO, and other styles
40

Tien, Ngo Manh, Nguyen Huu Huan, Tran Van Chung, Tong Phuoc Hoang Son, Vo Trong Thach, and Pham Thi Thu Thuy. "XÂY DỰNG KHUNG CƠ SỞ DỮ LIỆU SỐ VỀ HẢI DƯƠNG, MÔI TRƯỜNG VÙNG BIỂN NINH THUẬN - BÌNH THUẬN." Tạp chí Khoa học và Công nghệ Biển 17, no. 4 (August 6, 2018): 445–58. http://dx.doi.org/10.15625/1859-3097/17/4/8516.

Full text
Abstract:
This paper presents the building and designing of the digital database frame about the oceanographic and environmental elements of the coastal waters of Ninh Thuan and Binh Thuan provinces, based on the integration of data from the observed, multi-temporal remote sensing as well as simulated sources. Especially, VNREDSat-1 the first remote sensing imagery of Vietnam with high resolution also got the attention in building the digital database frame. The building of the digital database frame is the first important step in order to achieve an overview of data’s structure, then to build the database’s structure. The database’s structure is built in the form of B-tree which optimize the operation when updating the data.
APA, Harvard, Vancouver, ISO, and other styles
41

Климаш, М., О. Костів, О. Гордійчук-Бублівська, and І. Чайковський. "ДОСЛІДЖЕННЯ ЕФЕКТИВНОСТІ ВИКОРИСТАННЯ РОЗПОДІЛЕНИХ БАЗ ДАНИХ В СИСТЕМАХ IIOT." Information and communication technologies, electronic engineering 2, no. 1 (August 2022): 12–18. http://dx.doi.org/10.23939/ictee2022.01.012.

Full text
Abstract:
The Industrial Internet of Things (IIoT) determines the transformation of centralized systems into decentralized, more flexible, and efficient ones. Cloud technologies allow much more optimal use of IIoT resources. In the paper, the main features of the Industrial Internet of Things systems were investigated and the problems of smart manufacturing were analyzed. The necessity of using distributed architecture and cloud resources for flexible industrial systems organization was determined. In addition, the advantages of distributed computing for big data processing were established. The preference for relational databases over non-relational ones for data processing and the reliability of user requests service were defined. As well, the peculiarity of the relational database's operation was considered. For improving computational efficiency the use of a distributed database architecture was investigated. The benefits of involving cloud and distributed technologies in IIoT systems were determined. In this way, the possibility of choosing the most optimal parameters depending on the requirements for the industrial system productivity was defined. The opportunities for improving the quality of services in the Industrial Internet of Things by choosing the optimally distributed database architecture were determined.
APA, Harvard, Vancouver, ISO, and other styles
42

Jurga, Anna, and Jakub Mondzelewski. "Functioning of DNA Database in Poland." Issues of Forensic Science 297 (2017): 59–65. http://dx.doi.org/10.34836/pk.2017.297.2.

Full text
Abstract:
Forensic DNA databases that operate in the zone forming an interface between science and law have the purpose of gathering and processing DNA profiles for the needs of law enforcement and judicial authorities responsible for preventing and combating crime. Therefore, their appropriate functioning is important. On one hand, it improves efficiency of police work and, on the other hand, it has to play a required role in protecting citizen rights and personal data. The National DNA Database has functioned in Poland since 2007. Its effectiveness is correlated with the number of stored profiles. Despite small collection the Database has on numerous occasions proven its high usefulness in solving criminal cases. The possibility of carrying out searches in other countries databases, as well as legislative and organisational undertakings aiming at improvement of the Database operation are gradually bringing effects and result in an increased detective potential of this tool.
APA, Harvard, Vancouver, ISO, and other styles
43

Shalabi, Hossam, and George Hadjisophocleous. "CANDU FIRE DATABASE." CNL Nuclear Review 8, no. 2 (December 1, 2019): 179–89. http://dx.doi.org/10.12943/cnr.2017.00019.

Full text
Abstract:
The Nuclear Energy Agency (NEA) is a specialized agency within the Organization for Economic Co-operation and Development (OECD). The International Fire Data Exchange Project (OECD FIRE) was designed by the NEA to encourage multilateral co-operation in the collection and analysis of data relating to fire events in nuclear power plants. We used Python advanced software to analyze the data related to CANDU reactor plants in Canada from the OECD FIRE Database, while providing weighting factors/percentage tables to be used in CANDU Fire probabilistic risk assessment analysis. We also used 5 different time-series methods to predict future potential fires in CANDU reactors, compared the results from different methods, and identified the best method to predict future fires in CANDU power plants.
APA, Harvard, Vancouver, ISO, and other styles
44

Kumar, M. Sandeep, and Prabhu .J. "Comparison of NoSQL Database and Traditional Database-An emphatic analysis." JOIV : International Journal on Informatics Visualization 2, no. 2 (March 3, 2018): 51. http://dx.doi.org/10.30630/joiv.2.2.58.

Full text
Abstract:
A Huge amount of data is manipulated by using the web application, Facebook, Twitter, social sites etc. Most of the data are unstructured data. It is not desirable for storing, performing and analyzing data in the relational database for huge data. It affords way towards performing NoSQL database and uses fully for handling the big data. In this paper, we present the performance in store and query operation in NoSQL database, estimating the performance of both reads and write operation using simple and complex queries. Result represents that comparing Cassandra with relation database, Cassandra outperforms the relation database. Most of the organization used only Hbase and Cassandra for benefit of cost. Comparison Various NoSQL Database, issues while performing NoSQL database.
APA, Harvard, Vancouver, ISO, and other styles
45

Li, Wei Min, and Shi Jian Zhang. "The Key Technology Research of Mechanical Parts Dynamic Management Database System." Applied Mechanics and Materials 631-632 (September 2014): 1039–44. http://dx.doi.org/10.4028/www.scientific.net/amm.631-632.1039.

Full text
Abstract:
Accurately and efficiently managing mechanical parts database is extremely important and pivotal in the machinery industry. In this paper, a dynamic management database system of mechanical parts is established by the Microsoft Access database management system and the VB visualized integrated development environment. This system adopts ADOX object to establish a new database and edit the existing database dynamically, applies data files, ADO and ADOX object to dynamically manage database, and utilizes Datacombo control to realize the selection of data. Being simple, convenient and feasible, this system is an intelligent man-machine interactive operating system, which enhances the work efficiency and substantially decreases repetitive operation.
APA, Harvard, Vancouver, ISO, and other styles
46

Göhring, R. "Operating experience with the relational patient database." Medical Informatics 11, no. 3 (January 1986): 237–48. http://dx.doi.org/10.3109/14639238609003730.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Milne, R. "Operation Bigfoot – a volume crime database project." Science & Justice 41, no. 3 (July 2001): 215–17. http://dx.doi.org/10.1016/s1355-0306(01)71895-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Yanti, Neti Rusri, Alimah Alimah, and Desi Afrida Ritonga. "Implementasi Algoritma Data Encryption Standard Pada Penyandian Record Database." J-SAKTI (Jurnal Sains Komputer dan Informatika) 2, no. 1 (March 27, 2018): 23. http://dx.doi.org/10.30645/j-sakti.v2i1.53.

Full text
Abstract:
Record databases are generally still often displayed in text form as information for users, so it can facilitate cryptanalyst to access and provide opportunities to do the leak, distribute or modify the database records. One of the cryptographic algorithms used to secure data is using the DES algorithm to encrypt the data to be stored or sent. The DES algorithm belongs to a cryptographic system of symmetry and is a type of block cipher. DES operates on a 64-bit block size. DES describes 64 bits of plaintext to 64 bits of ciphertext using 56 bits of internal key (internal key) or up-key (subkey). The internal key is generated from an external key 64-bit length. This research describes the process of securing database records by encrypting it based on DES algorithm, resulting in text record databases in the form of passwords that are difficult to understand and understand by others. This is done in an attempt to minimize the misuse of database records.
APA, Harvard, Vancouver, ISO, and other styles
49

Adil, S. H., M. Ebrahim, S. S. A. Ali, and K. Raza. "Performance Analysis of Duplicate Record Detection Techniques." Engineering, Technology & Applied Science Research 9, no. 5 (October 9, 2019): 4755–58. http://dx.doi.org/10.48084/etasr.3036.

Full text
Abstract:
In this paper, a comprehensive performance analysis of duplicate data detection techniques for relational databases has been performed. The research focuses on traditional SQL based and modern bloom filter techniques to find and eliminate records which already exist in the database while performing bulk insertion operation (i.e. bulk insertion involved in the loading phase of the Extract, Transform, and Load (ETL) process and data synchronization in multisite database synchronization). The comprehensive performance analysis was performed on several data sizes using SQL, bloom filter, and parallel bloom filter. The results show that the parallel bloom filter is highly suitable for duplicate detection in the database.
APA, Harvard, Vancouver, ISO, and other styles
50

Ait El Mouden, Zakariyaa, and Abdeslam Jakimi. "A New Algorithm for Storing and Migrating Data Modelled by Graphs." International Journal of Online and Biomedical Engineering (iJOE) 16, no. 11 (October 5, 2020): 137. http://dx.doi.org/10.3991/ijoe.v16i11.15545.

Full text
Abstract:
<span>NoSQL databases have moved from theoretical solutions to exceed relational databases limits to a practical and indisputable application for storing and manipulation big data. In term of variety, NoSQL databases store heterogeneous data without being obliged to respect a predefined schema such as the case of relational and object-relational databases. NoSQL solutions surpass the traditional databases in storage capacity; we consider MongoDB for example, which is a document-oriented database capable of storing unlimited number of documents with a maximal size of 32TB depending on the machine that runs the database and also the operating system. Also, in term of velocity, many researches compared the execution time of different transactions and proved that NoSQL databases are the perfect solution for real-time applications. This paper presents an algorithm to store data modeled by graphs as NoSQL documents, the purpose of this study is to exploit the high amount of data stored in SQL databases and to make such data usable by recent clustering algorithms and other data science tools. This study links relational data to document datastores by defining an effective algorithm for reading relational data, modelling those data as graphs and storing those data as NoSQL documents.</span>
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography