To see the other types of publications on this topic, follow the link: Database migrations.

Dissertations / Theses on the topic 'Database migrations'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Database migrations.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Jonsson, Peter. "Automated Testing of Database Schema Migrations." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-255012.

Full text
Abstract:
Modern applications use databases, and the majority of them are relational databases, which use schemas to impose data integrity constraints. As applications change, so do their databases. Database schemas are changed using migrations. Certain conditions can result in migrations failing in production environments, leading to a broken database state and testing can be problematic without accessing production data which can be sensitive.Two migration validation methods were proposed and implemented to automatically reject invalid migrations that are not compatible with the database state. The methods were based on, and compared to, a default method that used Liquibase to structure and perform migrations. The assertion method used knowledge of what a valid state would look like to generate pre-conditions from assertions to verify that the database’s state matched expectations and that the migrations were compatible with a database’s state prior to migration. The schema method, used a copy of the production database’s schema to perform migrations on an empty database in order to test the compatibility of the old and new schemas. 108 test cases consisting of a migration and a database state were used to test all methods. Both valid and invalid test cases that were not compatible with the database’s state were used. The distribution of aborted, failed, and successful migrations were analyzed along with the automation, traceability, reliability, database interoperability, preservability, and scalability for each method.Both the assertion method and the schema method could be used to stop most of the invalid migrations without access to the production data. A combination of the assertion method and the schema method would result in only 2/108 migrations failing and could reduce the failure rate even further through using a schema to reduce complexity for uniqueness constraints and to improve support for handling data type conversions.<br>Moderna applikationer använder ofta relationsdatabaser som har en strikt regeluppsättning för att garantera dataintegritet. När applikationerna förändras måste även deras regeluppsättningar göra det. Omständigheter kan leda till att databasförändringar inte går att genomföra i produktionsmiljön, vilket resulterar i ett trasigt databastillstånd. Testning av förändringar kan vara problematiskt utan tillgång till produktionsdata, men detta kan vara svårt då produktionsdatan i sig kan vara känslig.Två valideringsmetoder föreslogs och implementerades för att automatiskt stoppa ogiltiga förändringar som inte är kompatibla med databasens tillstånd. Metoderna grundades i, och jämfördes med, en grundmetod som använde Liquibase för att strukturera och genomföra databasförändringar. Påståendemetoden använde kunskap kring vad som är ett giltigt tillstånd för att generera villkorssatser som verifierar att databasens tillstånd matchar förväntningarna innan förändringen genomförs. Regelmetoden använde en kopia av produktionsdatabasens regeluppsättning för att exekvera förändringen på en tom databas för att testa kompatibiliteten mellan den gamla regeluppsättningen och den nya. 108 testfall användes och bestod av förändringar samt tillstånd. Både giltiga och ogiltiga testfall som inte var kompatibla med databasens tillstånd användes. Distributionen av avbrutna, misslyckade och lyckade förändringar analyserades i faktorer som automation, spårbarhet, pålitlighet, databasinteroperabilitet, konserverbarhet samt skalbarhet för varje metod.Både påståendemetoden och regelmetoden kunde användas för att stoppa de flesta ogiltiga förändringarna utan tillgång till produktionsdata. En kombination av påståendemetoden och regelmetoden skulle resultera i att enbart 2/108 förändringar misslyckades och kunna nå ännu lägre felfrekvens genom att analysera databasschemat för att reducera komplexiteten som krävs för vissa unikhetskrav och vidare öka stödet för konvertering av datatyper.
APA, Harvard, Vancouver, ISO, and other styles
2

Moatassem, Nawal N. "A Study of Migrating Biological Data from Relational Databases to NoSQL Databases." Youngstown State University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1442486094.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ibanez, Enric. "Reengineering Project: Database Optimization and Migration." Thesis, Linnéuniversitetet, Institutionen för datavetenskap, fysik och matematik, DFM, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-9168.

Full text
Abstract:
The purpose of this project is to help a growing company reform their static information system to a dynamic system compatible with growth. The solution consists of migrating the legacy system that they have in FileMaker to an open technology platform. To solve this specific problem “patterns” have been used and this project explains these general solution “patterns”. We understand patterns like a generic solution to persistent design problems. General solutions will not only be useful in this specific problem, but they will be useful in all kind of similar migration projects as well. This thesis gives a detailed explanation of how to apply these patterns into the AEMI specific problem and how they can be useful in the migration process. The solution of the problem then, is following the advice of “patterns” to achieve our goals; these goals are the requirements that are given from the company supervisor. After the migration process a redesign process must be done in order to organize the information. This redesign consists of organizing the migrated information as well as adding the new information in the correct place. This thesis focuses on the process of migrating from a legacy system to a MySQL system through the use of the generic solution called “patterns”.  The final result is a MySQL database with all the old and new information together in a more adaptable platform for the company’s scalability.
APA, Harvard, Vancouver, ISO, and other styles
4

Zkoumalová, Barbora. "Migrace systémové databáze elektronického obchodu." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2016. http://www.nusl.cz/ntk/nusl-234829.

Full text
Abstract:
The object of master‘s thesis is design and creation of e-commerce system database migration tool from the ZenCart platform to the PrestaShop platform. Both system databases will be described and analysed and based on gained information the migration tool will be created according customers‘ requirements and then final data migration from original to the new database will be executed.
APA, Harvard, Vancouver, ISO, and other styles
5

Benitez, Javier. "Database schema migrationin highly available services." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-290927.

Full text
Abstract:
Since the rise of DevOps, companies have been trying to provide rapid and agile cycles on the development of new products, reducing time and costs to go from definition to production. However, whenever a MySQL system with a zero-downtime requirement is found, the database deployment cycle is still manual, leaving it less evolved than other parts of the development cycle, especially when applying schema updates without affecting the availability of a MySQL system (Database Online Schema migrations) on Continuous Integration and Continuous Delivery pipelines. As a consequence, it delays the process. There is a strong belief that the different affected technologies are ready to automatize Online Schema migrations in production. In this thesis, we verify the viability and study the reliability of automatizing Online Schemamigrations with a zero-downtime requirement. We achieved this by testing the most common errors in a setup without error detection mechanisms and then testing our proposed design. We conclude that is viable and reliable to apply these techniques and strategies for providing Continuous Delivery. Our design prevents most common errors when updating SQL schemas online with some inherent implications and flaws in MySQL systems.<br>Sedan DevOps uppkomst har företag försökt ge snabba och smidiga cykler för utveckling av nya produkter, vilket reducerar tid och kostnader för att gå fråndefinition till produktion. Men när ett noll-stilleståndskrav hittas i ett MySQL system, är databasdistributionscykeln fortfarande rudimentär och manuell, vilket gör att den har mindre utvecklats än andra delar av utvecklingscykeln, särskilt migrationsdatabas online-schema för kontinuerlig integration och kontinuerliga leveransrörledningar . Som en konsekvens försenar det processen. Även om det inte finns någon tidigare litteratur om det, finns det en stark övertygelse om att de olika berörda teknikerna är redo att automatisera onlineschema-migrationer i produktionen. I den här avhandlingen verifierar vi livskraften och studerar tillförlitligheten för att automatisera Online Schema migreringar med ett noll-stilleståndskrav. Vi gjorde det genom att testa de vanligaste felen i en normal installation och när vi använder vår föreslagna design. Vi drar slutsatsen att det är genomförbart och pålitligt att tillämpa dessatekniker och strategier för att tillhandahålla kontinuerlig leverans. Vår designförhindrar vanliga fel vid uppdatering av SQL-scheman online med några inneboendeimplikationer och brister.
APA, Harvard, Vancouver, ISO, and other styles
6

Mendoza, Jayo Rubén G., Carlos Raymundo, Francisco Domínguez Mateos, and Rodríguez José María Alvarez. "Convergence model for the migration of a relational database to a NoSQL database." International Institute of Informatics and Systemics, IIIS, 2017. http://hdl.handle.net/10757/656362.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Asplund, Felicia. "Undersökning av migrationsmetoder för databaser : Jämförelse mellan Export/Import och ETL utifrån den mest lämpade metoden för att effektivisera en databas." Thesis, Mittuniversitetet, Institutionen för informationssystem och –teknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-39589.

Full text
Abstract:
Migration av data innebär att data flyttas från en databas till en annan. Denna åtgärd kan företag behöva av olika skäl, till exempel för att ändra språk eller förnya befintlig databas. Något som diskuteras är hur denna process ska ske och hur data migreras på bästa smidigaste sätt. Ett av IT företagen som var i behov av svaren är XLENT Sundsvall. XLENT har en webbshop där gränssnittet är föråldrat, samt i behov av en bättre serverlösning ur ett förvaltningsperspektiv. Den här studien syftar till att se över möjligheterna att flytta över den befintliga hemsidan till en modern e-handelslösning. Titta på vilket bästa sätt är att migrera datan till en ny databas som är mer lämplig för hemsidan. De metoderna som ska jämföras är en export-import möjlighet och Extract Transform Load ( ETL) verktyg. Export-import metoden visade sig vara den mest lämpade processen för en databas med våra egenskaper, och en migration med valda processen genomfördes. Till migrationsprocessen hör även städning av data. Ett viktigt steg då databasen bestod av redundant data. En jämförelse mellan den nya och gamla databasen visade att städningen vara en lyckad process då den redundanta datan reducerades med 24 procent. Export-import processen valdes då metoden passade bäst för databas egenskaperna. Då databasen hade varit mycket större eller vara skrivet i ett annat SQL språk så vore metoden inte helt optimalt längre. Då skulle ETL verktyget vara med önskvärt. Vid fortsatt arbete så skulle det vara intressant att göra en mer teoretisk jämförelse. Testa att migrera på de olika sätten, med databaser med olika egenskaper för att få en mer övergripande blick av vilken som passar bäst i olika fall.<br>Data migration means the transfer of data from one database to another. This is something that more and more companies need to do. for various reasons, for example to change the language or renew the existing database. Something that is discussed is how this process should take place and how data is migrated in the best and most flexible way. One of the IT companies that was in need of the answers is XLENT Sundsvall. XLENT has an online shop where the interface is outdated, and in need of a better server solution from a management perspective. This study aims to review the possibilities of moving over the existing website to a more modern solution. Look at the best way to migrate the data to a new database that is more suitable for the website. The methods to be compared are an export-import method and Extract Transform Load (ETL) tool. The export-import method proved to be the most suitable process for a database with our properties, and a migration with the selected process was performed. The migration process also includes cleaning of data. An important step as the database consisted of redundant data. A comparison between the new and old databases showed that cleaning was a successful process as the redundant data was reduced by 24 percent. The export-import process was chosen as the method best suited the characteristics of the existing database. With a database being much larger or being written in another SQL language, it might not be the most optimal solution anymore. Then the the ETL tool would be better. With continued work, it would be interesting to make more than a theoretical comparison. Try migrating with many different databases with different properties to get a more comprehensive look of which one is best suited in different cases.
APA, Harvard, Vancouver, ISO, and other styles
8

Maatuk, Abdelsalam. "Migrating relational databases into object-based and XML databases." Thesis, Northumbria University, 2009. http://nrl.northumbria.ac.uk/3374/.

Full text
Abstract:
Rapid changes in information technology, the emergence of object-based and WWW applications, and the interest of organisations in securing benefits from new technologies have made information systems re-engineering in general and database migration in particular an active research area. In order to improve the functionality and performance of existing systems, the re-engineering process requires identifying and understanding all of the components of such systems. An underlying database is one of the most important component of information systems. A considerable body of data is stored in relational databases (RDBs), yet they have limitations to support complex structures and user-defined data types provided by relatively recent databases such as object-based and XML databases. Instead of throwing away the large amount of data stored in RDBs, it is more appropriate to enrich and convert such data to be used by new systems. Most researchers into the migration of RDBs into object-based/XML databases have concentrated on schema translation, accessing and publishing RDB data using newer technology, while few have paid attention to the conversion of data, and the preservation of data semantics, e.g., inheritance and integrity constraints. In addition, existing work does not appear to provide a solution for more than one target database. Thus, research on the migration of RDBs is not fully developed. We propose a solution that offers automatic migration of an RDB as a source into the recent database technologies as targets based on available standards such as ODMG 3.0, SQL4 and XML Schema. A canonical data model (CDM) is proposed to bridge the semantic gap between an RDB and the target databases. The CDM preserves and enhances the metadata of existing RDBs to fit in with the essential characteristics of the target databases. The adoption of standards is essential for increased portability, flexibility and constraints preservation. This thesis contributes a solution for migrating RDBs into object-based and XML databases. The solution takes an existing RDB as input, enriches its metadata representation with the required explicit semantics, and constructs an enhanced relational schema representation (RSR). Based on the RSR, a CDM is generated which is enriched with the RDB's constraints and data semantics that may not have been explicitly expressed in the RDB metadata. The CDM so obtained facilitates both schema translation and data conversion. We design sets of rules for translating the CDM into each of the three target schemas, and provide algorithms for converting RDB data into the target formats based on the CDM. A prototype of the solution has been implemented, which generates the three target databases. Experimental study has been conducted to evaluate the prototype. The experimental results show that the target schemas resulting from the prototype and those generated by existing manual mapping techniques were comparable. We have also shown that the source and target databases were equivalent, and demonstrated that the solution, conceptually and practically, is feasible, efficient and correct.
APA, Harvard, Vancouver, ISO, and other styles
9

Ellison, Martyn. "Evaluating cloud migration options for relational databases." Thesis, University of York, 2017. http://etheses.whiterose.ac.uk/20206/.

Full text
Abstract:
Migrating the database layer remains a key challenge when moving a software system to a new cloud provider. The database is often very large, poorly documented, and used to store business-critical information. Most cloud providers offer a variety of services for hosting databases and the most suitable choice depends on the database size, workload, performance requirements, cost, and future business plans. Current approaches do not support this decision-making process, leading to errors and inaccurate comparisons between database migration options. The heterogeneity of databases and clouds means organisations often have to develop their own ad-hoc process to compare the suitability of cloud services for their system. This is time consuming, error prone, and costly. This thesis contributes to addressing these issues by introducing a three-phase methodology for evaluating cloud database migration options. The first phase defines the planning activities, such as, considering downtime tolerance, existing infrastructure, and information sources. The second phase is a novel method for modelling the structure and the workload of the database being migrated. This addresses database heterogeneity by using a multi-dialect SQL grammar and annotated text-to-model transformations. The final phase consumes the models from the second and uses discrete-event simulation to predict migration cost, data transfer duration, and cloud running costs. This involved the extension of the existing CloudSim framework to simulate the data transfer to a new cloud database. An extensive evaluation was performed to assess the effectiveness of each phase of the methodology and of the tools developed to automate their main steps. The modelling phase was applied to 15 real-world systems, and compared to the leading approach there was a substantial improvement in: performance, model completeness, extensibility, and SQL support. The complete methodology was applied to four migrations of two real-world systems. The results from this showed that the methodology provided significantly improved accuracy over existing approaches.
APA, Harvard, Vancouver, ISO, and other styles
10

Ericsson, Joakim. "Object Migration in a Distributed, Heterogeneous SQL Database Network." Thesis, Linköpings universitet, Databas och informationsteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-148181.

Full text
Abstract:
There are many different database management systems (DBMSs) on the market today. They all have different strengths and weaknesses. What if all of these different DBMSs could be used together in a heterogeneous network? The purpose of this thesis is to explore ways of connecting the many different DBMSs together. This thesis will explore suitable architectures, features, and performance of such a network. This is all done in the context of Ericsson’s wireless communication network. This has not been done in this context before, and a big part of the thesis is exploring if it is even possible. The result of this thesis shows that it is not possible to find a solution that can fulfill the requirements of such a network in this context
APA, Harvard, Vancouver, ISO, and other styles
11

PHIPPS, CASSANDRA J. "Migrating an Operational Database Schema to Data Warehouse Schemas." University of Cincinnati / OhioLINK, 2002. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1019667418.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Pulgatti, Leandro Duarte. "Data migration between different data models of NOSQL databases." reponame:Repositório Institucional da UFPR, 2017. http://hdl.handle.net/1884/49087.

Full text
Abstract:
Orientador : Marcos Didonet Del Fabro<br>Dissertação (mestrado) - Universidade Federal do Paraná, Setor de Ciências Exatas, Programa de Pós-Graduação em Informática. Defesa: Curitiba, 17/02/2017<br>Inclui referências : f. 76-79<br>Resumo: Desde sua origem, as bases de dados Nosql têm alcançado um uso generalizado. Devido à falta de padrões de desenvolvimento nesta nova tecnologia emergem grandes desafios. Existem modelos de dados , linguagens de acesso e frameworks heterogêneos, o que torna a migração de dados ainda mais complexa. A maior parte das soluções disponíveis hoje se concentra em fornecer uma representação abstrata e genérica para todos os modelos de dados. Essas soluções se concentram em adaptadores para acessar homogeneamente os dados, mas não para implementar especificamente transformações entre eles. Essas abordagens muitas vezes precisam de um framework para acessar os dados, o que pode impedir de usá-los em alguns cenários. Entre estes desafios, a migração de dados entre as várias soluções revelou-se particularmente difícil. Esta dissertação propõe a criação de um metamodelo e uma série de regras capazes de auxiliar na tarefa de migração de dados. Os dados podem ser convertidos para vários formatos desejados através de um estado intermediário. Para validar a solução foram realizados vários testes com diversos sistemas e utilizando dados reais disponíveis. Palavras Chave: NoSql Databases. Metamodelo. Migração de Dados.<br>Abstract: Since its origin the NoSql Database have achieved widespread use. Due to the lack of standards for development in this new technology great challenges emerges. Among these challenges, the data migration between the various solutions has proved particularly difficult. There are heterogeneous datamodels, access languages and frameworks available, which makes data migration even more complex. Most part of the solutions available today focus on providing an abstract and generic representation for all data models. These solutions focus in design adapters to homogeneously access the data, but not to specifically implement transformations between them. These approaches often need a framework to access the data, which may prevent from using them in some scenarios. This dissertation proposes the creation of a metamodel and a series of rules capable of assisting in the data migration task. The data can be converted to various desired formats through an intermediate state. To validate the solution several tests were performed with different systems and using real data available. Key-words: NoSql Databases. Metamodel. Data Migration.
APA, Harvard, Vancouver, ISO, and other styles
13

Leal, F?bio de Sousa. "SLA-Based Guidelines for Database Transitioning." Universidade Federal do Rio Grande do Norte, 2016. http://repositorio.ufrn.br/handle/123456789/21027.

Full text
Abstract:
Submitted by Automa??o e Estat?stica (sst@bczm.ufrn.br) on 2016-07-22T15:02:52Z No. of bitstreams: 1 FabioDeSousaLeal_DISSERT.pdf: 4632537 bytes, checksum: e73b6f4fa9d420fff9ac92bcd734f16b (MD5)<br>Approved for entry into archive by Arlan Eloi Leite Silva (eloihistoriador@yahoo.com.br) on 2016-07-26T22:20:11Z (GMT) No. of bitstreams: 1 FabioDeSousaLeal_DISSERT.pdf: 4632537 bytes, checksum: e73b6f4fa9d420fff9ac92bcd734f16b (MD5)<br>Made available in DSpace on 2016-07-26T22:20:11Z (GMT). No. of bitstreams: 1 FabioDeSousaLeal_DISSERT.pdf: 4632537 bytes, checksum: e73b6f4fa9d420fff9ac92bcd734f16b (MD5) Previous issue date: 2016-02-26<br>Engenharia de Software Baseada em Componentes (CBSE) e Arquitetura Orientada a Servi?os (SOA) tornaram-se formas populares de se desenvolver software nos ?ltimos anos. Durante o ciclo de vida de um software, v?rios componentes e servi?os podem ser desenvolvidos, evolu?dos e substitu?dos. Em ambientes de produ??o, a substitui??o de componentes essenciais - como os que envolvem bancos de dados - ? uma opera??o delicada, onde v?rias restri??es e stakeholders devem ser considerados. Service-Level agreement (acordo de n?vel de servi?o - SLA), de acordo com o gloss?rio oficial da ITIL v3 , ? ?um acordo entre um provedor de servi?o de TI e um cliente. O acordo consiste em um conjunto de restri??es mensur?veis que um prestador de servi?os deve garantir aos seus clientes.?. Em termos pr?ticos, um SLA ? um documento que um prestador de servi?o oferece aos seus consumidores garantindo n?veis m?nimos de qualidade de servi?o (QoS). Este trabalho busca avaliar a utiliza??o de SLAs para guiar o processo de transi??o de bancos de dados em ambientes de produ??o. Em particular, propomos um conjunto de guidelines baseados em SLAs para apoiar decis?es migra??es de bancos de dados relacionais (RDBMS) para bancos NoSQL. Nosso trabalho ? validado por estudos de caso.<br>Component-based Software Engineering (CBSE) and Service-Oriented Architecture (SOA) became popular ways to develop software over the last years. During the life-cycle of a software system, several components and services can be developed, evolved and replaced. In production environments, the replacement of core components, such as databases, is often a risky and delicate operation, where several factors and stakeholders should be considered. Service Level Agreement (SLA), according to ITILv3?s official glossary, is ?an agreement between an IT service provider and a customer. The agreement consists on a set of measurable constraints that a service provider must guarantee to its customers.?. In practical terms, SLA is a document that a service provider delivers to its consumers with minimum quality of service (QoS) metrics.This work is intended to assesses and improve the use of SLAs to guide the transitioning process of databases on production environments. In particular, in this work we propose SLA-Based Guidelines/Process to support migrations from a relational database management system (RDBMS) to a NoSQL one. Our study is validated by case studies.
APA, Harvard, Vancouver, ISO, and other styles
14

Lorenzen, Knut. "Migrating a mobile application towards a distributed database for simplified synchronisation." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-211839.

Full text
Abstract:
As mobile applications are often dependant on cloud services and connected through unreliable radio networks, application developers may find themselves implementing custom caching and synchronisation algorithms if the application is to operate flawlessly while offline. Relational databases have been the predominant architecture for persistent storage for a long time. With the emergence of the real time web, distributed schema free databases known as NoSQL have gained widespread adoption in recent years. This thesis evaluates the benefits of a distributed, document-oriented database over a relational database for a mobile business application that needs to provide offline usage. A prototype for an existing building inspection application invoking an embedded NoSQL database has been developed for this purpose. While the NoSQL database provides built-in replication capabilities for the mobile application, it is clearly limited compared to SQL when it comes to modelling highly structured data.
APA, Harvard, Vancouver, ISO, and other styles
15

Fomby, Paula. "Starting points : households of origin and Mexico-U.S. migration /." U. of Wisconsin - Madison Connect to Dissertations. Requires UW-Madison login, 2001. http://www.library.wisc.edu/databases/connect/dissertations.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Strid, Erik, and Robin Wallin. "Migration av Excelbaserat system till webbsystem : Implementation av ett planeringsverktyg inom telekom." Thesis, Linköping University, Department of Computer and Information Science, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-54794.

Full text
Abstract:
<p>Syftet med examensarbetet är att redogöra för hur ett system baserat påExcel kan migreras till en webb- och databasplattform. Initiativtagare tillarbetet är en avdelning vid Ericsson AB i Linköping. Avdelningen använderExcel till datalagring och presentation av testmiljöbeskrivningar ochönskade migrera detta Excelbaserade system till ett webbaserat dito. Någraav de problem som uppkommer vid migration är: nya säkerhetsrisker; hurdata ska ordnas i databasen för att uppnå sökbarhet och versionshanteringsamt hur presentation av data ska styras.</p><p>För att svara på dessa frågor har vi utfört en förstudie och enimplementation av systemet Node Network Plan Database (NNP DB),vilket ämnar ersätta det Excelbaserade systemet. Rapporten beskriver hur:säkerhetsrisker har minimerats genom att införa lämpliga skydd; databasenhar designats för att uppnå sökbarhet och versionshantering; presentation avdata kan styras.</p><p>NNP DB visar sig vara effektivare, ur ett tidsperspektiv, än detExcelbaserade systemet. Övriga resultat av migrationen diskuteras ochpresenteras i rapportens avslutande delar.</p>
APA, Harvard, Vancouver, ISO, and other styles
17

Papež, Zdeněk. "Datová integrace mezi databázovými systémy." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2010. http://www.nusl.cz/ntk/nusl-235552.

Full text
Abstract:
This master´s thesis deals with data integration that is used for data transfer within various database systems in both directions - data migration and replication. We become familiar with the technologies of distributed databases. In detail the system of health care providers is described and particular tables involved into its data integration are explored. For the project execution the proposal for integration of this system is created and whereupon following implementation is described.
APA, Harvard, Vancouver, ISO, and other styles
18

Macias, Miguel S. "The returns to human capital migration within the Department of Defense civilian internal labor market." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2005. http://library.nps.navy.mil/uhtbin/hyperion/05Sep%5FMacias.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Slaughter, Aaron Tory. "A relational database model and data migration plan for the student services department at the Marine Corps Institute." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1997. http://handle.dtic.mil/100.2/ADA361583.

Full text
Abstract:
Thesis (M.S. in Information Technology Management) Naval Postgraduate School, September 1997.<br>Thesis advisor,(s): Kamel, Magdi N. Appendix H, I, and P are located in the back pocket of this Thesis. "September 1997." Includes bibliographical references (p. 103-104). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
20

Wejros, Albin. "Strategier för migration av klassiska servermiljöer till containermiljöer." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-99605.

Full text
Abstract:
Företag som är intresserade av att gå över från ett traditionellttillvägagångssätt att driftsätta sin applikationsmiljö i en mer moderncontainerbaserad miljö saknas ofta den kunskap och erfarenhet som behövs.Det här arbetet syftar till att ge förslag till företag kring hur de bör gå tillvägaför att flytta sin nuvarande traditionella infrastrukturella lösning, vare sig detär fysiska eller virtuella servrar, till en containerbaserad lösning. Projektetsyftar också till att ge företag som är intresserade av en sådan migration enteoretisk grund att stå på för att undvika att gå i de fällor som är vanliga att gåi. Resultatet baseras på en litteraturstudie som i sin tur resulterade i ensammanfattning över vad som bör ingå i en migrationsstrategi samt ett antalpraktiskt tillämpningsbara beslutsträd.
APA, Harvard, Vancouver, ISO, and other styles
21

Miiro, Fabian, and Mikael Nääs. "SQL and NoSQL databases : A CASE STUDY IN THE AZURE CLOUD." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-186398.

Full text
Abstract:
I denna rapport jämförs Azure SQL Database med NoSQL lösningen DocumentDB, under förutsättningarna givna av ett turbaserat flerspelarspel. NoSQL är en relativt ny typ av databas, som till skillnad från dess relationsbaserade kusin har nya lovord angående fantastisk prestanda och oändlig skalbarhet. Det sägs att det är skapt för dagens problem utan gårdagens begränsningar. Warmkitten vill ta reda på om löftena kring NoSQL håller i deras scenario. Eller om den nuvarande SQL lösningen är en bättre match för deras spel med tanke på skalning, snabbhet och anvädningen utav .NET. Jämförelsen gjordes genom att först skapa ett testramverk för spelets flerspelar funktioner. Sedan optimerades både den relations- och den icke-relationsbaserade lösningen, baserat på rekommendationer och experters utlåtande. Testerna kördes sedan under förhållanden som skulle återspegla verkligheten. Rapporten visar att denna nya typ utav databaser helt klart är mogna nog att i vissa scenarion användas istället för de relationsbaserade databaserna, givet att tillräckligt mycket tid ges åt att förstå skillnaderna mellan databastyperna och vad det innebär. Extra betänketid bör läggas på hur NoSQL skalar samt vilka skillnader som det innebär när relations-begränsningarna försvinner. Prestandamässigt så är DocumentDB snabbare under normal användning när det finns lugnare stunder, SQL är bättre för applikationer som behöver köra under långa stunder med hög användning utav databasen. Rapportens slutsats blir en rekommendation till Warmkitten att fortsätta med den redan utvecklade SQL lösningen. Detta då prestandaförbättringarna från NoSQL inte väger upp för den ökade transaktionskomplexiteten, då spelet använder transaktioner i hög utsträckning.<br>This paper was created to compare Azure SQL Database to the Azure NoSQL solution DocumentDB. This case study is done in the scope of a turn-based multiplayer game created by Warmkitten. NoSQL is a new type of databases, instead of the relational kind we are used to, this new type gives promises of increased performance and unlimited scalability. It is marketed that it is created for the problems of today and not bound by yesterday’s limitations. Warmkitten would like to gain insight if the promises from NoSQL holds for the game they are developing or if the current SQL solution fit their needs. They are most interested in the areas of scaling, performance and .NET interoperability. The comparison was carried out by creating a test suite testing the game functionalities. Then both the SQL and NoSQL solution was optimized based on best practices and expert guidelines. The tests were then run under circumstances mimicking real-world scenarios. The paper shows that NoSQL is a valid replacement to SQL, if enough time and thought is put into the implementation. Given that NoSQL is a good fit for the problem at hand. Our paper shows that performance wise NoSQL is faster under normal load when there are times with less load, SQL is better for applications that will have continuously heavy load. The final verdict of this paper is a recommendation for Warmkitten to continue using SQL for their game. This is because the recorded performance improvements does not outweighs the transactional problems seen in this case study created by the transactional nature of the game.
APA, Harvard, Vancouver, ISO, and other styles
22

Thulemark, Maria. "Moved by the mountains : migration into tourism dominated rural areas." Doctoral thesis, Örebro universitet, Institutionen för humaniora, utbildnings- och samhällsvetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-43914.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Starý, Jan. "Migrace databáze do prostředí Oracle Exadata." Master's thesis, Vysoká škola ekonomická v Praze, 2013. http://www.nusl.cz/ntk/nusl-191988.

Full text
Abstract:
This thesis concerns the question of data migration and specifically data migration into the Oracle Exadata database environment. A characteristic project of data migration has been designed which is then specified in a form of a data migration operating guideline creation. Suggested process is then tested on a real database migration. A part of this work is also a detailed description of the Oracle Exadata technological solution including an evaluation of its benefits and limitations in a practical usage. Information and feedbacks necessary to accomplish these goals were gained by interviewing specialists on different positions throughout the organisation that have a real experience with the system. This work can also serve as a valuable source of information for projects dealing with data migration, mainly data migration in the Oracle Exadata environment. As a matter of fact, potential future users can find this work very helpful when considering a purchase of the Oracle Exadata system package.
APA, Harvard, Vancouver, ISO, and other styles
24

Claußnitzer, Ralf. "Effiziente Schemamigration in der modellgetriebenen Datenbankanwendungsentwicklung." Master's thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1212156557862-16452.

Full text
Abstract:
Unter dem Terminus der MDA (Model Driven Architecture)versteht man eine Methode, Anwendungen im Rahmen der UML zu spezifizieren und ablauffähigen Programm-Code durch automatische Generierung zu erzeugen. Am Lehrstuhl für Datenbanken existiert in diesem Zusammenhang das GignoMDA-Projekt, daß sich mit der modellgetriebenen Entwicklung von Datenbankenanwendungen beschäftigt. Als wesentlicher Bestandteil der jeweiligen Anwendung sind Datenmodelle jedoch, genau wie die Anwendungsarchitektur selbst, Anpassungen an sich veränderte Zielstellungen und Umgebungsbedingungen unterworfen. Es stellt sich also die Notwendigkeit der Überführung von Datenbeständen in neu generierte Zielsysteme, als Bestandteil eines vollständig modellgetriebenen Ansatzes dar. Diese Arbeit stellt ein Konzept zur Schema- und Datenmigration bei der Weiterentwicklung der Anwendungs-Datenbankmodelle vor. Dabei werden Datenmigrationen, gemäß dem MDA-Ansatz, als Modell in UML ausgedrückt und anschließend zur automatischen Erzeugung von plattformabhängigen Migrationsmodellen genutzt. Aus diesen Migrationsmodellen können so, Datenbanktechnik basierte Programme (ETL, Stored Procedures) zur effizienten Ausführung von Migrationen generiert werden.
APA, Harvard, Vancouver, ISO, and other styles
25

Alves, Miguel Bispo. "Integrated data model and DSL modifications." Master's thesis, 2013. http://hdl.handle.net/10362/14395.

Full text
Abstract:
Companies are increasingly more and more dependent on distributed web-based software systems to support their businesses. This increases the need to maintain and extend software systems with up-to-date new features. Thus, the development process to introduce new features usually needs to be swift and agile, and the supporting software evolution process needs to be safe, fast, and efficient. However, this is usually a difficult and challenging task for a developer due to the lack of support offered by programming environments, frameworks, and database management systems. Changes needed at the code level, database model, and the actual data contained in the database must be planned and developed together and executed in a synchronized way. Even under a careful development discipline, the impact of changing an application data model is hard to predict. The lifetime of an application comprises changes and updates designed and tested using data, which is usually far from the real, production, data. So, coding DDL and DML SQL scripts to update database schema and data, is the usual (and hard) approach taken by developers. Such manual approach is error prone and disconnected from the real data in production, because developers may not know the exact impact of their changes. This work aims to improve the maintenance process in the context of Agile Platform by Outsystems. Our goal is to design and implement new data-model evolution features that ensure a safe support for change and a sound migration process. Our solution includes impact analysis mechanisms targeting the data model and the data itself. This provides, to developers, a safe, simple, and guided evolution process.
APA, Harvard, Vancouver, ISO, and other styles
26

Huang, Chi-Ming, and 黃吉民. "Data Migration from Relational Database to Cloud Database." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/17579917130973306775.

Full text
Abstract:
碩士<br>世新大學<br>資訊管理學研究所(含碩專班)<br>101<br>Who provide 24-hour service for a lot of the needs of the enterprise applications, system upgrade and expansion of traditional relational database is a very painful thing, and often need to be down for maintenance. Case of gradually increasing the amount of data and visits to artificially increase the machine or partition to a different machine, is becoming increasingly difficult, increasingly high human cost, they begin to have a NoSQL concept, it the intention is to increase the degree of automation of data storage, reduce human intervention, more balanced load, in order to solve the prob-lems faced by the traditional relational database, the data is how to transplant from a traditional relational database to the cloud database as this thesis issues. How this thesis through the traditional relational database for data migration to the cloud database, and propose an effective method to solve transplantation Mon-goDB, for example, through the messaging system as the research object data model architecture confirmation and the Construction of the data model, the information to complete the transplant. And experiments to validate the traditional relational data-base and cloud database query, update, modify efficiency comparison results confirms the cloud database is better than a traditional relational database.
APA, Harvard, Vancouver, ISO, and other styles
27

Alves, Fernando Jorge Coelho Barreira Calheiros. "Tool for Incremental Database Migration." Dissertação, 2021. https://hdl.handle.net/10216/135480.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Alves, Fernando Jorge Coelho Barreira Calheiros. "Tool for Incremental Database Migration." Master's thesis, 2021. https://hdl.handle.net/10216/135480.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Chen, Chun-ting, and 陳俊廷. "A study of data migration from relational database to cloud database." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/94574912437002841166.

Full text
Abstract:
碩士<br>逢甲大學<br>工業工程與系統管理學研究所<br>100<br>Cloud computing is the delivery of computing and storage capacity as a service to a community of the end-users. For the Internet era of information explosion, the general applications frequently encounter the many problems of the network traffic. Most enterprises must install the high-price server and extend their network bandwidth. Thus, the cloud computing become one of feasible solutions, but the database schema is very different between relationship database and cloud database. How to migrate the data from traditional relationship database to the cloud computing, it is a key problem. This thesis proposed an effective and feasible approach to migrate the data from RDBMS to cloud database. For a case study from MySQL DB to Apache HBase, how to maintain the relational database tables with its attributes for migration. The HBase is based on column oriented which is not support SQL commands to successfully retrieve data from the relationship tables using join calculation. We design a simple RDBMS application to generate a huge data and to simulate the database data migration. In the experiments, we verify the integrity and correctness of the migrated data using Hive and Sqoop tools. At the same time, we compare the query performance between the relational database and cloud database, the results approved the cloud database is really better than traditional relational database. Therefore, we confirm the proposed is an effective and feasible approach, it will be able to assist the enterprises to migrate the legacy DB system to the cloud DB.
APA, Harvard, Vancouver, ISO, and other styles
30

Yang, Chun-Chung, and 楊濬仲. "Migrating Relational Database Applications to HBase." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/16615508573010092444.

Full text
Abstract:
碩士<br>國立交通大學<br>資訊科學與工程研究所<br>100<br>Distributed system is playing an important role in large-scale data processing like data mining, web data processing, even the science computing. Applying distributed systems to data processing can lead to fantasy result. Traditional relational database provides us with convenient commands to deal with data. This allows programmers to ignore the complicated data storing and processing when they develop a new application. However, traditional relational database system costs a lot when dealing with large-scale data. Sometimes it makes companies hard to make a decision on whether a company is going to expand a new database system. HBase is a distributed storage system which can manage structured data with column-based storing method. Compared to traditional relational database system, HBase can store data scale to a very large size with a distributed, persistent multidimensional sorted map indexed by a row key, column key, and a timestamp. In this paper, we want to apply HBase to the functionality of relational database system. However, there exists some legacy code in the original database application, which takes time to rewrite. To solve this problem, we propose an architecture to let programmers replace their relational database step by step. In our study, we also provide a solution for some relational operations in HBase. In our experiment, the new architecture can handle large scale data and has good performance in processing.
APA, Harvard, Vancouver, ISO, and other styles
31

Pi, Shao Kan, and 畢劭康. "Deterministic Crabbing - a non-transactional-blocking database live migration technique for distributed database systems." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/j28t2v.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Cheng, Wei Chung, and 鄭惟中. "Decision Analysis of Database Migration on Cloud Using AHP." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/74858224774321194960.

Full text
Abstract:
碩士<br>華梵大學<br>資訊管理學系碩士班<br>100<br>The present study was to investigate the enterprise in the cloud database service, to face a variety of uncertain sources of risk, these risks are often caused by the tremendous impact of enterprises and the results, you need to have a strategic analysis to help companies analyze. In this study, through literature, interviews with experts, scholars, corporate executives and users are compiled from the chart, cloud database service decision analysis model for "organizational aspects", "management level", "User involvement and training "," technology planning "," market maturity "five levels, each level respectively, three to five evaluation criteria, a total of 20 2 of the assessment criteria. And by the Delphi method and the Analytical Hierarchy Process to analyze the different roles in the choice of the cloud database service priorities. According to the information collected by the analysis of results obtained to provide a reference for improvement of the cloud database service providers in the five dimensions of the level of the research assessment framework, management and marketing is to build cloud database system consider the factors most important dimension, followed by user participation and training, science and technology planning, organization. During cloud database construction strategy recommends that companies must give special consideration to the market and management factors. Keyword:Database, cloud computing, Analytical Hierarchy Process, Delphi
APA, Harvard, Vancouver, ISO, and other styles
33

Yang, Hung-Yi, and 楊弘義. "Database Migration Analysis - From Oracle to Open Source PostgreSQL." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/2b739y.

Full text
Abstract:
碩士<br>國立彰化師範大學<br>資訊管理學系所<br>105<br>In the past , many enterprises use Commercial Database Management Systems. Today, with the rapid development of the Internet and the Community websites, all kinds of Open Source Database are widely used in different areas. Commercial Database software costs become increasingly expensive , if enterprises can use the Open Source Database will be able to reduce maintenance costs. Because every databases are structurally different, it is a complex process to migrate from different databases. Therefore, this paper studies how to migrate from the popular Oracle database to the PostgreSQL database. This paper first constructs the transfer step, with Orale10g HR Sample Schemas as migration case , implement the migration process and verify the correctness of the data transfer. Finally, the use of large amounts of data measured INSERT statement, COPY statement, Pg_bulkload tool performance. Experimental results the Pg_bulkload tool performs the fastest. Therefore, we confirm the proposed is an effective approach, it will be able to assist the enterprises to migrate the Oracle database to the PostgreSQL database.
APA, Harvard, Vancouver, ISO, and other styles
34

Huang, Mei-Chen, and 黃美甄. "The Analysis of Customer Value Migration Path in Database Marketing." Thesis, 2003. http://ndltd.ncl.edu.tw/handle/26342400066062650673.

Full text
Abstract:
碩士<br>國立臺灣大學<br>國際企業學研究所<br>91<br>Database Marketing is a useful method of implementing Relationship Marketing and One-to-One Marketing. Because of the change of the whole marketing environment, the business must adapt himself to heterogeneity. The company not only wants to sustain the relationship with the customers but also attempts to find the most profitable customers. In this thesis, we try to use the database of credit card and consider customer’s heterogeneity and dynamic to analysis customer value and customer migration path. Apply markov chain, hierarchical bayes and brand choice theory to development three model including Bayes Multi-nominal, Hierarchical Bayes Probit and Hierarchical Bayes Logit model. Through the data analysis, we can realize customer’s value and behavior more efficiently and execute marketing strategy and the customer relationship management precisely.
APA, Harvard, Vancouver, ISO, and other styles
35

Cabral, Andreia Filipa Gonçalves. "Data Profiling in Cloud Migration: Data Quality Measures while Migrating Data from a Data Warehouse to the Google Cloud Platform." Master's thesis, 2021. http://hdl.handle.net/10362/117609.

Full text
Abstract:
Internship Report presented as the partial requirement for obtaining a Master's degree in Data Science and Advanced Analytics<br>In today times, corporations have gained a vast interest in data. More and more, companies realized that the key to improving their efficiency and effectiveness and understanding their customers’ needs and preferences better was reachable by mining data. However, as the amount of data grow, so must the companies necessities for storage capacity and ensuring data quality for more accurate insights. As such, new data storage methods must be considered, evolving from old ones, still keeping data integrity. Migrating a company’s data from an old method like a Data Warehouse to a new one, Google Cloud Platform is an elaborate task. Even more so when data quality needs to be assured and sensible data, like Personal Identifiable Information, needs to be anonymized in a Cloud computing environment. To ensure these points, profiling data, before or after it migrated, has a significant value by design a profile for the data available in each data source (e.g., Databases, files, and others) based on statistics, metadata information, and pattern rules. Thus, ensuring data quality is within reasonable standards through statistics metrics, and all Personal Identifiable Information is identified and anonymized accordingly. This work will reflect the required process of how profiling Data Warehouse data can improve data quality to better migrate to the Cloud.
APA, Harvard, Vancouver, ISO, and other styles
36

Lai, Li-Ting, and 賴俐婷. "The analysis of Customer Migration Path in Credit Card Database Marketing." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/11922298419523257592.

Full text
Abstract:
碩士<br>國立臺灣大學<br>國際企業學研究所<br>101<br>How does enterprises maintain relationships between customers and to fully understand what the customers really want becomes quite urgent. The investment in marketing resources and the list of key customers determine the profits of enterprises and the results of the management. Database Marketing is a requisite and efficient weapon for firms to put relationship marketing in practice. The task of this practice is to evaluate individual customer value through proper analyses of customers&apos;&apos; purchase data. What enterprises need to do is to be able to capture the migration patterns of a customer&apos;&apos;s value in advance, in order to explore potential customers and prevent inactive customers. Markov chain model and hierarchical Bayesian Probit approach are applied to construct individual customer&apos;&apos;s purchasing transition probabilities in every state. With the transition matrix and defined customer migration pathways, we can calculate customer correspondence. With the information, the enterprises can indicate key customers and allocate marketing resources more efficiently.
APA, Harvard, Vancouver, ISO, and other styles
37

Tang, Ding-Yao, and 唐定堯. "The Design and Implementation of XML based Database Data Migration System." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/jz7dcw.

Full text
Abstract:
碩士<br>國立臺北科技大學<br>資訊工程系研究所<br>96<br>The purpose of this research is to provide a XML based database data migration system which can help simplify data migration to help database operator easily migrate from the previous system to another optimized database system. Improper database design will cause the low database performance. Moreover, as the scale of data number becomes larger, the management issues involved becomes more complicated and the task of remaining cost effective becomes increasingly challenging. The object of the research is to develop a better database data migration framework for database optimization. Furthermore, a solution of centralized XML based database data migration System was developed. The database data migration solution is a web application based on XML technologies enabling the seamless integration. From the result of this research, we look forward to building up a comprehensive database migration system and the platform for the database management. And we believe the combination of these two can completely ameliorate the Information Appliance qualities and attain to the goal of the cooperation between the business and academic fields.
APA, Harvard, Vancouver, ISO, and other styles
38

Yang, Hsiu-Wei, and 楊修維. "Customer Value and Migration Analysis ─ An Example of Shopping Website Database." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/jpv962.

Full text
Abstract:
碩士<br>國立臺灣大學<br>國際企業學研究所<br>100<br>Enterprise in this era of information, competition becomes furious and needs of customer are more complex, needs to build competence advantage elaborately. Efficiently utilizing customer information to manage relationship with client and effectively implementing marketing strategy is important. Datebase is core resource of shopping website industry so that marketing staff devote to extract valuable information from historic transaction record. Utilizing databese marketing skill integrating marketing knowledge and statistic methodology is useful to distinguish different needs of each customer.   This thesis use activity index and reliability index to analyze customer value. Besides, we use Hierarchical Bayesian methodology to modify the bias from shortage of individual transaction record and take customer heterogeneity into account. In addition, migration analysis could assist in predicting customer behavior and adopt dynamic marketing strategy.   The final results of this thesis provides systematic customer research structure with shopping website enterprise to allocate marketing resource to beneficial customer and strengthen customer relationship management.
APA, Harvard, Vancouver, ISO, and other styles
39

Peng, Ming-Hsiu, and 彭銘修. "Customer Value and Migration Analysis–Take Database of Credit Card as example." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/mr2pur.

Full text
Abstract:
碩士<br>國立臺灣大學<br>國際企業學研究所<br>100<br>The main purpose of this considering consumer dynamic and consumer heterogeneity by theory of database marketing and statistical method, is to recognize the customer value which is based on a domestic bank’s database of credit card customers. In the first part of this research, we analyze customer value by indexes such as inter-purchase time, reliability index, activity index conducted by RFM model, Hierarchical Bayesian model and maximum likelihood estimates. Second, we build individual customer’s Transition Probability Matrix of Markov Chain to predict customer purchase state, and the average hit rate reaches 54%. Further, we set 6 meaningful paths and find out if there is any correlation within the transition paths and demographic variables. The contribution of this study could help enterprises verify customer value, identify customer’s transition paths, develop follow-up marketing strategies and build up long-term customer relationship.
APA, Harvard, Vancouver, ISO, and other styles
40

Chen, Yudy, and 陳春瑞. "Simplifying Data Migration from Relational Database Management System to Google App Engine Datastore." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/26387994857379543693.

Full text
Abstract:
碩士<br>國立東華大學<br>資訊工程學系<br>101<br>Since cloud computing was introduced, it has been becoming more and more popular. Cloud computing which offers resources as services has taken part as one of the next generation computing technologies. The first thing we should do before migrating an application is to migrate the data. Google App Engine (GAE) Datastore provides NoSQL data storage. For migrating data to Datastore, we should prepare configuration file that contains table schema and CSV or XML file which contains the data. Migrating data from relational database management system (RDBMS) to Google App Engine seems to be overwhelmed when we have many tables. In this research, we present the method for simplifying data migration from RDBMS to GAE including blob data migration. Our method leverages AppCfg to provide convenience way for data migration and as a result, user has eliminated at least 75% task effort for data migration.
APA, Harvard, Vancouver, ISO, and other styles
41

Maghfirah, Intan, and 馬英妲. "Constraints Preserving in Schema Transformation to Enhance Database Migration from MySQL to HBase." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/78676917365369695343.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Phan, Diep Ngoc. "The determinants and impacts of internal migration : the case of Vietnam /." 2008. http://www.library.wisc.edu/databases/connect/dissertations.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Woo, Seokjin. "Three essays on elderly migration and local fiscal policy /." 2006. http://www.library.wisc.edu/databases/connect/dissertations.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Siao, Chun-Chang, and 蕭峻昌. "Impulse Buying and Its Migration Path Analysis–Take Database of Credit Card as example." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/w336ry.

Full text
Abstract:
碩士<br>國立臺灣大學<br>國際企業學研究所<br>103<br>This main idea of this study is trying to estimate the impulse buying attitude and their migration behavior with database marketing and statistical method, which is based on a domestic bank’s database of credit card customer. In the first, we established individual user’s Transition Probability Matrix with Markov Chain and Hierarchical Bayesian Model to predict the future states of them with an average hit rate 66.5%. Second, we set 6 meaningful migration paths and analyze the relation between the transition paths and demographic variable. Though this analysis, we could realize the impulse buying behavior and help the bank plan their marketing strategy and evaluate the value of their customer.
APA, Harvard, Vancouver, ISO, and other styles
45

林敬昇. "Using National Health Insurance Database to explore the Township-level Population and its Migration." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/47086417804311647385.

Full text
Abstract:
碩士<br>國立政治大學<br>統計學系<br>104<br>Although the population registration system is one of the distinctive features of Taiwan’s official statistics, it does not necessarily reflect the demographic information regarding the usual residential (or de jure) population. This is one of the reasons why Taiwan government still conduct population census every 10 years in order to acquire appropriate information for policy planning. However, the interval of 10 years is too long for most countries and thus there are some attempts (from Nordic countries especially) to integrate the records from official statistics and survey to construct the information of de jure population. Taiwan also experimented a survey of 16% sample to replace the traditional census in 2010. In this study, we also propose an alternative approach of data source to achieve the information of de jure population. Our approach is based on the National Health Insurance Research Database (NHIRD). Taiwan has national health insurance for more than 20 years, and everyone is eligible to join the NHI, no matter rich or poor, and now more than 99% of Taiwan population are covered. In a sense, the NHIRD covers practically whole population in Taiwan. In addition, every township of Taiwan has at least one medical institution, more than 10,000 medical institutions in 371 townships. People tend to visit nearby medical institutions for minor sickness, such as upper respiratory tract infection (URTI) and skin diseases. Therefore, the records of UTRI are used to approximate the de jure population. These records can also be used to acquire the pattern how people migrate by the change of their usual place of outpatient visit. We applied the outpatient migration data to migration models to evaluate the behavior of outpatient visits can be described by these models. We found that about 70% of people from database of one million sample in 2005 (LHI2005) would go to medical institution at least once annually because of UTRI, and 88% of these outpatient visits are in one county. Also, about 90% of people from LHI2005 would go to the doctors at least once for 2005-2007 (3-year interval) because of UTRI. In order words, the records of UTRI, including the locations of medical institution, can be used to estimate the place-of-residence. But this information should be used with care since they are not identical to those from census or population registration.
APA, Harvard, Vancouver, ISO, and other styles
46

Chen, Wei-You, and 陳威佑. "The Analysis of Customer Value Migration Path in Database Marketing:An Application of Bayesian Multinomial Model." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/46071374170737473319.

Full text
Abstract:
碩士<br>國立臺灣大學<br>國際企業學研究所<br>99<br>Since the technologies of Database Management System (DBMS) have been well-developed, enterprises could retain and process enormous transaction data easily than before. But when enterprises want to analyze these transaction data, they should study it based on rigorous methodology, therefore they could gain value-added infor-mation from their customer transaction data in order to develop customer-related strate-gies which could make enterprises communicate to their customers directly and fulfill their needs according to heterogeneity. The enterprises can predict customer lifetime value、customer value migration path、brand choice and also probability of different purchase state in the future. The main purpose of this article is to predict customer purchase state given on his/her prior state. Markov Chain Theory helps us figure out the conditional probabilities of each state therefore to build individual customer’s Transition Probability Matrix of Markov Chain and analyze it by using Bayesian Multinomial Model. After that we verify the predictive power of models by using out-sample and try to find out if there is any correlations within the transition paths and demographic variables. The contribution of this study could help enterprises identify customers’ transition paths and develop fol-low-up marketing strategies in order to achieve database marketing.
APA, Harvard, Vancouver, ISO, and other styles
47

Lin, Pei-Rou, and 林佩柔. "Using National Health Insurance Database to Explore the Relationship between Medical Usage and Domestic Migration." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/zvfcq8.

Full text
Abstract:
碩士<br>國立政治大學<br>統計學系<br>107<br>People migrate because of resource needs, climate change and other factors. Large-scale population movements often reflect major environmental changes, such as Wuhu Chaos in China and the Middle East refugees in recent years. However, human migration is not easy to track and the official statistics from household registration cannot reveal the actual movements. The population census collect the records of permanent resident population but the frequency of every ten-year cannot provide up-to-date information. On the other hand, Taiwan implemented the National Health Insurance (NHI) in 1995 and more than 99% of Taiwan’s residents participate in this program. The NHI becomes a part of our daily life and every township has at least one hospital or clinic. According to past studies, people tend to have outpatient visits in their living areas for minor illnesses, and thus we can use these medical records to study Taiwan’s domestic migration. In the study, we explore the relationship between medical usage and domestic migration from the NHI Research Database. We found that the migration rates from the NHI are higher than those from household registration. The young adults move more frequently than other groups, and most migrants move to nearby municipality and cities. The mortality rates of migrants are higher than those of non-migrants. Those who migrate to cities have less medical utilization but their medical costs are higher (i.e., more serious conditions), indicating that the migrants are less healthy.
APA, Harvard, Vancouver, ISO, and other styles
48

Caviedes, Alexander A. "Cracks in the fortress Europe? : how sectoral needs shape labor migration policy /." 2006. http://www.library.wisc.edu/databases/connect/dissertations.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Simonson, William Thomas Nels. "The roles of talin1 and calpain in T cell adhesion and migration." 2007. http://www.library.wisc.edu/databases/connect/dissertations.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Verma, Rita. "Migration and memory : reflections on schooling and community by Sikh immigrant youth /." 2004. http://www.library.wisc.edu/databases/connect/dissertations.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!