Academic literature on the topic 'Enterprise Data Warehouse Projects'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Enterprise Data Warehouse Projects.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Enterprise Data Warehouse Projects"

1

Xu, B. Y., H. M. Cai, and C. Xie. "An Ontology Approach for Manufacturing Enterprise Data Warehouses Development." Advanced Materials Research 215 (March 2011): 77–82. http://dx.doi.org/10.4028/www.scientific.net/amr.215.77.

Full text
Abstract:
Data warehouse (DW) is a powerful and useful technology for decision making in manufacturing enterprises. Because that the operational data often comes from distributed units for manufacturing enterprises, there exits an urgent need to study on the methods of integrating heterogonous data in data warehouse. In This paper, an ontology approach is proposed to eliminate data source heterogeneity. The approach is based on the exploitation of the application of domain ontology methods in data warehouse design, representing the semantic meanings of the data by ontology at database level and pushing the data as data resources to manufacturing units at data warehouse access level. The foundation of our approach is a meta-data model which consists of data, concept, ontology and resource repositories. The model is used in a shipbuilding enterprise data warehouse development project. The result shows that with the guide of the meta-data model, our ontology approach could eliminate the data heterogeneity.
APA, Harvard, Vancouver, ISO, and other styles
2

Rahman, Nayem. "Lessons from a Successful Data Warehousing Project Management." International Journal of Information Technology Project Management 8, no. 4 (2017): 30–45. http://dx.doi.org/10.4018/ijitpm.2017100103.

Full text
Abstract:
This article provides an overview of project management aspects of a data warehouse application implementation. More specifically, the article discusses the project's implementation, challenges faced, and lessons learned. The project was initiated with an objective to redesign the procurement data pipeline of a data warehouse. The data flows from enterprise resource planning (ERP) system to enterprise data warehouse (EDW) to reporting environments. This project was challenged to deliver more quickly to the consumers with improved report performance, and reduced total cost of ownership (TCO) in EDW and data latency. Strategies of this project include providing continuous business value, and adopt new technologies in data extraction, transformation and loading. The project's strategy was also to implement it using some of the agile principles. The project team accomplished twice the scope of previous project in the same duration with a relatively smaller team. It also achieved improved quality of the products, and increased customer satisfaction by improving the reports' response time for management.
APA, Harvard, Vancouver, ISO, and other styles
3

Papiernik, Daniel K., Dhruv Nanda, Robert O. Cassada, and William H. Morris. "Data Warehouse Strategy to Enable Performance Analysis." Transportation Research Record: Journal of the Transportation Research Board 1719, no. 1 (2000): 175–83. http://dx.doi.org/10.3141/1719-23.

Full text
Abstract:
The Virginia Department of Transportation (VDOT) has engaged to implement an enterprise data warehouse as part of a strategic investment in its information technology (IT) infrastructure. Data warehousing provides an information architecture that serves as the enterprisewide source of data for performance analysis and organizational reporting. To assist VDOT in achieving its strategic outcome area objectives, a programming and scheduling (P&S) data mart is being developed to track preconstruction project activities. This data mart and subsequent data marts function as departmental decision support platforms, enabling VDOT’s operating divisions to perform their own enhanced analytical processing, visualization, and data mining for more informed business decision capabilities. Presented is a case study based on the enterprise data warehouse and P&S data mart being developed and implemented for VDOT by TransCore. Explicitly described is how one VDOT division, Programming and Scheduling, will benefit by investing in IT to achieve its strategic goals. The design approach, methodology, and implementation procedure for the P&S decision support data mart are detailed. The methodology for capturing the performance measures that have been defined by the P&S division in the context of its strategic outcome areas is highlighted. Recommended future direction and the technologies that the agency should adopt to continue to maximize their IT investment are outlined.
APA, Harvard, Vancouver, ISO, and other styles
4

Eschrich, Steven A., Jamie K. Teer, Phillip Reisman, et al. "Enabling Precision Medicine in Cancer Care Through a Molecular Data Warehouse: The Moffitt Experience." JCO Clinical Cancer Informatics, no. 5 (June 2021): 561–69. http://dx.doi.org/10.1200/cci.20.00175.

Full text
Abstract:
PURPOSE The use of genomics within cancer research and clinical oncology practice has become commonplace. Efforts such as The Cancer Genome Atlas have characterized the cancer genome and suggested a wealth of targets for implementing precision medicine strategies for patients with cancer. The data produced from research studies and clinical care have many potential secondary uses beyond their originally intended purpose. Effective storage, query, retrieval, and visualization of these data are essential to create an infrastructure to enable new discoveries in cancer research. METHODS Moffitt Cancer Center implemented a molecular data warehouse to complement the extensive enterprise clinical data warehouse (Health and Research Informatics). Seven different sequencing experiment types were included in the warehouse, with data from institutional research studies and clinical sequencing. RESULTS The implementation of the molecular warehouse involved the close collaboration of many teams with different expertise and a use case–focused approach. Cornerstones of project success included project planning, open communication, institutional buy-in, piloting the implementation, implementing custom solutions to address specific problems, data quality improvement, and data governance, unique aspects of which are featured here. We describe our experience in selecting, configuring, and loading molecular data into the molecular data warehouse. Specifically, we developed solutions for heterogeneous genomic sequencing cohorts (many different platforms) and integration with our existing clinical data warehouse. CONCLUSION The implementation was ultimately successful despite challenges encountered, many of which can be generalized to other research cancer centers.
APA, Harvard, Vancouver, ISO, and other styles
5

Hall, James P., Rob Robinson, and Mary Ann Paulis. "Enterprisewide Spatial Data Integration of Legacy Systems for Asset Management." Transportation Research Record: Journal of the Transportation Research Board 1917, no. 1 (2005): 11–17. http://dx.doi.org/10.1177/0361198105191700102.

Full text
Abstract:
This paper describes the spatial information system infrastructure implemented by the Illinois Department of Transportation (IDOT) to enable delivery of information to management decision makers in asset management applications. This spatial data warehouse infrastructure makes extensive use of geographic information system (GIS) technologies to integrate information from a variety of database structures and formats. GIS products and tools have been developed to portray and analyze these data in useful combinations focused on practitioner needs. In June 1999 the Governmental Accounting Standards Board issued Statement 34 requiring governments to have a systematic approach to managing their assets. As a result, transportation agencies have placed an increased emphasis on developing mechanisms to integrate information from disparate management information systems and legacy databases. IDOT has used GIS to develop a spatial data warehouse to enable integration. A valuable characteristic of the department's information systems infrastructure is the embedding of the underlying link–node structure into roadway inventory databases to enable the direct linkage of data through various system identifiers, including differing milepost referencing and project numbering schemes. This direct linkage enables the complex integration of asset management–related data files across the enterprise and provides access to historical asset information. Changes to route referencing systems are readily accommodated, without loss of integrative capabilities. Outputs include a variety of user-developed analyses and output products with accessibility through networks, intranets, and the Internet.
APA, Harvard, Vancouver, ISO, and other styles
6

Incalcaterra, James, Alexis B. Guzman, Yu-Ting Huang, et al. "Assessing the cost of cancer care delivery using a time-driven, activity-based costing software." Journal of Clinical Oncology 34, no. 7_suppl (2016): 25. http://dx.doi.org/10.1200/jco.2016.34.7_suppl.25.

Full text
Abstract:
25 Background: Poor costing systems and measurement have led to cross-subsidies and cost-shifting in health care. A large academic cancer center has adopted Robert Kaplan’s bottom-up cost accounting methodology called time-driven, activity-based costing (TDABC). TDABC in health care has been proven to be an effective cost accounting tool to measure and improve care delivery by standardizing and creating transparency around patient care processes. The project aims to process map and identify event triggers associated with each process map, use a software applications to compute the costs and resource capacities. Methods: Information technology and financial subject-matter experts integrated clinical, resource, and financial data from the institution’s enterprise information warehouse, general ledger, resource and asset management systems into the software application. Clinical business managers, nurse managers and other clinical content experts helped identify patient-level care processes. Results: The institution deployed a project team to integrate data from the institution’s enterprise information warehouse and aid in the process mapping across three multidisciplinary care centers. The team was able to successfully cost both direct and overhead costs associated with 69 head and neck, 18 endocrine, and 15-20 proton therapy patient-level processes over 7 different business department within 7 months. The resource capacity analysis was the most difficult to analyze due of the lack of transparency around resource’s clinical, administrative, and research responsibilities. Dashboards are currently being developed to help assess changes in patient care processes, cost or resource utilization. Conclusions: This methodology can be used across all health care organizations in all countries to analyze the true cost of care delivery.
APA, Harvard, Vancouver, ISO, and other styles
7

He, Wenjun, Katie G. Kirchoff, Royce R. Sampson, et al. "Research Integrated Network of Systems (RINS): a virtual data warehouse for the acceleration of translational research." Journal of the American Medical Informatics Association 28, no. 7 (2021): 1440–50. http://dx.doi.org/10.1093/jamia/ocab023.

Full text
Abstract:
Abstract Objective Integrated, real-time data are crucial to evaluate translational efforts to accelerate innovation into care. Too often, however, needed data are fragmented in disparate systems. The South Carolina Clinical & Translational Research Institute at the Medical University of South Carolina (MUSC) developed and implemented a universal study identifier—the Research Master Identifier (RMID)—for tracking research studies across disparate systems and a data warehouse-inspired model—the Research Integrated Network of Systems (RINS)—for integrating data from those systems. Materials and Methods In 2017, MUSC began requiring the use of RMIDs in informatics systems that support human subject studies. We developed a web-based tool to create RMIDs and application programming interfaces to synchronize research records and visualize linkages to protocols across systems. Selected data from these disparate systems were extracted and merged nightly into an enterprise data mart, and performance dashboards were created to monitor key translational processes. Results Within 4 years, 5513 RMIDs were created. Among these were 726 (13%) bridged systems needed to evaluate research study performance, and 982 (18%) linked to the electronic health records, enabling patient-level reporting. Discussion Barriers posed by data fragmentation to assessment of program impact have largely been eliminated at MUSC through the requirement for an RMID, its distribution via RINS to disparate systems, and mapping of system-level data to a single integrated data mart. Conclusion By applying data warehousing principles to federate data at the “study” level, the RINS project reduced data fragmentation and promoted research systems integration.
APA, Harvard, Vancouver, ISO, and other styles
8

Rahul, Kumar, and Rohitash Kumar Banyal. "Detection and Correction of Abnormal Data with Optimized Dirty Data: A New Data Cleaning Model." International Journal of Information Technology & Decision Making 20, no. 02 (2021): 809–41. http://dx.doi.org/10.1142/s0219622021500188.

Full text
Abstract:
Each and every business enterprises require noise-free and clean data. There is a chance of an increase in dirty data as the data warehouse loads and refreshes a large quantity of data continuously from the various sources. Hence, in order to avoid the wrong conclusions, the data cleaning process becomes a vital one in various data-connected projects. This paper made an effort to introduce a novel data cleaning technique for the effective removal of dirty data. This process involves the following two steps: (i) dirty data detection and (ii) dirty data cleaning. The dirty data detection process has been assigned with the following process namely, data normalization, hashing, clustering, and finding the suspected data. In the clustering process, the optimal selection of centroid is the promising one and is carried out by employing the optimization concept. After the finishing of dirty data prediction, the subsequent process: dirty data cleaning begins to activate. The cleaning process also assigns with some processes namely, the leveling process, Huffman coding, and cleaning the suspected data. The cleaning of suspected data is performed based on the optimization concept. Hence, for solving all optimization problems, a new hybridized algorithm is proposed, the so-called Firefly Update Enabled Rider Optimization Algorithm (FU-ROA), which is the hybridization of the Rider Optimization Algorithm (ROA) and Firefly (FF) algorithm is introduced. To the end, the analysis of the performance of the implanted data cleaning method is scrutinized over the other traditional methods like Particle Swarm Optimization (PSO), FF, Grey Wolf Optimizer (GWO), and ROA in terms of their positive and negative measures. From the result, it can be observed that for iteration 12, the performance of the proposed FU-ROA model for test case 1 on was 0.013%, 0.7%, 0.64%, and 0.29% better than the extant PSO, FF, GWO, and ROA models, respectively.
APA, Harvard, Vancouver, ISO, and other styles
9

Ruckdeschel, John C., William T. Sause, Tom Belnap, Cory Jones, and Braden D. Rowley. "Oncology quality improvement as a cornerstone of the transition to accountable care." Journal of Clinical Oncology 30, no. 34_suppl (2012): 141. http://dx.doi.org/10.1200/jco.2012.30.34_suppl.141.

Full text
Abstract:
141 Background: Accountable care is defined as moving the incentives for health care from a system that rewards volume and procedures to one that rewards improvements in the quality of care for a defined population. To prevent this process from deteriorating into solely a cost reduction exercise, physicians, and hospitals need to develop a valid, reproducible, and effective means of measuring quality and impacting behavior to reduce variation and improve quality of care. The Intermountain Healthcare Oncology Clinical Program’s (OCP) experience with Oncology Quality Improvement (OQI) offers several key lessons for enabling this process. Methods: OQI initiatives are developed by a multidisciplinary physician-based team tasked with directing standardization and ensuring optimal care delivery. The team uses clinical knowledge, peer-reviewed literature, and data from an enterprise data warehouse to develop goals. Performance is measured against a goal which focuses on variation between physicians and facilities. Individual physician data is compared to de-identified data of peers, facilities, and the system. A physician champion performs academic detailing for physician groups across the system and is critical to the success of the program. Results: Over the past decade, the OCP initiated over 30 projects designed to measure and improve quality of oncology care delivery. Breast cancer projects included breast conservation in surgical management, reducing axillary dissection for ductal carcinoma in situ and sentinel node biopsy rather than axillary dissection. The OCP also explored standardizing lymph node resection during colorectal cancer surgery and subsequently the utilization of adjuvant chemotherapy. Imaging based goals included improving mammography callback rates and using PET/CT during preoperative assessment of lung cancer. In most instances the process resulted in significant, sustainable OQI. Conclusions: The investment in program and clinician staff is significant, and the requirements and costs for a sophisticated data system are real. However, an OQI program can provide meaningful improvements in the quality of cancer care and is an important step to facilitate the transition to accountable care.
APA, Harvard, Vancouver, ISO, and other styles
10

Bosch, Brandon, Scott Hartman, Lauren Caldarello, and Diane Denny, DBA. "Integrating patient-reported outcomes data into the electronic health record." Journal of Clinical Oncology 36, no. 30_suppl (2018): 186. http://dx.doi.org/10.1200/jco.2018.36.30_suppl.186.

Full text
Abstract:
186 Background: As a national network of hospitals that specialize in the treatment of patients fighting complex or advanced-stage cancer, the network was an early adopter of using patient reported outcome (PRO) data as part of its routine patient assessment and treatment. Since 2012 an externally validated tool has been used to capture patients’ perceived symptom burden for real-time clinical intervention, from the point of first visit throughout the course of treatment, at intervals of 21 days or greater. Research has demonstrated the use of PRO data as a valuable component of a patient’s treatment plan, promoting improved quality and length of life. Methods: The use of this data across the network was expanded such that results once only accessible on paper and via electronically stored images, has now been fully integrated into the electronic health record (EHR). A multidisciplinary project team formulated the specifications for a successful integration of PRO data into the EHR. Results: The project achieved its goal and went beyond data integration to include implementation of a solution to facilitate documentation of intervention against patients’ symptoms. Provider workflow efficiency is greatly enhanced via single system access and visual notification, with critical values flagged, to focus providers’ attention on severe symptoms. Incorporation of a unified EHR flowsheet provides a paperless, one-stop symptom assessment approach and streamlined mechanism for intervention documentation. The documentation module leverages structured data fields and linkage of PRO data with interventions, such as specialist referrals or medication orders, to support enhanced patient care and quality improvement. Conclusions: The ability to easily view an array of patient reported concerns and document interventions against severe or significantly worsening symptoms provides clinicians an enhanced ability to address quality of life related needs. PRO data is now stored electronically in the enterprise warehouse, thus enabling aggregation with data from which to perform population analysis and eventually, pursue opportunities for predictive modeling.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Enterprise Data Warehouse Projects"

1

Adamala, Szymon, and Linus Cidrin. "Key Success Factors in Business Intelligence." Thesis, Blekinge Tekniska Högskola, Sektionen för management, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-5773.

Full text
Abstract:
Business Intelligence can bring critical capabilities to an organization, but the implementation of such capabilities is often plagued with problems and issues. Why is it that certain projects fail, while others succeed? The theoretical problem and the aim of this thesis is to identify the factors that are present in successful Business Intelligence projects and organize them into a framework of critical success factors. A survey was conducted during the spring of 2011 to collect primary data on Business Intelligence projects. It was directed to a number of different professionals operating in the Business Intelligence field in large enterprises, primarily located in Poland and primarily vendors, but given the similarity of Business Intelligence initiatives across countries and increasing globalization of large enterprises, the conclusions from this thesis may well have relevance and be applicable for projects conducted in other countries. Findings confirm that Business Intelligence projects are wrestling with both technological and nontechnological problems, but the non-technological problems are found to be harder to solve as well as more time consuming than their technological counterparts. The thesis also shows that critical success factors for Business Intelligence projects are different from success factors for IS projects in general and Business Intelligences projects have critical success factors that are unique to the subject matter. Major differences can be predominately found in the non-technological factors, such as the presence of a specific business need to be addressed by the project and a clear vision to guide the project. Results show that successful projects have specific factors present more frequently than nonsuccessful. Such factors with great differences are the type of project funding, business value provided by each iteration of the project and the alignment of the project to a strategic vision for Business Intelligence. Furthermore, the thesis provides a framework of critical success factors that, according to the results of the study, explains 61% of variability of success of projects. Given these findings, managers responsible for introducing Business Intelligence capabilities should focus on a number of non-technological factors to increase the likelihood of project success. Areas which should be given special attention are: making sure that the Business Intelligence solution is built with end users in mind, that the Business Intelligence solution is closely tied to company‟s strategic vision and that the project is properly scoped and prioritized to concentrate on best opportunities first. Keywords: Critical Success Factors, Business Intelligence, Enterprise Data Warehouse Projects, Success Factors Framework, Risk Management
APA, Harvard, Vancouver, ISO, and other styles
2

Rifaie, Mohammad. "Strategy and methodology for enterprise data warehouse development : integrating data mining and social networking techniques for identifying different communities within the data warehouse." Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/4416.

Full text
Abstract:
Data warehouse technology has been successfully integrated into the information infrastructure of major organizations as potential solution for eliminating redundancy and providing for comprehensive data integration. Realizing the importance of a data warehouse as the main data repository within an organization, this dissertation addresses different aspects related to the data warehouse architecture and performance issues. Many data warehouse architectures have been presented by industry analysts and research organizations. These architectures vary from the independent and physical business unit centric data marts to the centralised two-tier hub-and-spoke data warehouse. The operational data store is a third tier which was offered later to address the business requirements for inter-day data loading. While the industry-available architectures are all valid, I found them to be suboptimal in efficiency (cost) and effectiveness (productivity). In this dissertation, I am advocating a new architecture (The Hybrid Architecture) which encompasses the industry advocated architecture. The hybrid architecture demands the acquisition, loading and consolidation of enterprise atomic and detailed data into a single integrated enterprise data store (The Enterprise Data Warehouse) where businessunit centric Data Marts and Operational Data Stores (ODS) are built in the same instance of the Enterprise Data Warehouse. For the purpose of highlighting the role of data warehouses for different applications, we describe an effort to develop a data warehouse for a geographical information system (GIS). We further study the importance of data practices, quality and governance for financial institutions by commenting on the RBC Financial Group case. v The development and deployment of the Enterprise Data Warehouse based on the Hybrid Architecture spawned its own issues and challenges. Organic data growth and business requirements to load additional new data significantly will increase the amount of stored data. Consequently, the number of users will increase significantly. Enterprise data warehouse obesity, performance degradation and navigation difficulties are chief amongst the issues and challenges. Association rules mining and social networks have been adopted in this thesis to address the above mentioned issues and challenges. We describe an approach that uses frequent pattern mining and social network techniques to discover different communities within the data warehouse. These communities include sets of tables frequently accessed together, sets of tables retrieved together most of the time and sets of attributes that mostly appear together in the queries. We concentrate on tables in the discussion; however, the model is general enough to discover other communities. We first build a frequent pattern mining model by considering each query as a transaction and the tables as items. Then, we mine closed frequent itemsets of tables; these itemsets include tables that are mostly accessed together and hence should be treated as one unit in storage and retrieval for better overall performance. We utilize social network construction and analysis to find maximum-sized sets of related tables; this is a more robust approach as opposed to a union of overlapping itemsets. We derive the Jaccard distance between the closed itemsets and construct the social network of tables by adding links that represent distance above a given threshold. The constructed network is analyzed to discover communities of tables that are mostly accessed together. The reported test results are promising and demonstrate the applicability and effectiveness of the developed approach.
APA, Harvard, Vancouver, ISO, and other styles
3

Fasano, Domenico. "progettazione e sviluppo in una enterprise data platform per l’analisi di fondi strutturali: il caso regione emilia-romagna." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020.

Find full text
Abstract:
Uno dei clienti di Iconsulting è la Regione Emilia-Romagna, con la quale vengono realizzati diversi progetti nell'ottica della Data Analytics, tra cui il progetto denominato DatalakER al quale ho contribuito, nato per mettere a disposizione della pubblica amministrazione regionale uno strumento più evoluto di quelli precedentemente esistenti per l'analisi del patrimonio informativo a loro disposizione. La soluzione individuata è stata quella di creare una Data Platform con una architettura adatta ad un contesto Enterprise, in cui sono priorità critiche la sicurezza e la corretta gestione dei dati. La tesi nei primi due capitoli presenta una parte introduttiva in cui vengono esplicati i concetti alla base del lavoro svolto. Invece, nel terzo capitolo vengono illustrate le tecnologie che vengono usate nel progetto DatalakER. Dopodiché, nel quarto capitolo viene relazionata la progettazione e l'implementazione dell'architettura della Enterprise Data Platform. Infine, nel quinto capitolo vengono approfondite le attività di analisi di fondi strutturali che l'architettura ha abilitato.
APA, Harvard, Vancouver, ISO, and other styles
4

Nilsson, Erik. "Knowledge transfer in enterprise resource planning (ERP) projects : Towards a framework for increased learning when implementing ERP Systems." Thesis, Växjö University, School of Mathematics and Systems Engineering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:vxu:diva-5905.

Full text
Abstract:
<p>Companies spend considerable amounts of money on implementation of enterprise resource planning (ERP) systems. The implementation of an ERP system is risky since it involves the core administrative processes used to give a good customer service, plan and monitor production, handle suppliers and monitor the financial effectiveness of the company. It is quite clear that a wrongly managed ERP implementation can cause lower customer satisfaction and weakening trust from the market. These are effects that companies can’t afford in most markets were competition is very strong and customer service is the key to future improved business. One very important part to minimize the risk in such projects is to focus on change management and knowledge transfer to the end users. The end users need to be equipped with the right knowledge in the new ERP system from day one, otherwise the risks grow considerable. Missing knowledge can cost missed deliveries, customer complaints, financial claims and most importantly lower compatibility on the market. This thesis builds a framework with main points to consider when building a positive learning environment and how to break the information wall so that the trainer can get through with the message.</p>
APA, Harvard, Vancouver, ISO, and other styles
5

Burns, Jessie. "An Investigation of Multiple Integration Techniques for Information Systems: A Model for Integrating Data Warehousing, ERP, and SOA in Practice." NSUWorks, 2010. http://nsuworks.nova.edu/gscis_etd/110.

Full text
Abstract:
This study addressed issues associated with implementing multiple data integrators. The process included implementing the same data integrator multiple times or implementing multiple different data integrators. It was shown that single data integrators implementations were failing at high rates. Environments were complicated because there were more acquisitions and mergers now than in the past, companies were required to be competitive while implementations were taking place, and possibly had short implementation time frames. The main goal of this study was to develop an agile, sturdy, and rigorous model that addressed the needs of companies that found themselves in situations where they needed to implement two or more data integrators and possibly business initiatives. This researcher developed a model and a questionnaire. The questionnaire was used to gather data from managers and practitioners in the information technology industry. The collected information was expected to show a consensus about the proposed JBurns Systems Implementation Model (JBSIM). Analysis showed various industry participants used different combinations of the proposed data integrators supported and they supported the proposed JBSIM. This model allowed for successful implementation of multiple data integrators used in this research. As this model matured, it had the potential that allowed the use of other data integrators not used in this research. It had potential that allowed corporate initiatives to be used.
APA, Harvard, Vancouver, ISO, and other styles
6

Olsson, Johannes, and Mikael Sjöberg. "Fallgropar vid mjukvaruutveckling inom Enterprise Application Integration." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-186405.

Full text
Abstract:
Enterprise Application Integration (EAI) handlar om att koppla samman system för att tillfredsställa behov som inte kunnat tillfredsställas av systemen var och en för sig, eller upprätta nya kommunikationskanaler inom eller mellan företag. Allteftersom system och deras underliggande teknologier blivit mer avancerade har uppgiften att integrera dem blivit mer komplicerad. Genom att göra en fallstudie i samband med ett integrationsprojekt på ett företag syftar denna uppsats till att identifiera fallgropar som kan innebära risker för sådana projekt. En målsättning är att analyser kring dessa fallgropar ska kunna användas för att hjälpa utvecklare undvika dessa i sina projekt. Under fallstudien identifierades flera fallgropar genom analys av dokumentation och reflektion. Den fallgrop som hade störst påverkan under projektet var otydliga krav. Fallgroparna analyseras utifrån hur de upptäcktes och resonemang förs kring bidragande faktorer. För de fallgropar som utgjorde faktiska utmaningar under fallstudien analyseras även hur de undveks, medan potentiella lösningar diskuteras för resterande.<br>Enterprise Application Integration (EAI) concerns connecting systems together to meet demands that could not be met by either system individually, or to establish new channels for communication within and between enterprises. As the systems and their underlying technologies have become more advanced, the task of integrating them has become a more complicated one. By making a case study bound to an integration project at an enterprise, this study aims to identify pitfalls that could constitute risks for such projects. The main goal is for the analysis of these pitfalls to guide developers to avoid the pitfalls in their own projects. Several pitfalls were identified by analyzing documentation and reflecting on the course of the project. The pitfall that had the greatest impact during the project was unclear requirements. The pitfalls are analyzed with emphasis on circumstances under which they were discovered and contributing factors are reasoned about. For the pitfalls that were actual challenges for the case study, analysis also covers how they were avoided, while potential solutions are discussed for the remainder.
APA, Harvard, Vancouver, ISO, and other styles
7

Rahman, Shahbaaz. "The Impact of Adopting “Business Intelligence (BI)” in Organizations." Thesis, Uppsala universitet, Informationssystem, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-153204.

Full text
Abstract:
In economically turbulent times, Business Intelligence (BI) is increasingly being used to mitigate risk and increase certainty. The evolution of BI from a series of technologies to a platform for supporting strategies is analyzed in this thesis. The concentration is on how BI can streamline manufacturing, quality assurance, marketing and customer service are evaluated, as are the potential payoffs of increasing the level of insight an organization has. The thesis also includes analysis of how the more complex value chain processes including build-to-order, configure-to-order and quote-to-order can be made more efficient and profitable through the inclusion of BI and its associated analytics and technologies. The inclusion of the Delphi research technique makes this paper unique and strong in its content as well. The role of BI has shifted from being used in specific functional areas of an organization to being strategic in scope. The intent of this thesis is to evaluate its contributions to the customer-facing processes that are the most complex and most challenging to sustain, making BI an indispensible platform for their successful execution on the base of theories and practical experience of the BI experts.
APA, Harvard, Vancouver, ISO, and other styles
8

Nilsson, Andreas, and Jonas Karlsson. "Förpackningshantering i det dolda : En studie om hur den interna hanteringen kan synliggöras och förbättras." Thesis, Linnéuniversitetet, Institutionen för datavetenskap, fysik och matematik, DFM, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-12900.

Full text
Abstract:
Handling materials and packaging, which is the focus of this study, requires efficient logistics, both internally and between companies. To achieve an efficient material flow, good information systems, whose role is to collect, store, process and distribute information, are required. The assumption that packagings are handled in an efficient manner is currently not a matter of course, when many companies do not prioritize the internal management of the package. The purpose of this report was to investigate how medium-sized businesses working with distribution and handling of spare parts packaging can improve their internal management of packaging. In order to answer the purpose, we have used an abductive process and a qualitative data collection. The study had Logistic Center Ljungby (LCL) at Electrolux Laundry Systems (ELS) as a case company and two other respondents, Nibe in Markaryd and Enertech in Ljungby. With these respondents, we conducted two interviews per respondent. In this study we described the concepts that are linked to the internal management of packaging according to three main areas. These three main areas are inventory planning, information systems to support and Enterprise Resource Planning. The empirical report is divided into three chapters which each respondent is treated separately. These three chapters are structured in the same manner, but with a greater emphasis on the case company. The areas of these chapters are company facts about the company, inventory planning today and the view of information systems. Through effective inventory planning with information to support, we believe that the internal management of packaging could be better. The result of this study shows that following factors should be considered, A rticle number, Classification, Reorder point, Reduce separated inventory, Call-of and BOM-terms.
APA, Harvard, Vancouver, ISO, and other styles
9

Zahradník, Jan. "Klient pro zobrazování OLAP kostek." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2009. http://www.nusl.cz/ntk/nusl-412816.

Full text
Abstract:
The purpose of this work is in analytical tool supporting manager decision making. The goal was to develop a reporting system which simplifies company customer's orientation in key performance indicators. As implementation environment has been used .NET Framework 3.5 with C\# language and database server Microsoft SQL 2008 with Analysis Services extension, for web interface has been used ASP.NET.
APA, Harvard, Vancouver, ISO, and other styles
10

Castellani, Giovanni. "Data capturing: studio di fattibilità per l’acquisizione automatizzata delle informazioni lungo la catena produttiva." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2016.

Find full text
Abstract:
Il processo di Data Entry manuale non solo è oneroso dal punto di vista temporale ed economico, lo è ancor di più poiché rappresenta una fonte di errore: per questi motivi, l’acquisizione automatizzata delle informazioni lungo la catena produttiva è un obiettivo fortemente desiderato dal Gruppo per migliorare i propri business. Le tecnologie analizzate, ormai diffuse e standardizzate in ampia scala come barcode, etichette logistiche, terminali in radiofrequenza, possono apportare grandi benefici ai processi aziendali, ancor più integrandole su misura agli ERP aziendali, permettendo una registrazione rapida e corretta delle informazioni e la diffusione immediata delle stesse all’intera organizzazione. L’analisi dei processi e dei flussi hanno evidenziato le criticità e permesso di capire dove e quando intervenire con una progettazione che risultasse quanto più la best suite possibile. Il lancio dei fabbisogni, l’entrata, la mappatura e la movimentazione merci in Magazzino, lo stato di produzione, lo scarico componenti ed il carico di produzione in Confezionamento e Semilavorazione, l’istituzione di un magazzino di interscambio Dogana, un flusso di tracciabilità preciso e rapido, sono tutti eventi che modificheranno i processi aziendali, snellendoli e svincolando risorse che potranno essere reinvestite in operatività a valore aggiunto superiore. I risultati potenzialmente ottenibili, comprovati anche dalle esperienze esterne di fornitori e consulenza, hanno generato le condizioni necessarie ad un rapido studio e start dei lavori: il Gruppo è entusiasta ed impaziente di portare a termine quanto prima il progetto e di andare a regime con la nuova modalità operativa, snellita ed ottimizzata.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Enterprise Data Warehouse Projects"

1

Enterprise data warehouse. Prentice Hall PTR, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Messerli, Allen. A global enterprise data warehouse at 3M. Information Management Forum, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bryan, Bergeron, Al-Daig Hamad, Alswailem Osama, UL Hoque Enam, and Saad AlBawardi Fadwa. Developing a Data Warehouse for the Healthcare Enterprise. Edited by Bryan P. Bergeron, MBA Al-Daig, MD Alswailem, MBA UL Hoque, and MS Saad AlBawardi. Productivity Press, 2018. http://dx.doi.org/10.4324/9781315145174.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bergeron, Bryan P., John Glaser, Hamad Al-Daig, et al. Developing a data warehouse for the healthcare enterprise: Lessons from the trenches. Himss, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Töpfer, Jochen. Active Enterprise Intelligence(TM): Unternehmensweite Informationslogistik als Basis einer wertorientierten Unternehmenssteuerung (German Edition). Springer, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Cappellini, Vito, ed. Electronic Imaging & the Visual Arts. EVA 2014 Florence. Firenze University Press, 2014. http://dx.doi.org/10.36253/978-88-6655-573-5.

Full text
Abstract:
Information Technologies of interest for Culture Heritage are presented: multimedia systems, data-bases, data protection, access to digital content, Virtual Galleries. Particular reference is reserved to digital images (Electronic Imaging &amp; the Visual Arts), regarding Cultural Institutions (Museums, Libraries, Palace - Monuments, Archaeological Sites). The International Conference includes the following Sessions: Strategic Issues; EC Projects and Related Networks &amp; Initiatives; 2D - 3D Technologies and Applications; Virtual Galleries - Museums and Related Initiatives; Access to the Culture Information. Three Workshops regard: International Cooperation; Innovation and Enterprise; e.Culture Cloud.
APA, Harvard, Vancouver, ISO, and other styles
7

Cappellini, Vito, ed. Electronic Imaging & the Visual Arts. EVA 2013 Florence. Firenze University Press, 2013. http://dx.doi.org/10.36253/978-88-6655-372-4.

Full text
Abstract:
Important Information Technology topics are presented: multimedia systems, data-bases, protection of data, access to the content. Particular reference is reserved to digital images (2D, 3D) regarding Cultural Institutions (Museums, Libraries, Palace – Monuments, Archaeological Sites). The main parts of the Conference Proceedings regard: Strategic Issues, EC Projects and Related Networks &amp; Initiatives, International Forum on “Culture &amp; Technology”, 2D – 3D Technologies &amp; Applications, Virtual Galleries – Museums and Related Initiatives, Access to the Culture Information. Three Workshops are related to: International Cooperation, Innovation and Enterprise, Creative Industries and Cultural Tourism.
APA, Harvard, Vancouver, ISO, and other styles
8

Redbooks, IBM. Planning a Tivoli Enterprise Data Warehouse Project. IBM.Com/Redbooks, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Redbooks, IBM. Introduction to Tivoli Enterprise Data Warehouse. Ibm, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sperley. Planning, Building and Using the Enterprise Data Warehouse. Pearson Education, Limited, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Enterprise Data Warehouse Projects"

1

Morgan, David, Jai W. Kang, and James M. Kang. "Minable Data Warehouse." In Enterprise Information Systems. Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01347-8_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jarke, Matthias, Maurizio Lenzerini, Yannis Vassiliou, and Panos Vassiliadis. "Data Warehouse Research: Issues and Projects." In Fundamentals of Data Warehouses. Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-662-05153-5_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Jarke, Matthias, Maurizio Lenzerini, Yannis Vassiliou, and Panos Vassiliadis. "Data Warehouse Research: Issues and Projects." In Fundamentals of Data Warehouses. Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/978-3-662-04138-3_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Al-Daig, Hamad. "Enterprise Environment." In Developing a Data Warehouse for the Healthcare Enterprise. Productivity Press, 2018. http://dx.doi.org/10.4324/9781315145174-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hoque, Enam UL. "Data Warehouse Report Life Cycle." In Developing a Data Warehouse for the Healthcare Enterprise. Productivity Press, 2018. http://dx.doi.org/10.4324/9781315145174-11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Brown, Joe, and Paul Hill. "Data Marts: Key to Reviving the Enterprise Data Warehouse." In Data Warehousing. Vieweg+Teubner Verlag, 2000. http://dx.doi.org/10.1007/978-3-322-84964-9_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Oliveira, Rui, Fátima Rodrigues, Paulo Martins, and João Paulo Moura. "Dimensional Templates in Data Warehouses: Automating the Multidimensional Design of Data Warehouse Prototypes." In Enterprise Information Systems. Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01347-8_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chevalier, Max, Mohammed El Malki, Arlind Kopliku, Olivier Teste, and Ronan Tournier. "How Can We Implement a Multidimensional Data Warehouse Using NoSQL?" In Enterprise Information Systems. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-29133-8_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gluchowski, Peter. "Data Vault as a Modeling Concept for the Data Warehouse." In Engineering the Transformation of the Enterprise. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-84655-8_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Brobst, Stephen A. "Enterprise Application Integration and Active Data Warehousing." In Vom Data Warehouse zum Corporate Knowledge Center. Physica-Verlag HD, 2002. http://dx.doi.org/10.1007/978-3-642-57491-7_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Enterprise Data Warehouse Projects"

1

van Wyngaarden, Robert, and Mel VanderWal. "Managing GIS and Spatial Data to Support Effective Decision Making Throughout the Pipeline Lifecycle." In 2006 International Pipeline Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/ipc2006-10472.

Full text
Abstract:
Many pipeline industry managers and senior officials intuitively understand that location is important to most aspects related to pipelines throughout the life-cycle — from project concept, through construction and operations and finally to decommissioning. However, many organizations are not taking full advantage of location as being a vital component to support business decision-making across the entire range of activities undertaken by pipeline companies. A Geographic Information System (GIS) is a tool that takes advantage of geography. GIS is ideally suited for the storage, display, and output of geographic data, and moreover, the analysis and modeling of geographic data. While GIS has been around as a technology for over 30 years it is only in the last several years that it has started to be extensively used within the pipeline industry. Most managers have heard about GIS. Many organizations have already started to implement GIS and CAD-based solutions through individual projects and with a technical focus of automating work flows or business processes such as generating alignment sheets, regulatory compliance, integrity management, and land management to name a few. Given that many of these applications tend to be stand-alone or isolated developments, pipeline companies need to look at the complete spatial environment of all potential tools and applications, and support this with a vision of a common spatial data warehouse in a holistic sense. Any company that embraces a continuous gathering of spatial data throughout the pipeline life-cyle will have a significant knowledge base whose value will increase over time. A spatial data warehouse of truly integrated environmental, engineering and socioeconomic factors related to a pipeline during the entire lifecycle will have a total value that transcends the value of the individual factors. The Return on Investment (ROI) of a properly developed GIS framework and spatial data warehouse looking at all operational demands and support applications will certainly be many times over the original expenditure as measured in cost savings as well as better decision making. This paper will present insights and approaches into how to properly and effectively leverage the spatial data asset and in deploying GIS throughout the enterprise. These include addressing all of the elements that are key in implementing GIS — hardware, software, data, people and methods — as well as considering some of the ROI and value-based measures for GIS success.
APA, Harvard, Vancouver, ISO, and other styles
2

"INCREMENTAL DATA QUALITY IN THE DATA WAREHOUSE." In 6th International Conference on Enterprise Information Systems. SciTePress - Science and and Technology Publications, 2004. http://dx.doi.org/10.5220/0002600806340637.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

"COMBINING THE DATA WAREHOUSE AND OPERATIONAL DATA STORE." In 8th International Conference on Enterprise Information Systems. SciTePress - Science and and Technology Publications, 2006. http://dx.doi.org/10.5220/0002490802820288.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

McGlothlin, James P., Amar Madugula, and Ilija Stojic. "The Virtual Enterprise Data Warehouse for Healthcare." In 10th International Conference on Health Informatics. SCITEPRESS - Science and Technology Publications, 2017. http://dx.doi.org/10.5220/0006253004690476.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bagambiki, Erika. "Enterprise Data warehouse and Business Intelligence Solution." In ICEGOV '18: 11th International Conference on Theory and Practice of Electronic Governance. ACM, 2018. http://dx.doi.org/10.1145/3209415.3209420.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

"A DATA WAREHOUSE ARCHITECTURE FOR INTEGRATING FIELD-BASED DATA." In 9th International Conference on Enterprise Information Systems. SciTePress - Science and and Technology Publications, 2007. http://dx.doi.org/10.5220/0002391105770580.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

"A NEW LOOK INTO DATA WAREHOUSE MODELLING." In 9th International Conference on Enterprise Information Systems. SciTePress - Science and and Technology Publications, 2007. http://dx.doi.org/10.5220/0002347205400543.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

"OLAP AGGREGATION FUNCTION FOR TEXTUAL DATA WAREHOUSE." In 9th International Conference on Enterprise Information Systems. SciTePress - Science and and Technology Publications, 2007. http://dx.doi.org/10.5220/0002364401510156.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

"MULTIDIMENSIONAL REFERENCE MODELS FOR DATA WAREHOUSE DEVELOPMENT." In 9th International Conference on Enterprise Information Systems. SciTePress - Science and and Technology Publications, 2007. http://dx.doi.org/10.5220/0002366803470354.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Solodovnikova, Darja, Laila Niedrite, and Lauma Svilpe. "Managing Evolution of Heterogeneous Data Sources of a Data Warehouse." In 23rd International Conference on Enterprise Information Systems. SCITEPRESS - Science and Technology Publications, 2021. http://dx.doi.org/10.5220/0010496601050117.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Enterprise Data Warehouse Projects"

1

Kramer, Mitchell. DB2 Data Warehouse Enterprise Edition. Patricia Seybold Group, 2003. http://dx.doi.org/10.1571/pr9-11-03cc.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!