Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Microsoft SQL Server.

Дисертації з теми "Microsoft SQL Server"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 дисертацій для дослідження на тему "Microsoft SQL Server".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Badiozamany, Sobhan. "Microsoft SQL Server OLAP Solution - A Survey." Thesis, Uppsala University, Department of Information Technology, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-130636.

Повний текст джерела
Анотація:

Microsoft SQL Server 2008 offers technologies for performing On-Line Analytical Processing (OLAP), directly on data stored in data warehouses, instead of moving the data into some offline OLAP tool. This brings certain benefits, such as elimination of data copying and better integration with the DBMS compared with off-line OLAP tools. This report reviews SQL Server support for OLAP, solution architectures, tools and components involved. Standard storage options are discussed but the focus of this report is relational storage of OLAP data. Scalability test is conducted to measure performance of Relational OLAP (ROLAP) storage option. The scalability test shows that when ROLAP storage mode is used, query response time grows linearly with dataset size. A tutorial is appended to demonstrate how to perform OLAP tasks using SQL Server in practice.

Стилі APA, Harvard, Vancouver, ISO та ін.
2

Bergvall, Michael. "Utformning av databasdriven webbapplikation i ASP.NET C# och Microsoft SQL Server." Thesis, Umeå universitet, Institutionen för tillämpad fysik och elektronik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-58114.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Andersson, Magnus. "Automatiserad dokumentation vid systemutveckling." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-98860.

Повний текст джерела
Анотація:
Ett erkänt problem inom industrin för mjukvaruutveckling är bristen på kvalitativ systemdokumentation. Hos företaget Multisoft Consulting finns detta problem. Utvecklare måste spendera onödigt mycket tid på att sätta sig in i befintliga system. Som en del av lösningen vill företaget införa automatisk generering av dokumentation. Genereringen ska ske i plattformen Softadmin® som används för att bygga alla kundsystem. Plattformen är baserad på C# och Microsofts SQL Server och innehåller en mängd färdiga komponenter med olika funktionalitet. För att veta vilken dokumentation som bör genereras automatiskt har litteraturstu-dier och intervjuer med Multisoft Consulting-anställda genomförts. För att veta vilken dokumentation som kan genereras har Softadmin® analyserats. Undersökningarna visar att dokumentation som ger en överblick över ett system samt visar hur systemet används både är önskvärd och möjlig att generera via Softadmin®. En form av system-överblick fanns redan implementerad i Softadmin® i form av en träd-struktur. Överblicken saknade dock en del önskvärda detaljer vilket medförde att fokus för studien blev att implementera en prototyp som kompletterade överblicken. Resultatet är att information om systemets menyval, vilket är sidor med olika funktionalitet, nu visas i överblicken.
A well-known problem within the software development industry is the absence of qualitative system documentation. This problem can be found within the company Multisoft Consulting. Too much time is spent by the developers when they must familiarize themselves with an existing system. As a part of the solution to this problem the company would like to generate documentation automatically. The generation should be performed by the platform Softadmin® which is used to develop all customer systems. The platform is based on C# and Microsoft’s SQL Server and contains different components, each with its own functionality. In order to find out which documentation that should be generated automatically literature has been studied and developers have been interviewed. In order to know which documentation that can be generated by Softadmin® the platform has been analyzed. The conclusion is that documentation which provides a general view of the system and demonstrates how the system is used is both desirable and possible to generate with Softadmin®. A kind of general view had already been implemented in Softadmin®. However, some desired features were not included in the general view. The priority of this study became to implement a prototype which completed the general view by including some of the desired features. The result is that it is now possible to display information about the system’s menu items, which is pages with different functionality, within the general view.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Freitas, Cláudia Marisa Araújo. "Integração das funcionalidades do programa SQL Server R Services 2016 na empresa Quidgest." Master's thesis, Instituto Superior de Economia e Gestão, 2017. http://hdl.handle.net/10400.5/14989.

Повний текст джерела
Анотація:
Mestrado em Métodos Quantitativos para a Decisão Económica e Empresarial
O programa R Services é utilizado por empresas de modo a desenvolver capacidades estatísticas, a partir das suas bases de dados. Por outro lado, o programa Microsoft SQL Server é um software de gestão de bases de dados que permite a criação de código em linguagem de programação, fazendo a atualização automática dos dados. Neste contexto surgiu o programa Microsoft SQL Server R Services 2016, que combina integração dos sistemas de gestão de bases de dados das empresas com as análises estatísticas. O objetivo desta dissertação é explorar o potencial da integração do Microsoft SQL Server R Services 2016 utilizando as linguagens Transact-SQL e R, nas quais a partir das competências computacionais e estatísticas permite a criação de procedimentos com código R a ser executado dentro do programa SQL Server. De modo a explorar estas potencialidades foi efetuado um estágio curricular na empresa Quidgest, no qual se teve acesso à informação necessária de uma base de dados. Neste contexto, foi desenvolvido um caso de estudo, em que se analisou o sucesso desta nova integração dos dois software mencionados anteriormente.
The R Services program is used by companies to develop cognitive and predictive capabilities from their databases. On the other hand, the Microsoft SQL Server program is a database management software that allows to create programming code, and automatically updating the data. In this context, the Microsoft SQL Server R Services 2016 was created, combining the integration of the companies' database management systems with statistical analyses. The purpose of this thesis is to explore the potential of the integration of Microsoft SQL Server R Services 2016 using Transact-SQL and R languages, in which computational and statistical skills allow the creation of procedures with R code to be executed within the SQL program Server. In order to explore these potentialities, an internship was carried out at Quidgest S.A, from which the necessary information was obtained from a database. From here, a case study was developed, where the success of this new integration of the two software mentioned above, was analysed.
info:eu-repo/semantics/publishedVersion
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Jetter, Michael. "Semantische und logische Datenmodellierung multidimensionaler Strukturen am Beispiel Microsoft® SQL Server&r;"Yukon"." [S.l. : s.n.], 2005. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB11759351.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Pyszko, Pavel. "Technologie vysoké dostupnosti MS SQL Serveru." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-220540.

Повний текст джерела
Анотація:
The thesis contains a complete theoretical overview of high availability technologies in Microsoft SQL Server. For each technology, guidance is provided for the deployment of technologies. Technologies are analyzed from a security perspective, they are determined advantages and disadvantages of using technology in practice and is determined the optimal variant of use of the technology. High availability technology are compared with each other and is given the availability of individual technologies in versions of MS SQL Server. The thesis contains three scenarios with practical examples of using technology for high availability in practice. It provides an analysis of high-availability features in Oracle and are subsequently compared high availability features in Oracle environments with high availability technology in MS SQL Server.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Mazáč, Pavel. "Návrh interaktivního WWW OLAP rozhraní pro analýzu produkce výrobních závodů." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2008. http://www.nusl.cz/ntk/nusl-235999.

Повний текст джерела
Анотація:
This work is focused on OLAP analysis. The work presents important theoretic facts and compares availible OLAP systems in different ways. The main goal was to create own OLAP system. Design and implementation of this system is described in the project.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Holeček, Ivan. "Databázové řešení pro ukládání měřených dat." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2018. http://www.nusl.cz/ntk/nusl-382262.

Повний текст джерела
Анотація:
Diploma thesis is focused on elaboration of database solution for saving measured data. In theory part analyses database query language and database management system Microsoft SQL Server 2017. Further in theory part is focused on programming environment for application development using C# .NET. Thesis includes database solution for saving measured data, service console application for saving data into the database and user application for creating new measuring, representation of data and user administration.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Lundström, Anton. "Databasoptimering för användning med Power BI : Hur indexering och kompression kan förbättra prestanda vid datahämtning." Thesis, Högskolan i Gävle, Datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-32730.

Повний текст джерела
Анотація:
I mätrummet på Sandvik Coromant finns en lösning för att visualisera maskinhälsa, mäthistorik och servicetider för olika mätinstrument. Lösningen för datavisualiseringen nyttjar verktyget Power BI och är kopplad till Excelfiler. När data väl hämtats in görs en rad modifieringar på tabellerna för att få fram visualiserbar data. Dessa modifieringar i kombination med många Excelark resulterar i att ledtiderna för att uppdatera en Power BI rapport blir väldigt långa. Nu önskas det att istället nyttja en databaslösning för den data dessa Excelfiler innehåller och därmed förbättra dessa ledtider. Således skapades en databas utifrån den data dessa Excelfiler innehöll. Power BI tillåter användaren att importera data från en databas till applikationen på två sätt, via Import Mode eller DirectQuery. Import Mode läser in samtliga tabeller som efterfrågas och lagrar dessa i minnet. DirectQuery ställer frågor direkt till databasen utifrån vad som efterfrågas. I och med denna skillnad i importsätt finns metoder för att optimera den databas som data läses in ifrån. Studien undersöker hur olika typer av indexering och olika typer av kompression av dessa index påverkar svarstiden på frågor ställda av Power BI för att besvara följande två forskningsfrågor: Hur påverkar olika typer av indexering av en databas datahämtningshastigheten vid användning av Power BI? Hur påverkar olika typer av kompression av index datahämtningshastigheten vid användning av Power BI? Studien utfördes genom att studera execution plans och exekveringshastighet för de frågor som ställdes mot databasen av Power BI. Med hjälp av T-SQL kunde exekveringshastigheten för en specifik fråga tas fram. Denna exekveringshastighet jämfördes sedan för de olika typerna av index och kompression mot exekveringshastigheten för samma fråga mot en tabell helt utan index. Detta utfördes sedan på tabeller med varierande antal rader, där antalet rader som testades var 33 001, 50 081, 100 101, 500 017 och 1 000 217. Resultatet av studien visar att för Import Mode är det bästa typen av index ett clustered rowstore index utan kompression, med undantag för tabeller med över 1 001 217 rader där radkompression presterade bättre. För DirectQuery presterade non-clustered rowstore index bäst, men för vilken kompression var resultatet tvetydigt. Detta eftersom samtliga typer av kompression presterade bäst för olika antal rader i tabellen. För tabeller med fler än 500 017 rader presterade dock ingen kompression allra bäst.
In the measurement room at Sandvik Coromant there is a solution for visualizing machine health, measurement history and service times for different measuring instruments. The data visualization solution uses Power Bi and connects to Excel files. Once the data has been collected, a number of modifications are made on the tables to produce something that is possible to visualize. These modifications in combination with many Excel sheets result in very long lead times for updating a Power BI report. Now it is desired to use a database solution for the data contained in the Excel files and thus improve these lead times. For this, a database was created based on the data that these Excel files contained. Power BI allows the user to import data from a database into the application in two ways, via Import Mode or DirectQuery. Import Mode loads all the requested tables and stores them in memory. DirectQuery runs queries directly to the database, based on what is requested. Due to this difference, there are methods to optimize the database from which the data is loaded. This study examines how different types of indexing and different types of compression affect the response time for queries ran by Power BI to answer the following two research questions: How do different types of indexing affect a database's data retrieval rate when using Power BI? How do different types of compression affect the data retrieval rate when using Power BI? This was done by studying execution plans and execution rate for the queries that was done towards the database by Power BI. With the help of T-SQL, the execution rate for a specific query was obtained. The execution rate for different types of index and compression was then compared against a table without an index. This was then performed on tables with varying numbers of rows, where the numbers of rows that were tested was 33 001, 50 081, 100 101, 500 017 and 1 000 217. The results of the study show that for Import Mode, the best type of index is a clustered rowstore index without compression, with the exception of tables with over 1 001 217 rows where row compression performed better. For DirectQuery, non-clustered rowstore index performed best, but for which compression the result was ambiguous. This was because all types of compression performed best for different number of rows in the table. However, for tables with more than 500 017 rows, no compression performed best.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Шабовта, Г. С. "Інформаційно- комп'ютерна система реєстрації звернень громадян в електронному урядуванні". Thesis, Чернігів, 2020. http://ir.stu.cn.ua/123456789/23429.

Повний текст джерела
Анотація:
Шабовта, Г. С. Інформаційно- комп'ютерна система реєстрації звернень громадян в електронному урядуванні : випускна кваліфікаційна робота : 123 "Комп’ютерна інженерія" / Г. С. Шабовта ; керівник роботи А. І. Роговенко ; НУ "Чернігівська політехніка", кафедра інформаційних і комп’ютерних систем. – Чернігів, 2020. – 73 с.
Об'єктом розробки є система реєстрації електронних звернень громадян в електронному урядуванні. Метою розробки було створення системи реєстрації звернень, яка б була спрощеною системою. В ході виконання кваліфікаційної роботи була розроблена автоматизована система, яка надає можливість громадян звертатися до органів самоврядування або влади в електронному вигляді . В ході розробки проведено аналіз існуючих систем реєстрації звернень. В результаті аналізу предметної області висунуті основні вимоги до розроблювальної систем та спроектована ії архітектура. В ході розробки програмної складової була розроблена система реєстрації електронних звернень громадян в електронному урядуванні . При розробці системи застосовувалась технологія ASP.NET Core, середовище розробки Microsoft VisualStudio, система управління базами даних Microsoft SQL Server, а також новітні технології Razor Pages та Entity Framework. Робота системи можлива в ОС Windows. Виконання роботи проводилося відповідно до поставлених вимог. Подальший розвиток інформаційної комп'ютерної системи можливий в сторону додавання можливостей створення звітів для роздрукування.
The object of development is a system of registration of electronic appeals of citizens in e-government. The purpose of the development was to create a system of registration of appeals, which would be a simplified system. In the course of the qualification work, an automated system was developed that allows citizens to apply to local governments or authorities in electronic form. During the development, an analysis of existing appeal registration systems was conducted. As a result of the analysis of the subject area the basic requirements to the development system are put forward, its architecture is designed. During the development of the software component, a system of registration of electronic applications of citizens in e-government was developed. The system was developed using ASP.NET Core technology, Microsoft Visual Studio development environment, Microsoft SQL Server database management system, as well as the latest Razor Pages and Entity Framework technologies. The system works in Windows. The work was performed in accordance with the requirements. Further development of the computer information system is possible in the direction of adding the ability to create reports for printing.
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Buřil, Marek. "Vývoj business intelligence na platformě Microsoft: komparace tradičního BI a selfservice BI." Master's thesis, Vysoká škola ekonomická v Praze, 2013. http://www.nusl.cz/ntk/nusl-194709.

Повний текст джерела
Анотація:
The first aim of this thesis is map the Microsoft development tools for business intelligence systems. This goal devide the tools for develop BI into two groups: tools for traditional BI systems and tools for self-service BI systems. The second aim of this thesis is a critical evaluation of approaches based on the criteria and scenarios focuse on selected business areas. This thesis answers the question if some of the approaches is a better solution for business intelligence systems, or whether one approach is more suitable for business intelligence solutions to some business areas.
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Řezníček, Jiří. "Analýza platformy SAP BI a porovnání vybraných nástrojů s řešením MS SQL Server BI." Master's thesis, Vysoká škola ekonomická v Praze, 2008. http://www.nusl.cz/ntk/nusl-12312.

Повний текст джерела
Анотація:
This diploma thesis deals with an analysis of SAP Business Intelligence platform (SAP NetWeaver BI 7.0) and compares its chosen tools with Microsoft SQL Server BI. The main aim of the thesis is to evaluate SAP BI system in a comprehensive way whereas extra attention is paid to certain areas, such as a data warehouse. One of the goals is the comparison of approaches and methods of the creation of the data warehouse with MS SQL Server BI solution. The thesis consists of theoretical insight into Business Intelligence, which is focused on the architecture of data warehouse, and OLAP technologies. There is an evaluation of the role of SAP in the BI market followed by the introduction of SAP NetWeaver, an application and integration platform. Included is also an analysis of the architecture of data warehouse, methods of storing data, ETL process and other chosen functions and tools (including presentation tools) in next sections. A sample application in the final part provides the reader with better understanding of different ETL approaches, which are analyzed into detail and evaluated. There is the comparison of OLAP technologies in the end and the comparison of basic reporting tools, as well.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Veselý, Tomáš. "Vývoj a provoz PHP aplikací v prostředí Microsoft Azure Platform." Master's thesis, Vysoká škola ekonomická v Praze, 2010. http://www.nusl.cz/ntk/nusl-124738.

Повний текст джерела
Анотація:
Windows Azure Platform is product of the Microsoft company aimed to host web applications. With this product, Microsoft is reacting on the new trend in IT -- cloud computing. This diploma thesis deals with all the aspects of developing and running PHP applications on Windows Azure Platform. Thesis is divided into two parts -- theoretical and practical. In the theoretical part, Windows Azure Platform is described in detail. The description consists of it's main parts, position in the world of cloud computing, price and main competitors. In the practical part, migration of school PHP application developed during 4IT445 course from LAMP platform to Windows Azure Platform is described. Except the migration itself, the benefits of new platform to application are described. In the thesis conclusion, development and running applications on both platforms is compared. The target of the thesis is to write first complete Czech text about running PHP applications on Windows Azure Platform and to help PHP developers to chose which way to go.
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Wheeler, Ryan. "BlindCanSeeQL: Improved Blind SQL Injection For DB Schema Discovery Using A Predictive Dictionary From Web Scraped Word Based Lists." Scholar Commons, 2015. http://scholarcommons.usf.edu/etd/6050.

Повний текст джерела
Анотація:
SQL Injections are still a prominent threat on the web. Using a custom built tool, BlindCanSeeQL (BCSQL), we will explore how to automate Blind SQL attacks to discover database schema using fewer requests than the standard methods, thus helping avoid detection from overloading a server with hits. This tool uses a web crawler to discover keywords that assist with autocompleting schema object names, along with improvements in ASCII bisection to lower the number of requests sent to the server. Along with this tool, we will discuss ways to prevent and protect against such attacks.
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Kocfelda, Jan. "Implementace Business Intelligence řešení ve společnosti TN Biofaktory." Master's thesis, Vysoká škola ekonomická v Praze, 2013. http://www.nusl.cz/ntk/nusl-198430.

Повний текст джерела
Анотація:
The thesis describes implementation of Business Intelligence solution in mid-sized company which produces and sells animal nutrition products. Currently the company uses basic tools for reporting, mostly Excel application. Considering the data growth, that solution is about not to be sufficient. In the first part of the thesis, the Business Intelligence principals are introduced to the reader, so one is provided with enough knowledge to understand the issue of data processing, transforming and storing. Basics of data visualization used for decision making in the company are introduced as well. At the end of the chapter, there is a list of general rules for creating a dashboard. The secord part describes the Trouw Nutrition Biofaktory s.r.o. company, where the implementation of new Business Intelligence solution is going to take place. The reader will learn about the company size, structure and history. The last part contains a new procedure how to process data for reporting purposes. The design includes data loading to MS SQL 2012 database server and further processing for analytical purposes. In addition, OLAP cube is created as well as three monthly dashboards.
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Poti, Allison Tamara S. "Building a multi-tier enterprise system utilizing visual Basic, MTS, ASP, and MS SQL." Virtual Press, 2001. http://liblink.bsu.edu/uhtbin/catkey/1221293.

Повний текст джерела
Анотація:
Multi-tier enterprise systems consist of more than two distributed tiers. The design of multi-tier systems is considerably more involved than two tier systems. Not all systems should be designed as multi-tier, but if the decision to build a multi-tier system is made, there are benefits to this type of system design. CSCources is a system that tracks computer science course information. The requirements of this system indicate that it should be a multi-tier system. This system has three tiers, client, business and data. Microsoft tools are used such as Visual Basic (VB) that was used to build the client tier that physically resides on the client machine. VB is also used to create the business tier. This tier consists of the business layer and the data layer. The business layer contains most of the business logic for the system. The data layer communicates with the data tier. Microsoft SQL Server (MS SQL) is used for the data store. The database containsseveral tables and stored procedures. The stored procedures are used to add, edit, update and delete records in the database. Microsoft Transaction Server (MTS) is used to control modifications to the database. The transaction and security features available in the MTS environment are used. The business tier and data tier may or may not reside on the same physical computer or server. Active Server Pages (ASP) was built that accesses the business tier to retrieve the needed information for display on a web page. The cost of designing a distributed system, building a distributed system, upgrades to the system and error handling are examined.Ball State UniversityMuncie, IN 47306
Department of Computer Science
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Таран, Д. І. "Автоматизація кадрового обліку підприємства". Master's thesis, Сумський державний університет, 2018. http://essuir.sumdu.edu.ua/handle/123456789/69497.

Повний текст джерела
Анотація:
В роботі досліджено діяльність підприємства з обліку кадрів, визначені бізнес-процеси, проаналізовані існуючі інформаційні системи для автоматизації діяльності кадрового відділу, сформульовані вимоги до створюваної системи, розроблено проект та прототип автоматизованої системи, визначено економічний ефект від впровадження системи.
In the course of the project, the implementation of the business process, the visibility of the business processes, the initialization of the system for the automation of the personnel recruitment, the formulation of the work before the installation, the project is the prototype of the automated system, the economical effect is identified in the system.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Počatko, Boris. "Dynamický definovatelný dashboard." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2012. http://www.nusl.cz/ntk/nusl-236436.

Повний текст джерела
Анотація:
This thesis deals with the design and implementation of a dynamic user-definable dashboard. The user will be able to define conditions dynamically, which will filter out and save only the data he needs. The application will support the changing of the condition definitions and the display of the graphs after they were created. The current implementations available on the internet are usually solutions designed to fit only one type of project and are not designed to meet general guidelines for a dashboard. The dashboard is designed for a smooth cooperation with high load databases and therefore not to slow down the whole solution.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Lučan, Martin. "Analýza reportingu v podnikovém prostředí a jeho technologické pokrytí Microsoft BI." Master's thesis, Vysoká škola ekonomická v Praze, 2015. http://www.nusl.cz/ntk/nusl-261817.

Повний текст джерела
Анотація:
This master thesis deals with an analysis of reporting in business and its technological coverage by Microsoft Business Intelligence portfolio. The main objective of this work is to analyze the individual organization's requirements on reporting and coverage options. This work can serve as a tool for implementation of reporting in a company or for increasing the efficiency of reporting. The first part is theoretical. It deals with Business Intelligence as an environment for reporting. It also defines the basic concepts. The next part covers the area of reporting. This section provides an insight into the history, definition of outcomes and detailed classification of reporting from different perspectives. Furthermore, the definition of users reporting and the reporting standards in companies is also covered. At the end of this chapter, there is the definition of the benefits of reporting for companies. The main part of this master thesis is an analysis of the requirements for reporting in companies. The chapter defines five key perspectives which are described in a greater detail. The methods how companies should methodically approach these requirements are also discussed. The chapter further looks at the requirements for a concrete report. An output of this chapter is an effective template for gathering requirements for a specific report. The last chapter focuses on reporting portfolio analysis of Microsoft and defines the concept of Microsoft. It provides detailed information about reporting products that Microsoft offers. An analyses and a mapping of the individual characteristics of the products and the requirements is defined in this thesis.
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Rajnoch, Adam. "Implementace podpory studia v nástroji IBM Cognos." Master's thesis, Vysoká škola ekonomická v Praze, 2012. http://www.nusl.cz/ntk/nusl-150098.

Повний текст джерела
Анотація:
In recent years, Business Intelligence is one of the fastest growing software industry according to foreign analyzes. More and more companies realize the importance of tools for Business Intelligence and dashboarding. That is why I chose this topic for my thesis. The main part of this thesis focuses on study support for subject 4IT314 Enterprise Information Systems. This course teaches student how to work with the ERP system Microsoft Dynamics AX. The main shortage of this subject is the lack of feedback to students. Therefore it was decide to integrate BI tools that will be used for dashboarding over database of Microsoft Dynamics AX. In the theoretical part are explains the basic terms essential to understanding BI and dashboarding. On the basis of defined criteria was chose BI tool for integration in to the subject 4IT314. In the practical part of this thesis is described the implementation of this tool and created user documentation, which is used as study materials for students.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Blomberg, John, and Sebastian Östlund. "Optimering av databasinformation." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-52696.

Повний текст джерела
Анотація:
Syftet med detta examensarbete är att med avseende på Snow Softwares databas försöka att effektivisera samt omstrukturera de procedurer som hanterar deras regelverk. Den är ej inriktad på att utvärdera eller förändra de algoritmer som redan finns i Microsoft SQL Server. Ett flertal olika prototyper framställdes och utvärderades. Dessa var ej specifika för just denna databas och kan därför vara intressanta även i andra sammanhang. Den prototyp som visade sig vara mest effektiv implementerades så att den enkelt skulle kunna tas i bruk i det nuvarande systemet. Prototypen gav en ungefärlig förbättring i söktid på 18,8 %
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Houssein, Hatem. "Are ORMs the end of stored procedures?" Thesis, Linnéuniversitetet, Institutionen för datavetenskap (DV), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-69426.

Повний текст джерела
Анотація:
Stored procedures are used as the current database logic for SAAB’s data model of the fighter aircraft JAS-39 Gripen electrical schemas. Since the database model was developed in 2000, a research and tests needed to be carried out to decide on whether updating the database to today's technology is applicable. Therefore, Object-Relational Mapping (ORM) is to be researched, tested and compared to stored procedures using test-driven development (TDD)concerning an important factor, that Stored procedures are well-known for, which is querying performance of the database. Moreover, how maintainability and flexibility [1] can affect decision between Stored procedures or migrating to ORM based on our subjective experience. NHibernate and Entity Framework are the two ORM solutions considered sinceSAAB uses C# in this project. The process of this project is run using scrum of the agile software development to maintain an iterative progress throughout the project timeline. In this paper, the process and methodology are covered in details and also the comparison with the test results. These results eventually lead us to the answer that ORM is not a suitable technology, and stored procedures still dominate the querying performance for SAAB’scurrent database.
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Madron, Lukáš. "Datové sklady a OLAP v prostředí MS SQL Serveru." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2008. http://www.nusl.cz/ntk/nusl-235916.

Повний текст джерела
Анотація:
This paper deals with data warehouses and OLAP. These technologies are defined and described here. Then an introduction of the architecture of product MS SQL Server and its tools for work with data warehouses and OLAP folow. The knowledge gained is used for creation of sample application.
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Lebeda, Martin. "Analýza a návrh změn informačního systému MEDIMED." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2012. http://www.nusl.cz/ntk/nusl-223339.

Повний текст джерела
Анотація:
This work deals with the modifications of project MeDiMed, which ceased to meet the requirements it imposed. In the first part the main objectives, targets and methods to achieve them are defined. The second part deals with the theory that is needed to understand the issue. In the practical part, which is based on the theory is prepared SWOT analysis and the modified HOS-8 analysis of the current state, whose results are then analyzed. The next section presents design of new solution. The conclusion summarizes all the proposed solutions and knowledge that have been achieved in this work.
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Lebeda, Martin. "Analýza a návrh změn informačního systému firmy." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2012. http://www.nusl.cz/ntk/nusl-374727.

Повний текст джерела
Анотація:
This work deals with the modifications of project MeDiMed, which ceased to meet the requirements it imposed. In the first part the main objectives, targets and methods to achieve them are defined. The second part deals with the theory that is needed to understand the issue. In the practical part, which is based on the theory is prepared SWOT analysis and the modified HOS-8 analysis of the current state, whose results are then analyzed. The next section presents design of new solution. The conclusion summarizes all the proposed solutions and knowledge that have been achieved in this work.
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Vedral, Jakub. "KPI management - auto alerting na platformě MS SQL." Master's thesis, Vysoká škola ekonomická v Praze, 2009. http://www.nusl.cz/ntk/nusl-17180.

Повний текст джерела
Анотація:
This diploma thesis deals with the problem of automatic alerting on Microsoft SQL Server 2008 platform. Thesis was elaborated with Clever Decision s.r.o. company which significantly contributed to thesis assignment. Main goal of this thesis is to provide a solution concept for absence of alerting for multidimensional data in Microsoft's product. Alerting is aimed at critical business data -- Key performance indicators - KPIs. KPIs are used to deliver enhanced business performance through Corporate Performance Management concept. These indicators need to be watched frequently. Considering that alerting solution is convenient for watching the data. Solution concept is based on market research of Business Intelligence (BI) platforms which dominates the market. Platforms are examined for their alerting capabilities. Next goal is to provide overall insight to KPI management -- creating, managing, analyzing and monitoring of KPIs. Thesis is divided into three sections. First part creates theoretical background for solution creation and describes the field of KPI management. Second part consists of market research using selected criteria. Third part is aimed at providing the solution concept through web application fixing the absence of alerting on Microsoft SQL Server 2008 platform. This thesis is primarily intended for Clever Decision company but also for various BI experts dealing with alerting problems on BI platforms. Thesis serves also as a theoretical summary of KPI management which is commonly evaded by available technical literature.
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Brander, Thomas, and Christian Dakermandji. "En jämförelse mellan databashanterare med prestandatester och stora datamängder." Thesis, KTH, Data- och elektroteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-188199.

Повний текст джерела
Анотація:
Företaget Nordicstation hanterar stora datamängder åt Swedbank där datalagringen sker i relationsdatabasen Microsoft SQL Server 2012 (SQL Server). Då det finns andra databashanterare designade för stora datavolymer är det oklart om SQL Server är den optimala lösningen för situationen. Detta examensarbete har tagit fram en jämförelse med hjälp av prestandatester, beträffande exekveringstiden av databasfrågor, mellan databaserna SQL Server, Cassandra och NuoDB vid hanteringen av stora datamängder. Cassandra är en kolumnbaserad databas designad för hantering av stora datavolymer, NuoDB är en minnesdatabas som använder internminnet som lagringsutrymme och är designad för skalbarhet. Resultaten togs fram i en virtuell servermiljö med Windows Server 2012 R2 på en testplattform skriven i Java. Jämförelsen visar att SQL Server var den databas mest lämpad för gruppering, sortering och beräkningsoperationer. Däremot var Cassandra bäst i skrivoperationer och NuoDB presterade bäst i läsoperationer. Analysen av resultatet visade att mindre access till disken ger kortare exekveringstid men den skalbara lösningen, NuoDB, lider av kraftiga prestandaförluster av att endast konfigureras med en nod. Nordicstation rekommenderas att uppgradera till Microsoft SQL Server 2014, eller senare, där möjlighet finns att spara tabeller i internminnet.
The company Nordicstation handles large amounts of data for Swedbank, where data is stored using the relational database Microsoft SQL Server 2012 (SQL Server). The existence of other databases designed for handling large amounts of data, makes it unclear if SQL Server is the best solution for this situation.  This degree project describes a comparison between databases using performance testing, with regard to the execution time of database queries.  The chosen databases were SQL Server, Cassandra and NuoDB. Cassandra is a column-oriented database designed for handling large amounts of data, NuoDB is a database that uses the main memory for data storage and is designed for scalability. The performance tests were executed in a virtual server environment with Windows Server 2012 R2 using an application written in Java. SQL Server was the database most suited for grouping, sorting and arithmetic operations. Cassandra had the shortest execution time for write operations while NuoDB performed best in read operations. This degree project concludes that minimizing disk operations leads to shorter execution times but the scalable solution, NuoDB, suffer severe performance losses when configured as a single-node. Nordicstation is recommended to upgrade to Microsoft SQL Server 2014, or later, because of the possibility to save tables in main memory.
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Matiášek, Tomáš. "Metodika provozu a údržby BI řešení." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-82004.

Повний текст джерела
Анотація:
This diploma thesis is focused on operation and maintenance of BI systems based on Microsoft's SQL Server 2008 R2 platform. This work covers regular activities related to BI system management that means monitoring, performance tuning, maintenance, administration, backup and recovery tasks. System management is based mainly on SQL Server's native management tools along with Microsoft's applications System Center: Essentials and System Center: Data Protection Manager. The main goal of this work is to create BI system operations and maintenance methodology in cooperation with Clever Decision which is specialized in BI area. Partial goals are to design a way of monitoring components of BI system, including to outline backup and recovery posibilities, maintenance and performance tuning of the SQL Server platform. Each chapter contains methods, best practices, summary tables and appropriate tools which help DBA to achieve the specified goals. The main contribution of this work is operations and maintenance methodology created according to Clever Decision's needs. Among the other contributions of this thesis also belongs the definition of using tools for monitoring and maintenance and their relations to the SQL Server's components. The work also contains simple backup and performance tuning methodology for SQL Server. At least, this diploma thesis represents comprehensive material which is strictly focused on OLAP databases operation and maintenance.
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Kačur, Andrej. "Zavedenie BI riešenia do firmy pôsobiacej na finančných trhoch." Master's thesis, Vysoká škola ekonomická v Praze, 2015. http://www.nusl.cz/ntk/nusl-203873.

Повний текст джерела
Анотація:
This diploma thesis is focused on designing a prototype Business Intelligence solution for a company operating in financial markets. The company does not possess a full-scaled system for measuring enterprise performance and performs this task by the means of manual data collection. The current system is not sufficient for covering the growing number of the com-pany's dealings. The theoretical part of the thesis deals with introducing basic principles of Enterprise Per-formance Management, Balanced Scorecard and, subsequently, Business Intelligence. The practical part covers the defined strategic goals of the company using the Balanced Sco-recard method. The discovered strategic goals are transposed into a strategic map. Metrics for measuring these goals are assigned to them. Only some of the goals are chosen for the implementation of the Business Intelligence solution. Source data for assessing the reached values are defined for the given indicators. Data from the data source were transposed into the data hub. Subsequently, an OLAP cube connected to a reporting tool for displaying the calculated metrics is created.
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Björkman, Lucas, and Jimmy Erixon. "Produktkonfigurator." Thesis, Jönköping University, JTH, Computer and Electrical Engineering, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-12960.

Повний текст джерела
Анотація:

Rapporten presenterar skapandet av en produktkonfigurator och flera förslag på hur den kan förbättras. Uppdraget gavs av System Andersson till två studenter på Jönköpings Tekniska Högskola som genomförde detta som sitt examensarbete.

Rapporten diskuterar möjliga lösningar på en produktkonfigurator och visar ett resultat som utgår från dessa lösningar. Produktkonfiguratorn i rapporten består av två delar.

Dels konfigurationsdelen som hjälper användaren att skapa nya produkter utifrån valda artiklar. Artiklarna sorteras automatiskt av produktkonfiguratorn så att endast kompatibla artiklar kan sättas ihop. Detta sker med ett ”drag and drop” gränssnitt.

Dessutom innehåller produktkonfiguratorn en kompatibilitetsdel, vars uppgift är att underlätta vid definierandet av kompatibilitet mellan artiklar. Detta kan göras löpande under tiden en produkt konfigureras om så önskas. Användaren kan välja på två eller flera artiklar som ska vara kompatibla med varandra, eller hela kategorier.

Sist i rapporten diskuteras de val som gjorts under arbetets gång och möjliga förbättringar som kan göras på produktkonfiguratorn.

Стилі APA, Harvard, Vancouver, ISO та ін.
31

Stahr, Michael C. "DESIGN AND IMPLEMENTATION OF AN EMPLOYEE JOB SCHEDULING AND PROGRESSION TRACKING SYSTEM." Miami University / OhioLINK, 2002. http://rave.ohiolink.edu/etdc/view?acc_num=miami1038847285.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Johansson, Tony. "Modernisering av webbaserat användargränssnitt för Skärblacka Bruk." Thesis, Mittuniversitetet, Institutionen för informationssystem och –teknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-42148.

Повний текст джерела
Анотація:
My independent work has been to modernize a user interface in a 10-year-old MVC Framework application used by Skärblacka mills within the BillerudKorsnäs Group. The concept of modernization also means increasing its quality; to replace older knowledge and technology with new and modern technology such as be able to click on the text of a checkbox that is considered to be more relevant for its time. The framework chosen for the implementation was ASP.NET Core MVC, which is an open source for the most common platforms. The implementation of the porting meant that server code that is well-functioning has been maintained to a large extent. JavaScript is also pretty much the same except in a few places. To get a good and durable structure, the application is layered in the form of 3 layers with Controller, BusinessLayer and Repository. The database with SQL Server is basically the same except that Identity has been introduced. To be able to communicate with the database, ORM EF Core has been chosen, which is a slim version of EF. A lot of what is in the old MVC framework has been redesigned because it is not supported in Core MVC. The application consists of an assembly with a logical development tree that consists of the files included in the application.Ajax has been used to get soft desktop-like updates.The application, which is complex with a lot of complicated things, made time run away.
Mitt självständiga arbete har varit att modernisera ett användargränssnittet i en 10 år gammal MVC Framework applikation som används av Skärblacka bruk inom BillerudKorsnäs koncernen. I begreppet modernisera ligger också innebörden att öka dess kvalité; att byta ut äldre kunskaper och teknik mot ny och modern teknik som t.ex. kunna klicka på texten till en checkbox som anses vara av mer relevans för sin tid. Det ramverk som valdes för implementationen var ASP.NET Core MVC som är ett open sourse för de vanligaste plattformarna. Genomförandet av porteringen innebar att serverkod som är väl fungerande har bibehållits i stora delar. Även JavaScript är i stort sett densamma förutom på några få ställen. För att få en bra och hållbar struktur är applikationen skiktad i form av 3 lager med Controller, BusinessLayer och Repository. Databasen med SQL Server är i stort sett densamma förutom att Identity har införs. För att kunna kommunicera med databasen har man valt ORM EF Core som är en slimmad version av EF. En hel del av det som finns i den gamla MVC framework är omgjort eftersom det stöds in i Core MVC. Applikationen består av en assembly med ett logiskt utvecklingsträd som utgörs av de filer som ingår i applikationen. Ajax har används för att få mjuka desktop liknade uppdateringar. Applikationen som är komplex med en hel del komplicerade saker gjorde att tiden rann iväg.
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Látal, Pavel. "IS – aplikace pro servisní management." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2010. http://www.nusl.cz/ntk/nusl-229004.

Повний текст джерела
Анотація:
This diploma thesis deals with electronic information system for recording and tracking service calls on pressing and cutting tools, the injection molds and assembly machines and assembly lines in manufacturing plant of Lear Corporation Czech Republic, s. r. o. in Vyškov.
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Fabian, Jaroslav. "Využití technik Data Mining v různých odvětvích." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2014. http://www.nusl.cz/ntk/nusl-224335.

Повний текст джерела
Анотація:
This master’s thesis concerns about the use of data mining techniques in banking, insurance and shopping centres industries. The thesis theoretically describes algorithms and methodology CRISP-DM dedicated to data mining processes. With usage of theoretical knowledge and methods, the thesis suggests possible solution for various industries within business intelligence processes.
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Cimbaľák, Michal. "Metodika řešení transformačních úloh v BI (ETL)." Master's thesis, Vysoká škola ekonomická v Praze, 2010. http://www.nusl.cz/ntk/nusl-72511.

Повний текст джерела
Анотація:
Extract, Transformation, and Load (ETL) system is responsible for the extraction of data from several sources, their cleansing, conforming and insertion into a data warehouse. The thesis is focused on building ETL methodology to better understanding of ETL process and its application in the real environment. I've worked on project "BI Cookbook" with Clever Decision, a software consulting company that specializes in Business Intelligence. The aim was to create ETL part of the BI methodology. This methodology has been designed to provide ease of use, flexible and extensible material for the purpose of building ETL solution. The main goal of this work is to introduce created ETL methodology, all of its components and the way in which it could be used. Following goal is to discuss options of its implementation and modification for the wide range of use. The main contribution of this work is to provide ideas of creating and implementing ETL methodology or ideas for creating methodology in general. The thesis is divided into three parts. The first theoretical part deals with basic definition, architecture and implementing of ETL system. The second part presents the proposed methodology. The last part contains of the possible options for implementation a modification of the methodology.
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Stríž, Rostislav. "Dolování periodických vzorů." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2012. http://www.nusl.cz/ntk/nusl-236473.

Повний текст джерела
Анотація:
Data collecting and analysis are commonly used techniques in many sectors of today's business and science. Process called Knowledge Discovery in Databases presents itself as a great tool to find new and interesting information that can be used in a future developement. This thesis deals with basic principles of data mining and temporal data mining as well as with specifics of concrete implementation of chosen algorithms for mining periodic patterns in time series. These algorithms have been developed in a form of managed plug-ins for Microsoft Analysis Services -- service that provides data mining features for Microsoft SQL Server. Finally, we discuss obtained results of performed experiments focused on time complexity of implemented algorithms.
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Uhlíř, Radek. "Měření výkonnosti obchodníků v softwarové společnosti." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-81966.

Повний текст джерела
Анотація:
This thesis is focused on research of available resources about applicable methods and approaches of implementation of Business Intelligence in sales department of mid-size local company focused on information technologies. The solution is aimed on controlling and performance measurement. Local specifics and company culture of innovation and creativeness are considered during the whole work. The next part of the work is analysis of the company environment and appropriable process of implementation matching current maturity level of company. The goal is to define appropriable set of indicators for performance measurement of sales representatives in order to reflect the reality and allow the relative comparison of individuals. This solution is applied in specific company and as a result there are identified issues of this proposal and suggested recommendations for the future modifications. The work is based on research of available resources and identification of the best practices methods of design and implementation of this system. The next part contains detail analysis of the company, application of conclusion of the theoretical part and suggestion for optimal process of successful adaptation. As a result the structure of metrics has been built and it was verified that the detail analysis is required for relevant definition of the scope of the project, identification of risks and preparing the real schedule. It has been verified, that implementation of performance measurement system requires complex changes in company culture and close coordination to other triggered changes in workflow and quality of recorded data in information systems.
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Pettersson, Jonnie. "Teknik för en flerskiktadwebbapplikation." Thesis, Högskolan Dalarna, Informatik, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:du-5055.

Повний текст джерела
Анотація:
The report analyses if some common problems can be avoided by using modern technology. As a reference system “Fartygsrapporteringssystemet” is used. It is an n-tier web application built with modern technology at time, 2003-2004. The aim is to examine whether ASP.Net MVC, Windows Communication Foundation, Workflow Foundation and SQL Server 2005 Service Broker can be used to create an n-tier web application which also communicate with other systems and facilitate automated testing. The report describes the construction of a prototype in which the presentation layer uses ASP.Net MVC to separate presentation and business logic. Communication with the business layer is done through the Windows Communication Foundation. Hard coded processes are broken out and dealt with by Workflow Foundation. Asynchronous communication with other systems is done by using Microsoft SQL Server 2005 Service Broker. The results of the analysis is that these techniques can be used to create a n-tier web application, but that ASP.Net MVC, which at present only available in a preview release, is not sufficiently developed yet.
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Hanuš, Václav. "Využití principů business intelligence v dotazníkových šetřeních." Master's thesis, Vysoká škola ekonomická v Praze, 2010. http://www.nusl.cz/ntk/nusl-72526.

Повний текст джерела
Анотація:
This thesis is oriented on practical usage of tools for data mining and business intelligence. Main goals are processing of source data to suitable form and test use of chosen tool on the test case. As input data I used database which was created as result of processing forms from research to verify the level of IT and economics knowledge among Czech universities. These data was modified into the form, which allows processing them via data mining tools included in Microsoft SQL Server 2008. I choose two cases for verification the potentials of these tools. First case was focused on clustering using Microsoft Clustering algorithm. Main task was to sort the universities into the clusters by comparing their attributes which was amounts of credits of each knowledge group. I had to deal with two problems. It was necessary to reduce the number of groups of subjects, otherwise there was a danger of creation too many clusters which I couldn't put the name on. Another problem was unequal value of credits in each group and this problem caused another problem with weights of these groups. Solution was at the end quite simple. I put together similar groups to bigger formation with more general category. For unequal value, I used parameter for each of new group and transform it to scale 0-5. Second case was focused on prediction task using Microsoft Logistic Regresion algorithm and Microsoft Neural Network algorithm. In this case was the goal to predict the number of presently studying students. I had a historical data from years 2001-2009. A predictive model was processed based on them and I could compare the prediction with real data. In this case, it was also necessary to transform the source data, otherwise it couldn't be processed by tested tool. Original data was placed into the view instead of table and contained not only wished objects but more types of these. For example divided by a sex. Solution was in creation of new table in database where only relevant objects for test case were placed. Last problem come up when I tried to use prediction model to predict data for year 2010 for which there wasn't real data in the table. Software reported an error and couldn't make prediction. During my research on the Microsoft technical support I find some threads which refer to similar problem, so it's possible that this is a system error whit will be fix in forthcoming actualization. Fulfillment of these cases provided me enough clues to determine abilities of these tools from Microsoft. After my former school experience with data mining tools from IBM (former SSPS) and SAS, I can recognize, if tested tools can match these software from major data mining supplier on the market and if it can be use for serious deployment.
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Třetina, Jan. "Analýza a optimalizace databázových systémů." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2008. http://www.nusl.cz/ntk/nusl-235927.

Повний текст джерела
Анотація:
With the increase of demands for the speed and availability of the information technologies, the process of optimization gains more and more importance. Concerning search engine optimization, optimization of operating systems or application optimization (source code), the goal is to gain faster, smaller and more maintainable solution. In my thesis I deal with optimization of database systems, which includes low level of database tuning - physical organization of data and indices, database management system tuning and query optimization. I focused on optimization of Microsoft SQL Servers 2005 in enterprise environment.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Zahradník, Jan. "Klient pro zobrazování OLAP kostek." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2009. http://www.nusl.cz/ntk/nusl-412816.

Повний текст джерела
Анотація:
The purpose of this work is in analytical tool supporting manager decision making. The goal was to develop a reporting system which simplifies company customer's orientation in key performance indicators. As implementation environment has been used .NET Framework 3.5 with C\# language and database server Microsoft SQL 2008 with Analysis Services extension, for web interface has been used ASP.NET.
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Kukačka, Pavel. "Master Data Management a jeho využití v praxi." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-73506.

Повний текст джерела
Анотація:
This thesis deals with the Master Data Management (MDM), specifically its implementation. The main objectives are to analyze and capture the general approaches of MDM implementation including best practices, describe and evaluate the implementation of MDM project using Microsoft SQL Server 2008 R2 Master Data Services (MDS) realized in the Czech environment and on the basis of the above theoretical background, experiences of implemented project and available technical literature create a general procedure for implementation of the MDS tool. To achieve objectives above are used these procedures: exploration of information resources (printed, electronic and personal appointments with consultants of Clever Decision), cooperation on project realized by Clever Decision and analysis of tool Microsoft SQL Server 2008 R2 Master Data Services. Contributions of this work are practically same as its goals. The main contribution is creation of a general procedure for implementation of the MDS tool. The thesis is divided into two parts. The first (theoretically oriented) part deals with basic concepts (including definition against other systems), architecture, implementing styles, market trends and best practices. The second (practically oriented) part deals at first with implementation of realized MDS project and hereafter describes a general procedure for implementation of the MDS tool.
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Pohl, Ondřej. "Analýza veřejně dostupných dat Českého statistického úřadu." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2017. http://www.nusl.cz/ntk/nusl-363884.

Повний текст джерела
Анотація:
The aim of this thesis is analysis of data of the Czech Statistical Office concerning foreign trade. At first, reader familiarize with Business Intelligence and data warehousing. Further, OLAP analysis and data mining basics are explained. In next parts the thesis deal with describing and analysis of data of foreign trade by the help of OLAP technology and data mining in MS SQL Server including selected analytical tasks implementation.
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Al, King Raddad. "Localisation de sources de données et optimisation de requêtes réparties en environnement pair-à-pair." Toulouse 3, 2010. http://thesesups.ups-tlse.fr/912/.

Повний текст джерела
Анотація:
Malgré leur succès dans le domaine du partage de fichiers, les systèmes P2P sont capables d'évaluer uniquement des requêtes simples basées sur la recherche d'un fichier en utilisant son nom. Récemment, plusieurs travaux de recherche sont effectués afin d'étendre ces systèmes pour qu'ils permettent le partage de données avec une granularité fine (i. E. Un attribut atomique) et l'évaluation de requêtes complexes (i. E. Requêtes SQL). A cause des caractéristiques des systèmes P2P (e. G. Grande-échelle, instabilité et autonomie de nœuds), il n'est pas pratique d'avoir un catalogue global qui contient souvent des informations sur: les schémas, les données et les hôtes des sources de données. L'absence d'un catalogue global rend plus difficiles: (i) la localisation de sources de données en prenant en compte l'hétérogénéité de schémas et (ii) l'optimisation de requêtes. Dans notre thèse, nous proposons une approche pour l'évaluation des requêtes SQL en environnement P2P. Notre approche est fondée sur une ontologie de domaine et sur des formules de similarité pour résoudre l'hétérogénéité sémantique des schémas locaux. Quant à l'hétérogénéité structurelle de ces schémas, elle est résolue grâce à l'extension d'un algorithme de routage de requêtes (i. E. Le protocole Chord) par des Indexes de structure. Concernant l'optimisation de requêtes, nous proposons de profiter de la phase de localisation de sources de données pour obtenir toutes les méta-données nécessaires pour générer un plan d'exécution proche de l'optimal. Afin de montrer la faisabilité et la validité de nos propositions, nous effectuons une évaluation des performances et nous discutons les résultats obtenus
Despite of their great success in the file sharing domain, P2P systems support only simple queries usually based on looking up a file by using its name. Recently, several research works have made to extend P2P systems to be able to share data having a fine granularity (i. E. Atomic attribute) and to process queries written with a highly expressive language (i. E. SQL). The characteristics of P2P systems (e. G. Large-scale, node autonomy and instability) make impractical to have a global catalog that stores often information about data, schemas and data source hosts. Because of the absence of a global catalog, two problems become more difficult: (i) locating data sources with taking into account the schema heterogeneity and (ii) query optimization. In our thesis, we propose an approach for processing SQL queries in a P2P environment. To solve the semantic heterogeneity between local schemas, our approach is based on domain ontology and on similarity formulas. As for the structural heterogeneity of local schemas, it is solved by the extension of a query routing method (i. E. Chord protocol) with Structure Indexes. Concerning the query optimization problem, we propose to take advantage of the data source localization phase to obtain all metadata required for generating a close to optimal execution plan. Finally, in order to show the feasibility and the validity of our propositions, we carry out performance evaluations and we discuss the obtained results
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Buzo, Amir. "Intelligent Data Layer: : An approach to generating data layer from normalized database model." Thesis, Linnéuniversitetet, Institutionen för datavetenskap, fysik och matematik, DFM, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-22170.

Повний текст джерела
Анотація:
Model View Controller (MVC) software architecture is widely spread and commonly used in application’s development. Therefore generation of data layer for the database model is able to reduce cost and time. After research on current Object Relational Mapping (ORM) tools, it was discovered that there are generating tools like Data Access Object (DAO) and Hibernate, however their usage causes problems like inefficiency and slow performance due to many connections with database and set up time. Most of these tools are trying to solve specific problems rather than generating a data layer which is an important component and the bottom layer of database centred applications. The proposed solution to the problem is an engineering approach where we have designed a tool named Generated Intelligent Data Layer (GIDL). GIDL tool generates small models which create the main data layer of the system according to the Database Model. The goal of this tool is to enable and allow software developers to work only with object without deep knowledge in SQL. The problem of transaction and commit is solved by the tool. Also filter objects are constructed for filtering the database. GIDL tool reduced the number of connections and also have a cache where to store object lists and modify them. The tool is compared under the same environment with Hibernate and showed a better performance in terms of time evaluations for the same functions. GIDL tool is beneficial for software developers, because it generates the entire data layer.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Jakubičková, Nela. "Návrh metodiky testování BI řešení." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-114404.

Повний текст джерела
Анотація:
This thesis deals with Business Intelligence and its testing. It seeks to highlight the differences from the classical software testing and finally design a methodology for BI solutions testing that could be used in practice on real projects of BI companies. The aim of thesis is to design a methodology for BI solutions testing based on theoretical knowledge of Business Intelligence and software testing with an emphasis on the specific BI characteristics and requirements and also in accordance with Clever Decision's requirements and test it in practice on a real project in this company. The paper is written up on the basis of studying literature in the field of Business Intelligence and software testing from Czech and foreign sources as well as on the recommendations and experience of Clever Decision's employees. It is one of the few if not the first sources dealing with methodology for BI solutions testing in the Czech language. This work could also serve as a basis for more comprehensive methodologies of BI solutions testing. The thesis can be divided into theoretical and practical part. The theoretical part tries to explain the purpose of Business Intelligence use in enterprises. It elucidates particular components of the BI solution, then the actual software testing, various types of tests, with emphasis on the differences and specificities of Business Intelligence. The theoretical part is followed by designed methodology for BI solutions using a generic model for the BI/DW solution testing. The practical part's highlight is the description of real BI project testing in Clever Decision according to the designed methodology.
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Muller, Leslie. "'n Ondersoek na en bydraes tot navraaghantering en -optimering deur databasisbestuurstelsels / L. Muller." Thesis, North-West University, 2006. http://hdl.handle.net/10394/1181.

Повний текст джерела
Анотація:
The problems associated with the effective design and uses of databases are increasing. The information contained in a database is becoming more complex and the size of the data is causing space problems. Technology must continually develop to accommodate this growing need. An inquiry was conducted in order to find effective guidelines that could support queries in general in terms of performance and productivity. Two database management systems were researched to compare die theoretical aspects with the techniques implemented in practice. Microsoft SQL Sewer and MySQL were chosen as the candidates and both were put under close scrutiny. The systems were researched to uncover the methods employed by each to manage queries. The query optimizer forms the basis for each of these systems and manages the parsing and execution of any query. The methods employed by each system for storing data were researched. The way that each system manages table joins, uses indices and chooses optimal execution plans were researched. Adjusted algorithms were introduced for various index processes like B+ trees and hash indexes. Guidelines were compiled that are independent of the database management systems and help to optimize relational databases. Practical implementations of queries were used to acquire and analyse the execution plan for both MySQL and SQL Sewer. This plan along with a few other variables such as execution time is discussed for each system. A model is used for both database management systems in this experiment.
Thesis (M.Sc. (Computer Science))--North-West University, Potchefstroom Campus, 2007.
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Gabriš, Ondrej. "Software Projects Risk Management Support Tool." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2011. http://www.nusl.cz/ntk/nusl-412827.

Повний текст джерела
Анотація:
Management projektů a jejich rizik je v současnosti rozvíjející se disciplína, která si získává stále větší pozornost a uplatnění v praxi. Tato práce popisuje úvod do problematiky řízení rizik, zkoumání metod jejich identifikace, vyhodnocení a managementu, předcházení jejich následkům a jejich zvládání. V další části práce byla provedena analýza vzorků rizik z reálných projektů, byly popsány metody pro identifikaci a vyhodnocení následků rizik v úvodních fázích softwarového projektu, taktéž byly popsány atributy rizik a navržen způsob jejich dokumentace. V závěrečné části zadání byl navržen a implementován prototyp modelové aplikace pro podporu managementu rizik softwarových projektů.
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Slunský, Stanislav. "Webová aplikace pro vizualizaci analýz podnikových informací." Master's thesis, 2019. http://www.nusl.cz/ntk/nusl-429001.

Повний текст джерела
Анотація:
This master thesis is focused on design and implementation of a web application for a company to show users of this application relevant data in prearranged dashboards using visual elements, such as charts, information cards or tables, depending on the role of the logged in user in the application. The data that will be used by the application are obtained from the QI system in which they are transformed into multidimensional cubes that are automatically uploaded to Microsoft SQL Analysis Services Server.
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Syrový, Jiří. "Vývoj a implementace webové aplikace s podporou notace IFML." Master's thesis, 2015. http://www.nusl.cz/ntk/nusl-189543.

Повний текст джерела
Анотація:
Syrový, J. Development and implementation of web applications with the support of IFML notation. Diploma thesis. Brno: Mendel University in Brno, 2015. This diploma thesis deals with the problems of development and implementation of web applications with the support of IFML notation. Processes in the sales department of middle-sized company which are focused on the electronic auctions are modelled through the use of UML 2.0 notation and BPMN 2.0 using a CASE software tool called Enterprise Architect. Then the complete process of development of web applications is made with the support of the product's life cycle using the modeling language IFML and a CASE software tool called WebRatio. The implementation of web applications is made using ASP.NET technology and Microsoft SQL Server.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії