To see the other types of publications on this topic, follow the link: Software re-engineering; Legacy systems.

Dissertations / Theses on the topic 'Software re-engineering; Legacy systems'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 31 dissertations / theses for your research on the topic 'Software re-engineering; Legacy systems.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

You, Danyu. "Re-engineering the Legacy Software Systems by using Object-Oriented Technologies." Ohio University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1386167451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Williams, Julian R. "Re-engineering and prototyping a legacy software system-Janus version 6.X." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1999. http://handle.dtic.mil/100.2/ADA361419.

Full text
Abstract:
Thesis (M.S. in Computer Science) Naval Postgraduate School, March 1999.
Thesis advisor(s): Man-Tak Shing, Valdis Berzins. "March 1999". Includes bibliographical references (p. 181-182). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Xiaodong. "Abstraction : a notion for reverse engineering." Thesis, De Montfort University, 1999. http://hdl.handle.net/2086/4214.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Naspinski, Stan William. "Selection and Implementation of Technologies for the Re-Engineering of an Existing Software System." Scholar Commons, 2011. http://scholarcommons.usf.edu/etd/3260.

Full text
Abstract:
A major hurdle for any company to cross is the act of re-engineering software if they wish to stay relevant. With the speed that software and technology advances, it would be ignorant for any product to stagnate. With that comes the inherent difficulties of choosing which of the older technologies to keep (if any) and which newer technologies to employ in the re-engineered solution. Once that is covered, the actual implementation presents its own set of challenges to both the decision makers and developers in the process. This thesis describes a case study, in particular the efforts put forth to re-engineer some specific software. While the software is quite capable, it is becoming more and more out-dated every passing year, not to mention more difficult to maintain, upgrade and alter, providing a perfect example to explore. The focus of this thesis is to discuss what avenues of upgrading and methods of providing comparable or improved services to the end user our team chose and implemented. These include using a relational database with an advanced object-relational mapper in a modern environment to provide a REpresentational State Transfer (REST) web service that will then supply a rich interactive front-end. Taken together, these tools are quite powerful and capable.
APA, Harvard, Vancouver, ISO, and other styles
5

Waters, Robert Lee. "Obtaining Architectural Descriptions from Legacy Systems: The Architectural Synthesis Process (ASP)." Diss., Available online, Georgia Institute of Technology, 2004:, 2004. http://etd.gatech.edu/theses/available/etd-10272004-160115/unrestricted/waters%5Frobert%5Fl%5F200412%5Fphd.pdf.

Full text
Abstract:
Thesis (Ph. D.)--Computing, Georgia Institute of Technology, 2005.
Rick Kazman, Committee Member ; Colin Potts, Committee Member ; Mike McCracken, Committee Member ; Gregory Abowd, Committee Chair ; Spencer Rugaber, Committee Member. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
6

Ewer, John Andrew Clark. "An investigation into the feasibility, problems and benefits of re-engineering a legacy procedural CFD code into an event driven, object oriented system that allows dynamic user interaction." Thesis, University of Greenwich, 2000. http://gala.gre.ac.uk/6165/.

Full text
Abstract:
This research started with questions about how the overall efficiency, reliability and ease-of-use of Computational Fluid Dynamics (CFD) codes could be improved using any available software engineering and Human Computer Interaction (HCI) techniques. Much of this research has been driven by the difficulties experienced by novice CFD users in the area of Fire Field Modelling where the introduction of performance based building regulations have led to a situation where non CFD experts are increasingly making use of CFD techniques, with varying degrees of effectiveness, for safety critical research. Formerly, such modelling has not been helped by the mode of use, high degree of expertise required from the user and the complexity of specifying a simulation case. Many of the early stages of this research were channelled by perceived limitations of the original legacy CFD software that was chosen as a framework for these investigations. These limitations included poor code clarity, bad overall efficiency due to the use of batch mode processing, poor assurance that the final results presented from the CFD code were correct and the requirement for considerable expertise on the part of users. The innovative incremental re-engineering techniques developed to reverse-engineer, re-engineer and improve the internal structure and usability of the software were arrived at as a by-product of the research into overcoming the problems discovered in the legacy software. The incremental reengineering methodology was considered to be of enough importance to warrant inclusion in this thesis. Various HCI techniques were employed to attempt to overcome the efficiency and solution correctness problems. These investigations have demonstrated that the quality, reliability and overall run-time efficiency of CFD software can be significantly improved by the introduction of run-time monitoring and interactive solution control. It should be noted that the re-engineered CFD code is observed to run more slowly than the original FORTRAN legacy code due, mostly, to the changes in calling architecture of the software and differences in compiler optimisation: but, it is argued that the overall effectiveness, reliability and ease-of-use of the prototype software are all greatly improved. Investigations into dynamic solution control (made possible by the open software architecture and the interactive control interface) have demonstrated considerable savings when using solution control optimisation. Such investigations have also demonstrated the potential for improved assurance of correct simulation when compared with the batch mode of processing found in most legacy CFD software. Investigations have also been conducted into the efficiency implications of using unstructured group solvers. These group solvers are a derivation of the simple point-by-point Jaccobi Over Relaxation (JOR) and Successive Over Relaxation (SOR) solvers [CROFT98] and using group solvers allows the computational processing to be more effectively targeted on regions or logical collections of cells that require more intensive computation. Considerable savings have been demonstrated for the use of both static- and dynamic- group membership when using these group solvers for a complex 3-imensional fire modelling scenario. Furthermore the improvements in the system architecture (brought about as a result of software re-engineering) have helped to create an open framework that is both easy to comprehend and extend. This is in spite of the underlying unstructured nature of the simulation mesh with all of the associated complexity that this brings to the data structures. The prototype CFD software framework has recently been used as the core processing module in a commercial Fire Field Modelling product (called "SMARTFIRE" [EWER99-1]). This CFD framework is also being used by researchers to investigate many diverse aspects of CFD technology including Knowledge Based Solution Control, Gaseous and Solid Phase Combustion, Adaptive Meshing and CAD file interpretation for ease of case specification.
APA, Harvard, Vancouver, ISO, and other styles
7

Malinauskienė, Eglė. "Rekonstrukcijos metodų analizė modernizuojant informacinę sistemą." Master's thesis, Lithuanian Academic Libraries Network (LABT), 2004. http://vddb.library.lt/obj/LT-eLABa-0001:E.02~2004~D_20040527_102923-81618.

Full text
Abstract:
This master thesis covers re-engineering methods of legacy systems. Legacy system is an old system, which is hardly compliant with modern technologies and used only because it has become an integral part of organization business process support during the long period of its maintenance. These systems are large, monolithic and difficult to modify, and cost and risk of their replacement are difficult to predict. The science of software engineering offers an incremental modernization of information systems applying the re-engineering of legacy software. The main goal of software re-engineering is to transform the software in the way, it would become easier to understand, maintain and re-use, at the same time preserving its useful, time trusted functions. The main re-engineering methods are source code translation, reverse engineering and data re-engineering. This thesis covers the analysis of these methods, which was made during the re-engineering of wood production and sales accounting system. The adoption and realization time rate of every method was examined. The influence of the applied re-engineering methods to the system reliability, efficiency, usability and other quality metrics is given.
APA, Harvard, Vancouver, ISO, and other styles
8

Braga, Rosana Teresinha Vaccare. "Padrões de software a partir da engenharia reversa de sistemas legados." Universidade de São Paulo, 1998. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-24012001-163455/.

Full text
Abstract:
A execução da engenharia reversa orientada a objetos de um sistema legado desenvolvido com orientação procedimental é usada como base para sua reengenharia, seguindo duas abordagens diferentes. Na primeira, o sistema passa por reengenharia com mudança de orientação, por meio de segmentação e, posteriormente, é transformado para uma linguagem orientada a objetos de forma semi-automática. Na segunda, é feito o reconhecimento de padrões recorrentes de software no modelo de objetos produzido pela engenharia reversa, para depois efetuar a reengenharia utilizando esses padrões. Os resultados obtidos por intermédio dessas duas abordagens podem ser comparados quanto à manutenibilidade, legibilidade e reuso. A versão original do sistema legado escolhido para a realização da experiência está implementado na linguagem Clipper e possui cerca de vinte mil linhas de código. Trata-se de uma oficina auto-elétrica e mecânica de veículos. Para a engenharia reversa foi escolhido o método Fusion/RE, sendo feita uma proposta para sua evolução, adicionando um maior detalhamento da etapa de abstração do modelo de análise do sistema. Para mudança de orientação do paradigma de desenvolviemnto, de procedimental para orientado a objetos, são propostas duas etapas adicionais a serem executadas após a aplicação do Fusion/RE: o projeto avante do sistema e a segmentação do programa legado. Indicações sobre como fazer a segmentação são fornecidas. A transformação do código segmentado em Clipper para Java é feita com auxílio da máquina Draco-Puc. Uma estratégia é proposta para o reconhecimento de padrões a partir do modelo de objetos do sistema obtido pela engenharia reversa. Por meio dela, instâncias dos padrões Type-Object, Association-Object, State Across a Collection e Behaviour Across a Collection podem ser reconhecidas. Experiências de implementação de alguns desses padrões, em Delphi, são feitas.
The object oriented reverse engineering of a legacy system, originally developed using the procedural paradigm, is the basis for two different reengineering approaches. In the first, the reengineering is done to change the implementation paradigm by segmentation, followed by the semi-automatic transformation to an object oriented language. In the second, recurring patterns are first recognized in the object model produced by the reverse engineering, and then the reengineering is done adopting these patterns. Results obtained by these two approaches are compared to assess their maintainability, legibility and reuse. The original version of the legacy system used in this experience has about twenty thousand lines of Clipper code and refers to an electric and mechanic car repair shop. For the reverse engineering phase the Fusion/RE method is used, and a proposal is done for its evolution, adding features to detail its system analysis model abstraction phase. To change the system orientation paradigm, from procedural to object-oriented, two additional phases are proposed to be conducted after the application of the Fusion/RE method: the forward design of the system and the legacy code segmentation. Hints and rationales are supplied to conduct the code segmentaion. The code transformation from segmented Clipper to Java is done with support of the Draco-Puc machine. A strategy is proposed for pattern recognition based on the system object model obtained through reverse engineering. Through it, instances of Type-Object, Association-Object, State Across a Collection and Behaviour Across a Collection patterns can be recognized. Delphi implementation experiments of these patterns are done.
APA, Harvard, Vancouver, ISO, and other styles
9

Nilsson, Simon. "Automated Culling of Data in a Relational Database for Archiving." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18261.

Full text
Abstract:
Background. Archiving of legacy information systems is challenging. When no options exist for extracting the information in a structured way, the last resort is to save the database. Optimally only the information that is relevant should be saved and the rest of the information could be removed. Objectives. The goal is to develop a method for assisting the archivist in the process of culling a database before archiving. The method should be described as rules defining how the tables can be identified.Methods. To get an overview of how the process works today and what archivists think can be improved, a number of interviews with experts in database archiving is done. The results from the interviews are then analysed, together with test databases to define rules that can be used in a general case. The rules are then implemented in a prototype that is tested and evaluated to verify if the method works. Results. The results point to the algorithm being both faster and able to exclude more irrelevant tables than a person could do with the manual method. An algorithm for finding candidate keys has also been improved to decrease the number of tests and execution time in the worst case. Conclusions. The evaluation shows results that point to the method working as intended while resulting in less work for the archivist. More work should be done on this method to improve it further.
APA, Harvard, Vancouver, ISO, and other styles
10

Saffo, Farah, and Basma Saeed. "Modernisering av mjukvaruarkitektur för äldre mjukvarusystem." Thesis, KTH, Hälsoinformatik och logistik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-296562.

Full text
Abstract:
Flera företag använder sig än idag av mjukvarusystem som är uppbyggda med äldre mjukvaruarkitektur som den monolitiska. Ett av dessa företag är Consid vars personalsystem är uppbyggt med det utdaterade ramverket klassisk ASP och där användargränssnitt samt logik kan direkt kommunicera med varandra. Detta medför begränsningar som uppstår till följd av brister i modularitet på grund av valet av mjukvaruarkitektur, vilket försvårar vidareutveckling och ändringar i ett system. Dessa begränsningar påverkar i sin tur parametrar som prestanda, skalbarhet, säkerhet, robusthet samt integrering med modernare tekniker.  I denna rapport presenteras en litteraturstudie samt en semistrukturerad intervjustudie, i syfte att undersöka vilka mjukvaruarkitekturer som är lämpliga att implementera vid en modernisering av en monolitisk mjukvaruarkitektur. Arbetet diskuterade också vilka utmaningar som kan uppstå vid en sådan modernisering och hur de hanteras på ett effektivt sätt. Ett bedömningsschema med önskvärda parametrar, med avseende på skalbarhet, prestanda, säkerhet och robusthet, togs fram för att underlätta avgörandet vid val av mjukvaruarkitektur. Utifrån detta, beslutades det att en prototyp med en REST-baserad arkitektur skulle implementeras och utvärderas.  Resultatet av prototypen, till följd av re-architecting, visade en ökad modularisering av mjukvaruarkitekturen. I jämförelse mot med det tidigare systemet har den nya prototypen ingen större påverkan på prestanda i form av responstid. Däremot bidrog prototypen till förbättrad skalbarhet när det gäller vidareutvecklingen av systemet, eftersom det förenklar införandet av ny funktionalitet. Prototypen hade också högre säkerhet genom att isolera lager ifrån varandra samt dölja underliggande detaljer i implementationen. Dessutom blev prototypen inte bara mer robust till följd av modulariseringen, men även enklare att utföra integrationstester samt destruktiva tester mot.
Several companies still use software systems that are built with older software architecture such as the monolithic one. One of these companies is Consid, whose personnel system is built with the outdated framework Classic ASP and where the user interface and logic can directly communicate with each other. This entails limitations that arise because of shortcomings in modularity due to the choice of software architecture, which complicates further development and changes in a system. These limitations in turn, affect parameters such as performance, scalability, security, robustness, and integration with modern technologies. In this work, a literature study was conducted as well as a semi-structured interview study in order to investigate which software architectures are suitable to implement when a modernization of a monolithic software architecture, is carried out. The work also discussed the challenges that may arise in a modernization of the software architecture and how they are handled efficiently. An assessment scheme with desirable parameters regarding scalability, performance, security, and robustness, was developed to facilitate the decision in the choice of software architecture. Based on this, it was decided that a prototype with a REST-based architecture would be implemented and evaluated. The result of the prototype, following re-architecting, showed an increased modularization of the software architecture. Compared to the previous system, the new prototype has no major impact on performance in terms of response time. However, the prototype contributed to better scalability in the further development of the system as it simplifies the introduction of new functionality. The prototype also had higher security by isolating layers from each other and hiding the underlying details in the implementation. In addition, the prototype not only became more robust because of the modularization, but also easier to perform destructive tests against.
APA, Harvard, Vancouver, ISO, and other styles
11

Oliveira, Paulo Henrique Ribeiro de. "Engenharia de requisitos aplicada em sistema legado de gestão e custeio de propostas comerciais: pesquisa-ação em empresa do setor de estamparia." Universidade Nove de Julho, 2016. http://bibliotecadigital.uninove.br/handle/tede/1479.

Full text
Abstract:
Submitted by Nadir Basilio (nadirsb@uninove.br) on 2016-07-01T14:26:09Z No. of bitstreams: 1 Paulo Henrique Ribeiro De Oliveira.pdf: 2464979 bytes, checksum: 93b7213ac2b763d9ce75892da0e93cda (MD5)
Made available in DSpace on 2016-07-01T14:26:09Z (GMT). No. of bitstreams: 1 Paulo Henrique Ribeiro De Oliveira.pdf: 2464979 bytes, checksum: 93b7213ac2b763d9ce75892da0e93cda (MD5) Previous issue date: 2016-02-29
The effort spent in maintaining systems regarded as legacies is relatively higher than that of new projects development effort. Such systems should be kept in place because, in most cases, are difficult to replace, given the complexity of changing the interaction and the impact on the functioning of processes, ie, the system can not stop. Thus, maintenance or modifications represent a sign of success for a Legacy System because it means that it is still useful and worth investing resources to keep it updated and running. However, if changes are carried out due to an emergency on business dynamics, and proper documentation is not completed, problems involving control and management of future maintenance might arise. In this context, it is the Requirements Engineering´s responsibility, as a sub-area of Software Engineering, to improve processes by proposing methods, tools and techniques that promote the development of the Requirement´s documentation, so that the requirements are in accordance with the satisfaction of stakeholders, meeting the business attributes in question. The objective of this work was to apply the Requirements Engineering in Legacy Systems of Management and Costing Methods of Sales Proposals in the stamping industry. Through literature review, document analysis and action research, the study was divided into four phases considering the development of the Sales Proposal Management and Costing Method System and three maintenance stages performed with the application of Requirements Engineering. In the first phase, a group of artifacts was generated expressing all system features. In the second phase, a progressive maintenance incorporated new features based in the system´s backlog with requirements collected in the first phase. The third phase included a new stamping business area that was not present in the initial development. Lastly, the fourth phase included new maintenance adjustments that answered to the needs of the stamping business system. The results of the study phases proved that the processes described in the Requirements Engineering (RE) were present in the information gathering actions, analysis, documentation and verification and validation of requirements, bringing academic and technical knowledge on issues related to legacy systems, ER and Software Engineering. As a result, it was concluded that the Requirements Engineering can be applied to Legacy Systems of Management and Costing Methods of Sales Proposals in stamping company in the industry.
O esforço despendido para a manutenção de sistemas considerados como legado é relativamente maior que o esforço de desenvolvimento de novos projetos. Tais sistemas devem ser mantidos em funcionamento pois, em sua maioria, são de difícil substituição, dada a complexidade de convívio da mudança e o impacto no funcionamento dos processos, ou seja, o sistema não pode parar. Dessa forma, manutenções ou modificações representam um sinal de sucesso para um sistema legado, pois significam que ele ainda é útil e que vale a pena investir recursos para mantê-lo atualizado e em funcionamento. No entanto, se modificações são realizadas emergencialmente devido a dinâmica do negócio, e a devida documentação não é realizada, tem-se instaurado o caos para o controle e gerência de futuras manutenções. Neste contexto, cabe à Engenharia de Requisitos, como sub-área da Engenharia de Software, aperfeiçoar os processos para o gerenciamento do ciclo de vida dos requisitos propondo métodos, ferramentas e técnicas que promovam o desenvolvimento do documento de requisitos, para que os requisitos estejam em conformidade com a satisfação dos stakeholders, atendendo as características do negócio em questão. Assim, o objetivo deste trabalho foi aplicar a Engenharia de Requisitos em Sistema Legado de Gestão e Custeio de Propostas Comerciais em empresa do setor de estamparia. Por meio de levantamento bibliográfico, análise documental e pesquisa-ação, o estudo foi dividido em quatro fases considerando o desenvolvimento do Sistema de Gestão e Custeio de Propostas Comerciais e três manutenções realizadas com a aplicação da Engenharia de Requisitos. Na primeira fase um conjunto de artefatos foi gerado expressando todas as funcionalidades do sistema. Na segunda fase uma manutenção evolutiva incorporou novas funcionalidades no sistema baseada em requisitos de backlog coletados na primeira fase. A terceira fase incluiu uma nova área de negócios da estamparia que não esteve presente no desenvolvimento inicial e a quarta fase contemplou novas manutenções ajustando o sistema as necessidades de negócio da estamparia. Os resultados das fases do estudo possibilitaram identificar que os processos descritos na Engenharia de Requisitos (ER) se fizeram presentes nas ações de levantamento, análise, documentação e verificação e validação de requisitos trazendo conhecimento acadêmico e técnico nos temas relacionados a sistemas legados, ER e Engenharia de Software. Concluiu-se, então, que a Engenharia de Requisitos pode ser aplicada em Sistema Legado de Gestão e Custeio de Propostas Comerciais em empresa do setor de estamparia.
APA, Harvard, Vancouver, ISO, and other styles
12

Skoog, Jörgen. "Software re-engineering : a legacy application to create more manageable source code." Thesis, University West, Department of Technology, Mathematics and Computer Science, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:hv:diva-556.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Hart, Peter Bartholomew. "A plm implementation for aerospace systems engineering-conceptual rotorcraft design." Thesis, Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/28278.

Full text
Abstract:
The thesis will discuss the Systems Engineering phase of an original Conceptual Design Engineering Methodology for Aerospace Engineering-Vehicle Synthesis. This iterative phase is shown to benefit from digitization of Integrated Product&Process Design (IPPD) activities, through the application of Product Lifecycle Management (PLM) technologies. Requirements analysis through the use of Quality Function Deployment (QFD) and 7 MaP tools is explored as an illustration. A "Requirements Data Manager" (RDM) is used to show the ability to reduce the time and cost to design for both new and legacy/derivative designs. Here the COTS tool Teamcenter Systems Engineering (TCSE) is used as the RDM. The utility of the new methodology is explored through consideration of a legacy RFP based vehicle design proposal and associated aerospace engineering. The 2001 American Helicopter Society (AHS) 18th Student Design Competition RFP is considered as a starting point for the Systems Engineering phase. A Conceptual Design Engineering activity was conducted in 2000/2001 by Graduate students (including the author) in Rotorcraft Engineering at the Daniel Guggenheim School of Aerospace Engineering at the Georgia Institute of Technology, Atlanta GA. This resulted in the "Kingfisher" vehicle design, an advanced search and rescue rotorcraft capable of performing the "Perfect Storm" mission, from the movie of the same name. The associated requirements, architectures, and work breakdown structure data sets for the Kingfisher are used to relate the capabilities of the proposed Integrated Digital Environment (IDE). The IDE is discussed as a repository for legacy knowledge capture, management, and design template creation. A primary thesis theme is to promote the automation of the up-front conceptual definition of complex systems, specifically aerospace vehicles, while anticipating downstream preliminary and full spectrum lifecycle design activities. The thesis forms a basis for additional discussions of PLM tool integration across the engineering, manufacturing, MRO and EOL lifecycle phases to support business management processes.
APA, Harvard, Vancouver, ISO, and other styles
14

Jennings, Charles A. "Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools." Thesis, Monterey, California. Naval Postgraduate School, 1992. http://hdl.handle.net/10945/23761.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Fischer, Christian. "Functional Programming and Legacy Software Using PureScript to Extend a Legacy JavaScript System." Thesis, Umeå universitet, Institutionen för datavetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-147947.

Full text
Abstract:
Legacy systems are everywhere. Immense resources are placed on fixing problems caused by them, and on legacy system maintenance and reverse engineering. After decades of research, a solution has yet to be found. In this thesis, both the viability of using purely functional programming to mitigate problems of legacy systems is investigated, as well as the possibility that purely functiona lprogramming can lead to code that is less likely to lead to legacy problems in the first place. This was done by developing a genome browser in PureScript that embeds, interfaces with, and extends, an existing genome browser written in JavaScript.The resulting codebase is examined, and various characteristics of purely functional programming, and how they helped solve or avoid problems related to legacy systems, are presented. In Conclusion, PureScript is found to be an excellent tool for working with legacy JavaScript, and while the nature of the project limits the conclusions that can be drawn, it appears likely that using purely functional programming, especially with a language suchas PureScript that provides a powerful type-system for ensuring program correctness, leads to code that is more easily understandable, and thus avoids the problems of legacy code.
APA, Harvard, Vancouver, ISO, and other styles
16

Ramasubbu, Surendranath. "Reverse Software Engineering Large Object Oriented Software Systems using the UML Notation." Thesis, Virginia Tech, 2001. http://hdl.handle.net/10919/31960.

Full text
Abstract:
A common problem experienced by the software engineering community traditionally has been that of understanding legacy code. A decade ago, legacy code was used to refer to programs written in COBOL, typically for large mainframe systems. However, current software developers predominantly use Object Oriented languages like C++ and Java. The belief prevalent among software developers and object philosophers that comprehending object-oriented software will be relatively easier has turned out to be a myth. Tomorrow's legacy code is being written today, since object oriented programs are even more complex and difficult to comprehend, unless rigorously documented. Reverse Engineering is a methodology that greatly reduces the time, effort and complexity involved in solving the program comprehension problem. This thesis deals with Reverse Engineering complex object oriented software and the experiences with a sample case study. Extensive survey of literature and contemporary research on reverse engineering and program comprehension was undertaken as part of this thesis work. An Energy Information System (EIS) application created by a leading energy service provider and one that is being used extensively in the real world was chosen as a case study. Reverse engineering this industry strength Java application necessitated the definition of a formal process. An intuitive Reverse Engineering Process (REP) was defined and used for the reverse engineering effort. The learning experiences gained from this case study are discussed in this thesis.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
17

Svensson, Niclas. "Defining and Identifying Legacy Code in Software, A case study developing 3D visual camera surveillance software." Thesis, Malmö universitet, Fakulteten för teknik och samhälle (TS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-20423.

Full text
Abstract:
Legacykod är något som är svårt för utvecklare att förhindra. Medan det finns mycket forskning kring att motarbeta legacykod och förnya legacysystem som använder utdaterade teknologier och designer, så har inte mycket fokus lagts på hur legacykod kan identifieras i sitt tidigaste skede. Hur dessutom definierar man legacykod? I denna studie så görs en litteraturstudie för att ta reda på vad för forskningsbakgrund det finns bakom legacykod i allmänhet. Därefter så utvecklas två prototyper i syfte att upptäcka mönster under utvecklingen som kan vara egenskaper till legacykod. I slutändan så utförs en enkät riktade åt mjukvaruutvecklare för att få insyn och tankar på vad legacykod betyder för dem och för att testa idéerna som togs fram från implementationerna. Den rekommenderade definitionen för legacykod är kod som inte har testats, ingen eller för lite dokumentation och är allmänt svår att förstå, vare sig med dokumentation eller utan. Observationerna från implementation av de två prototyperna bildade 7 riktlinjer som tar upp hur man kan identifiera uppväxten av legacykod.
Legacy code proves to be something difficult for developers to prevent. While much research exists to combat legacy code and renewing legacy systems that use outdated technologies and designs, not much focus has been put on how legacy code can be detected at its earliest stage. And how do you define legacy code? In this study, a literature review is done to find out the scientific background behind legacy code in general. Afterwards, two implementations of the same software are done to observe any events during the software development that can be characteristics of legacy code. In the end phase, a questionnaire-based survey is handed out to developers to get their insight and thoughts on what legacy code means to them and to test the ideas brought up from the implementations. A recommended definition for legacy code is code with no testing and little to no documentation and generally being hard to understand, whether with documentation or not. From the observations implementing the prototypes, 7 guidelines are formed to help identify and discover legacy code evolution.
APA, Harvard, Vancouver, ISO, and other styles
18

Alawairdhi, Mohammed. "A re-engineering approach for software systems complying with the utilisation of ubiquitous computing technologies." Thesis, De Montfort University, 2009. http://hdl.handle.net/2086/3183.

Full text
Abstract:
The evident progression of ubiquitous technologies has put forward the introduction of new features which software systems can sustain. Several of the ubiquitous technologies available today are regarded as fundamental elements of many software applications in various domains. The utilisation of ubiquitous technologies has an apparent impact on business processes that can grant organisations a competitive advantage and improve their productivity. The change in the business processes in such organisations typically leads to a change in the underlying software systems. In addressing the need for change in the underlying software systems, this research is focused on establishing a general framework and methodology to facilitate the reengineering of software systems in order to allow the incorporation of new features which are introduced by the employment of ubiquitous technologies. Although this thesis aims to be general and not limited to a specific programming language or software development approach, the focus is on Object-Oriented software. The reengineering framework follows a systematic step-based approach, with greater focus on the reverse engineering aspect. The four stages of the framework are: program understanding, additional-requirement engineering, integration, and finally the testing and operation stage. In its first stage, the proposed reengineering framework regards the source code as the starting point to understand the system using a static-analysis based method. The second stage is concerned with the elicitation of the user functional requirements resulting from the introduction of ubiquitous technologies. In the third stage, the goal is to integrate the system’s components and hardware handlers using a developed integration algorithm and available integration techniques. In the fourth and final stage, which is discussed in a general manner only in this thesis, the reengineered system is tested and put in the operation phase. The proposed approach is demonstrated using a case study in Java to show that the proposed approach is feasible and promising in its domain. Conclusions are drawn based on analysis and further research directions are discussed at the end of the study.
APA, Harvard, Vancouver, ISO, and other styles
19

Mehta, Alok. "Evolving legacy system's features into fine-grained components using regression test-cases." Link to electronic thesis, 2002. http://www.wpi.edu/Pubs/ETD/Available/etd-1211102-163800.

Full text
Abstract:
Dissertatio (Ph. D.)--Worcester Polytechnic Institute.
Keywords: software maintenance; software evolution; regression test-cases; components; legacy system; incremental software evolution methodology; fine-grained components. Includes bibliographical references (p. 283-294).
APA, Harvard, Vancouver, ISO, and other styles
20

Adjoyan, Seza. "Describing Dynamic and Variable Software Architecture Based on Identified Services From Object-Oriented Legacy Applications." Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTS022/document.

Full text
Abstract:
L'Orienté Service (SOA) est un paradigme de conception qui facilite la construction d’applications extensibles et reconfigurables basées sur des artefacts réutilisables qui sont les services. Ceux-ci sont structurés via des interfaces bien définies et publiables et qui peuvent être dynamiquement découvertes.Beaucoup d’approches ont été proposées dans la littérature pour la réingénierie d’applications existantes développées dans des paradigmes pré-services, principalement l’orienté objet, vers SOA. L’objectif est de permettre de sauvegarder la valeur métier de ces d’applications tout en leur permettant de bénéficier des avantages de SOA. Le problème est que ces approches s'appuient sur des critères ad-hoc pour identifier correctement des services dans le code source des applications existantes.Par ailleurs, l'une des caractéristiques les plus distinctives d'une application orientée service est sa capacité de se reconfigurer dynamiquement et d'adapter son comportement en fonction de son contexte d'exécution. Cependant, dans les langages de description d'architecture (ADL) existants dont l’aspect de reconfiguration et pris en compte, les règles de reconfiguration sont représentées d'une manière ad-hoc; en général, elles ne sont pas modélisées d'une manière explicite mais enfouillées dans la description de l'architecture. D'une part, ceci engendre une difficulté de la gestion de la reconfiguration dynamique au niveau de l'architecture et d’autre part, la traçabilité de la description de la reconfiguration dynamique à travers les différents niveaux d'abstraction est difficile à représenter et à gérer.Afin de surmonter les problèmes précédents, nous proposons dans le cadre de cette thèse deux contributions. D'abord, nous proposons une approche d'identification de services basée sur un modèle de qualité où les caractéristiques des services sont étudiées, raffinées et réifiées en une fonction que nous utilisons pour mesurer la validité sémantique de ces services. La deuxième contribution consiste en une proposition d'un langage de description d'architecture orientée service (ADL) qui intègre la description de la variabilité architecturale. Dans cette ADL les services qui peuvent constituer l’architecture, les éléments de contexte dont les changements d’état sont à l’origine des changements architecturaux, les variantes des éléments architecturaux sélectionnées en fonction des états des éléments de contexte et le comportement architectural dynamique sont ainsi spécifiés de façon modulaire
Service Oriented Architecture (SOA) is an architectural design paradigm which facilitates building and composing flexible, extensible and reusable service-oriented assets. These latter are encapsulated behind well-defined and published interfaces that can be dynamically discovered by third-party services. Before the advent of SOA, several software systems were developed using older technologies. Many of these systems still afford a business value, however they suffer from evolution and maintenance problems. It is advantageous to modernize those software systems towards service-based ones. In this sense, several re-engineering techniques propose migrating object-oriented applications towards SOA. Nonetheless, these approaches rely on ad-hoc criteria to correctly identify services in object-oriented legacy source code.Besides, one of the most distinguishing features of a service-oriented application is the ability to dynamically reconfigure and adjust its behavior to cope with changing environment during execution. However, in existing architecture description languages handling this aspect, reconfiguration rules are represented in an ad-hoc manner; reconfiguration scenarios are often implicit. This fact hinders a full management of dynamic reconfiguration at architecture level. Moreover, it constitutes a challenge to trace dynamic reconfiguration description/management at different levels of abstraction.In order to overcome the aforementioned problems, our contributions are presented in two axes: First, in the context of migrating legacy software towards SOA, we propose a service identification approach based on a quality measurement model, where service characteristics are considered, refined to measurable metrics in order to measure the semantic correctness of identified services. The second axis is dedicated to an Architecture Description Language (ADL) proposition that describes a variant-rich service-based architecture. In this modular ADL, dynamic reconfigurations are specified at architecture level. Moreover, the description is enriched with context and variability information, in order to enable a variability-based self-reconfiguration of architecture in response to context changes at runtime
APA, Harvard, Vancouver, ISO, and other styles
21

Huang, Jianchu. "A reengineering approach to reconciling requirements and implementation for context-aware web services systems." Thesis, De Montfort University, 2012. http://hdl.handle.net/2086/7578.

Full text
Abstract:
In modern software development, the gap between software requirements and implementation is not always conciliated. Typically, for Web services-based context-aware systems, reconciling this gap is even harder. The aim of this research is to explore how software reengineering can facilitate the reconciliation between requirements and implementation for the said systems. The underlying research in this thesis comprises the following three components. Firstly, the requirements recovery framework underpins the requirements elicitation approach on the proposed reengineering framework. This approach consists of three stages: 1) Hypothesis generation, where a list of hypothesis source code information is generated; 2) Segmentation, where the hypothesis list is grouped into segments; 3) Concept binding, where the segments turn into a list of concept bindings linking regions of source code. Secondly, the derived viewpoints-based context-aware service requirements model is proposed to fully discover constraints, and the requirements evolution model is developed to maintain and specify the requirements evolution process for supporting context-aware services evolution. Finally, inspired by context-oriented programming concepts and approaches, ContXFS is implemented as a COP-inspired conceptual library in F#, which enables developers to facilitate dynamic context adaption. This library along with context-aware requirements analyses mitigate the development of the said systems to a great extent, which in turn, achieves reconciliation between requirements and implementation.
APA, Harvard, Vancouver, ISO, and other styles
22

Nyberg, Pontus, and Tim Elofsson. "Underhåll och Migrering av Legacy-System." Thesis, Linnéuniversitetet, Institutionen för datavetenskap (DV), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-28597.

Full text
Abstract:
Legacy-system sköter idag flera kritiska affärsprocesser hos många företag och banker. Dessa system är dyra att underhålla och uppdatera med nya funktioner. Legacy-systemenär också svåra att anpassa till en tjänsteorienterad arkitektur eller SOA (Service OrientedArchitechture). Därför vill företagen börja fasa ut dessa system. SOA är ett tankesätt i hur man strukturerar ett system. Allt ska vara uppbyggt av tjänster som inte är beroende avvarandra och därför i framtiden blir lättare att byta ut, ändra eller ta bort. Eftersom ingen tjänst ska vara beroende av någon annan skadas ingen annan del av systemet om en tjänst ändras. Eftersom fler och fler företag idag vill övergå till en SOA så letar de efter sätt att migrerasina legacy-system till modernare plattformar. Det finns flera olika sätt att migrerera legacy-system. Alla har olika fördelar och nackdelar. En av de säkrare metoderna är Chicken Little (steg-för-stegmetod), men den tar längre tid än att till exempel använda sig av metoden Cold Turkey som även kallas Big Bang. Big Bang för att man byter ut hela systemet på till exempel en helg eller liknande (drastisk metod). Flera företag har också specialiserat sig på att automatiskt översätta gammal kod till modernare, den tekniken heter transcoding. Det finns även företag som jobbar med att få bland annat Cobol att jobba ihop med modernare utvecklingsplattformar såsom Java. Ett av de mer aktiva företagen som arbetar med detta är Micro Focus, som har utvecklat ett bibliotek som gör att du kan starta Cobol-program från Java eller .NET. Författarna har med hjälp av deras Java-bibliotek skapat ett program åt Bluegarden som kan starta upp Cobol-program. Programmet skapades för att påvisa vad de kan använda för att slippa flera steg i uppstart av Cobol-program. För att undvika att man får legacy-system i framtiden har det även kommit fram flera underhållsmodeller. Underhållsmodellerna fungerar på olika sätt, men alla har som mål att undvika legacy-system.
Legacy systems today manages many critical business processes of many organisations and banks. These systems are expensive to maintain and update with new features. Legacy systems is also difficult to adapt to a SOA (Service Oriented Architecture). Therefore, the companies want to begin phasing out these systems. SOA is an approach in how to structure a system. Everything should be built of services that are not depending on each other and therefore in the future will be easy to replace, update or delete. Since no services will be dependent on any other, you can change one without harming another. More and more companies today want to move to a SOA, therefore they are looking for ways to migrate their legacy systems to modern platforms. There are several ways to migrate legacy systems. Every way has different advantages and disadvantages. One of the safer methods is called Chicken Little, but it takes longer than, for example using the method Cold Turkey. Several companies have also specialized in automatically translating old code to a modern programming language, this technique is called transcoding. There are companies that are working to get Cobol to work with more modern languages such as Java. One of the bigger companies that is developing ways to combine COBOL and Java is Micro Focus. They have developed a library that allows you to start COBOL programs from Java. To avoid legacy systems in the future, several maintenance models have been created. Maintenance models are structured in different ways, but all have the goal to avoid future legacy systems.
APA, Harvard, Vancouver, ISO, and other styles
23

Ross, Timothy A. "Searching without SQL: Re-engineering a database-centric web application with open-source information retrieval software." Thesis, School of Information and Library Science, 2008. http://hdl.handle.net/1901/579.

Full text
Abstract:
This paper seeks to describe the process by which a database-centric web application was redesigned and rewritten to take advantage of Apache’s Lucene - an open-source information retrieval software library written in the Java programming language. After the implementation of a Lucene-based text index of “semi-structured data”, a college radio station's card catalog application was able to deliver higher-quality search results in significantly less time than it was able to do using just a relational database alone. Additionally, the dramatic improvements in speed and performance even allowed the search results interface to be redesigned and enhanced with an improved pagination system and new features such as faceted search/filtering.
APA, Harvard, Vancouver, ISO, and other styles
24

Poolla, Venkata Sai Abhishek, and Bhargav Krishna Mandava. "Understanding the Challenges and Needs of Requirements Engineering for Data-centric Systems." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-21108.

Full text
Abstract:
Background: As technology is advancing day by day, people tend to produce enormous volumes of data. This exceptional growth in data is leading to an increase in the development of intelligent systems that make use of this huge data available. We group the development of such type of intelligent software systems and term them as "Data-Centric Systems (DCS)". Such systems include AI/ML components in an aself-driving car, Recommender systems any many more. Developing DCS is complexin the Software development life cycle process; one of the main reasons behind this complexity is the ineffective handling of requirements. Moreover, the literature suggests that a large percentage (48%) of development problems begin during the requirements phase and fixing requirements-related problems consumes a high cost of rework in later stages of software development. To design DCS effectively, RE techniques are considered one of the essential factors, since it is required to promote the combination of a system’s functional and implementation expectations from two entirely different perspectives, such as customers and developers. Although RE frequently plays a critical role in DCS, little is known about how RE can effectively be incorporated into developing such systems. Objectives: This thesis aims to focus on understanding industry experiences in the development of DCS with the main emphasis on RE and investigate the techniques/approaches used in DCS designing during the RE process and identify the challenges practitioners face during the development process. Methods: Two workshop-style interviews are conducted to identify the design process of RE and the practitioners’ challenges during DCS development. To complement the results from the workshop and scaling up the target population, an online survey is conducted. Results: From the workshops, we have identified that no explicit stakeholder is involved during the RE phase of DCS. Still, all people collectively take the decisions when it comes to developing in agile, and the role varies depending on the type of projects the stakeholder is involved in. Four categories of requirements were identified, namely regulatory, infrastructure, data, safety and security requirements. Techniques/approaches used to elicit, document, analyse and validate the requirements were identified. Based on the data received, we have identified ten challenges faced by the practitioners during the DCS. Based on the number of responses recorded in the survey, the categorisation and the techniques/approaches used for RE were prioritised based on the highest number of responses received. A total of 15 themes were generated for the challenges based on the responses received from participants. Conclusions: To conclude, a specific RE architecture needs to be designed to help the practitioners during the development of DCS. We believe that the analysis of these insights provides the industry with a structured overview of the DCS development process in RE. Besides, this thesis allows the academic community to steer future research based on an understanding of industry needs in RE for DCS.
APA, Harvard, Vancouver, ISO, and other styles
25

Marcus, Elwin, and Lehti Emil. "Riktlinjer för Pensionering av IT - System." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-188913.

Full text
Abstract:
Retirement of old IT - systems to newer ones has become more up-to-date, due to increases in performance in hardware and new programming languages. Nevertheless for companies it is not uncomplicated how this retirement process is best handled, and there are not many studies covering this phenomenon. In this thesis the goal is to find general guidelines regarding retirement of IT - systems. This is done by doing a case study on retirement for the old (MAPPER/BIS) platform to a more modern C# system which is currently used at Handelsbanken, one of Sweden’s largest banks. With the help of a qualitative research method, literature study and with interviews at Handelsbanken and external parts, we aim to understand and analyze what’s important in a retirement process. In order to create recommendations for our guidelines by also using some parts from the EM3: Software Retirement Process Model. The result of the thesis report is in total 16 guidelines that are presented as tables in the study, which companies can use in their retirement process. Ten guidelines are concerned with the retirement of the platform and six guidelines regarding conversion rules. Nonetheless the study has shown that not all systems on the MAPPER platform were possible to retire.
Pensionering av IT-system har idag blivit en mer aktuell fråga än någonsin, till följd av att system blir gamla och inte är anpassade till dagens hårdvara eller moderna programmeringsspråk. Det är inte okomplicerat hur en sådan pensionering ska ske, speciellt då det inte gjorts så många studier kring det ämnet. I denna uppsats är målet att ta fram riktlinjer gällande pensionering, detta gjordes genom att utföra en fallstudie. Fallstudien gjordes på uppdrag av Handelsbanken, en av de största bankerna i Sverige, kring pensionering av (MAPPER/BIS) plattformen till en annan mer moderna plattform i C#. Med hjälp av en kvalitativ forskningsmetod, litteraturstudie samt intervjuer med personal på Handelsbanken och externa parter, ämnar vi att förstå och analysera vad som är viktigt vid en pensioneringsprocess. För att kunna skapa rekommendationer till de riktlinjer kring pensionering som detta arbete presenterar. Vissa utvalda delar av EM3: Software Retirement Process Model kommer också att ligga till grund för riktlinjerna. Uppsatsrapporten har resulterat i totalt 16 riktlinjerna som presenteras i tabeller som företag kan använda i sina pensioneringsprocesser. Tio stycken riktlinjer grundar sig i pensioneringen av en plattform och ytligare sex stycken gäller konvertering. Studien visade dock att alla system på MAPPER plattformen inte var möjliga att pensionera.
APA, Harvard, Vancouver, ISO, and other styles
26

Jonasson, Fredrik. "A system for GDPR-compliant collection of social media data: from legal to software requirements." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-397110.

Full text
Abstract:
As of 2018 there is a new regulation regarding data protection in the European union. The legislation, often refereed to as the General Data Protection Regulation (GDPR), has led to increased demands on organizations that processes personal data. This thesis has investigated the legal consequences of social media data collection with a certain focus on collection of tweets. The legal findings was then translated into possible enhancements of a tweet collecting software. The tweet collecting software was extended with a method for pseudonymization, however it turned out that our implementation had some serious performance issues. There where also work done on an implementation of a method providing automatic tweet posting with the purpose to repeatedly inform followers of a hashtag that a collection of tweets regarding that hashtag is taking place. Lastly, some findings about possible future enhancements that can be done on the software was laid out.
APA, Harvard, Vancouver, ISO, and other styles
27

Bürgel, Sven. "ADONIS -- A Case Study of a Legal Advisory System Using Adaptive Programming." Master's thesis, Universitätsbibliothek Chemnitz, 2002. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200200818.

Full text
Abstract:
Software evolution and maintenance have received great attention with the steadily increasing complexity of software systems. One recent approach in this field is adaptive programming, which focuses on the evolution of large class hierarchies. Its main objectives are to manage the change in evolutionary systems and to keep costs for adaptive maintenance low. In this thesis we present our experiences with the application of adaptive programming for modeling and implementing the legal advisory system ADONIS. Unlike most other information systems, ADONIS does not simply process data but regulations. Since regulations and laws are frequently a subject to change, we have chosen this domain as the basis for our practical research on adaptive programming
Die Evolution und Wartung von Software haben wegen der stetig wachsenden Komplexität von Softwaresystemen deutlich an Beachtung gewonnen. Ein aktueller Ansatz auf diesem Gebiet ist die adaptive Programmierung, welche sich auf die Evolution von großen Klassenhierarchien konzentriert. Ihre Hauptziele liegen darin, die wechselnden Anforderungen in evolutionären Systemen zu bewältigen und die Kosten für adaptive Wartung gering zu halten. In dieser Arbeit präsentieren wir unsere Erfahrungen mit der Anwendung der adaptiven Programmierung zur Modellierung und Implementierung des rechtlichen Beratungssystemes ADONIS. Im Gegensatz zu vielen anderen Informationssystemen verarbeitet ADONIS nicht einfach Daten, sondern Vorschriften. Aufgrunddessen, daß Vorschriften und Gesetze häufig geändert werden, haben wir diese Domäne als Grundlage für unsere praktischen Nachforschungen zum Thema adaptive Programmierung ausgewählt
APA, Harvard, Vancouver, ISO, and other styles
28

Höffl, Marc. "A new programming model for enterprise software : Allowing for rapid adaption and supporting maintainability at scale." Thesis, KTH, Elkraftteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-215103.

Full text
Abstract:
Companies are under constant pressure to adapt and improve their processes to staycompetitive. Since most of their processes are handled by software, it also needs toconstantly change. Those improvements and changes add up over time and increase thecomplexity of the system, which in turn prevents the company from further adaption.In order to change and improve existing business processes and their implementation withinsoftware, several stakeholders have to go through a long process. Current IT methodologies arenot suitable for such a dynamic environment. The analysis of this change process shows thatfour software characteristics are important to speed it up. They are: transparency, adaptability,testability and reparability. Transparency refers to the users capability to understand what thesystem is doing, where and why. Adaptability is a mainly technical characteristic that indicatesthe capability of the system to evolve or change. Testability allows automated testing andvalidation for correctness without requiring manual checks. The last characteristic is reparability,which describes the possibility to bring the system back into a consistent and correct state, evenif erroneous software was deployed.An architecture and software development patterns are evaluated to build an overall programmingmodel that provides the software characteristics. The overall architecture is basedon microservices, which facilitates decoupling and maintainability for the software as well asorganizations. Command Query Responsibility Segregation decouples read from write operationsand makes data changes explicit. With Event Sourcing, the system stores not only the currentstate, but all historic events. It provides a built-in audit trail and is able to reproduce differentscenarios for troubleshooting and testing.A demo process is defined and implemented within multiple prototypes. The design of theprototype is based on the programming model. It is built in Javascript and implements Microservices,CQRS and Event Sourcing. The prototypes show and validate how the programmingmodel provides the software characteristics. Software built with the programming model allowscompanies to iterate faster at scale. Since the programming model is suited for complex processes,the main limitation is that the validation is based on a demo process that is simpler and thebenefits are hard to quantify.
ör att fortsatt vara konkurrenskraftiga är företag under konstant press att anpassa ochförbättra sina processer. Eftersom de flesta processer hanteras av programvara, behöveräven de ständigt förändras. Övertiden leder dessa förbättringar och förändringar till ökadsystemkomplexitet, vilket i sin tur hindrar företaget från ytterligare anpassningar. För attförändra och förbättra befintliga affärsprocesser och dess programvara, måste idag typiskt fleraaktörer vara en del av en lång och tidskrävande process. Nuvarande metoder är inte lämpade fören sådan dynamisk miljö. Detta arbete har fokuserat på fyra programvaruegenskaper som ärviktiga för att underlätta förändringsprocesser. Dessa fyra egenskaper är: öppenhet, anpassningsförmåga,testbarhet och reparerbarhet. Öppenhet, hänvisar till förmågan att förstå varför, var ochvad systemet gör. Anpassningsbarhet är huvudsakligen en teknisk egenskap som fokuserar påsystemets förmåga att utvecklas och förändras. Testbarhet strävar efter automatisk testning ochvalidering av korrekthet som kräver ingen eller lite manuell kontroll. Den sista egenskapen ärreparerbarhet, som beskriver möjligheten att återhämta systemet till ett konsekvent och korrekttillstånd, även om felaktig programvara har använts. En programmeringsmodell som rustarprogramvara med de ovan beskrivna programegenskaperna är utvecklad i detta examensarbete.Programmeringsmodellens arkitektur är baserad på diverse micro-tjänster, vilka ger brafrånkopplings- och underhållsförmåga för en programvara, samt användarorganisationerna.Command Query Responsibility Segregation (CQRS) frånkopplar läsoperationer från skrivoperationeroch gör ändringar i data explicita. Med Event Sourcing lagrar systemet inte endastdet nuvarande tillståndet, utan alla historiska händelser. Modellen förser användarna medett inbyggt revisionsspår och kan reproducera olika scenarion för felsökning och testning. Endemoprocess är definierad och implementerad i tre olika prototyper. Designen av prototypernaär baserad på den föreslagna programmeringsmodellen. Vilken är byggd i Javascript och implementerarmicro-tjänster, CQRS och Event Sourcing. Prototyperna visar och validerar hurprogrammeringsmodellen ger programvaran rätt egenskaper. Programvara byggd med dennaprogrammeringsmodell tillåter företag att iterera snabbare. De huvudsakliga begränsningarna iarbetet är att valideringen är baserad på en enklare demoprocess och att dess fördelar är svåraatt kvantifiera.
APA, Harvard, Vancouver, ISO, and other styles
29

Pan, yi-ching, and 潘義清. "Re-engineering Legacy Web-based Java Systems to J2EE-based Scalable Systems." Thesis, 2002. http://ndltd.ncl.edu.tw/handle/83138641080830714491.

Full text
Abstract:
碩士
東海大學
資訊工程與科學系
90
With the fashion of Internet and WWW, computerizing and automation in all kinds of industry are prospering. ” Web-base Application “ is a wonderful choice for enterprise to provide service quickly and instantly. However, because of the keen competition among market, marketing strategy is changeable. Also with the increasing population of using Internet so rapidly, enterprise must to integrate and upgrade the original system for maintaining quickly response to customers. However, in the situation of lack well-designed software architecture, it constantly costs much, even results of the original system that developed cost much time and expense be compelled to eliminate early. The technology of software re-engineering extracts the useful information from Legacy System to re-manufacture the software. It not only reduces the system development cost but also can satisfy the requirement for according the variable market. Consequently, how to re-engineer the Legacy System to a high reliability, maintainability Scalable System that can adjust the service content conveniently and flexibly is an urgent requirement. Owing to adapt the “Component-based and N-tier Software Architecture” to develop the software has become the main stream in large-scale software development. It also because of the advantages of Java in distribute environment and across-platform, the thesis adapt the J2EE’s (Java2 Enterprise Edition) “Sever-side Component Standard model ” and utilize its N-tier architecture to be the target architecture for re-engineering. The thesis takes ” On-line book order management system “for example, utilizing the standard procedure addressed by us to extract the useful information from the un-organized and inexpansible program of JSP and Java Beans, then transform to the corresponding J2EE component in J2EE N-tier Architecture. Due to these components are developed following the J2EE standard, so they can distributed in the N-tier architecture of J2EE according to their functions and roles, and interact with J2EE container. And utilize the J2EE container to manage the system-level problems of application in security, concurrence access, persistence store, and data transaction. Besides, we study the technology of J2EE in distributed computing , to help build a scalable system that has component load balance mechanism and high reliability、expansibility and flexibly.
APA, Harvard, Vancouver, ISO, and other styles
30

(10706787), Tiantu Xu. "Software Systems for Large-Scale Retrospective Video Analytics." Thesis, 2021.

Find full text
Abstract:

Pervasive cameras are generating videos at an unprecedented pace, making videos the new frontier of big data. As the processors, e.g., CPU/GPU, become increasingly powerful, the cloud and edge nodes can generate useful insights from colossal video data. However, as the research in computer vision (CV) develops vigorously, the system area has been a blind spot in CV research. With colossal video data generated from cameras every day and limited compute resource budgets, how to design software systems to generate insights from video data efficiently?


Designing cost-efficient video analytics software systems is challenged by the expensive computation of vision operators, the colossal data volume, and the precious wireless bandwidth of surveillance cameras. To address above challenges, three software systems are proposed in this thesis. For the first system, we present VStore, a data store that supports fast, resource-efficient analytics over large archival videos. VStore manages video ingestion, storage, retrieval, and consumption and controls video formats through backward derivation of configuration: in the opposite direction along the video data path, VStore passes the video quantity and quality expected by analytics backward to retrieval, to storage, and to ingestion. VStore derives an optimal set of video formats, optimizes for different resources in a progressive manner, and runs queries as fast as 362x of video realtime. For the second system, we present a camera/cloud runtime called DIVA that supports querying cold videos distributed on low-cost wireless cameras. DIVA is built upon a novel zero-streaming paradigm: to save wireless bandwidth, when capturing video frames, a camera builds sparse yet accurate landmark frames without uploading any video data; when executing a query, a camera processes frames in multiple passes with increasingly more expensive operators. On diverse queries over 15 videos, DIVA runs at more than 100x realtime and outperforms competitive alternatives remarkably. For the third system, we present Clique, a practical object re-identification (ReID) engine that builds upon two unconventional techniques. First, Clique assesses target occurrences by clustering unreliable object features extracted by ReID algorithms, with each cluster representing the general impression of a distinct object to be matched against the input. Second, to search across camera videos, Clique samples cameras to maximize the spatiotemporal coverage and incrementally adds cameras for processing on demand. Through evaluation on 25 hours of traffic videos from 25 cameras, Clique reaches a high recall at 5 of 0.87 across 70 queries and runs at 830x of video realtime in achieving high accuracy.

APA, Harvard, Vancouver, ISO, and other styles
31

Parnell, Paul P., University of Western Sydney, and Faculty of Law. "Information technology law : 'micro-agreements' in systems integration and outsourcing projects : recognising and managing the legal implications of day to day interactions between parties to large and complex information technology projects." 2000. http://handle.uws.edu.au:8081/1959.7/25573.

Full text
Abstract:
This work describes the concept of 'micro-agreement', representing the many forms of interaction occurring between parties involved in large and complex information technology projects. Micro-agreements can provide benefits as well as disadvantages to such projects and need to be effectively managed. This work begins by describing the nature of information technology projects from an engineering perspective, particularly in light of the problems that may occur. The existing legal doctrines that are relevant to such projects are then described and expanded into the concept of a micro-agreement. The concept of micro-agreement is supported through the analysis of a number of csae studies relevant to the information technology industry, together with further analysis of legal relationship models. A number of key recommendations are made which provide support for gaining maximum benefits from micro-agreements. These recommendations include: linking information technology contracts to software engineering best practice; using an appropriate legal relationship model; and developing an industry wide Information Technology Code of Conduct.
Master of Laws (Hons)
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography