To see the other types of publications on this topic, follow the link: Legacy tools.

Journal articles on the topic 'Legacy tools'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Legacy tools.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

MacDonald, Stuart Wyllie. "Tools for community: Ivan Illich's legacy." International Journal of Education Through Art 8, no. 2 (2012): 121–33. http://dx.doi.org/10.1386/eta.8.2.121_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Then, Matthias, Benjamin Wallenborn, Birgit R. Ianniello, Duc Binh Vu, Michael Fuchs, and Matthias L. Hemmje. "Innovative Authoring Tools for Online-Courses with Assignments - Integrating Heterogeneous Tools of e-Learning Platforms into Hybrid Application Solutions." International Journal of Emerging Technologies in Learning (iJET) 11, no. 02 (2016): 12. http://dx.doi.org/10.3991/ijet.v11i02.5108.

Full text
Abstract:
This paper is concerned with an essential topic in e-learning - course content authoring. Besides supporting the concept of Competence-Based Learning (CBL) our solution is aiming to make effective use of an open integration architecture fostering the interoperability of hybrid e-learning solutions. Modern scenarios ask for interoperable software solutions to seamlessly integrate existing e-learning infrastructures and legacy tools with innovative technologies while being cognitively efficient to handle. In this way, prospective users are enabled to use them seamlessly without learning overheads. At the same time, methods of Learning Design (LD) in combination with CBL are getting more and more important to produce and maintain easy to facilitate solutions. Our approach of developing a competence-based course-authoring and assignment-support software bridges the gaps between Moodle and established legacy infrastructures by embedding existing legacy tools via Learning Tools Interoperability (LTI). The underlying conceptual architecture for this integration approach and its components will be introduced; furthermore a Moodle plugin will be outlined, which enables Moodle for LD- and CBL-support including corresponding data exchange with our course authoring tool. The paper concludes with an outlook on future plans for our research and development.
APA, Harvard, Vancouver, ISO, and other styles
3

Wright, Adam, Pamela M. Neri, Skye Aaron, et al. "Development and evaluation of a novel user interface for reviewing clinical microbiology results." Journal of the American Medical Informatics Association 25, no. 8 (2018): 1064–68. http://dx.doi.org/10.1093/jamia/ocy014.

Full text
Abstract:
Abstract Background Microbiology laboratory results are complex and cumbersome to review. We sought to develop a new review tool to improve the ease and accuracy of microbiology results review. Methods We observed and informally interviewed clinicians to determine areas in which existing microbiology review tools were lacking. We developed a new tool that reorganizes microbiology results by time and organism. We conducted a scenario-based usability evaluation to compare the new tool to existing legacy tools, using a balanced block design. Results The average time-on-task decreased from 45.3 min for the legacy tools to 27.1 min for the new tool (P < .0001). Total errors decreased from 41 with the legacy tools to 19 with the new tool (P = .0068). The average Single Ease Question score was 5.65 (out of 7) for the new tool, compared to 3.78 for the legacy tools (P < .0001). The new tool scored 88 (“Excellent”) on the System Usability Scale. Conclusions The new tool substantially improved efficiency, accuracy, and usability. It was subsequently integrated into the electronic health record and rolled out system-wide. This project provides an example of how clinical and informatics teams can innovative alongside a commercial Electronic Health Record (EHR).
APA, Harvard, Vancouver, ISO, and other styles
4

Brito, Kellyton dos Santos, Vinícius Garcia, Almeida Eduardo De, and Silvio Romero de Lemos Meira. "LIFT - A Legacy InFormation Retrieval Tool." JUCS - Journal of Universal Computer Science 14, no. (8) (2008): 1256–84. https://doi.org/10.3217/jucs-014-08-1256.

Full text
Abstract:
Nowadays software systems are essential to the environment of most organizations, and their maintenance is a key point to support business dynamics. Thus, reverse engineering legacy systems for knowledge reuse has become a major concern in software industry. This article, based on a survey about reverse engineering tools, discusses a set of functional and non-functional requirements for an effective tool for reverse engineering, and observes that current tools only partly support these requirements. In addition, we define new requirements, based on our group's experience and industry feedback, and present the architecture and implementation of LIFT: a Legacy InFormation retrieval Tool, developed based on these demands. Furthermore, we discuss the compliance of LIFT with the defined requirements. Finally, we applied the LIFT in a reverse engineering project of a 210KLOC NATURAL/ADABAS system of a financial institution and analyzed its effectiveness and scalability, comparing data with previous similar projects performed by the same institution.
APA, Harvard, Vancouver, ISO, and other styles
5

Harkins, Arthur, and George Kubik. "Legacy‐based Thinking II: Resisting New Tools and Competencies." On the Horizon 9, no. 5 (2001): 6–9. http://dx.doi.org/10.1108/10748120110424825.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Aakarsh Mavi. "Bridging the Gap: Cybersecurity Automation for Legacy Manufacturing Systems." Journal of Information Systems Engineering and Management 10, no. 30s (2025): 21–33. https://doi.org/10.52783/jisem.v10i30s.4768.

Full text
Abstract:
Legacy manufacturing systems play a big role in industrial production, but they usually don’t have strong cyber- security measures in place, which makes them easy targets for modern cyber threats. Because these systems are often outdated, they present serious security risks, as they weren’t built to defend against today’s cyber-attacks. This study aims to fill the cybersecurity gap in legacy manufacturing environments by creating automated tools that boost the security of these systems without needing a lot of hands-on work. The frame- work includes automated patch management and vulnerability scanning tools, making sure that important security updates are consistently applied to legacy systems. This helps reduce the risk of hackers exploiting any unpatched vulnerabilities. What’s more, the research also uses network segmentation and anomaly detection technologies. These strategies help keep critical legacy systems separate from the wider network and watch for any unusual activity that could signal a cyber-attack. By combining, these approaches it helps to stop legacy systems from being compromised and used as gateways for cybercriminals. Not only does this strengthen the overall security setup, but it also minimizes operational hiccups, making sure both system uptime and security stay strong. By automating these processes, this research offers a scalable, effective, and sustainable way to protect legacy manufacturing systems from developing cyber threats.
APA, Harvard, Vancouver, ISO, and other styles
7

Greene, Frederick L. "The Legacy of a Boston Curmudgeon." American Surgeon 80, no. 7 (2014): 631–34. http://dx.doi.org/10.1177/000313481408000714.

Full text
Abstract:
Ernest Amory Codman developed the End-Results System that has given rise to many quality tools and registries that surgeons are familiar with today. Although not appreciated by colleagues for his revolutionary concepts, he has now been recognized as a visionary and developer of many of the concepts that we equate with good surgical care and follow-up.
APA, Harvard, Vancouver, ISO, and other styles
8

De Lucia, Andrea, Rita Francese, Giuseppe Scanniello, and Genoveffa Tortora. "Developing legacy system migration methods and tools for technology transfer." Software: Practice and Experience 38, no. 13 (2008): 1333–64. http://dx.doi.org/10.1002/spe.870.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Burrows, C. R. "Fluid Power Systems Design—Bramah's Legacy." Proceedings of the Institution of Mechanical Engineers, Part A: Journal of Power and Energy 210, no. 2 (1996): 105–20. http://dx.doi.org/10.1243/pime_proc_1996_210_018_02.

Full text
Abstract:
This paper gives a brief historical survey of fluid power systems. It shows that many current applications can be traced to ideas embedded in the pioneering work of Bramah and later contributors. The demand for high dynamic performance has been satisfied by advances in components and the fusion of technologies. A central theme is the importance of developing effective tools for system synthesis. It is shown that research in progress will ensure a future for fluid power in the twenty-first century.
APA, Harvard, Vancouver, ISO, and other styles
10

Kinahan, Kelly L. "Historic preservation as a community development tool in legacy city neighbourhoods." Community Development Journal 54, no. 4 (2018): 581–604. http://dx.doi.org/10.1093/cdj/bsy035.

Full text
Abstract:
Abstract For legacy cities, population decline and economic restructuring contributed to the challenges facing their built environments including low demand, oversupply, and high rates of vacancy and abandonment. Amidst this backdrop, there is intense pressure for demolition, yet legacy cities also possess rich stocks of historic resources that can potentially serve as physical assets for community development. Market-based historic preservation incentives such as historic rehabilitation tax credit (RTC) programs are important tools for facilitating reinvestment in legacy cities. These tools are also criticized for primarily benefiting the real estate developers spearheading these projects or creating inequitable neighbourhood change. This research analyzes federal historic RTC projects in two St. Louis, Missouri neighbourhoods – Lafayette Square and Midtown Alley – between 1997 and 2010 and asks: in what ways do investments supported by historic tax credit programs function as a tool for legacy city community development? Through interviews and document analysis, I find that historic tax credit projects support neighbourhood stabilization by minimizing vacancies and shifting redevelopment approaches away from demolition and towards preservation. These projects help build capacity among real estate developers to take on historic preservation redevelopments in other neighbourhoods. However, residents and community-based organizations are often disconnected from these projects, limiting their usefulness as a community development tool.
APA, Harvard, Vancouver, ISO, and other styles
11

Leonard, Allenna. "Stafford Beer and the legacy of Cybersyn: seeing around corners." Kybernetes 44, no. 6/7 (2015): 926–34. http://dx.doi.org/10.1108/k-02-2015-0045.

Full text
Abstract:
Purpose – The purpose of this paper is to reflect on the legacy of Stafford Beer and the continuing implications of his work on Cybersyn and the models and tools he used and explored during the project and in his later work. Design/methodology/approach – Description of Stafford Beer’s work on Cybersyn and examples of its present day applicability. Findings – The values and tools associated with the Cybersyn work in Chile continue to be relevant for the challenges of the present and an example of an approach to management structure and practice that serves both efficiency and humanity. Originality/value – The value of this work is to contribute to the history and future possibilities of the ideas and tools pioneered in the Cybersyn project by Stafford Beer and others and their broader context in organizational cybernetics.
APA, Harvard, Vancouver, ISO, and other styles
12

Williams, Kenton, Steven A. Sader, Christopher Pryor, and Frank Reed. "Application of Geospatial Technology to Monitor Forest Legacy Conservation Easements." Journal of Forestry 104, no. 2 (2006): 89–93. http://dx.doi.org/10.1093/jof/104.2.89.

Full text
Abstract:
Abstract Remote sensing and other geospatial tools are being applied to monitor large working forest conservation easements and to assist office and field users to improve the efficiency of required monitoring. Two USDA-Forest Legacy Program conservation easements in northern New England are included as case studies to describe how to acquire geospatial data and apply these tools to monitor selected easement features. Much of the remote sensing and geographical information system (GIS) data are free or low cost and functional software capable of viewing, querying, and making measurements on images and maps can be acquired from public sources. A high-resolution, digital image management system linked to satellite imagery and the GIS database is presented. Finally, the geospatial monitoring tools on a handheld personal data assistant device for field users are demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
13

Acuña, Ruben, Jacques Chomilier, and Zoé Lacroix. "Managing and Documenting Legacy Scientific Workflows." Journal of Integrative Bioinformatics 12, no. 3 (2015): 65–87. http://dx.doi.org/10.1515/jib-2015-277.

Full text
Abstract:
Summary Scientific legacy workflows are often developed over many years, poorly documented and implemented with scripting languages. In the context of our cross-disciplinary projects we face the problem of maintaining such scientific workflows. This paper presents the Workflow Instrumentation for Structure Extraction (WISE) method used to process several ad-hoc legacy workflows written in Python and automatically produce their workflow structural skeleton. Unlike many existing methods, WISE does not assume input workflows to be preprocessed in a known workflow formalism. It is also able to identify and analyze calls to external tools. We present the method and report its results on several scientific workflows.
APA, Harvard, Vancouver, ISO, and other styles
14

Woods, Kam, and Geoffrey Brown. "Assisted Emulation for Legacy Executables." International Journal of Digital Curation 5, no. 1 (2010): 160–71. http://dx.doi.org/10.2218/ijdc.v5i1.150.

Full text
Abstract:
Emulation is frequently discussed as a failsafe preservation strategy for born-digital documents that depend on contemporaneous software for access (Rothenberg, 2000). Yet little has been written about the contextual knowledge required to successfully use such software. The approach we advocate is to preserve necessary contextual information through scripts designed to control the legacy environment, and created during the preservation workflow. We describe software designed to minimize dependence on this knowledge by offering automated configuration and execution of emulated environments. We demonstrate that even simple scripts can reduce impediments to casual use of the digital objects being preserved. We describe tools to automate the remote use of preserved objects on local emulation environments. This can help eliminate both a dependence on physical reference workstations at preservation institutions, and provide users accessing materials over the web with simplified, easy-to-use environments. Our implementation is applied to examples from an existing collection of over 4,000 virtual CD-ROM images containing thousands of custom binary executables.
APA, Harvard, Vancouver, ISO, and other styles
15

Vijayasekhar, Duvvur. "Modernizing Legacy Applications: Navigating Potential Issues and Roadblocks." European Journal of Advances in Engineering and Technology 9, no. 1 (2022): 26–30. https://doi.org/10.5281/zenodo.12770625.

Full text
Abstract:
Modernizing legacy applications is imperative for organizations to stay competitive in today's dynamic digital landscape. However, this process is fraught with challenges, including technical debt, scalability issues, security vulnerabilities, compatibility concerns, and knowledge gaps. This article explores these potential issues and offers strategies for navigating them effectively. By conducting comprehensive assessments, adopting incremental approaches, leveraging automated tools, investing in employee training, fostering collaboration, implementing robust security measures, and considering cloud-native architectures, organizations can successfully modernize their legacy applications and unlock new opportunities for innovation and growth.
APA, Harvard, Vancouver, ISO, and other styles
16

Kliner, Dahv, and Alex Kingsbury. "The shape of things to come." PhotonicsViews 21, no. 2 (2024): 59–63. http://dx.doi.org/10.1002/phvs.202400012.

Full text
Abstract:
AbstractAll‐fiber beam shaping is revolutionizing laser‐based manufacturing. This capability is a commercially accessible reality in cutting, welding, and additive manufacturing tools released by leading integrators worldwide. These advanced tools increase productivity and part quality and introduce entirely new production capabilities, driving the displacement of legacy lasers and non‐laser technologies in existing applications and spurring the development of new markets.
APA, Harvard, Vancouver, ISO, and other styles
17

Clayer, François, Leah Jackson-Blake, Daniel Mercado-Bettín, et al. "Sources of skill in lake temperature, discharge and ice-off seasonal forecasting tools." Hydrology and Earth System Sciences 27, no. 6 (2023): 1361–81. http://dx.doi.org/10.5194/hess-27-1361-2023.

Full text
Abstract:
Abstract. Despite high potential benefits, the development of seasonal forecasting tools in the water sector has been slower than in other sectors. Here we assess the skill of seasonal forecasting tools for lakes and reservoirs set up at four sites in Australia and Europe. These tools consist of coupled hydrological catchment and lake models forced with seasonal meteorological forecast ensembles to provide probabilistic predictions of seasonal anomalies in water discharge, temperature and ice-off. Successful implementation requires a rigorous assessment of the tools' predictive skill and an apportionment of the predictability between legacy effects and input forcing data. To this end, models were forced with two meteorological datasets from the European Centre for Medium-Range Weather Forecasts (ECMWF), the seasonal forecasting system, SEAS5, with 3-month lead times and the ERA5 reanalysis. Historical skill was assessed by comparing both model outputs, i.e. seasonal lake hindcasts (forced with SEAS5), and pseudo-observations (forced with ERA5). The skill of the seasonal lake hindcasts was generally low although higher than the reference hindcasts, i.e. pseudo-observations, at some sites for certain combinations of season and variable. The SEAS5 meteorological predictions showed less skill than the lake hindcasts. In fact, skilful lake hindcasts identified for selected seasons and variables were not always synchronous with skilful SEAS5 meteorological hindcasts, raising questions on the source of the predictability. A set of sensitivity analyses showed that most of the forecasting skill originates from legacy effects, although during winter and spring in Norway some skill was coming from SEAS5 over the 3-month target season. When SEAS5 hindcasts were skilful, additional predictive skill originates from the interaction between legacy and SEAS5 skill. We conclude that lake forecasts forced with an ensemble of boundary conditions resampled from historical meteorology are currently likely to yield higher-quality forecasts in most cases.
APA, Harvard, Vancouver, ISO, and other styles
18

Srinivas, Adilapuram. "The Roadmap to Legacy System Modernization: Phased Approach to Mainframe Migration and Cloud Adoption." Journal of Scientific and Engineering Research 7, no. 9 (2020): 252–57. https://doi.org/10.5281/zenodo.14770551.

Full text
Abstract:
Legacy systems often present significant challenges for organizations seeking to modernize their IT infrastructure. Migrating these systems to contemporary platforms requires careful planning and execution to mitigate risks and maximize benefits. This paper addressed the challenges of migrating legacy mainframe systems to modern cloud platforms. It proposed a phased approach to mitigate risks such as performance issues, skill gaps, and high costs. The study outlined three phases: assessment and planning, rehosting and refactoring, and application modernization. Recommendations included employing CI/CD pipelines, upskilling teams, and utilizing performance monitoring tools.
APA, Harvard, Vancouver, ISO, and other styles
19

Md. Noh, Shahizan, Zurina Shafii, Ainulashikin Marzuki, and Ahmad Saruji Abdul Aziz. "ISLAMIC LEGACY PLANNING INDUSTRY IN MALAYSIA: VALIDATION ON COMPETENCY-BASED CERTIFICATION FOR ISLAMIC LEGACY PLANNERS." Advanced International Journal of Banking, Accounting and Finance 2, no. 2 (2020): 31–45. http://dx.doi.org/10.35631/aijbaf.22004.

Full text
Abstract:
Islamic legacy planning is poised to be the new growth area in the Islamic finance space. There is a need to professionalize this area to ensure that best practices applied in assisting the society with the right legacy planning solutions. The objective of the paper is to explore the industry expectation of the content of certification for practitioners in the Islamic legacy planning field in Malaysia. This study applies a qualitative research method that involves in-depth focus group discussion with the subject matter experts, academicians and personnel in the area of Islamic financial planning. The focus group discussion is conducted in order to keep abreast with the current trend of the industry. The input sought from the panels includes improving the learning process, suggesting the method of delivering the subject, and recommending relevant learning methodology. The discussion was also discussed on the suitability of the program structure with participants. The experts agreed that the content of the training should be comprehensive to cover the required knowledge, skills and other characteristics (KSOC) of practitioners in the Islamic legacy planning field. The outcome of the program developed is to advise clients on Islamic legacy planning practices and to use the planning tools effectively. This study discovered the knowledge, skills and other characteristics (KSOC) required to practice in the Islamic legacy planning industry in Malaysia from the perspective of practitioners and lead as a trajectory path to the development of competency framework for the Islamic legacy planners in Malaysia.
APA, Harvard, Vancouver, ISO, and other styles
20

Singha, Roy Arabinda. "A legacy of used stone tools from Palaeolithic to Neolithic at Chuagara in the Suvarnarekha- Burahabanga complex." Journal of Heritage, Archaeology & Management (JHAM) 2, no. 1 (2022): 63–89. https://doi.org/10.5281/zenodo.7265475.

Full text
Abstract:
Chuagara or Chau Gora, a site located in a plain land but in the middle of hilly area beside the river Suvarnarekha in Suvarnarekha-Burahabanga complex. The entire drainage system runs over a peneplain surface that was developed due to the lava flow. The complex is known for the occurrences of tools and materials of pre and proto historic culture, that have been collected by scholars in a greater number beside both the rivers. A continuous cultural element of prehistoric culture from palaeolithic to neolithic have been collected. Detail of the tools and their making technologies have been analysed. Though, any postulation would be vague based on the surface collections, but location of the site among the other prehistoric sites and absence of early historic, even late medieval materials somehow let us postulate that collected materials must be placed with the bracket of prehistoric period.  
APA, Harvard, Vancouver, ISO, and other styles
21

LIU, ZHENG-YANG. "AUTOMATING SOFTWARE EVOLUTION." International Journal of Software Engineering and Knowledge Engineering 05, no. 01 (1995): 73–87. http://dx.doi.org/10.1142/s0218194095000058.

Full text
Abstract:
This paper presents a pragmatic knowledge-based approach to evolving and reengineering legacy business software systems. We describe a CASE tool for assisting legacy system evolution by automating the tedious and knowledge-intensive conversion process and show how we developed and maintained the tool and how it is used with payoff. This work demonstrates that timely technology upgrades of legacy systems are only possible with carefully engineered knowledge-based tools. Viewing software as an artifact with structure, behavior, and function, we can represent most program conversion knowledge explicitly for function-preserving transformation in an automated conversion system. The payoff for using the knowledge-based approach to software evolution is not only in terms of valuable resources saved, but also in terms of improved quality of the upgraded software systems.
APA, Harvard, Vancouver, ISO, and other styles
22

Rohit Reddy Kommareddy. "Migration strategies for large-scale legacy applications to AWS cloud ecosystems." World Journal of Advanced Engineering Technology and Sciences 15, no. 3 (2025): 1673–81. https://doi.org/10.30574/wjaets.2025.15.3.0992.

Full text
Abstract:
As organizations strive to modernize their IT infrastructure and remain competitive in the digital economy, cloud migration has become a strategic necessity. Migrating large-scale legacy applications to cloud platforms like Amazon Web Services (AWS) offers advantages in scalability, resilience, performance, and cost optimization. However, legacy applications often present unique challenges due to outdated architectures, tight system coupling, and critical business dependencies. This review explores current migration strategies—including rehosting, replatforming, and refactoring—focusing on their effectiveness, decision frameworks, tools, performance outcomes, and risk mitigation. Through synthesis of academic literature, industrial case studies, and experimental evaluations, this paper provides a comprehensive overview of migration practices for legacy systems targeting AWS. Future research directions and open challenges are identified, encouraging a more automated, secure, and context-aware migration ecosystem.
APA, Harvard, Vancouver, ISO, and other styles
23

Chu, Shaoping, Hari Viswanathan, and Nathan Moodie. "Legacy Well Leakage Risk Analysis at the Farnsworth Unit Site." Energies 16, no. 18 (2023): 6437. http://dx.doi.org/10.3390/en16186437.

Full text
Abstract:
This paper summarizes the results of the risk analysis and characterization of the CO2 and brine leakage potential of Farnsworth Unit (FWU) site wells. The study is part of the U.S. DOE’s National Risk Assessment Partnership (NRAP) program, which aims to quantitatively evaluate long-term environmental risks under conditions of significant geologic uncertainty and variability. To achieve this, NRAP utilizes risk assessment and computational tools specifically designed to quantify uncertainties and calculate the risk associated with geologic carbon dioxide (CO2) sequestration. For this study, we have developed a workflow that utilizes physics-based reservoir simulation results as input to perform leakage calculations using NRAP Tools, specifically NRAP-IAM-CS and RROM-Gen. These tools enable us to conduct leakage risk analysis based on ECLIPSE reservoir simulation results and to characterize wellbore leakage at the Farnsworth Unit Site. We analyze the risk of leakage from both individual wells and the entire field under various wellbore integrity distribution scenarios. The results of the risk analysis for the leakage potential of FWU wells indicate that, when compared to the total amount of CO2 injected, the highest cemented well integrity distribution scenario (FutureGen high flow rate) exhibits approximately 0.01% cumulative CO2 leakage for a 25-year CO2 injection duration at the end of a 50-year post-injection monitoring period. In contrast, the highest possible leakage scenario (open well) shows approximately 0.1% cumulative CO2 leakage over the same time frame.
APA, Harvard, Vancouver, ISO, and other styles
24

Wang, Hai H., Danica Damljanovic, Terry Payne, Nicholas Gibbins, and Kalina Bontcheva. "Transition of legacy systems to semantically enabled applications: TAO method and tools." Semantic Web 3, no. 2 (2012): 157–68. http://dx.doi.org/10.3233/sw-2011-0039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Leduc, D., J. England, and R. Rothermel. "Vulnerabilities of legacy fuel casks when evaluated using modern structural analysis tools." Packaging, Transport, Storage & Security of Radioactive Material 20, no. 2 (2009): 82–87. http://dx.doi.org/10.1179/174651009x443097.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Franklin, Cynthia G., Njoroge Njoroge, and Suzanna Reiss. "Tracing the Settler's Tools: A Forum on Patrick Wolfe's Life and Legacy." American Quarterly 69, no. 2 (2017): 235–47. http://dx.doi.org/10.1353/aq.2017.0017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Ottenbreit-Leftwich, Anne, and Theresa A. Cullen. "Preserving the Legacy of PT3 Tools, Strategies & Resources: Knowledge Capture Artifacts." TechTrends 50, no. 3 (2006): 46–53. http://dx.doi.org/10.1007/s11528-006-7603-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Agarwal, Archita, Marilyn George, Aaron Jeyaraj, and Malte Schwarzkopf. "Retrofitting GDPR compliance onto legacy databases." Proceedings of the VLDB Endowment 15, no. 4 (2021): 958–70. http://dx.doi.org/10.14778/3503585.3503603.

Full text
Abstract:
New privacy laws like the European Union's General Data Protection Regulation (GDPR) require database administrators (DBAs) to identify all information related to an individual on request, e.g. , to return or delete it. This requires time-consuming manual labor today, particularly for legacy schemas and applications. In this paper, we investigate what it takes to provide mostly-automated tools that assist DBAs in GDPR-compliant data extraction for legacy databases. We find that a combination of techniques is needed to realize a tool that works for the databases of real-world applications, such as web applications, which may violate strict normal forms or encode data relationships in bespoke ways. Our tool, GDPRizer, relies on foreign keys, query logs that identify implied relationships, data-driven methods, and coarse-grained annotations provided by the DBA to extract an individual's data. In a case study with three popular web applications, GDPRizer achieves 100% precision and 96--100% recall. GDPRizer saves work compared to hand-written queries, and while manual verification of its outputs is required, GDPRizer simplifies privacy compliance.
APA, Harvard, Vancouver, ISO, and other styles
29

Thilmany, Jean. "Information Aging." Mechanical Engineering 130, no. 03 (2008): 22–25. http://dx.doi.org/10.1115/1.2008-mar-1.

Full text
Abstract:
This paper emphasizes on the importance of keeping aging information in a format that can be used readily and understood by everyone. In order to get up-to-the-minute access to older engineering information, managers need to be ever vigilant about ensuring that legacy data exists in a format, which can be easily understood and accessed. Today a number of new software applications and technologies can help even those companies with seemingly the most outdated of computers, the mainframe. In recent days, software developers have introduced innovative ways in which companies can speedily retrieve legacy information, whether it is stored in a format for a desktop computer or a mainframe. These methods involve migrating or upgrading information or installing middleware, all at some cost. Some companies may choose to move legacy data to the open XML format and use a number of software tools, such as database query methods, to quickly retrieve information. Legacy documents that can be written into XML can include blueprints, CAD designs, change orders, materials specifications, assembly instructions, and cost estimates.
APA, Harvard, Vancouver, ISO, and other styles
30

Mahesh, Mokale. "Optimizing Legacy Data Systems: Techniques for Seamless Modernization." International Journal of Innovative Research in Engineering & Multidisciplinary Physical Sciences 7, no. 1 (2019): 1–6. https://doi.org/10.5281/zenodo.14900623.

Full text
Abstract:
Legacy data systems remain vital for many organizations, serving as the backbone for critical business operations. However, as technology evolves, these systems often become bottlenecks due to outdated architectures, inefficient processes, limited scalability, and security vulnerabilities. Organizations frequently encounter real-world challenges such as data silos, lack of interoperability with modern applications, high operational costs, and compliance issues that hinder agility and innovation. This paper examines comprehensive techniques for optimizing and modernizing legacy data systems while maintaining data integrity and minimizing disruptions. The discussion focuses on methodologies available on or before 2019, including database migration strategies, advanced data integration approaches, performance optimization techniques, and the adoption of hybrid cloud models. By leveraging schema refinements, indexing strategies, data partitioning, ETL processes, and real-time processing tools such as Apache Kafka, organizations can enhance system performance and interoperability. Additionally, the paper explores best practices for bridging legacy systems with modern applications through middleware solutions and hybrid cloud environments. These strategies collectively enable organizations to extend the longevity of their legacy infrastructures while improving efficiency, scalability, security, and compliance with evolving regulatory requirements.
APA, Harvard, Vancouver, ISO, and other styles
31

Vega-Gorgojo, Guillermo, Eduardo Gómez-Sánchez, Miguel Bote-Lorenzo, and Juan Asensio-Pérez. "RESTifying a Legacy Semantic Search System: Experience and Lessons Learned." JUCS - Journal of Universal Computer Science 18, no. (2) (2012): 286–311. https://doi.org/10.3217/jucs-018-02-0286.

Full text
Abstract:
The REST architectural style pursues scalability and decoupling of application components on target architectures, as opposed to the focus on distribution transparency of RPC-based middleware infrastructures. Ongoing debate between REST and RPC proponents evidences the need of comparisons of both approaches, as well as case studies showing the implications in the development of RESTful applications. With this aim, this paper presents a revamped RESTful version of a legacy RPC-based search system of educational tools named Ontoolsearch. The former version suffers from reduced interoperability with third-party clients, limited visibility of interactions and has some scalability issues due to the use of an RPC-based middleware. These limitations are addressed in the RESTful application as a result of applying REST constraints and using the Atom data format. Further, a benchmarking experiment showed that scalability of the RESTful prototype is superior, measuring a ∼3 times increase of peak throughput. In addition, some lessons learned on RESTful design and implementation have been derived from this work that may be of interest for future developments.
APA, Harvard, Vancouver, ISO, and other styles
32

Vats, Satvik, Bharat Bhushan Sagar, Karan Singh, Ali Ahmadian, and Bruno A. Pansera. "Performance Evaluation of an Independent Time Optimized Infrastructure for Big Data Analytics that Maintains Symmetry." Symmetry 12, no. 8 (2020): 1274. http://dx.doi.org/10.3390/sym12081274.

Full text
Abstract:
Traditional data analytics tools are designed to deal with the asymmetrical type of data i.e., structured, semi-structured, and unstructured. The diverse behavior of data produced by different sources requires the selection of suitable tools. The restriction of recourses to deal with a huge volume of data is a challenge for these tools, which affects the performances of the tool’s execution time. Therefore, in the present paper, we proposed a time optimization model, shares common HDFS (Hadoop Distributed File System) between three Name-node (Master Node), three Data-node, and one Client-node. These nodes work under the DeMilitarized zone (DMZ) to maintain symmetry. Machine learning jobs are explored from an independent platform to realize this model. In the first node (Name-node 1), Mahout is installed with all machine learning libraries through the maven repositories. The second node (Name-node 2), R connected to Hadoop, is running through the shiny-server. Splunk is configured in the third node (Name-node 3) and is used to analyze the logs. Experiments are performed between the proposed and legacy model to evaluate the response time, execution time, and throughput. K-means clustering, Navies Bayes, and recommender algorithms are run on three different data sets, i.e., movie rating, newsgroup, and Spam SMS data set, representing structured, semi-structured, and unstructured data, respectively. The selection of tools defines data independence, e.g., Newsgroup data set to run on Mahout as others cannot be compatible with this data. It is evident from the outcome of the data that the performance of the proposed model establishes the hypothesis that our model overcomes the limitation of the resources of the legacy model. In addition, the proposed model can process any kind of algorithm on different sets of data, which resides in its native formats.
APA, Harvard, Vancouver, ISO, and other styles
33

Gregersen, J. B., P. J. A. Gijsbers, S. J. P. Westen, and M. Blind. "OpenMI: the essential concepts and their implications for legacy software." Advances in Geosciences 4 (August 9, 2005): 37–44. http://dx.doi.org/10.5194/adgeo-4-37-2005.

Full text
Abstract:
Abstract. Information & Communication Technology (ICT) tools such as computational models are very helpful in designing river basin management plans (rbmp-s). However, in the scientific world there is consensus that a single integrated modelling system to support e.g. the implementation of the Water Framework Directive cannot be developed and that integrated systems need to be very much tailored to the local situation. As a consequence there is an urgent need to increase the flexibility of modelling systems, such that dedicated model systems can be developed from available building blocks. The HarmonIT project aims at precisely that. Its objective is to develop and implement a standard interface for modelling components and other relevant tools: The Open Modelling Interface (OpenMI) standard. The OpenMI standard has been completed and documented. It relies entirely on the "pull" principle, where data are pulled by one model from the previous model in the chain. This paper gives an overview of the OpenMI standard, explains the foremost concepts and the rational behind it.
APA, Harvard, Vancouver, ISO, and other styles
34

Tanaka, Masayuki, Hiroyuki Ikeda, Kazumi Murata, et al. "Hyper Suprime-Cam Legacy Archive." Publications of the Astronomical Society of Japan 73, no. 3 (2021): 735–46. http://dx.doi.org/10.1093/pasj/psab034.

Full text
Abstract:
Abstract We present the launch of the Hyper Suprime-Cam Legacy Archive (HSCLA), a public archive of processed, science-ready data from the Hyper Suprime-Cam (HSC). HSC is an optical wide-field imager installed at the prime focus of the Subaru Telescope and has been in operation since 2014. While ∼1/3 of the total observing time of HSC has been used for the Subaru Strategic Program (SSP), the remainder of the time is used for Principal Investigator (PI)-based programs. We have processed the data from these PI-based programs and make the processed, high-quality data available to the community through HSCLA. The current version of HSCLA includes data taken in the first year of science operation, 2014. We provide both individual and coadd images as well as photometric catalogs. The photometric catalog from the coadd is loaded to the database, which offers a fast access to the large catalog. There are other online tools such as an image browser and an image cutout tool and they will be useful for science analyses. The coadd images reach 24–27th magnitudes at 5σ for point sources and cover approximately 580 square degrees in at least one filter with 150 million objects in total. We perform extensive quality assurance tests and verify that the photometric and astrometric quality of the data is good enough for most scientific explorations. However, the data are not without problems and users are referred to the list of known issues before exploiting the data for science. All the data and documentations can be found at the data release site, 〈https://hscla.mtk.nao.ac.jp/〉.
APA, Harvard, Vancouver, ISO, and other styles
35

Pavan Kumar Boyapati. "Modernizing data migration from legacy systems using an intelligent interface powered by AI." World Journal of Advanced Engineering Technology and Sciences 15, no. 1 (2025): 163–74. https://doi.org/10.30574/wjaets.2025.15.1.0183.

Full text
Abstract:
This article explores how organizations can modernize data migration from legacy systems to cloud platforms using an intelligent interface powered by artificial intelligence. The approach combines a metadata-driven foundation that captures comprehensive information about source and target systems with intuitive visual mapping tools that enable collaboration between technical and business stakeholders. AI capabilities significantly enhance legacy data understanding through automated profiling, schema discovery, intelligent classification, and smart transformation suggestions. The execution phase leverages automated conversion through transformation engines, code generation, and incremental migration support. Robust validation mechanisms ensure data integrity through quality verification, reconciliation reporting, and automated testing. The intelligent interface also facilitates stakeholder engagement through intuitive dashboards, collaboration tools, and knowledge repositories. Cloud integration provides additional advantages including elastic scalability, secure data handling, cost optimization, and seamless integration with cloud data services. Organizations implementing this approach can expect accelerated timelines, reduced costs, improved data quality, lower risk, better documentation, and increased stakeholder satisfaction.
APA, Harvard, Vancouver, ISO, and other styles
36

LaPorte, Jody, and Danielle N. Lussier. "What Is the Leninist Legacy? Assessing Twenty Years of Scholarship." Slavic Review 70, no. 3 (2011): 637–54. http://dx.doi.org/10.5612/slavicreview.70.3.0637.

Full text
Abstract:
In this review essay, Jody LaPorte and Danielle N. Lussier examine the “legacies” paradigm dominating postcommunist scholarship in the social sciences. The legacies paradigm has produced a growing list of factors that qualify as historical antecedents to contemporary outcomes without establishing a set of shared standards to guide comparative analysis. Scholars have paid less attention to developing a conceptual definition of legacy, thereby limiting our ability to evaluate the importance of historical factors versus more proximate causes. This critique presents a thoughtful analysis of the communist legacy, develops a typology that can be used to categorize legacy variables for meaningful comparison, and brings the concept into discussion with the broader literature on historical institutions and path dependency. By suggesting tools to aid comparative study, LaPorte and Lussier’s goal is to stimulate both conceptual and empirical analysis in evaluating the effect of communism on contemporary outcomes.
APA, Harvard, Vancouver, ISO, and other styles
37

Boronat, Artur, Jennifer Pérez, Jose Carsí, and Isidro Ramos. "Two Experiences in Software Dynamics." JUCS - Journal of Universal Computer Science 10, no. (4) (2004): 428–53. https://doi.org/10.3217/jucs-010-04-0428.

Full text
Abstract:
This paper presents an outline of a formal model management framework that provides breakthroughs for legacy systems recovery (RELS) and for data migration (ADAM). To recover a legacy system, we use an algebraic approach by using algebras in order to represent the models and manipulate them. RELS also generates automatically a data migration plan that specifies a data transfer process to save all the legacy knowledge in the new recovered data-base. The data migration solution is also introduced as a support for the O-O conceptual schemas evolution where their persistent layers are stored by means of relational databases, in the ADAM tool. Contents and structure of the data migration plans are specified using an abstract data migration language. Our past experience in both projects has guided us towards the model management research field. We present a case study that illustrates the application of both tools.
APA, Harvard, Vancouver, ISO, and other styles
38

DJORDJEVIC, GORAN S. "NONCOMMUTATIVITY AND HUMANITY — JULIUS WESS AND HIS LEGACY." International Journal of Modern Physics: Conference Series 13 (January 2012): 66–85. http://dx.doi.org/10.1142/s2010194512006745.

Full text
Abstract:
A personal view on Julius Wess's human and scientific legacy in Serbia and the Balkan region is given. Motivation for using noncommutative and nonarchimedean geometry on very short distances is presented. In addition to some mathematical preliminaries, we present a short introduction in adelic quantum mechanics in a way suitable for its noncommutative generalization. We also review the basic ideas and tools embedded in q-deformed and noncommutative quantum mechanics. A rather fundamental approach, called deformation quantization, is noted. A few relations between noncommutativity and nonarchimedean spaces, as well as similarities between corresponding quantum theories, in particular, quantum cosmology are pointed out. An extended Moyal product in a frame of an adelic noncommutative quantum mechanics is also considered.
APA, Harvard, Vancouver, ISO, and other styles
39

Perley, Daniel A. "The Swift GRB Host Galaxy Legacy Survey." Proceedings of the International Astronomical Union 11, A29B (2015): 248. http://dx.doi.org/10.1017/s1743921316005172.

Full text
Abstract:
AbstractI will describe the Swift Host Galaxy Legacy Survey (SHOALS), a comprehensive multiwavelengthprogram to characterize the demographics of the GRB host population and its redshift evolution from z=0 to z=7.Using unbiased selection criteria we have designated a subset of 119 Swift gamma-ray bursts which are now beingtargeted with intensive observational follow-up. Deep Spitzer imaging of every field has already been obtained andanalyzed, with major programs ongoing at Keck, GTC, Gemini, VLT, and Magellan to obtain complementaryoptical/NIR photometry and spectroscopy to enable full SED modeling and derivation of fundamental physicalparameters such as mass, extinction, and star-formation rate. Using these data I will present an unbiasedmeasurement of the GRB host-galaxy luminosity and mass distributions and their evolution with redshift, compareGRB hosts to other star-forming galaxy populations, and discuss implications for the nature of the GRB progenitor andthe ability of GRBs to serve as tools for measuring and studying cosmic star-formation in the distant universe.
APA, Harvard, Vancouver, ISO, and other styles
40

Gundla, Sandeep Reddy. "AI-Assisted Legacy Modernization: Automating Monolith-to-Microservice Decomposition." International journal of networks and security 05, no. 01 (2025): 147–73. https://doi.org/10.55640/ijns-05-01-09.

Full text
Abstract:
Legacy systems are still critical business operations in many industries – but they are becoming roadblocks to innovation, agility, and scalability. As enterprises increasingly pressure themselves to modernize their aging infrastructures, strategic implementation of a transition from monolithic to microservices is gaining ground. Transforming this type of complex monolith into microservices is not a trivial task. It presents technical and organizational challenges, including bureaucratic service boundaries embedded in legacy codebases that tightly couple the service's functionality. The topic of this article is how artificial intelligence (AI) can help automate the decomposition of monolithic systems into decomposed, scalable microservices. By using machine learning, natural language processing, and clustering algorithms, AI tools can analyze source code, runtime data, and interactions between system components to determine intelligent service boundaries. A detailed methodology for AI-assisted decomposition is presented, along with real-world tools such as IBM Mono2Micro and AWS Microservice Extractor. A practical case study involving a global e-commerce company is included to illustrate applied outcomes. Additionally, the article addresses key challenges such as data inconsistency, domain misalignment, and organizational resistance. How it works outlines best practices to support successful implementation, including incremental migration patterns, domain-driven design, and DevOps integration. The article concludes with strategic recommendations and a forward-looking perspective on how AI will further change the modernization process. When done right, AI improves organizations’ ability to create agile, future-prepared software ecosystems.
APA, Harvard, Vancouver, ISO, and other styles
41

Naga Murali Krishna Koneru. "Modernizing CI/CD Pipelines: Migrating from Legacy Tools to GitLab for Enterprise Applications." International Journal of Science and Research Archive 1, no. 2 (2021): 136–56. https://doi.org/10.30574/ijsra.2021.1.2.0027.

Full text
Abstract:
In today’s modern software development, CI and CD pipelines are essential tools for faster and more reliable delivery of high-quality software. Such legacy CI/CD tools perpetuate themselves, and many enterprises continue to cling to inefficient, janky, and constantly fragmented tools that default to introduce complexities in scaling and integrating with modern technologies. This paper advocates developing a modern CI/CD system with GitLab, a DevOps platform combining version control, CI/CD, monitoring, and security into a single unified platform. This paper studies the difficulties with the current CI/CD systems, what GitLab presents advantages for, and how to perform an enterprise migration of CI/CD. Scalability, automation features, and smooth integration with cloud platforms like Kubernetes and AWS are advantages of Google GitLab. A real-world case study of migration to GitLab shows empirically that it increased the deployment frequency, lead time, and operational efficiency. The results indicate that GitLab provides a good scale, efficient, and integrated solution in support of the requirements of the modern software architecture, which includes microservices and cloud-native applications. This research gives businesses a hands-on approach to their CI/CD pipelines, focusing on automated processes, toolchain convergence, and continuous improvement. Organizations adopting GitLab acquire speed in software delivery, lessen operational costs, and keep the market edge when technology becomes more complex.
APA, Harvard, Vancouver, ISO, and other styles
42

Couretas, J. M. "System Architectures: Legacy Tools/Methods, DoDAF Descriptions and Design Through System Alternative Enumeration." Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 3, no. 4 (2006): 227–37. http://dx.doi.org/10.1177/875647930600300404.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Hughes, Sara. "Principles, drivers, and policy tools for just climate change adaptation in legacy cities." Environmental Science & Policy 111 (September 2020): 35–41. http://dx.doi.org/10.1016/j.envsci.2020.05.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Chukhman, Ilya, Shuoxin Lin, William Plishker, Chung-Ching Shen, and Shuvra S. Bhattacharyya. "Instrumentation-Driven Model Detection and Actor Partitioning for Dataflow Graphs." International Journal of Embedded and Real-Time Communication Systems 4, no. 1 (2013): 1–21. http://dx.doi.org/10.4018/jertcs.2013010101.

Full text
Abstract:
Dataflow modeling offers a myriad of tools to improve optimization and analysis of signal processing applications, and is often used by designers to help design, implement, and maintain systems on chip for signal processing. However, maintaining and upgrading legacy systems that were not originally designed using dataflow methods can be challenging. Designers often convert legacy code to dataflow graphs by hand, a process that can be difficult and time consuming. In this paper, the authors developed a method to facilitate this conversion process by automatically detecting the dataflow models of the core functions from bodies of legacy code. They focus first on detecting static dataflow models, such as homogeneous and synchronous dataflow, and then present an extension that can also detect dynamic dataflow models. Building on the authors’ algorithms for dataflow model detection, they present an iterative actor partitioning process that can be used to partition complex actors into simpler sub-functions that are more prone to analysis techniques.
APA, Harvard, Vancouver, ISO, and other styles
45

Tandon, Yaman. "How GenAI Agents Are Transforming Legacy Application Modernization." European Journal of Computer Science and Information Technology 13, no. 36 (2025): 62–71. https://doi.org/10.37745/ejcsit.2013/vol13n366271.

Full text
Abstract:
This article explores how Generative AI (GenAI) is revolutionizing legacy application modernization in enterprise environments. Legacy systems, with their outdated technologies and rigid architectures, represent significant technical debt and maintenance burdens for organizations. GenAI-powered agents are emerging as transformative tools that can analyze complex codebases, discover implicit knowledge, recommend customized modernization strategies, and automate code transformation. The article examines core capabilities of these AI agents, including automated code analysis, intelligent strategy formulation, code transformation, and API generation. It presents implementation approaches across assessment, execution, and governance phases, supported by case studies from financial services, healthcare, and manufacturing sectors that demonstrate substantial improvements in modernization speed, cost, and outcomes. As these technologies continue to evolve, they promise to fundamentally reimagine how organizations approach technical debt and enable more adaptive, innovative technology landscapes.
APA, Harvard, Vancouver, ISO, and other styles
46

Hoole, Samuel Ratnajeevan Herbert, Thiruchelvam Arudchelvam, and Janaka Wijayakulasooriya. "Reverse Engineering Legacy Finite Element Code." Materials Science Forum 721 (June 2012): 307–12. http://dx.doi.org/10.4028/www.scientific.net/msf.721.307.

Full text
Abstract:
The development of code for finite elements-based field computation has been going on at a pace since the 1970s, yielding code that was not put through the software lifecycle – where code is developed through a sequential process of requirements elicitation from the user/client to design, analysis, implementation and testing and release and maintenance. As a result, today we have legacy code running into millions of lines, implemented without planning and not using proper state-of-the-art software design tools. It is necessary to redo this code to exploit new object oriented facilities and make corrections or run on the web with Java. Object oriented code’s principal advantage is reusability. Recent advances in software make such reverse engineering/re-engineering of this code into object oriented form possible. The purpose of this paper is to show how existing finite element code can be reverse/re-engineered to improve it. Taking sections of working finite element code, especially matrix computation for equation solution as examples, we put it through reverse engineering to arrive at the effective UML design by which development was done and then translate it to Java. This then is the starting point for analyzing the design and improving it without having to throw away any of the old code. Using auto-translators and then visually rewriting parts by the design so revealed, has no match in terms of speed and efficiency of re-engineering legacy code.
APA, Harvard, Vancouver, ISO, and other styles
47

Nascimento, Elisa Larkin. "The Ram’s Horns: Reflections on the Legacy of Abdias Nascimento." Journal of Black Studies 52, no. 6 (2021): 588–601. http://dx.doi.org/10.1177/00219347211006484.

Full text
Abstract:
Abdias Nascimento’s legacy is timely in a world experiencing exacerbated racial conflict and setbacks in public policy addressing inequality. This essay addresses two dimensions: on one hand, Nascimento’s life and work, and the tools he used to combat racism in the diverse realms of social and political activism as well as culture and the arts; on the other, IPEAFRO’s efforts and initiatives to make his legacy a living one, current with the needs of our time. My sources are my firsthand experience as Nascimento’s translator, co-author, and co-worker for 36 years; and his archives, the contents of which IPEAFRO is in the process of organizing, microfilming, digitizing, and making available via internet, a project that I coordinate.
APA, Harvard, Vancouver, ISO, and other styles
48

Zuraidi, Siti Nor Fatimah, Mohammad Ashraf Abdul Rahman, and Zainal Abidin Akasah. "The Development of Condition Assessment for Heritage Building." E3S Web of Conferences 65 (2018): 01007. http://dx.doi.org/10.1051/e3sconf/20186501007.

Full text
Abstract:
This study examines the criteria and properties of the elements in the legacy of buildings. Using the Hierarchy Analysis Method (AHP), new instruments are developed based on the criteria and attributes that have been identified for the legacy building elements. The new instrument is given to industry professionals and academicians to get their opinions. This study shows the number of attributes of the score for the criteria. The results show that new instruments are developed and used as tools for assessing the elements of heritage building conditions. This new instrument can be proposed to the National Heritage Department to be used as a guideline for assessing the heritage buildings in the future.
APA, Harvard, Vancouver, ISO, and other styles
49

Pognan, François, Thomas Steger-Hartmann, Carlos Díaz, et al. "The eTRANSAFE Project on Translational Safety Assessment through Integrative Knowledge Management: Achievements and Perspectives." Pharmaceuticals 14, no. 3 (2021): 237. http://dx.doi.org/10.3390/ph14030237.

Full text
Abstract:
eTRANSAFE is a research project funded within the Innovative Medicines Initiative (IMI), which aims at developing integrated databases and computational tools (the eTRANSAFE ToxHub) that support the translational safety assessment of new drugs by using legacy data provided by the pharmaceutical companies that participate in the project. The project objectives include the development of databases containing preclinical and clinical data, computational systems for translational analysis including tools for data query, analysis and visualization, as well as computational models to explain and predict drug safety events.
APA, Harvard, Vancouver, ISO, and other styles
50

Majó-Vázquez, Sílvia, Ana S. Cardenal, Oleguer Segarra, and Pol Colomer De Simón. "Media Roles in the Online News Domain: Authorities and Emergent Audience Brokers." Media and Communication 8, no. 2 (2020): 98–111. http://dx.doi.org/10.17645/mac.v8i2.2741.

Full text
Abstract:
This article empirically tests the role of legacy and digital-born news media, mapping the patterns of audience navigation across news sources and the relationship between news providers. We borrow tools from network science to bring evidence that suggest legacy news media retain control of the most central positions in the online news domain. Great progress has been made in discussing theoretically the impact of the Internet on the news media ecology. Less research attention, however, has been given to empirically testing changes in the role of legacy media and the rising prominence of digital-born outlets. To fill this gap, in this study we use the hyperlink-induced topic search algorithm, which identifies authorities by means of a hyperlink network, to show that legacy media are still the most authoritative sources in the media ecology. To further substantiate their dominant role, we also examine the structural position of news providers in the audience network. We gather navigation data from a panel of 30,000 people and use it to reproduce the network of patterns of news consumption. While legacy news media retain control of the brokerage positions for the general population, our analysis—focused on patterns of young news consumers—reveals that new digital outlets also occupy relevant positions to control the audience flow. The results of this study have substantive implications for our understanding of news organizations’ roles and how they attain authority in the digital age.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!