Academic literature on the topic 'Research data preservation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Research data preservation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Research data preservation"

1

Kowalczyk, Stacy T. "Modelling the Research Data Lifecycle." International Journal of Digital Curation 12, no. 2 (2018): 331–61. http://dx.doi.org/10.2218/ijdc.v12i2.429.

Full text
Abstract:
This paper develops and tests a lifecycle model for the preservation of research data by investigating the research practices of scientists. This research is based on a mixed-method approach. An initial study was conducted using case study analytical techniques; insights from these case studies were combined with grounded theory in order to develop a novel model of the Digital Research Data Lifecycle. A broad-based quantitative survey was then constructed to test and extend the components of the model. The major contribution of these research initiatives are the creation of the Digital Research Data Lifecycle, a data lifecycle that provides a generalized model of the research process to better describe and explain both the antecedents and barriers to preservation. The antecedents and barriers to preservation are data management, contextual metadata, file formats, and preservation technologies. The availability of data management support and preservation technologies, the ability to create and manage contextual metadata, and the choices of file formats all significantly effect the preservability of research data.
APA, Harvard, Vancouver, ISO, and other styles
2

Subaveerapandiyan, A., and Maurya Anuradha. "Research Data Preservation Practices of Library and Information Science Faculties." DESIDOC Journal of Library & Information Technology 42, no. 4 (2022): 259–64. https://doi.org/10.14429/djlit.42.4.17538.

Full text
Abstract:
Digitisation of research data is widely increasing all around the world because it needs more and development of enormous digital technologies. Data curation services are starting to offer many libraries. Research data curation is the collective invaluable and reusable information of the researchers. Collected data preservation is more important. The majority of the higher education institutes preserved the research data for their students and researchers. It is stored for a long time using various formats. It is called research data preservation. Without proper research data management plan and implementation cannot curate the research data. The aim of the study is to identify the Asian Library and Information Science (LIS) faculties’ experiences in the research data preservation and curation during their research. Data management, curation and preservation all are interlinked. For reuse of the research data; data curation is an essential role. For this research, we adopted a survey method and an online questionnaire was shared with 1400 LIS professionals, belonging to the Asian region but the completed study respondents are 125 university faculties from various Asian countries. The study findings are 76.8 per cent generated statistical data followed by 58.4 per cent textual files. By far, the most preferable data analysis tool is Microsoft Excel 82.4 per cent. Moreover, the result shows that generated data is mostly stored by personal computers and laptop hard disks. This study concludes LIS faculties having adequate skills and knowledge on data curation and preservation even though they are expecting more services from their academic institute libraries.
APA, Harvard, Vancouver, ISO, and other styles
3

Raman, Ganguly. "Digital Ecosystems for Data Preservation." Digital Platform: Information Technologies in Sociocultural Sphere 1, no. 1 (2018): 87–96. https://doi.org/10.31866/2617-796x.1.2018.151343.

Full text
Abstract:
The purpose of the investigation is the review and research of digital ecosystems for data preservation. This paper will address issues concerning the handling of complex data such as research data, multimedia content, e‐learning content, and the use of repositories infrastructures. At the University of Vienna, an ecosystem for digital data preservation and research data management has already been established and will be subsequently be enlarged according to future needs and requirements in the future. This living digital ecosystem is the foundation for research data management and was implemented from the beginning as a central service according to the FAIR principles as stated in the first HLEG‐EOSC (https://ec.europa.eu/digitalsingle‐market) report. With the help of ten years of professional experience, a model for digital data preservation was established to address the complexity of heterogeneous data. It was necessary because of different use cases assigned to the interdisciplinary data management team based on the Computer Centre and the Library. The source for the use cases are research projects, their different approach to research and their multifaceted requirements regarding the efficient re‐use of data. The usage of this model might be considered as the foundation on which an ecosystem for digital data preservation can be built. Research methods is the management Phaidra Management created a model based on the Data Publication Pyramid, and added data not directly included in publications, such as inconclusive and negative results. Scientific novelty is to use three different models as a guide, the management redesigned the repository infrastructure, an important starting point for the transition from a simple repository concept to a living digital ecosystem concept. Conclusions. It provides a good working infrastructure, and connect with the research community and maintain links to other infrastructure projects.
APA, Harvard, Vancouver, ISO, and other styles
4

Conway, Esther, Brian Matthews, David Giaretta, Simon Lambert, Michael Wilson, and Nick Draper. "Managing Risks in the Preservation of Research Data with Preservation Networks." International Journal of Digital Curation 7, no. 1 (2012): 3–15. http://dx.doi.org/10.2218/ijdc.v7i1.210.

Full text
Abstract:
Network modelling provides a framework for the systematic analysis of needs and options for preservation. A number of general strategies can be identified, characterised and applied to many situations; these strategies may be combined to produce robust preservation solutions tailored to the needs of the community and responsive to their environment. This paper provides an overview of this approach. We describe the components of a Preservation Network Model and go on to show how it may be used to plan preservation actions according to the requirements of the particular situation using illustrative examples from scientific archives.
APA, Harvard, Vancouver, ISO, and other styles
5

Navale, Vivek, and Matthew McAuliffe. "Long-term preservation of biomedical research data." F1000Research 7 (August 29, 2018): 1353. http://dx.doi.org/10.12688/f1000research.16015.1.

Full text
Abstract:
Genomics and molecular imaging, along with clinical and translational research have transformed biomedical science into a data-intensive scientific endeavor. For researchers to benefit from Big Data sets, developing long-term biomedical digital data preservation strategy is very important. In this opinion article, we discuss specific actions that researchers and institutions can take to make research data a continued resource even after research projects have reached the end of their lifecycle. The actions involve utilizing an Open Archival Information System model comprised of six functional entities: Ingest, Access, Data Management, Archival Storage, Administration and Preservation Planning. We believe that involvement of data stewards early in the digital data life-cycle management process can significantly contribute towards long term preservation of biomedical data. Developing data collection strategies consistent with institutional policies, and encouraging the use of common data elements in clinical research, patient registries and other human subject research can be advantageous for data sharing and integration purposes. Specifically, data stewards at the onset of research program should engage with established repositories and curators to develop data sustainability plans for research data. Placing equal importance on the requirements for initial activities (e.g., collection, processing, storage) with subsequent activities (data analysis, sharing) can improve data quality, provide traceability and support reproducibility. Preparing and tracking data provenance, using common data elements and biomedical ontologies are important for standardizing the data description, making the interpretation and reuse of data easier. The Big Data biomedical community requires scalable platform that can support the diversity and complexity of data ingest modes (e.g. machine, software or human entry modes). Secure virtual workspaces to integrate and manipulate data, with shared software programs (e.g., bioinformatics tools), can facilitate the FAIR (Findable, Accessible, Interoperable and Reusable) use of data for near- and long-term research needs.
APA, Harvard, Vancouver, ISO, and other styles
6

Burgi, Pierre-Yves, Eliane Blumer, and Basma Makhlouf-Shabou. "Research data management in Switzerland." IFLA Journal 43, no. 1 (2017): 5–21. http://dx.doi.org/10.1177/0340035216678238.

Full text
Abstract:
In this article, the authors report on an ongoing data life cycle management national project realized in Switzerland, with a major focus on long-term preservation. Based on an extensive document analysis as well as semi-structured interviews, the project aims at providing national services to respond to the most relevant researchers’ data life cycle management needs, which include: guidelines for establishing a data management plan, active data management solutions, long-term preservation storage options, training, and a single point of access and contact to get support. In addition to presenting the different working axes of the project, the authors describe a strategic management and lean startup template for developing new business models, which is key for building viable services.
APA, Harvard, Vancouver, ISO, and other styles
7

Moore, Fred. "Long Term Data Preservation." EDPACS 27, no. 12 (2000): 1. http://dx.doi.org/10.1201/1079/43258.27.12.20000601/30347.7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Devouassoux, Marion, João Fernandes, Bob Jones, Ignacio Peluaga Lozada, and Jakub Urban. "ARCHIVER - Data archiving and preservation for research environments." EPJ Web of Conferences 251 (2021): 02044. http://dx.doi.org/10.1051/epjconf/202125102044.

Full text
Abstract:
Over the last decades, several data preservation efforts have been undertaken by the HEP community, as experiments are not repeatable and consequently their data considered unique. ARCHIVER is a European Commission (EC) co-funded Horizon 2020 pre-commercial procurement project procuring R&D combining multiple ICT technologies including data-intensive scalability, network, service interoperability and business models, in a hybrid cloud environment. The results will provide the European Open Science Cloud (EOSC) with archival and preservation services covering the full research lifecycle. The services are co-designed in partnership with four research organisations (CERN, DESY, EMBL-EBI and PIC/IFAE) deploying use cases from Astrophysics, HEP, Life Sciences and Photon-Neutron Sciences creating an innovation ecosystem for specialist data archiving and preservation companies willing to introduce new services capable of supporting the expanding needs of research. The HEP use cases being deployed include the CERN Opendata portal, preserving a second copy of the completed BaBar experiment and the CERN Digital Memory digitising CERN’s multimedia archive of the 20th century. In parallel, ARCHIVER has established an Early Adopter programme whereby additional use cases can be incorporated at each of the project phases thereby expanding services to multiple research domains and countries.
APA, Harvard, Vancouver, ISO, and other styles
9

Mozzherin, Dmitry, and Deborah Paul. "Preservation strategies for biodiversity data." Biodiversity Information Science and Standards 7 (August 22, 2023): e111453. https://doi.org/10.3897/biss.7.111453.

Full text
Abstract:
We are witnessing a fast proliferation of biodiversity informatics projects. The data accumulated by these initiatives often grows rapidly, even exponentially. Most of these projects start small and do not foresee the data achitecture challenges of their future. Organizations may lack the necessary expertise and/or money to strategically address the care and feeding of this expanding data pile. In other cases, individuals with the expertise to address these needs may be present, but lack the power or status or possibly the bandwidth to take effective actions. Over time, the data may increase in size to such an extent that handling and preserving it becomes an almost insurmountable problem. The most common technical challenges include migrating data from one physical data storage to another, organizing backups, providing fast disaster recovery, and preparing data to be accessible for posterity. Some sociotechnical and strategic hurdles noted when trying to address data stewardship include funding, data leadership (Stack and Stadolnik 2018) and vision (or lack thereof), and organizational structure and culture. The biodiversity data collected today will be indispensable for future research, and it is our collective responsibility to preserve it for current and future generations.Some of the most common information loss risk factors are the end of funding, retirement of a researcher, or the departure of a critical researcher or programmer. More risk factors, such as hardware malfunction, hurricanes, tornadoes, and severe magnetic storms, can destroy the data carefully collected by large groups of people.The co-location of original data and backups create a "Library of Alexandria" where a single disaster at this location can lead to permanent data loss and poses an existential threat to the project.Biodiversity data becomes more valuable over time and should survive for several centuries. However, SSD (solid-state drive) and HDD (hard disk drive) storage solutions have an expiration date of only a few years. We propose the following solutions (Fig. 1) to provide long-term data security:Technical tacticsUse an immutable file storage for everything that is not entered very recently.Most of the biodiversity "big data" are files that are written once and never changed again. We suggest separating storage into a read-only part and small read/write sections. Data from the read/write section can be moved to the read-only part often, for example, daily.Use a Copy-On-Write file system, such as ZFS (Zettabyte File System).The ZFS file system is widely used in industry and is known for its robustness and error resistance. It allows efficient incremental backups and much faster data transfer than other systems. Regular incremental backups can work even with slow internet connections. ZFS provides real-time data integrity checks and uses powerful tools for data healing.Split data and its backups into smaller chunks.Dividing backups into cost-effective 2–8 terabyte chunks allows running backups using cheap hardware. Assuming that the data is read-only, such data organization always splits the backup into chunks, with hardware costs changing from tens of thousands of dollars (US) to less than two thousand dollars. We recognize that with time data storage costs drop, and larger chunks will be used.Split the data even further to the size of the largest available long-term storage unit (currently an optical M-disc).The write-once optical M-DISC is analogous to a Sumerian clay tablet. Data written on such discs does not deteriorate for hundreds of years. This option addresses the need for last resort backups because the storage does not depend on magnetic properties and is impervious to electromagnetic disasters. Optical discs can be easily and cheaply copied and distributed to libraries worldwide. In the future, discs' data can be transferred to a different long-term storage medium. We also trust these discs can be deciphered by those in the future, just like clay tablets.Sociotechnical insightsThe above example of a comprehensive strategy to preserve data epitomizes "LOCKSS" (lots of copies keep stuff safe) and makes it clear that these copies need to be in multiple media types. Our suggestions here focus on projects that experience data growth pains. Such projects often look to see how others address these data needs. Recently, The Species File Group (SFG) did this exercise to evaluate and address our data growth needs (Mozzherin et al. 2023). We recognize and emphasize here the need forpersonnel with the knowledge and skills to build, maintain, and evolve robust strategies and infrastructure to make data accessible and preserve it,funding to back the most suitable architectural strategies to do so, andpeople with expertise in long-term data security to have a seat at the leadership table in our organizations.We encourage our colleagues to evaluate the status of data leadership at your organizations (Stack and Stadolnik 2018, Kalms 2012). Implementing these suggestions will help ensure the survival of the data and accompanying software for hundreds of years to come.
APA, Harvard, Vancouver, ISO, and other styles
10

Sesartic, Ana, and Matthias Töwe. "Research Data Services at ETH-Bibliothek." IFLA Journal 42, no. 4 (2016): 284–91. http://dx.doi.org/10.1177/0340035216674971.

Full text
Abstract:
The management of research data throughout its life-cycle is both a key prerequisite for effective data sharing and efficient long-term preservation of data. This article summarizes the data services and the overall approach to data management as currently practised at ETH-Bibliothek, the main library of ETH Zürich, the largest technical university in Switzerland. The services offered by service providers within ETH Zürich cover the entirety of the data life-cycle. The library provides support regarding conceptual questions, offers training and services concerning data publication and long-term preservation. As research data management continues to play a steadily more prominent part in both the requirements of researchers and funders as well as curricula and good scientific practice, ETH-Bibliothek is establishing close collaborations with researchers, in order to promote a mutual learning process and tackle new challenges.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Research data preservation"

1

Trisovic, Ana. "Data preservation and reproducibility at the LHCb experiment at CERN." Thesis, University of Cambridge, 2018. https://www.repository.cam.ac.uk/handle/1810/283607.

Full text
Abstract:
This dissertation presents the first study of data preservation and research reproducibility in data science at the Large Hadron Collider at CERN. In particular, provenance capture of the experimental data and the reproducibility of physics analyses at the LHCb experiment were studied. First, the preservation of the software and hardware dependencies of the LHCb experimental data and simulations was investigated. It was found that the links between the data processing information and the datasets themselves were obscure. In order to document these dependencies, a graph database was designed and implemented. The nodes in the graph represent the data with their processing information, software and computational environment, whilst the edges represent their dependence on the other nodes. The database provides a central place to preserve information that was previously scattered across the LHCb computing infrastructure. Using the developed database, a methodology to recreate the LHCb computational environment and to execute the data processing on the cloud was implemented with the use of virtual containers. It was found that the produced physics events were identical to the official LHCb data, meaning that the system can aid in data preservation. Furthermore, the developed method can be used for outreach purposes, providing a streamlined way for a person external to CERN to process and analyse the LHCb data. Following this, the reproducibility of data analyses was studied. A data provenance tracking service was implemented within the LHCb software framework \textsc{Gaudi}. The service allows analysts to capture their data processing configurations that can be used to reproduce a dataset within the dataset itself. Furthermore, to assess the current status of the reproducibility of LHCb physics analyses, the major parts of an analysis were reproduced by following methods described in publicly and internally available documentation. This study allowed the identification of barriers to reproducibility and specific points where documentation is lacking. With this knowledge, one can specifically target areas that need improvement and encourage practices that would improve reproducibility in the future. Finally, contributions were made to the CERN Analysis Preservation portal, which is a general knowledge preservation framework developed at CERN to be used across all the LHC experiments. In particular, the functionality to preserve source code from git repositories and Docker images in one central location was implemented.
APA, Harvard, Vancouver, ISO, and other styles
2

GAILLARD, Rémi. "De l'Open data à l'Open research data : quelle(s) politique(s) pour les données de recherche ?" Thesis, 2014. http://eprints.rclis.org/22746/1/VD%20relue.pdf.

Full text
Abstract:
Open access movement is increasingly expanding from scientific publications to research data. Initiatives to make research data broadly accessible and fully available for reuse are emerging from a variety of stakeholders at international and European scale – states, funding agencies, publishers and scientific communities themselves. Research data openness is achieved through different policies, technical and legal mecanisms, but also lie on good data management practices. While France is beginning to get into the Open research data movement, it is important to consider that data policies are also needed at university level. Librarians have a serious part to play in leading on institutional data policy, understanding researchers needs and helping in metadata creation. Open access to research data is thus a unique opportunity for librarians to redesign their connection with scientific communities inside research institutions.
APA, Harvard, Vancouver, ISO, and other styles
3

Mason, Emma. "GIS on the Qualla Boundary: Data Management for the Eastern Band of Cherokee Indians Tribal Historic Preservation Office." 2017. http://scholarworks.gsu.edu/anthro_theses/112.

Full text
Abstract:
The use of Geographic Information Systems (GIS) has become increasingly important for the preservation of cultural resources by tribal entities. This project serves as a platform for the management of archaeological site data on the Qualla Boundary to be used by the Eastern Band of Cherokee Indians (EBCI) Tribal Historic Preservation Office (THPO) members. Over the course of a year, data was gathered from various agencies in order to export and create geospatial data that can be visualized, analyzed, and managed using ArcGIS software. A map and detailed data set were created to provide the user with the locations and attributes of archaeological sites, which can be used by the EBCI THPO as a tool for archaeological research and to protect sites on the Qualla Boundary. Additionally, a preliminary settlement pattern study was performed for the broader Qualla Boundary, along with a more in-depth analysis of sites along the Oconaluftee River.
APA, Harvard, Vancouver, ISO, and other styles
4

Pavlásková, Eliška. "Analýza výzkumných dat na základě fondu disertačních prací Univerzity Karlovy v Praze s ohledem na dlouhodobé uložení digitálních objektů." Doctoral thesis, 2016. http://www.nusl.cz/ntk/nusl-349662.

Full text
Abstract:
This dissertation thesis focuses on research data and their use in academics from the point of view of long-term preservation. It maps usage of research data at Charles University in Prague, analyses them and lays the foundation for further research. The first part of the text focuses on the theory of long term preservation and describes the most relevant concepts regarding users, storage and structure of research data. The second part is devoted directly to research data. It consists of definition of research data, the short explanation of their importance and sources, and the model of their lifecycle. The pivotal part of the thesis is the description and the results of the research itself. The research was conducted during the year 2015 and was based on a sample of dissertation theses from the collections of Charles University in Prague. Collected data were analysed by the methods of content analysis and grounded theory. Results are presented in two main parts - content analysis results with regard to differences among science, social science and humanities, and qualitative analysis results. Powered by TCPDF (www.tcpdf.org)
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Research data preservation"

1

National Research Council (U.S.). Committee on the Preservation of Geoscience Data and Collections. Geoscience data and collections: National resources in peril. National Academies Press, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Italy) AIUCD Annual Conference (2nd 2013 Padua. Collaborative research practices and shared infrastructures for humanities computing: 2nd AIUCD annual Conference, AIUCD 2013, Padua, Italy, 11-12 December 2013 : proceedings of revised papers. CLEUP, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

A, Strasser Carly, ed. Data management for libraries: A LITA guide. ALA TechSource, an imprint of the American Library Association, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

1938-, DeWitt Donald L., ed. Going digital: Strategies for access, preservation, and conversion of collections to a digital format. Haworth Press, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Guide to social science data preparation and archiving: Best practice through the data life cycle. 4th ed. ICPSR, Institute for Social Research, University of Michigan, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ioannides, Marinos, Sander Münster, Mieke Pfarr-Harfst, and Piotr Kuroczyński. 3D Research Challenges in Cultural Heritage II: How to Manage Data and Knowledge Related to Interpretative Digital 3D Reconstructions of Cultural Heritage. Springer, 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Medicine, Institute of, Board on the Health of Select Populations, and Committee on the Management of the Air Force Health Study Data and Specimens--Report to Congress. Air Force Health Study Assets Research Program. National Academies Press, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Medicine, Institute of, Board on the Health of Select Populations, and Committee on the Management of the Air Force Health Study Data and Specimens--Report to Congress. Air Force Health Study Assets Research Program. National Academies Press, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Air Force Health Study Assets Research Program. National Academies Press, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Johnson, Eric O. Working as a Data Librarian. ABC-CLIO, LLC, 2018. http://dx.doi.org/10.5040/9798216038719.

Full text
Abstract:
Many librarians’ job responsibilities increasingly require them to understand and handle data. Learn how to be an effective data librarian—even if you never expected to need data skills. The field of data librarianship is rapidly growing, and some librarians may feel that their training and experience does not cover data questions asked by patrons seeking advice. With this gentle guide for librarians moving—sometimes unexpectedly—into the world of data librarianship, all you need is a willingness to learn the skills required for the rapidly growing number of jobs requiring data librarianship. Working as a Data Librarian focuses on transferable skills and understanding and does not assume extensive knowledge. It introduces tasks and concepts needed to be an effective data librarian, such as best practices for data reference interviewing, finding data sources, data visualization, data literacy, the data lifecycle, metadata design, database design, understanding data management, and preparing data management plans. Additional sections focus on supporting creativity (Makerspaces and Fablabs, 3-D modeling), supporting analysis (GIS, data visualization, text mining, statistical methods), supporting research (digital scholarship, digital preservation, institutional data repositories, scholarly communication), and outreach (data librarian liaisonship, hackathons, developing outreach programs).
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Research data preservation"

1

Voss, Alex, Ilia Lvov, and Sara Day Thomson. "Data Storage, Curation and Preservation." In The SAGE Handbook of Social Media Research Methods. SAGE Publications Ltd, 2016. http://dx.doi.org/10.4135/9781473983847.n11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bakos, Asztrik, Tomasz Miksa, and Andreas Rauber. "Research Data Preservation Using Process Engines and Machine-Actionable Data Management Plans." In Digital Libraries for Open Knowledge. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-00066-0_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ryan, Matthew J., Gerard Verkleij, and Vincent Robert. "Data resources: role and services of culture collections." In Trends in the systematics of bacteria and fungi. CABI, 2021. http://dx.doi.org/10.1079/9781789244984.0083.

Full text
Abstract:
Abstract Microbial culture collections (CC) support taxonomy through the provision and preservation of microorganisms, underpinning the integrity of research. However, cultures are essentially worthless without associated data. Details of isolation (geographical location, date of collection, date of isolation), preservation, identification, characterization, growth conditions, properties and applications, etc., make up the key microbial data set that should be associated with every culture. This chapter summarizes the current status of data curation in fungal CCs, assess the challenges of maintaining genomic and bioinformatic data and investigate how collections need to evolve to meet the challenges of data acquisition and maintenance in the future in order to underpin taxonomy.
APA, Harvard, Vancouver, ISO, and other styles
4

Fullilove, Courtney, and Abdallah Alimari. "Baladi Seeds in the oPt: Populations as Objects of Preservation and Units of Analysis." In Towards Responsible Plant Data Linkage: Data Challenges for Agricultural Research and Development. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-13276-6_4.

Full text
Abstract:
AbstractThis essay argues that shortcomings in our approaches to global agriculture and its data infrastructures are attributable in part to a constricted application of population concepts derived from biological sciences in the context of international development. Using Palestine as a case study, this chapter examines the category of baladi seeds as a community-generated characterization of population, and one which arguably defies reduction to data. Drawing on quantitative research on farmer participation in informal seed production for wheat in the occupied Palestinian territories (oPt) and oral histories of farmers in the West Bank, this chapter analyzes the relation between participatory plant breeding initiatives, heritage narratives, and international agricultural research in rendering baladi seeds legible for archiving. It considers the multiple technological practices through which these institutions characterize and manage access to cultivated seeds, and how they differently approach problems of standardization, scalability, and variability. Through case studies of national and local seed saving initiatives, it asks, in turn, whether baladi seeds can be reduced to data, how they might be reduced to data, and whether they should be reduced to data.
APA, Harvard, Vancouver, ISO, and other styles
5

Tamper, Minna, Petri Leskinen, Kasper Apajalahti, and Eero Hyvönen. "Using Biographical Texts as Linked Data for Prosopographical Research and Applications." In Digital Heritage. Progress in Cultural Heritage: Documentation, Preservation, and Protection. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-01762-0_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Wenbin. "Research on Digital Preservation of Architectural Decoration Patterns Based on Virtual Reality Technology." In Lecture Notes on Data Engineering and Communications Technologies. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-96-0208-7_55.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Miyakita, Goki, Petri Leskinen, and Eero Hyvönen. "Using Linked Data for Prosopographical Research of Historical Persons: Case U.S. Congress Legislators." In Digital Heritage. Progress in Cultural Heritage: Documentation, Preservation, and Protection. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-01765-1_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kalshetty, Jagadevi N., and N. Nalini. "A Review on Design and Performance Evaluation of Privacy Preservation Techniques in Data Mining." In Emerging Research in Computing, Information, Communication and Applications. Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-5482-5_83.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Devare, Medha, Elizabeth Arnaud, Erick Antezana, and Brian King. "Governing Agricultural Data: Challenges and Recommendations." In Towards Responsible Plant Data Linkage: Data Challenges for Agricultural Research and Development. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-13276-6_11.

Full text
Abstract:
AbstractThe biomedical domain has shown that in silico analyses over vast data pools enhances the speed and scale of scientific innovation. This can hold true in agricultural research and guide similar multi-stakeholder action in service of global food security as well (Streich et al. Curr Opin Biotechnol 61:217–225. Retrieved from https://doi.org/10.1016/j.copbio.2020.01.010, 2020). However, entrenched research culture and data and standards governance issues to enable data interoperability and ease of reuse continue to be roadblocks in the agricultural research for development sector. Effective operationalization of the FAIR Data Principles towards Findable, Accessible, Interoperable, and Reusable data requires that agricultural researchers accept that their responsibilities in a digital age include the stewardship of data assets to assure long-term preservation, access and reuse. The development and adoption of common agricultural data standards are key to assuring good stewardship, but face several challenges, including limited awareness about standards compliance; lagging data science capacity; emphasis on data collection rather than reuse; and limited fund allocation for data and standards management. Community-based hurdles around the development and governance of standards and fostering their adoption also abound. This chapter discusses challenges and possible solutions to making FAIR agricultural data assets the norm rather than the exception to catalyze a much-needed revolution towards “translational agriculture”.
APA, Harvard, Vancouver, ISO, and other styles
10

Brumana, R. "How to Measure Quality Models? Digitization into Informative Models Re-use." In 3D Research Challenges in Cultural Heritage III. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-35593-6_5.

Full text
Abstract:
Abstract3D models from passive muted subjects, often used in the books and in preservation design reports as powerful images dense of contents, have nowadays the opportunity to become 'live gears’ leveraging knowledge, interpretation, and management into preservation objectives till to better-informed fruition. To this aim, we need to build up reliable and re-usable 3D Quality models. How to shift from a 3D model toward a 3D quality model?This contribution intends to focus on the parameters defining a 3D Quality model catching the heritage complexity with its components in a holistic methodological and practical vision. A radar chart has been used to manage all the parameters. First of all, Geometry describes a quality model: parameters for data acquisition, on-site surveying, and model processing to obtain 2D-3D Geometry quality are defined. The concept of scale associated with measurable parameters defining the Grade of Accuracy is proposed and applied to the surveying and to the 3D models. 3D models can be considered tools to decode the complexity of cultural heritage made by the different transformations across the centuries, anthropic-natural hazards, climate change threats and events (such as earthquakes, fires, wars). Thus, Geometry is not enough to describe such complexity; it represents the first step. Materials and Construction technologies analysis is the second pillar qualifying a quality model. The connection with the indirect data source (i.e., historical reports and archives documents), is the third pillar to be reconnected to the Geometry and Material analysis in the quality definition. HBIM represents a multidisciplinary environment to convey the information related to geometry and models. Furtherly, several parameters are identified to describe the quality of informative models, as in the case of Object Libraries and Building archeology progressively feeding such models. BIM Level of Developments (phases) and Level of Geometry (contents, not scale!) have been adapted to the HBIM, introducing digitization, surveying, and HBIM modeling into the preservation process. Finally, a quality model is defined by the capability to be re-used circulating Information and Models among the end-users as in the case of informed VR/AR through CDE and XR platforms.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Research data preservation"

1

Pillai, Sanjaikanth E. Vadakkethil Somanathan, and Wen-Chen Hu. "Federated Learning for Fake News Detection and Data Privacy Preservation." In 2024 Cyber Awareness and Research Symposium (CARS). IEEE, 2024. https://doi.org/10.1109/cars61786.2024.10778735.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lydell, B. O. Y., O. Nevander, and J. Riznic. "The Nuclear Energy Agency Contribution to Nuclear Materials Performance Knowledge Preservation." In CORROSION 2018. NACE International, 2018. https://doi.org/10.5006/c2018-10530.

Full text
Abstract:
Abstract The goal of the Nuclear Energy Agency (NEA) (https://www.oecd-nea.org/) in the area of nuclear safety and regulation is to assist its member countries in ensuring high standards of safety in the use of nuclear energy, by supporting the development of effective and efficient regulation and oversight of nuclear installations, and by helping to maintain and advance the scientific and technological knowledge base. The Agency's joint projects and information exchange programs enable interested countries, on a cost-sharing basis, to pursue research and the sharing of data with respect to particular areas or problems. The projects are carried out under the auspices, and with the support, of the NEA. Such projects, primarily in the areas of nuclear safety and waste management, are one of the NEA's major strengths. Since 2002, the NEA operates an international database on materials performance issues. The "Component Operational Experience, Degradation & Ageing Programme" (CODAP) has been established to encourage multilateral co-operation in the collection and analysis of data relating to degradation and failure of metallic piping and non-piping metallic passive components in commercial nuclear power plants. The Project is organised under the OECD/NEA Committee on the Safety of Nuclear Installations (CSNI).
APA, Harvard, Vancouver, ISO, and other styles
3

Taylor, James G. "Electronic Management of Paint Records – Paperless QA." In SSPC 2014 Greencoat. SSPC, 2014. https://doi.org/10.5006/s2014-00056.

Full text
Abstract:
Abstract The previous quality control process at Newport News Shipbuilding for recording surface preparation, paint application, and associated inspections used a paper system. These records required a vast amount of physical storage space, were difficult to research and track due to the complexity of the records often consisting of multiple sheets of data. To address these problems along with others that contribute to inadequate maintenance of objective quality evidence, the company decided to take the initiative in resolving this issue by developing a paperless QA system. This paperless system migrated the previous paper process to an electronic storage and tracking alternative where multiple users are able to access the data and, at the point of an inspection, use the electronic signature process as verification of completion. The system maintains all business functionalities while minimizing manual entry errors and identifying out-of-spec data. Also, the additional ability to generate electronic reports consisting of key data fields provides an alternative which eliminates the need for excessive physical storage space and time spent physically maintaining often cumbersome records. The primary goal of the project was to electronically manage the preservation documentation processes to improve visibility, tracking and handling of coating preservation records.
APA, Harvard, Vancouver, ISO, and other styles
4

Yelistratova, Lesya, Alexander Apostolov, Artur Hodorovsky, Olha Tomchenko, and Maksym Tymchyshyn. "SATELLITE MONITORING OF ANTHROPOGENIC PROCESSES AND FACTORS OF LAND DEGRADATION IN UKRAINE." In 24th SGEM International Multidisciplinary Scientific GeoConference 2024. STEF92 Technology, 2024. https://doi.org/10.5593/sgem2024/2.1/s10.35.

Full text
Abstract:
The relevance of the research lies in the fact that the well-being of the Ukrainian nation, present and future generations, largely depends on the preservation of existing ecosystems, their support, and the enhancement of their quality and biodiversity. Particular attention should be paid to the problem of land cover degradation. Land cover currently holds a unique status regarding food security and is a dominant source for renewing crucial biospheric functions, ensuring the diversity of living organisms. The anthropogenic and technological pressure on Ukraine's environment (land cover) exceeds several times the corresponding indicators in developed countries. A remote sensing data methodology was proposed to assess the impact of human, technical activities (anthropogenic predictor) on land cover degradation. As a result of the research, the obtained data allowed for the assessment of Ukraine's land cover status and revealed the degree of its degradation caused by the influence of anthropogenic factors, namely, irrational agricultural practices, non-efficient long-term exploitation of mineral deposits, ineffective urban infrastructure design, irrational practices in forestry and park management, the influence of reservoirs, forest, and steppe fires, as well as military activities.
APA, Harvard, Vancouver, ISO, and other styles
5

Prewitt, T. J. "Application of 3D Metrology Technology in Inspection, Corrosion Mapping, and Failure Analysis." In CORROSION 2016. NACE International, 2016. https://doi.org/10.5006/c2016-07672.

Full text
Abstract:
Abstract Three dimensional (3D) optical measurement systems have seen ongoing development with regards to both technologies and applications. While previously targeted primarily toward manufacturing and quality control environments, significant benefits have been recognized for utilizing precision optical metrology systems across the oil and gas industry. Specific applications have developed in field and facility defect assessment and mapping, NDE services, failure analysis, evidence preservation, and forensic reconstruction. This paper presents recent work and research comparing 3D laser scanning technologies to traditional inspection techniques for mapping of corrosion, dents, and dent/gouge combinations. Included is a comparison study consisting of a typical corroded region of pipe, comparing accuracy, speed of measurement, and data processing applications for traditional pit gauge measurements with those processed from a 3D laser scanner. Review and comparison of alternative techniques and technologies, such as structured light, and photogrammetry is also presented.
APA, Harvard, Vancouver, ISO, and other styles
6

Hedges, Mark, Adil Hasan, and Tobias Blanke. "Management and preservation of research data with iRODS." In the ACM first workshop. ACM Press, 2007. http://dx.doi.org/10.1145/1317353.1317358.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Hedges, Mark, Adil Hasan, and Tobias Blanke. "Curation and Preservation of Research Data in an iRODS Data Grid." In Third IEEE International Conference on e-Science and Grid Computing (e-Science 2007). IEEE, 2007. http://dx.doi.org/10.1109/e-science.2007.26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Vaghela, Alpesh, and Suthar Anil Kumar. "Medical health data privacy preservation using smart contract." In INTERNATIONAL CONFERENCE ON ADVANCES IN MULTI-DISCIPLINARY SCIENCES AND ENGINEERING RESEARCH: ICAMSER-2021. AIP Publishing, 2022. http://dx.doi.org/10.1063/5.0095361.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gowri, Akshaya, Deepak Tyagi, P. Vignesh, and K. Manikandan. "Patient data privacy preservation using modified whale optimization." In PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON ENGINEERING RESEARCH AND APPLICATION 2022 (ICERA 2022). AIP Publishing, 2023. http://dx.doi.org/10.1063/5.0180169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Raghav, Shivankar, and Ashish Kumar Saxena. "Mobile forensics: Guidelines and challenges in data preservation and acquisition." In 2009 IEEE Student Conference on Research and Development (SCOReD). IEEE, 2009. http://dx.doi.org/10.1109/scored.2009.5443431.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Research data preservation"

1

Marmar, Earl S. Preservation of Alcator C-Mod Data and Support of ITER Research Through ITPA Participation: Final Technical Report. Office of Scientific and Technical Information (OSTI), 2019. http://dx.doi.org/10.2172/1493304.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

García-Espinosa, J., and C. Soriano. Data management plan. Scipedia, 2021. http://dx.doi.org/10.23967/prodphd.2021.9.003.

Full text
Abstract:
This document presents the deliverable D8.1 – the Data Management Plan (DMP) of work package 8 of the prodPhD project. It aims to present the plan for the management, generation, collection, security, preservation and sharing of data generated through the prodPhD project. The DMP is a key element for organizing the project’s data. It provides an analysis of the data, which will be collected, processed and published by the prodPhD consortium. The project embraces the initiatives of the European Commission to promote the open access to research data, aiming to improve and maximize access to and reuse of research data generated by Horizon 2020 projects. In this sense prodPhD will adhere to the Open Research Data Pilot (ORD Pilot) fostered by the European Commission, and this DMP will be developed following the standards of data storage, access and management. This plan will detail what data will be generated through the project, whether and how it will be made accessible for the verification and reuse and how it will be curated and preserved. In this context, the term data applies to the information generated during the different experimental campaigns carried out in the project, and specifically to the data, including associated metadata, to be used to validate the computational models and the technical solutions to be developed in the project. This document is the first version of the DMP and may be updated throughout the project, if significant changes (new data, changes in consortium policies, changes in consortium composition, etc.) arise.
APA, Harvard, Vancouver, ISO, and other styles
3

Melad, Kris Ann. Harmonizing Philippine Census Data across Decades (1970–2020). Philippine Institute for Development Studies, 2024. https://doi.org/10.62986/dp2024.44.

Full text
Abstract:
This research harmonizes Philippine Census of Population and Housing (CPH) data from 1970 to 2020 to address data consistency challenges across five decades. The study systematically reconciles evolving variable definitions, classification systems, and measurement scales to create a unified longitudinal dataset. Key harmonization challenges include accommodating changes in the education system, such as the K-12 reforms, tracking modifications to administrative boundaries over the years, managing the expanding data scope across census years, and addressing historical data preservation issues, particularly for the 1970 and 1980 censuses. The research involved the creation of translation tables and crosswalks for major classification systems, including the Philippine Standard Geographic Classification (PSGC), Philippine Standard Occupational Classification (PSOC), and Philippine Standard Industrial Classification (PSIC). Variable-specific harmonization protocols and guidelines for researchers using the harmonized data are also documented. The harmonization process standardized core demographic variables across all periods while preserving more detailed classifications where possible, though some variables necessarily lost granularity when harmonized to their lowest common denominator. Beyond producing a consistent dataset for longitudinal analysis, this study contributes to PIDS's agenda of strengthening statistical systems for evidence-based policymaking. The paper concludes with recommendations for improving future census data collection and harmonization practices to support effective policy development in the Philippines.
APA, Harvard, Vancouver, ISO, and other styles
4

Langlais, Pierre-Carl. Open Scientific Data. Comité pour la science ouverte, 2023. https://doi.org/10.52949/69.

Full text
Abstract:
Not opening scientific data is costly. It has been estimated that a significant share of scientific knowledge disappears every year. In a 2014 study less than half of biological datasets from the 1990s have been recovered and when possible the recovery has necessitated significant time and efforts. In comparison, 98% of datasets published on PLOS with unique identifiers (data DOIs) are still available for future research. Open scientific data are fundamental resources for a large variety of scientific activities: meta-analysis, replication of research results or accessibility to primary sources. They also bring a significant economic and social value, as scientific data is commonly used by non-academic professionals as well as public agencies and non-profit organizations. Yet open scientific data is not costless. Ensuring that data is not only downloadable but usable requires significant investment in regards to documentation, data cleaning, licensing and indexation. Not all scientific data can be shared and verifications are frequently necessary to ensure that they do not incorporate copyrighted contents or personal information. To be effective, data sharing has to be anticipated throughout the entire research lifecycle. New principles of scientific data management aims to formalize the preexisting cultures of data in scientific communities and apply common standards. First published in 2016, the FAIR Guiding Principles (findability, accessibility, interoperability, and reusability) is an influential framework for opening scientific data. Policies in support of data sharing have moved from general and broad encouragement to the concrete development of data sharing services. Early initiatives go back to the first computing infrastructures: in 1957 the World Data Center system aimed to make a large range of scientific data readily available. Open data programs were yet severely limited by the lack of technical support and compatibility for data transfer. After 1991, the web created a universal framework for data exchange and entailed a massive expansion of scientific databases. Yet, numerous projects ran into critical issues of long term sustainability. Open science infrastructure have recently become key stakeholders in the diffusion and management of open scientific data. Data repositories ensure the preservation of scientific resources as well as their discoverability. Data hosted on repositories are more frequently used and quoted than data published in a supplementary file.
APA, Harvard, Vancouver, ISO, and other styles
5

Blanc, Isabelle, Roberto Di Cosmo, Mathieu Giraud, et al. Highlights of the "Software Pillar of Open Science" workshop. Ministère de l’enseignement supérieur et de la recherche, 2024. http://dx.doi.org/10.52949/53.

Full text
Abstract:
Software has become essential in all areas of scientific research, both as a tool for research, a product of research, and a research object in itself. In the quest to make research results reproducible, and pass knowledge to future generations, we must preserve three main pillars: the research articles that describe the results, the data sets used or produced, and the software source code that embodies the logic of the data transformation. Indeed, the preservation of software source code is as essential as preserving research articles and data sets. The main aim of this in-person half-day event, organized by the French Committee for Open Science on November 29th 2023, was to bring together high-level stakeholders from a variety of backgrounds, including researchers, research software engineers, research evaluation bodies, infrastructures, academic open source program offices (OSPOs), and financial backers, to share their experience and views on research software. The workshop featured three panel sessions, in which speakers focused on major dimensions of relevance to software in open science: − Acknowledgment of software as a key pillar for reproducible research; − Recognition and support for the dissemination of software; − Highlighting of the social impact of software.
APA, Harvard, Vancouver, ISO, and other styles
6

Lischer-Katz, Zack, Rashida Braggs, and Bryan Carter. Investigating Volumetric Video Creation and Curation for the Digital Humanities: a White Paper Describing Findings from the Project: Preserving BIPOC Expatriates’ Memories During Wartime and Beyond. The University of Arizona Libraries, 2024. http://dx.doi.org/10.2458/10150.674673.

Full text
Abstract:
Volumetric video capture technologies offer humanities scholars and other researchers new, immersive ways of engaging with historical and cultural knowledge for research and pedagogical purposes; however, the high cost of this technology and a paucity of expert knowledge in the field have limited its adoption. In particular, volumetric video offers rich new possibilities for recording, preserving, and re-experiencing BIPOC (Black, indigenous, and other people of color) stories in immersive detail, which have been underrepresented in the historical record. This technology is still experimental and is typically limited to specialized labs at large research universities. To democratize the technology and ensure that the potential benefits of this new technology can be realized by digital humanities scholars more broadly, a group of researchers at the University of Arizona and Williams College, in collaboration with technical innovators from the world-renowned volumetric capture studio, VoluCap, GmbH, embarked on a project to explore the challenges and potential benefits of volumetric video capture for BIPOC storytelling. The team traveled to Berlin/Potsdam in June 2023 to visit VoluCap Studios and record several volumetric capture videos, including a video of Mike Russell, who told a story about his father’s experiences as an African-American servicemember during World War II. Recording these videos and observing their processing pipeline allowed us to consider the logistical and data curation challenges of this format. Dr. Bryan Carter, lead-PI on the project, is also director of UArizona’s Center for Digital Humanities, which houses a prosumer-level volumetric capture studio. Comparing the workflows at the Center for DH with what was observed at VoluCap allowed the project team to better understand the challenges and benefits of volumetric capture at different scales and levels of quality. Because volumetric videos are expensive and time consuming to create, an important objective of this project was to examine the preservation and curation challenges associated with the digital objects created through the volumetric capture process. Planning for preservation, access, and reuse of volumetric video assets is essential to realizing their full value. This report describes the creation challenges and pedagogical benefits of volumetric video, as well as preservation and curation challenges.
APA, Harvard, Vancouver, ISO, and other styles
7

Løvschal, Mette, Havananda Ombashi, Marianne Høyem Andreasen, et al. The Protected Burial Mound ‘Store Vejlhøj’, Vinderup, Denmark: First Results. Det Kgl. Bibliotek, 2023. http://dx.doi.org/10.7146/aulsps-e.479.

Full text
Abstract:
An archaeological excavation of the protected burial mound Store Vejlhøj in northwestern Denmark was carried out in October-November 2021. The excavation formed part of the ERC-funded research project called ANTHEA, focusing on the deep history of anthropogenic heathlands. It was conducted by Aarhus University in collaboration with Holstebro Museum and Moesgaard Museum. The aim was to test a new method of sampling pollen data from different construction stages in a burial mound and comparing them with pollen data from nearby lake sediments with a view to improving our understanding of prehistoric anthropogenic heathland dynamics. Prior to the excavation, soil cores were collected from two nearby peat sediments as well as six burial mounds (including Store Vejlhøj) within a 1 km range of Lake Skånsø, where previous pollen analyses had been carried out. Based on these preliminary corings, Store Vejlhøj was selected for further archaeological investigation. A dispensation for excavating the protected mound was granted by the Danish Palaces and Culture Agency. The excavation was based on a 5 m long trench through the barrow, moving from its foot inwards. The surface vegetation and 40 cm topsoil were removed by an excavator, after which the remainder of the trench was manually dug in horizontal layers. Observation conditions were good. The excavation revealed a series of well-defined barrow construction stages, as well as unusually wellpreserved turf structures. Only two archaeological finds could be related to the barrow, both of which were later than its initial construction: a secondary urn in the top layer, and the base of a second urn at the foot of the mound. The burial mound was constructed using a minimum of three shells, which could be observed in the trench profile. Turfs were most probably collected locally in a landscape dominated by grass pastures, where no previous turf cutting had taken place. A total of 34 soil samples were collected for paleoecological analyses (pollen, Non-Pollen Polymorphs (NPPs), macrofossils) and geoarchaeological analyses (micromorphology, bulk samples). Preliminary pollen and macrofossil results from the burial mound revealed poor preservation conditions, which prompted a trench extension of 0.5 m by 0.2 m to find better preservation conditions. This extension resulted in the collection of a single final macrofossil sample, although there was no identifiable change in the in-situ preservation conditions. The dating results of the mound have not yet been completed and will be included as appendix 4-6 in 2023.
APA, Harvard, Vancouver, ISO, and other styles
8

O'Neill, H. B., S. A. Wolfe, and C. Duchesne. Ground ice map of Canada. Natural Resources Canada/CMSS/Information Management, 2022. http://dx.doi.org/10.4095/330294.

Full text
Abstract:
This Open File presents national-scale mapping of ground ice conditions in Canada. The mapping depicts a first-order estimate of the combined volumetric percentage of excess ice in the top 5 m of permafrost from segregated, wedge, and relict ice. The estimates for the three ice types are based on modelling by O'Neill et al. (2019) (https://doi.org/10.5194/tc-13-753-2019), and informed by available published values of ground ice content and expert knowledge. The mapping offers an improved depiction of ground ice in Canada at a broad scale, incorporating current knowledge on the associations between geological and environmental conditions and ground ice type and abundance. It provides a foundation for hypothesis testing related to broad-scale controls on ground ice formation, preservation, and melt. Additional compilation of quantitative field data on ground ice and improvements to national-scale surficial geology mapping will allow further assessment and refinement of the representation of ground ice in Canada. Continued research will focus on improving the lateral and vertical representation of ground ice required for incorporation into Earth system models and decision-making. Spatial data files of the mapping are available as downloads with this Open File.
APA, Harvard, Vancouver, ISO, and other styles
9

Langlais, Pierre-Carl. Scientific Integrity. Comité pour la science ouverte, 2024. https://doi.org/10.52949/59.

Full text
Abstract:
Between 2-4% of researchers admit to have falsified or fabricated their data. The prevalence of such unethical behavior can be as high as 10% in some disciplines or countries. Data falsification is an extreme form of questionable research practices that are both less problematic and much more widespread: surveys on different disciplines have shown that more than half of researchers make some form of selective reporting or add new data until they obtain significant results. Unethical practices harm the global quality of research. Provided they have been validated by peer review, fabricated, distorted or selected data are featured in literature review or meta-analysis. They can in turn influence directions for future research or, even, policy decisions with wide range implications in health, economics or politics. Negative incentives have attracted significant attention: publishers, institutions and research evaluators tend to favor unprecedented research that does not simply confirm common hypotheses. The lack of proper tools, standards and workflows to deal efficiently with data is also a fundamental issue. In most disciplines, data collection is not well organized nor maintained: it has been estimated that as much as half of the datasets created from the 1990s in life science are already lost. Questionable research practices partly stem from common deficiencies in scientific data management. Open science and data sharing have recently emerged as a common framework to solve issues of research integrity. While initially focused on access to publications, the open science movement is more broadly concerned with transparency at all the stages of the research lifecycle. The diffusion of datasets in open repositories and infrastructures has already largely solved major issues of long term preservation. It also ensures that potential errors or adjustments to statistical indicators can be subsequently corrected, as posterior analysis and replications have access to the original data source. Open science is now increasingly integrated into ethical standards, following community-led initiatives like the TOP Guidelines (2014). The European Code of Conduct for Research Integrity from 2017 includes full requirements for open access, open data and reproducible workflow: “Researchers, research institutions and organizations ensure access to data is as open as possible, as closed as necessary”. The Hong Kong principles for assessing researchers (2020) acknowledge open science as one of the five pillars of scientific integrity. Open science is changing the nature of the debate over research integrity which had remained until now largely detached from the public space. 60%-90% of the audience of open scientific platforms comes from non-academic professionals and private citizens. This increased diffusion creates new responsibilities but also new opportunities to involve non-academic stakeholders in the spirit of citizen science.
APA, Harvard, Vancouver, ISO, and other styles
10

Pasupuleti, Murali Krishna. 2D Quantum Materials for Next-Gen Semiconductor Innovation. National Education Services, 2025. https://doi.org/10.62311/nesx/rrvi425.

Full text
Abstract:
Abstract The emergence of two-dimensional (2D) quantum materials is revolutionizing next-generation semiconductor technology, offering superior electronic, optical, and quantum properties compared to traditional silicon-based materials. 2D materials, such as graphene, transition metal dichalcogenides (TMDs), hexagonal boron nitride (hBN), and black phosphorus, exhibit high carrier mobility, tunable bandgaps, exceptional mechanical flexibility, and strong light-matter interactions, making them ideal candidates for ultra-fast transistors, spintronics, optoelectronic devices, and quantum computing applications. This research explores the fundamental properties, synthesis techniques, and integration challenges of 2D quantum materials in semiconductor innovation, focusing on scalability, stability, and industrial feasibility. Additionally, it examines emerging applications in energy-efficient electronics, flexible and wearable technologies, and photonic integration for high-speed data processing. The study also addresses current challenges, including material defect engineering, large-area fabrication, and quantum coherence preservation, providing a comprehensive roadmap for the adoption of 2D quantum materials in next-generation semiconductor architectures. Keywords 2D quantum materials, next-generation semiconductors, graphene electronics, transition metal dichalcogenides (TMDs), quantum transport, high carrier mobility, spintronics, optoelectronics, quantum computing, flexible electronics, nanoelectronics, photonic integration, tunable bandgap materials, electronic properties of 2D materials, synthesis of 2D semiconductors, scalable semiconductor fabrication, quantum coherence, energy-efficient transistors, van der Waals heterostructures, semiconductor innovation.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography