To see the other types of publications on this topic, follow the link: Digital Forensic investigations.

Dissertations / Theses on the topic 'Digital Forensic investigations'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 48 dissertations / theses for your research on the topic 'Digital Forensic investigations.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Marziale, Lodovico. "Advanced Techniques for Improving the Efficacy of Digital Forensics Investigations." ScholarWorks@UNO, 2009. http://scholarworks.uno.edu/td/1027.

Full text
Abstract:
Digital forensics is the science concerned with discovering, preserving, and analyzing evidence on digital devices. The intent is to be able to determine what events have taken place, when they occurred, who performed them, and how they were performed. In order for an investigation to be effective, it must exhibit several characteristics. The results produced must be reliable, or else the theory of events based on the results will be flawed. The investigation must be comprehensive, meaning that it must analyze all targets which may contain evidence of forensic interest. Since any investigation must be performed within the constraints of available time, storage, manpower, and computation, investigative techniques must be efficient. Finally, an investigation must provide a coherent view of the events under question using the evidence gathered. Unfortunately the set of currently available tools and techniques used in digital forensic investigations does a poor job of supporting these characteristics. Many tools used contain bugs which generate inaccurate results; there are many types of devices and data for which no analysis techniques exist; most existing tools are woefully inefficient, failing to take advantage of modern hardware; and the task of aggregating data into a coherent picture of events is largely left to the investigator to perform manually. To remedy this situation, we developed a set of techniques to facilitate more effective investigations. To improve reliability, we developed the Forensic Discovery Auditing Module, a mechanism for auditing and enforcing controls on accesses to evidence. To improve comprehensiveness, we developed ramparser, a tool for deep parsing of Linux RAM images, which provides previously inaccessible data on the live state of a machine. To improve efficiency, we developed a set of performance optimizations, and applied them to the Scalpel file carver, creating order of magnitude improvements to processing speed and storage requirements. Last, to facilitate more coherent investigations, we developed the Forensic Automated Coherence Engine, which generates a high-level view of a system from the data generated by low-level forensics tools. Together, these techniques significantly improve the effectiveness of digital forensic investigations conducted using them.
APA, Harvard, Vancouver, ISO, and other styles
2

De, Souza Pedro. "A Chain of findings for digital investigations." Diss., University of Pretoria, 2013. http://hdl.handle.net/2263/40842.

Full text
Abstract:
Digital Forensic investigations play a vital role in our technologically enhanced world, and it may incorporate a number of different types of evidence — ranging from digital to physical. During a Digital Forensics investigation an investigator may formulate a number of hypotheses, and in order to reason objectively about them, an investigator must take into account such evidence in its entirety, relying on multiple sources. When formulating such objective reasoning an investigator must take into account not only inculpatory evidence but also exculpatory evidence and evidence of tampering. In addition, the investigator must factor in the reliability of the evidence used, the potential for error (tool and human based) and they must factor in the certainty with which they can make various claims. By doing so and creating a detailed audit trail of all actions performed by the investigator they can be better prepared against challenges against their work when it is presented. An investigator must also take into account the dynamic aspects of an investigation, such as certain evidence no longer being admissible, and they must continuously factor these aspects into their reasoning, to ensure that their conclusions still hold. Investigations may draw over a large period of time, and should the relevant information not be captured in detail, it may be lost or forgotten, affecting the reliability of an investigator’s findings and affecting future investigators’ capability to build on and continue an investigator’s work. In this dissertation we investigate whether it is possible to provide a formalised means for capturing and encoding an investigator’s reasoning process, in a detailed and structured manner. By this we mean we would like to capture and encode an investigator’s hypotheses, their arguments, their conclusions and the certainty with which they can make such claims, as well as the various pieces of evidence (digital and physical) that they use as a foundation for their arguments. We also want to capture the steps an investigator took when formulating these arguments and the steps an investigator took in order to get evidence into its intended form. The capturing of such a detailed reasoning process helps to allow for a more thorough reconstruction of an investigator’s finding, further improving the reliability that can be placed in them. By encoding the investigator’s reasoning process, an investigator can more easily receive feedback on the impacts that the various dynamic aspects of an investigation have upon their reasoning. In order to achieve these goals, our dissertation presents a model, called the Chain of Findings, allowing investigators to formulate and capture their reasoning process throughout the investigation, using a combination of goal-driven and data-driven approaches. When formulating their reasoning, the model allows investigators to treat evidence, digital and physical, uniformly as building blocks for their arguments and capture detailed information of how and why they serve their role in an investigator’s reasoning process. In addition, the Chain of Findings offers a number of other uses and benefits including the training of investigators and Digital Forensic Readiness.
Dissertation (MSc)--University of Pretoria, 2013.
gm2014
Computer Science
unrestricted
APA, Harvard, Vancouver, ISO, and other styles
3

Hansen, Tone. "A Digital Tool to Improve the Efficiency of IT Forensic Investigations." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-40232.

Full text
Abstract:
The IT forensic process causing bottlenecks in investigations is an identified issue, with multiple underlying causes – one of the main causes being the lack of expertise among those responsible for ordering IT forensic investigations. The focus of the study is to create and evaluate a potential solution for this problem, aiming to answer research questions related to a suitable architecture, structure and design of a digital tool that would assist individuals in creating IT forensic orders. This work evaluates concepts of such a digital tool. This is done using a grounded theory approach, where a series of test sessions together with the answers from a survey have been examined and analyzed in an iterative process. A low-fidelity prototype is used in the process. The resulting conclusion of the study is a set of concepts, ideas and principles for a digital tool that would aid in the IT forensic ordering process, as well improving the efficiency of the IT forensic process itself. Future work could involve developing the concept further to eventually become a finished product, or using it for improving already existing systems and tools, improving the efficiency and quality of the IT forensic process.
APA, Harvard, Vancouver, ISO, and other styles
4

Patterson, Farrah M. "The implications of virtual environments in digital forensic investigations." Master's thesis, University of Central Florida, 2011. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4819.

Full text
Abstract:
This research paper discusses the role of virtual environments in digital forensic investigations. With virtual environments becoming more prevalent as an analysis tool in digital forensic investigations, it's becoming more important for digital forensic investigators to understand the limitation and strengths of virtual machines. The study aims to expose limitations within commercial closed source virtual machines and open source virtual machines. The study provides a brief overview of history digital forensic investigations and virtual environments, and concludes with an experiment with four common open and closed source virtual machines; the effects of the virtual machines on the host machine as well as the performance of the virtual machine itself. My findings discovered that while the open source tools provided more control and freedom to the operator, the closed source tools were more stable and consistent in their operation. The significance of these findings can be further researched by applying them in the context of exemplifying reliability of forensic techniques when presented as analysis tool used in litigation.
ID: 030646240; System requirements: World Wide Web browser and PDF reader.; Mode of access: World Wide Web.; Thesis (M.S.)--University of Central Florida, 2011.; Includes bibliographical references (p. 46).
M.S.
Masters
Computer Science
Engineering and Computer Science
Digital Forensics; Science/Computing Track
APA, Harvard, Vancouver, ISO, and other styles
5

Liljekvist, Erika, and Oscar Hedlund. "Uncovering Signal : Simplifying Forensic Investigations of the Signal Application." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-44835.

Full text
Abstract:
The increasing availability of easy-to-use end-to-end encrypted messaging applications has made it possible for more people to conduct their conversations privately. This is something that criminals have taken advantage of and it has proven to make digital forensic investigations more difficult as methods of decrypting the data are needed. In this thesis, data from iOS and Windows devices is extracted and analysed, with focus on the application Signal. Even though other operating systems are compatible with the Signal application, such as Android, it is outside the scope of this thesis. The results of this thesis provide access to data stored in the encrypted application Signalwithout the need for expensive analysis tools. This is done by developing and publishing the first open-source script for decryption and parsing of the Signal database. The script is available for anyone at https://github.com/decryptSignal/decryptSignal.
APA, Harvard, Vancouver, ISO, and other styles
6

Hargreaves, C. J. "Assessing the Reliability of Digital Evidence from Live Investigations Involving Encryption." Thesis, Department of Informatics and Sensors, 2009. http://hdl.handle.net/1826/4007.

Full text
Abstract:
The traditional approach to a digital investigation when a computer system is encountered in a running state is to remove the power, image the machine using a write blocker and then analyse the acquired image. This has the advantage of preserving the contents of the computer’s hard disk at that point in time. However, the disadvantage of this approach is that the preservation of the disk is at the expense of volatile data such as that stored in memory, which does not remain once the power is disconnected. There are an increasing number of situations where this traditional approach of ‘pulling the plug’ is not ideal since volatile data is relevant to the investigation; one of these situations is when the machine under investigation is using encryption. If encrypted data is encountered on a live machine, a live investigation can be performed to preserve this evidence in a form that can be later analysed. However, there are a number of difficulties with using evidence obtained from live investigations that may cause the reliability of such evidence to be questioned. This research investigates whether digital evidence obtained from live investigations involving encryption can be considered to be reliable. To determine this, a means of assessing reliability is established, which involves evaluating digital evidence against a set of criteria; evidence should be authentic, accurate and complete. This research considers how traditional digital investigations satisfy these requirements and then determines the extent to which evidence from live investigations involving encryption can satisfy the same criteria. This research concludes that it is possible for live digital evidence to be considered to be reliable, but that reliability of digital evidence ultimately depends on the specific investigation and the importance of the decision being made. However, the research provides structured criteria that allow the reliability of digital evidence to be assessed, demonstrates the use of these criteria in the context of live digital investigations involving encryption, and shows the extent to which each can currently be met.
APA, Harvard, Vancouver, ISO, and other styles
7

Hargreaves, Christopher James. "Assessing the reliability of digital evidence from live investigations involving encryption." Thesis, Cranfield University, 2009. http://dspace.lib.cranfield.ac.uk/handle/1826/4007.

Full text
Abstract:
The traditional approach to a digital investigation when a computer system is encountered in a running state is to remove the power, image the machine using a write blocker and then analyse the acquired image. This has the advantage of preserving the contents of the computer’s hard disk at that point in time. However, the disadvantage of this approach is that the preservation of the disk is at the expense of volatile data such as that stored in memory, which does not remain once the power is disconnected. There are an increasing number of situations where this traditional approach of ‘pulling the plug’ is not ideal since volatile data is relevant to the investigation; one of these situations is when the machine under investigation is using encryption. If encrypted data is encountered on a live machine, a live investigation can be performed to preserve this evidence in a form that can be later analysed. However, there are a number of difficulties with using evidence obtained from live investigations that may cause the reliability of such evidence to be questioned. This research investigates whether digital evidence obtained from live investigations involving encryption can be considered to be reliable. To determine this, a means of assessing reliability is established, which involves evaluating digital evidence against a set of criteria; evidence should be authentic, accurate and complete. This research considers how traditional digital investigations satisfy these requirements and then determines the extent to which evidence from live investigations involving encryption can satisfy the same criteria. This research concludes that it is possible for live digital evidence to be considered to be reliable, but that reliability of digital evidence ultimately depends on the specific investigation and the importance of the decision being made. However, the research provides structured criteria that allow the reliability of digital evidence to be assessed, demonstrates the use of these criteria in the context of live digital investigations involving encryption, and shows the extent to which each can currently be met.
APA, Harvard, Vancouver, ISO, and other styles
8

Roussel, Bruno. "Les investigations numériques en procédure pénale." Thesis, Bordeaux, 2020. http://www.theses.fr/2020BORD0075.

Full text
Abstract:
Dans le contexte de numérisation que connaît notre société, l’enquête pénale doit s’adapter à la dématérialisation des investigations qui doivent permettre d’accéder, de collecter et de générer des données informatiques. En l’état actuel de la procédure pénale, les informations numériques manipulées lors des actes d’enquête sont éparpillées et cloisonnées, ce qui nuit à l’efficacité de leur exploitation ainsi qu’à la protection des droits des personnes concernées par les données ainsi collectées ou générées. La présente étude propose une analyse de toutes les informations numériques regroupées, qui sont recueillies au cours d’une procédure. Les nombreux traitements de données à caractère personnel mis en œuvre par l’Etat et pour lesquels un accès est directement prévu lors de l’enquête pénale sont également éparpillés et physiquement séparés. Loin de garantir une protection des droits des personnes fichées, cette séparation nuit à la qualité des données enregistrées et neutralise les possibilités de contrôles efficaces sur ces traitements. Une mise en commun mesurée de certaines données identiques est présentée ici : elle serait une source d’amélioration importante
Digitalization has more effects on our society. So, the criminal inquiry must be adapted in order to include digital investigations. Those investigations allow accessing, gathering and creating data. In the current state of criminal proceedings in France, the digital information manipulated during investigative acts is separated, which undermines the efficiency of their exploitation as well as the protection of data subjects ' rights. This study proposes an approach that allows the analysis of all the digital information collected during a procedure, grouped, for better exploitation. Moreover, a lot of legal processing of personal data exist in France. Data recorded in those files are divided, and the same data is stored in many judicial files. Our work studies the possibility to aggregate some of the identical data, like identification or address in order to improve criminal proceedings
APA, Harvard, Vancouver, ISO, and other styles
9

Montasari, Reza. "The Comprehensive Digital Forensic Investigation Process Model (CDFIPM) for digital forensic practice." Thesis, University of Derby, 2016. http://hdl.handle.net/10545/620799.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Sanyamahwe, Tendai. "Digital forensic model for computer networks." Thesis, University of Fort Hare, 2011. http://hdl.handle.net/10353/d1000968.

Full text
Abstract:
The Internet has become important since information is now stored in digital form and is transported both within and between organisations in large amounts through computer networks. Nevertheless, there are those individuals or groups of people who utilise the Internet to harm other businesses because they can remain relatively anonymous. To prosecute such criminals, forensic practitioners have to follow a well-defined procedure to convict responsible cyber-criminals in a court of law. Log files provide significant digital evidence in computer networks when tracing cyber-criminals. Network log mining is an evolution of typical digital forensics utilising evidence from network devices such as firewalls, switches and routers. Network log mining is a process supported by presiding South African laws such as the Computer Evidence Act, 57 of 1983; the Electronic Communications and Transactions (ECT) Act, 25 of 2002; and the Electronic Communications Act, 36 of 2005. Nevertheless, international laws and regulations supporting network log mining include the Sarbanes-Oxley Act; the Foreign Corrupt Practices Act (FCPA) and the Bribery Act of the USA. A digital forensic model for computer networks focusing on network log mining has been developed based on the literature reviewed and critical thought. The development of the model followed the Design Science methodology. However, this research project argues that there are some important aspects which are not fully addressed by South African presiding legislation supporting digital forensic investigations. With that in mind, this research project proposes some Forensic Investigation Precautions. These precautions were developed as part of the proposed model. The Diffusion of Innovations (DOI) Theory is the framework underpinning the development of the model and how it can be assimilated into the community. The model was sent to IT experts for validation and this provided the qualitative element and the primary data of this research project. From these experts, this study found out that the proposed model is very unique, very comprehensive and has added new knowledge into the field of Information Technology. Also, a paper was written out of this research project.
APA, Harvard, Vancouver, ISO, and other styles
11

Etow, Tambue Ramine. "IMPACT OF ANTI-FORENSICS TECHNIQUES ON DIGITAL FORENSICS INVESTIGATION." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-97116.

Full text
Abstract:
Computer crimes have become very complex in terms of investigation and prosecution. This is mainly because forensic investigations are based on artifacts left oncomputers and other digital devices. In recent times, perpetrators of computer crimesare getting abreast of the digital forensics dynamics hence, capacitated to use someanti-forensics measures and techniques to obfuscate the investigation processes.Incases where such techniques are employed, it becomes extremely difficult, expensive and time consuming to carry out an effective investigation. This might causea digital forensics expert to abandon the investigation in a pessimistic manner.ThisProject work serves to practically demonstrate how numerous anti-forensics can bedeployed by the criminals to derail the smooth processes of digital forensic investigation with main focus on data hiding and encryption techniques, later a comparativestudy of the effectiveness of some selected digital forensics tools in analyzing andreporting shreds of evidence will be conducted.
APA, Harvard, Vancouver, ISO, and other styles
12

Alqahtany, Saad. "A forensically-enabled IaaS cloud computing architecture." Thesis, University of Plymouth, 2017. http://hdl.handle.net/10026.1/9508.

Full text
Abstract:
Cloud computing has been advancing at an intense pace. It has become one of the most important research topics in computer science and information systems. Cloud computing offers enterprise-scale platforms in a short time frame with little effort. Thus, it delivers significant economic benefits to both commercial and public entities. Despite this, the security and subsequent incident management requirements are major obstacles to adopting the cloud. Current cloud architectures do not support digital forensic investigators, nor comply with today’s digital forensics procedures – largely due to the fundamental dynamic nature of the cloud. When an incident has occurred, an organization-based investigation will seek to provide potential digital evidence while minimising the cost of the investigation. Data acquisition is the first and most important process within digital forensics – to ensure data integrity and admissibility. However, access to data and the control of resources in the cloud is still very much provider-dependent and complicated by the very nature of the multi-tenanted operating environment. Thus, investigators have no option but to rely on the Cloud Service Providers (CSPs) to acquire evidence for them. Due to the cost and time involved in acquiring the forensic image, some cloud providers will not provide evidence beyond 1TB despite a court order served on them. Assuming they would be willing or are required to by law, the evidence collected is still questionable as there is no way to verify the validity of evidence and whether evidence has already been lost. Therefore, dependence on the CSPs is considered one of the most significant challenges when investigators need to acquire evidence in a timely yet forensically sound manner from cloud systems. This thesis proposes a novel architecture to support a forensic acquisition and analysis of IaaS cloud-base systems. The approach, known as Cloud Forensic Acquisition and Analysis System (Cloud FAAS), is based on a cluster analysis of non-volatile memory that achieves forensically reliable images at the same level of integrity as the normal “gold standard” computer forensic acquisition procedures with the additional capability to reconstruct the image at any point in time. Cloud FAAS fundamentally, shifts access of the data back to the data owner rather than relying on a third party. In this manner, organisations are free to undertaken investigations at will requiring no intervention or cooperation from the cloud provider. The novel architecture is validated through a proof-of-concept prototype. A series of experiments are undertaken to illustrate and model how Cloud FAAS is capable of providing a richer and more complete set of admissible evidence than what current CSPs are able to provide. Using Cloud FAAS, investigators have the ability to obtain a forensic image of the system after, just prior to or hours before the incident. Therefore, this approach can not only create images that are forensically sound but also provide access to deleted and more importantly overwritten files – which current computer forensic practices are unable to achieve. This results in an increased level of visibility for the forensic investigator and removes any limitations that data carving and fragmentation may introduce. In addition, an analysis of the economic overhead of operating Cloud FAAS is performed. This shows the level of disk change that occurs is well with acceptable limits and is relatively small in comparison to the total volume of memory available. The results show Cloud FAAS has both a technical and economic basis for solving investigations involving cloud computing.
APA, Harvard, Vancouver, ISO, and other styles
13

Sonnekus, Michael Hendrik. "A comparison of open source and proprietary digital forensic software." Thesis, Rhodes University, 2015. http://hdl.handle.net/10962/d1017939.

Full text
Abstract:
Scrutiny of the capabilities and accuracy of computer forensic tools is increasing as the number of incidents relying on digital evidence and the weight of that evidence increase. This thesis describes the capabilities of the leading proprietary and open source digital forensic tools. The capabilities of the tools were tested separately on digital media that had been formatted using Windows and Linux. Experiments were carried out with the intention of establishing whether the capabilities of open source computer forensics are similar to those of proprietary computer forensic tools, and whether these tools could complement one another. The tools were tested with regards to their capabilities to make and analyse digital forensic images in a forensically sound manner. The tests were carried out on each media type after deleting data from the media, and then repeated after formatting the media. The results of the experiments performed demonstrate that both proprietary and open source computer forensic tools have superior capabilities in different scenarios, and that the toolsets can be used to validate and complement one another. The implication of these findings is that investigators have an affordable means of validating their findings and are able to more effectively investigate digital media.
APA, Harvard, Vancouver, ISO, and other styles
14

Van, Ramesdonk Paul. "Continued forensic development - investigation into current trends and proposed model for digital forensic practitioners." Master's thesis, University of Cape Town, 2016. http://hdl.handle.net/11427/20707.

Full text
Abstract:
Continuous professional development has been looked at in many professions over the years, most notably in primary and secondary education and in the medical fields. With digital forensics being cast into the limelight due to the rapid advancements in technology, academic institutions have added courses to address the void created by the boom in the industry. Little research has been done to address the issues that have now become apparent concerning continued learning in this field. The purpose of this research was to investigate the kinds of frameworks and methods used in other professions, and how the practitioners themselves see career development, and to create a framework that could be used to keep abreast of developments in the field of digital forensics, be it changes in the law, case law, or changes in software. The data analysis showed quite a number of continued learning approaches that could be employed in the digital/computer forensic fields to achieve the objective of keeping abreast of changes in the field. Some, understandably, are due to the nature of the discipline. As part of practitioners' current approach to continued learning, they rely heavily on knowledge sharing in the form of learning from other professionals, through self-study by reading books, articles and research conducted in the forensic field, the use of Information and Communications Technology (ICT) for education, and the use of Internet sources such as user forums, Facebook groups, and web-blogs. The majority of the respondents had received formal training in digital forensics, and of the total number of participants, only six percent had not been involved in any form of continued learning activities in the past five years. When looking at the data obtained, and because there are no formal requirements to perform continued learning in the digital/computer forensic field, it becomes clear that individuals themselves need to be self-driven to keep up to date with changes in the field. As seen in studies focused on continued learning activities in other professions, the research shows that digital/computer forensic practitioners experience similar barriers to their own approaches to continued learning.
APA, Harvard, Vancouver, ISO, and other styles
15

Rule, Samantha Elizabeth. "A Framework for using Open Source intelligence as a Digital Forensic Investigative tool." Thesis, Rhodes University, 2015. http://hdl.handle.net/10962/d1017937.

Full text
Abstract:
The proliferation of the Internet has amplified the use of social networking sites by creating a platform that encourages individuals to share information. As a result there is a wealth of information that is publically and easily accessible. This research explores whether open source intelligence (OSINT), which is freely available, could be used as a digital forensic investigative tool. A survey was created and sent to digital forensic investigators to establish whether they currently use OSINT when performing investigations. The survey results confirm that OSINT is being used by digital forensic investigators when performing investigations but there are currently no guidelines or frameworks available to support the use thereof. Additionally, the survey results showed a belief amongst those surveyed that evidence gleaned from OSINT sources is considered supplementary rather than evidentiary. The findings of this research led to the development of a framework that identifies and recommends key processes to follow when conducting OSINT investigations. The framework can assist digital forensic investigators to follow a structured and rigorous process, which may lead to the unanimous acceptance of information obtained via OSINT sources as evidentiary rather than supplementary in the near future.
APA, Harvard, Vancouver, ISO, and other styles
16

Wang, Mengmeng, and 王萌萌. "Temporal analysis on HFS+ and across file systems in digital forensic investigation." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hub.hku.hk/bib/B50900122.

Full text
Abstract:
In computer forensics, digital evidence related to time is both important and complex. The rules of changes in time associated with digital evidence, such as files or folders, can be used to analyze certain user behaviors like data access, modification or transfer. However, the format and the rules in time information for user actions are quite different for different file systems, even for different versions of operating systems with the same file system. Some research on temporal analysis has already been done on NTFS and FAT file systems, while there are few resources that describe temporal analysis on the Hierarchical File System Plus (HFS+), the default file system in Apple computer. Moreover, removable devices like USB disks are used frequently; transferring files and folders between different devices with different file systems and operating systems happens more and more frequently, so the changes of times across different file systems are also crucial in digital forensics and investigations. In this research, the changes in time attributes of files and folders resulting from user actions on the HFS+ file system and across file systems are analyzed, and the rules of time are generated by inductive reasoning to help reconstruct crime scenes in the digital forensic investigation. Since inductive reasoning is not definitely true compared with deductive reasoning, experiments are performed to validate the rules. The usage of the rules is demonstrated by analyzing a case in details. The methods proposed here are efficient, practical and easy to put into practice in real scenarios.
published_or_final_version
Computer Science
Master
Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
17

Barbosa, Akio Nogueira. "Método para ranqueamento e triagem de computadores aplicado à perícia de informática." Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/3/3141/tde-14062016-081553/.

Full text
Abstract:
Considerando-se que uma das tarefas mais comuns para um perito judicial que atua na área da informática é procurar vestígios de interesse no conteúdo de dispositivos de armazenamento de dados (DADs), que esses vestígios na maioria das vezes consistem em palavras-chave (PChs) e durante o tempo necessário para realização da duplicação do DAD o perito fica praticamente impossibilitado de interagir com os dados contidos no mesmo, decidiu-se verificar a hipótese de que seja possível na etapa de coleta, realizar simultaneamente à duplicação do DAD a varredura para procurar PCHs em dados brutos (raw data), sem com isso impactar significativamente o tempo de duplicação. O principal objetivo desta tese é propor um método que possibilite identificar os DADs com maior chance de conter vestígios de interesse para uma determinada perícia ao término da etapa de coleta, baseado na quantidade de ocorrências de PCHs encontradas por um mecanismo de varredura que atua no nível de dados brutos. A partir desses resultados é realizada uma triagem dos DADs. Com os resultados da triagem é realizado um processo de ranqueamento, indicando quais DADs deverão ser examinados prioritariamente na etapa de análise. Os resultados dos experimentos mostraram que é possível e viável a aplicação do método sem onerar o tempo de duplicação e com um bom nível de precisão. Em muitos de casos, a aplicação do método contribui para a diminuição da quantidade de DADs que devem ser analisados, auxiliando a diminuir o esforço humano necessário.
Considering that one of the most common tasks for a legal expert acting in the information technology area is to look for invidences of interest in the content data storage devices (DADs). In most cases these evidences consist of keywords. During the time necessary to perform the DAD duplication, the expert is practically unable to interact with the data contained on DAD. In this work we have decided to verify the following hypothesis: It is possible, at the collection stage, to simultaneously hold the duplication of the DAD and scan to search for keywords in raw data, without thereby significantly impact the duplication time. The main objective of this thesis is to propose a method that allows to identify DADs with a strong chance of containing evidences of interest for a particular skill at the end of the collection stage, based on the keywords occurrences found by a scanner mechanism that operates at the raw data level. Based on these results, a triage of DADs is established. With the results of the triage, a ranking process is made, providing an indication of which DADs should be examined first at the analysis stage. The results of the ours experiments showed that it is possible and feasible to apply the method without hindering the duplication time and with a certain level of accuracy. In most cases, the application of the method contributes to reduce the number of DADs that must be analyzed, helping to reduces the human effort required.
APA, Harvard, Vancouver, ISO, and other styles
18

Al, Mutawa Noora Ahmad Khurshid. "Integrating behavioural analysis within the digital forensics investigation process." Thesis, University of Central Lancashire, 2018. http://clok.uclan.ac.uk/25412/.

Full text
Abstract:
This programme of research focused on incorporating Behavioural Analysis (BA) within the digital forensics investigation process. A review of previously developed digital forensics investigation models indicated a lack of sufficient consideration of the behavioural and motivational dimensions of offending, and the way in which digital evidence can be used to address these issues during the investigation process. This programme of research aimed to build on previous work by scientific researchers and investigators by developing a digital forensics investigation model which incorporates greater consideration of the behavioural and motivational implications of case-related digital evidence based on current theoretical understandings of these aspects of offending from forensic psychology. This can aid with understanding of the crime events and reconstruction, and lead to the development of more detailed models and guidelines for examining computer-facilitated interpersonal crimes. The first study employed an abductive approach to forensically analyse individual cases (real cases obtained from the Dubai Police archives) applying BA to the online Sexually Exploitative Imagery of Children (SEIC) and cyberstalking. Its aim was to investigate what BA could contribute to the digital forensics investigation of cases within these crime categories. It identified five benefits: (1) providing focus, speed and investigative directions, (2) inferring victim/offender behaviours, (3) inferring offender motivation(s), (4) identifying potential victims, and (5) eliminating suspects. This was followed by a survey study empirically examining the perceptions of national and international digital forensics practitioners regarding the use and utility of BA during the process of investigating SEIC and cyberstalking cases. The results indicated that while the majority believed that BA has potential to contribute to many aspects of digital forensics investigations, their daily investigative activities involved a limited use of this technique. The implications of the study were outlined, and emphasised the need to design a digital forensics investigation model that provides guiding steps and illustrations on how to utilise BA in digital forensics investigations. Based on the findings from the conducted studies, a digital forensics investigation model that incorporates aspects of BA was designed. It aimed to provide a pragmatic, structured, multidisciplinary approach to performing a post mortem examination, analysis, and interpretation of the content of the digital devices associated with computer-facilitated interpersonal crimes. Two comprehensive case studies were also used to illustrate the investigative importance of the model in investigating computer-facilitated interpersonal crimes.
APA, Harvard, Vancouver, ISO, and other styles
19

LeRoi, Jack. "A forensic investigation of the electrical properties of digital audio recording." Thesis, University of Colorado at Denver, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=1556863.

Full text
Abstract:

In media forensics, the devices; e.g. computers, smart phones, still/video cameras, audio recorders, and software; e.g. video, audio, and graphics editors, file and disk utilities, mathematical computation applications, are, for the most part, black boxes. The design specifications are usually proprietary and the operating specifications may be incomplete, inaccurate, or unavailable. This makes it difficult to validate the technology, but using it without validation could discredit a practitioner's findings or testimony. The alternative is to test the device or program to determine relevant characteristics of its performance.

An important and common device in media forensics is the portable digital audio recorder used to record surveillance and interviews. This type can also be used to record the alternating current (AC) waveform from the mains power. While small variations in the AC frequency (ENF) can be forensically important, distortion in the recording can affect its value in adjudication or investigation. A method is presented to evaluate aspects of a recorder's operation that can cause distortion. Specifically, the method measures the noise generated by the recorder's electronics in its input and amplifier circuits. The method includes a procedure to isolate the recorder from environmental sources of noise. The method analyzes the broadband noise floor produced by the range of recording conditions and recorder settings. It also analyzes the noise amplitude for the harmonics for the mains frequency.

APA, Harvard, Vancouver, ISO, and other styles
20

Ademu, Inikpi. "A comprehensive digital forensics investigation model and guidelines for establishing admissible digital evidence." Thesis, University of East London, 2013. http://roar.uel.ac.uk/3992/.

Full text
Abstract:
Technology systems are attacked by offenders using digital devices and networks to facilitate their crimes and hide their identities, creating new challenges for digital investigators. Malicious programs that exploit vulnerabilities also serve as threats to digital investigators. Since digital devices such as computers and networks are used by organisations and digital investigators, malicious programs and risky practices that may contaminate the integrity of digital evidence can lead to loss of evidence. For some reasons, digital investigators face a major challenge in preserving the integrity of digital evidence. Not only is there no definitive comprehensive model of digital forensic investigation for ensuring the reliability of digital evidence, but there has to date been no intensive research into methods of doing so. To address the issue of preserving the integrity of digital evidence, this research improves upon other digital forensic investigation model by creating a Comprehensive Digital Forensic Investigation Model (CDFIM), a model that results in an improvement in the investigation process, as well as security mechanism and guidelines during investigation. The improvement is also effected by implementing Proxy Mobile Internet Protocol version 6 (PMIPv6) with improved buffering based on Open Air Interface PIMIPv6 (OAI PMIPv6) implementation to provide reliable services during handover in Mobile Node (MN) and improve performance measures to minimize loss of data which this research identified as a factor affecting the integrity of digital evidence. The advantage of this is to present that the integrity of digital evidence can be preserved if loss of data is prevented. This research supports the integration of security mechanism and intelligent software in digital forensic investigation which assist in preserving the integrity of digital evidence by conducting experiments which carried out two different attack experiment to test CDFIM. It found that when CDFIM used security mechanism and guidelines with the investigation process, it was able to identify the attack and also ensured that the integrity of the digital evidence was preserved. It was also found that the security mechanism and guidelines incorporated in the digital investigative process are useless when the security guidelines are ignored by digital investigators, thus posing a threat to the integrity of digital evidence.
APA, Harvard, Vancouver, ISO, and other styles
21

Homem, Irvin. "Towards Automation in Digital Investigations : Seeking Efficiency in Digital Forensics in Mobile and Cloud Environments." Licentiate thesis, Stockholms universitet, Institutionen för data- och systemvetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-130742.

Full text
Abstract:
Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.
APA, Harvard, Vancouver, ISO, and other styles
22

Al-Morjan, Abdulrazaq Abdulaziz. "An investigation into a digital forensic model to distinguish between 'insider' and 'outsider'." Thesis, De Montfort University, 2010. http://hdl.handle.net/2086/4402.

Full text
Abstract:
IT systems are attacked using computers and networks to facilitate their crimes and hide their identities, creating new challenges for corporate security investigations. There are two main types of attacker: insiders and outsiders. Insiders are trusted users who have gained authorised access to an organisation's IT resources in order to execute their job responsibilities. However, they deliberately abuse their authorised (i.e. insider) access in order to contravene an organisation’s policies or to commit computer crimes. Outsiders gain insider access to an organisation's IT objects through their ability to bypass security mechanisms without prior knowledge of the insider’s job responsibilities, an advanced method of attacking an organisation’s resources in such a way as to prevent the abnormal behaviour typical of an outsider attack from being detected, and to hide the attacker’s identity. For a number of reasons, corporate security investigators face a major challenge in distinguishing between the two types of attack. Not only is there no definitive model of digital analysis for making such a distinction, but there has to date been no intensive research into methods of doing so. Identification of these differences is attempted by flawed investigative approaches to three aspects: location from which an attack is launched, attack from within the organisation's area of control, and authorised access. The results of such unsound investigations could render organisations subject to legal action and negative publicity. To address the issue of the distinction between insider and outsider attacks, this research improves upon the first academic forensic analysis model, Digital Forensic Research Workshop (DFRWS) [63]. The outcome of this improvement is the creation of a Digital Analysis Model for Distinction between Insider and Outsider Attacks (DAMDIOA), a model that results in an improvement in the analysis investigation process, as well as the process of decision. This improvement is effected by two types of proposed decision: fixed and tailored. The first is based on a predetermined logical condition, the second on the proportion of suspicious activity. The advantage of the latter is that an organisation can adjust its threshold of tolerance for such activity based on its level of concern for the type of attack involved. This research supports the possibility of distinguishing between insider and outsider attacks by running a network simulation which carried out a number of email attack experiments to test DAMDIOA. It found that, when DAMDIOA used predetermined decisions based on legitimate activities, it was able to differentiate the type of attack in seven of the eight experiments conducted. It was the tailored decisions with threshold levels Th=0.2 and 0.3 that conferred the ability to make such distinctions. When the researcher compared legitimate activities, including users’ job responsibilities, with the current methods of distinguishing between insider and outsider attacks,the criterion of authorised access failed three times to make that distinctions. This method of distinction is useless when there is a blank or shared password. He also discovered that both the location from which an attack was launched and attacks from areas within an organisation’s control failed five times to differentiate between such attacks. There are no substantive differences between these methods. The single instance in which the proposed method failed to make these distinctions was because the number of legitimate activities equalled the number of suspicious ones. DAMDIOA has been used by two organisations for dealing with the misuse of their computers, in both cases located in open areas and weakly protected by easily guessed passwords. IT policy was breached and two accounts moved from the restricted to the unlimited Internet policy group. This model was able to identify the insiders concerned by reviewing recorded activities and linking them with the insiders’ job responsibilities. This model also highlights users’ job responsibilities as a valuable source of forensic evidence that may be used to distinguish between insider and outsider attacks. DAMDIOA may help corporate security investigators identify suspects accurately and avoid incurring financial loss for their organisations. This research also recommends many improvements to the process by which user activities are collected before the attack takes place, thereby enabling distinctions to be better drawn. It also proposes the creation of a physical and logical log management system, a centralised database for all employee activities that will reduce organisations’ financial expenditures. Suggestions are also proposed for future research to classify legitimate and suspicious activities, evaluate them, identify the important ones and standardise the process of identifying and collecting users’ job responsibilities. This work will remove some of the limitations of the proposed model.
APA, Harvard, Vancouver, ISO, and other styles
23

Nikooazm, Elina. "Validation des logiciels d'expertise judiciaire de preuves informatiques." Thesis, Paris 2, 2015. http://www.theses.fr/2015PA020021.

Full text
Abstract:
Dans les affaires judiciaires, les juges confrontés à des questions d’ordre techniques en matière informatique, recourent à des experts qui mettent leur savoir-faire au service de la justice. Régulièrement mandatés par les tribunaux, ils ont pour mission d'éclairer le juge en apportant des éléments de preuve utiles à l'enquête.Ils recherchent dans les scellés informatiques les éléments relatifs aux faits incriminés en préservant l’intégrité des données et évitant toute altération des supports originaux. Les éléments de preuve ainsi recueillis sont analysés par l’expert qui déposera ses conclusions au magistrat sous forme d’un rapport d'expertise.les investigations techniques sont effectuées à l'aide des outils très sophistiqués qui permettent de prendre connaissance des informations présentes, effacées, cachées ou chiffrées dans les supports numériques examinés.Ce qui requiert une parfaite maîtrise du matériel déployé et une identification claire des bonnes pratiques de la discipline. Ce projet de recherches vise à mettre en exergue les défis techniques aux quels sont confrontés les experts, la complexité des outils utilisés dans le cadre des investigations techniques et l'importance de la mise en place des tests de validation qui permettent de connaître les capacités et limites de chaque outil
In criminal cases, judges confronted with questions of technical order in computer technology, designate expert witnesses who put their expertise at the service of justice. Duly appointed by the courts, they help the judge by providing evidence relevant to the investigation.They search the suspect’s seized digital devices for elements of computer related crime, while preserving the integrity of the data and avoiding any alteration of the original media.The evidence thus collected is analyzed by a digital forensic expert who will document their findings to the judge in a report.Technical investigations are conducted by using powerful and sophisticated tools to find the current files and recover deleted, hidden or encrypted data from the digital media device examined.This requires perfect control of the utilized equipment and a clear identification of the methods used during the analysis. This research project aims to highlight the technical challenges which experts face, the complexity of digital forensic tools used for technical investigations, and the importance of their validation to understand the capabilities and limitations of each tool
APA, Harvard, Vancouver, ISO, and other styles
24

Mankantshu, Mninawe Albert. "Investigating the factors that influence digital forensic readiness in a South African organisation." Master's thesis, University of Cape Town, 2014. http://hdl.handle.net/11427/8504.

Full text
Abstract:
Includes bibliographical references.
Computer crimes affect the bottom line of organisations across the globe. The ability of criminals to exploit organisational systems and avoid prosecution is a concern for most organisations. This is due to the increased use of information and communication technology (ICT) by individuals and organisations. The rapid growth of ICT has affected our communication and information exchange. These advances have not only influenced the way we conduct our daily activities, but has also led to new opportunities, risks and challenges for technical and legal structures. Unfortunately, some individuals and groups have decided to use these ICT advances in order to engage in criminal activities, such as cybercrime. The increase of cyber-related crimes puts a lot of pressure on law enforcement agencies and organisations across the globe to produce credible digital forensic evidence.
APA, Harvard, Vancouver, ISO, and other styles
25

AlMarri, Saeed. "A structured approach to malware detection and analysis in digital forensics investigation." Thesis, University of Bedfordshire, 2017. http://hdl.handle.net/10547/622529.

Full text
Abstract:
Within the World Wide Web (WWW), malware is considered one of the most serious threats to system security with complex system issues caused by malware and spam. Networks and systems can be accessed and compromised by various types of malware, such as viruses, worms, Trojans, botnet and rootkits, which compromise systems through coordinated attacks. Malware often uses anti-forensic techniques to avoid detection and investigation. Moreover, the results of investigating such attacks are often ineffective and can create barriers for obtaining clear evidence due to the lack of sufficient tools and the immaturity of forensics methodology. This research addressed various complexities faced by investigators in the detection and analysis of malware. In this thesis, the author identified the need for a new approach towards malware detection that focuses on a robust framework, and proposed a solution based on an extensive literature review and market research analysis. The literature review focussed on the different trials and techniques in malware detection to identify the parameters for developing a solution design, while market research was carried out to understand the precise nature of the current problem. The author termed the new approaches and development of the new framework the triple-tier centralised online real-time environment (tri-CORE) malware analysis (TCMA). The tiers come from three distinctive phases of detection and analysis where the entire research pattern is divided into three different domains. The tiers are the malware acquisition function, detection and analysis, and the database operational function. This framework design will contribute to the field of computer forensics by making the investigative process more effective and efficient. By integrating a hybrid method for malware detection, associated limitations with both static and dynamic methods are eliminated. This aids forensics experts with carrying out quick, investigatory processes to detect the behaviour of the malware and its related elements. The proposed framework will help to ensure system confidentiality, integrity, availability and accountability. The current research also focussed on a prototype (artefact) that was developed in favour of a different approach in digital forensics and malware detection methods. As such, a new Toolkit was designed and implemented, which is based on a simple architectural structure and built from open source software that can help investigators develop the skills to critically respond to current cyber incidents and analyses.
APA, Harvard, Vancouver, ISO, and other styles
26

Hewling, Moniphia Orlease. "Digital forensics : an integrated approach for the investigation of cyber/computer related crimes." Thesis, University of Bedfordshire, 2013. http://hdl.handle.net/10547/326231.

Full text
Abstract:
Digital forensics has become a predominant field in recent times and courts have had to deal with an influx of related cases over the past decade. As computer/cyber related criminal attacks become more predominant in today’s technologically driven society the need for and use of, digital evidence in courts has increased. There is the urgent need to hold perpetrators of such crimes accountable and successfully prosecuting them. The process used to acquire this digital evidence (to be used in cases in courts) is digital forensics. The procedures currently used in the digital forensic process were developed focusing on particular areas of the digital evidence acquisition process. This has resulted in very little regard being made for the core components of the digital forensics field, for example the legal and ethical along with other integral aspects of investigations as a whole. These core facets are important for a number of reasons including the fact that other forensic sciences have included them, and to survive as a true forensics discipline digital forensics must ensure that they are accounted for. This is because, digital forensics like other forensics disciplines must ensure that the evidence (digital evidence) produced from the process is able to withstand the rigors of a courtroom. Digital forensics is a new and developing field still in its infancy when compared to traditional forensics fields such as botany or anthropology. Over the years development in the field has been tool centered, being driven by commercial developers of the tools used in the digital investigative process. This, along with having no set standards to guide digital forensics practitioners operating in the field has led to issues regarding the reliability, verifiability and consistency of digital evidence when presented in court cases. Additionally some developers have neglected the fact that the mere mention of the word forensics suggests courts of law, and thus legal practitioners will be intimately involved. Such omissions have resulted in the digital evidence being acquired for use in various investigations facing major challenges when presented in a number of cases. Mitigation of such issues is possible with the development of a standard set of methodologies flexible enough to accommodate the intricacies of all fields to be considered when dealing with digital evidence. This thesis addresses issues regarding digital forensics frameworks, methods, methodologies and standards for acquiring digital evidence using the grounded theory approach. Data was gathered using literature surveys, questionnaires and interviews electronically. Collecting data using electronic means proved useful when there is need to collect data from different jurisdictions worldwide. Initial surveys indicated that there were no existing standards in place and that the terms models/frameworks and methodologies were used interchangeably to refer to methodologies. A framework and methodology have been developed to address the identified issues and represent the major contribution of this research. The dissertation outlines solutions to the identified issues and presents the 2IR Framework of standards which governs the 2IR Methodology supported by a mobile application and a curriculum of studies. These designs were developed using an integrated approach incorporating all four core facets of the digital forensics field. This research lays the foundation for a single integrated approach to digital forensics and can be further developed to ensure the robustness of process and procedures used by digital forensics practitioners worldwide.
APA, Harvard, Vancouver, ISO, and other styles
27

Waltereit, Marian [Verfasser], and Torben [Akademischer Betreuer] Weis. "On the Digital Forensic Investigation of Hit-And-Run Accidents / Marian Waltereit ; Betreuer: Torben Weis." Duisburg, 2021. http://d-nb.info/1236539818/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Nordin, Anton, and Felix Liffner. "Forensiska Artefakter hos Mobila Applikationer : Utvinning och Analys av Applikationen Snapchat." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-40207.

Full text
Abstract:
Today's smartphones and tablets use different applications and software for all sorts of purposes: communication, entertainment, fitness, to share images with each other, to keep up to date with the news and lots of different daily tasks. With the heavy usage of all these apps, it is no wonder that it comes with a few issues. Private data is stored in large quantities both on the local device and on the app-creators' servers. It is no wonder that applications advertising user secrecy and transient storage of user data. One of these applications is Snapchat, with over 500 million downloads on Google Play store, at the time of writing. Snapchat is a communication application with the niched feature that the images and messages sent, disappear once opened or after 24 hours have passed. With the illusion of privacy behind Snapchats niche it has become a breeding ground for criminal activity. The niche itself translates to a troublesome hurdle for law enforcement trying to retrieve evidence from devices of Snapchat users. This paper is aimed to investigate these issues and perform a methodology to retrieve potential evidence on a device using Snapchat to send images and messages. By performing a physical acquisition on a test device and analyzing to find artifacts pertaining to Snapchat and the test-data that was created. The method is performed on a Samsung Galaxy S4 with Android 5.0.1 running Snapchat version 10.52.3.0. Test data such as different images and messages were created and attempted to be retrieved at three points in time. First one being right after data creation. Second one after a restart and 24 hours after the data was created. And the third with 48 hours passed and the Snapchat user logged out at the time of acquisition. The acquisition resulted in the extraction of several sent images and a full text conversation between the experimental device and another party. A full video which was uploaded by the receiving user was able to be extracted even though the experimental device never actually viewed the video. The second acquisition which was made when 24h had passed gave the same results as the first one. This meant that time at least up to a day after the initial creation of the data did not have any effect on the evidence. However, when the Snapchat user was logged out from the application, the data was then unobtainable and had disappeared. Presumably Snapchat has a function which deletes personal data about the user when logged out from the application. This function might become a hurdle in law enforcement investigations where the application Snapchat is involved.
APA, Harvard, Vancouver, ISO, and other styles
29

Federici, Corrado <1965&gt. "The twofold role of Cloud Computing in Digital Forensics: target of investigations and helping hand to evidence analysis." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amsdottorato.unibo.it/6694/.

Full text
Abstract:
This PhD thesis discusses the impact of Cloud Computing infrastructures on Digital Forensics in the twofold role of target of investigations and as a helping hand to investigators. The Cloud offers a cheap and almost limitless computing power and storage space for data which can be leveraged to commit either new or old crimes and host related traces. Conversely, the Cloud can help forensic examiners to find clues better and earlier than traditional analysis applications, thanks to its dramatically improved evidence processing capabilities. In both cases, a new arsenal of software tools needs to be made available. The development of this novel weaponry and its technical and legal implications from the point of view of repeatability of technical assessments is discussed throughout the following pages and constitutes the unprecedented contribution of this work
APA, Harvard, Vancouver, ISO, and other styles
30

Jones, Eric Douglas. "Forensic Investigation of Stamped Markings Using a Large-Chamber Scanning Electron Microscope and Computer Analysis for Depth Determination." TopSCHOLAR®, 2013. http://digitalcommons.wku.edu/theses/1237.

Full text
Abstract:
All firearms within the United States are required by the Gun Control Act to be physically marked with a serial number; which is at least 0.003” in depth and 1/16” in height. The purpose of a serial number is to make each firearm uniquely identifiable and traceable. Intentional removal of a serial number is a criminal offense and is used to hide the identity and movements of the involved criminal parties. The current standard for firearm serial number restoration is by chemical etching; which is time & labor intensive as well as destructive to the physical evidence (firearm). It is hypothesized that a new technique that is accurate, precise, and time efficient will greatly aid law enforcement agencies in pursuing criminals. This thesis focuses on using a large chamber scanning electron microscope to take secondary electron (SE) images of a stamped metal plate and analyzing them using the MIRA MX 7 UE image processing software for purposes of depth determination. An experimental peak luminance value of 77 (pixel values) was correlated to the known depth (273 μm) at the bottom of the sample character. Results show that it is potentially possible to determine an unknown depth from a SEM image; using luminance values obtained in the MIRA analysis.
APA, Harvard, Vancouver, ISO, and other styles
31

Homem, Irvin. "LEIA: The Live Evidence Information Aggregator : A Scalable Distributed Hypervisor‐based Peer‐2‐Peer Aggregator of Information for Cyber‐Law Enforcement I." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-177902.

Full text
Abstract:
The Internet in its most basic form is a complex information sharing organism. There are billions of interconnected elements with varying capabilities that work together supporting numerous activities (services) through this information sharing. In recent times, these elements have become portable, mobile, highly computationally capable and more than ever intertwined with human controllers and their activities. They are also rapidly being embedded into other everyday objects and sharing more and more information in order to facilitate automation, signaling that the rise of the Internet of Things is imminent. In every human society there are always miscreants who prefer to drive against the common good and engage in illicit activity. It is no different within the society interconnected by the Internet (The Internet Society). Law enforcement in every society attempts to curb perpetrators of such activities. However, it is immensely difficult when the Internet is the playing field. The amount of information that investigators must sift through is incredibly massive and prosecution timelines stated by law are prohibitively narrow. The main solution towards this Big Data problem is seen to be the automation of the Digital Investigation process. This encompasses the entire process: From the detection of malevolent activity, seizure/collection of evidence, analysis of the evidentiary data collected and finally to the presentation of valid postulates. This paper focuses mainly on the automation of the evidence capture process in an Internet of Things environment. However, in order to comprehensively achieve this, the subsequent and consequent procedures of detection of malevolent activity and analysis of the evidentiary data collected, respectively, are also touched upon. To this effect we propose the Live Evidence Information Aggregator (LEIA) architecture that aims to be a comprehensive automated digital investigation tool. LEIA is in essence a collaborative framework that hinges upon interactivity and sharing of resources and information among participating devices in order to achieve the necessary efficiency in data collection in the event of a security incident. Its ingenuity makes use of a variety of technologies to achieve its goals. This is seen in the use of crowdsourcing among devices in order to achieve more accurate malicious event detection; Hypervisors with inbuilt intrusion detection capabilities to facilitate efficient data capture; Peer to Peer networks to facilitate rapid transfer of evidentiary data to a centralized data store; Cloud Storage to facilitate storage of massive amounts of data; and the Resource Description Framework from Semantic Web Technologies to facilitate the interoperability of data storage formats among the heterogeneous devices. Within the description of the LEIA architecture, a peer to peer protocol based on the Bittorrent protocol is proposed, corresponding data storage and transfer formats are developed, and network security protocols are also taken into consideration. In order to demonstrate the LEIA architecture developed in this study, a small scale prototype with limited capabilities has been built and tested. The prototype functionality focuses only on the secure, remote acquisition of the hard disk of an embedded Linux device over the Internet and its subsequent storage on a cloud infrastructure. The successful implementation of this prototype goes to show that the architecture is feasible and that the automation of the evidence seizure process makes the otherwise arduous process easy and quick to perform.
APA, Harvard, Vancouver, ISO, and other styles
32

Breitinger, Frank [Verfasser], Stefan [Akademischer Betreuer] Katzenbeisser, Harald [Akademischer Betreuer] Baier, and Felix [Akademischer Betreuer] Freiling. "On the utility of bytewise approximate matching in computer science with a special focus on digital forensics investigations / Frank Breitinger. Betreuer: Stefan Katzenbeisser ; Harald Baier ; Felix Freiling." Darmstadt : Universitäts- und Landesbibliothek Darmstadt, 2014. http://d-nb.info/1110901852/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Bouchaud, François. "Analyse forensique des écosystèmes intelligents communicants de l'internet des objets." Thesis, Lille, 2021. http://www.theses.fr/2021LILUI014.

Full text
Abstract:
Avec le développement des écosystèmes connectés à Internet, la recherche de données dans un environnement numérique par l’enquêteur judiciaire constitue une tâche de plus en plus ardue. Elle est un véritable défi en particulier par l’hétérogénéité des objets à étudier. A cette affirmation, il convient d'y ajouter l’absence de standardisation des architectures de communication et des remontées de données, des dépendances entre les dispositifs connectés et une dispersion de l’information. Dans cette thèse, nous proposons d’adapter l’approche traditionnelle de l’investigation numérique aux contraintes de l’Internet des objets. Nous développons des méthodologies et des outils d’appréhension et d’analyse de l’environnement connecté pour les praticiens du judiciaire. Nous partons du principe que la scène de crime constitue un tout connecté et non un agrégat d’objets numériques. Elle contient des données clefs dans la compréhension et la contextualisation d’un évènement ou d’un phénomène passé, éléments de preuve pour le procès pénal. L’investigation numérique est une science appliquée pour identifier un incident, collecter, examiner et analyser des données tout en préservant l’intégrité de l’information et en maintenant une chaîne de contrôle stricte pour les données (National Institute of Standards and Technology). Face à une scène de crime, l’enquêteur cherche à comprendre l’évènement criminel, en examinant les traces figées ou emprisonnées dans le support physique et/ou dans une partie déportée sur le Cloud. Nos travaux développent un processus d’identification rapide du phénomène selon quatre phases : détection, localisation, reconnaissance des objets et recoupement de l’information. Il est enrichi d’outils de recherche de traces radioélectriques: simple capteur et réseau maillé multi-capteur. Cette démarche est construite autour de la problématique de l’appréhension d’un environnement connecté multiforme, contenant des dispositifs pas toujours visibles ou identifiables lors d’une approche terrain. Nous intégrons dans notre étude la stratégie de la collecte des équipements. Le défi réside dans la capacité à extraire un ou plusieurs objets connectés, sans compromettre les données stockées, pour les placer dans un environnement contrôlé et sécurisé. L’objet est maintenu dans un état garantissant la non-altération ou la perte des données. L’étude regroupe une première phase de compréhension de l’environnement physique et des dépendances. Elle cherche à déterminer les mécanismes de migration de l’information vers les plates-formes en ligne et à isoler les groupes d’objets en déstructurant avec intelligence les connexions. Les dispositifs sont extraits, puis conditionnés et scellés au regard de leurs caractéristiques techniques et de l’infrastructure connectée. Puis, nous approfondissons l’exploitation de l’information collectée par des méthodes forensiques. La donnée est alors analysée selon les axes temporels, spatiaux et contextuels. Nous proposons par ailleurs une classification et une priorisation de la structure connectée en fonction des caractéristiques de la donnée recherchée. Les travaux donnent une lecture du cycle de vie de la donnée au sein de l’infrastructure de l’Internet des Objets.Dans une approche prospective, nous approfondissons les questions de l’identification fine de l'objet connecté en fonction des caractéristiques du matériel et du logiciel. L'émission acoustique de l'électronique apparaît comme une propriété physique pertinente dans l'étude des équipements. Cet attribut complète notre palette d'outils dans l'identification des objets connectés
With the development of the Internet of Things, searching for data in a digital environment is an increasingly difficult task for the forensic investigator. It is a real challenge, especially given the heterogeneity of the connected objects. There is a lack of standardization in communication architectures and data management policies. It is accompanied by dependencies between connected ecosystems, especially through hidden links and fragmented information. In this thesis, we suggest adjusting the traditional approach of digital investigation to the constraints of the Internet of Things. We develop methodologies and tools to understand and analyze the connected environment. We assume that the crime scene is a connected whole and not an aggregate of independent digital objects. It contains key data for understanding and contextualizing a past event or phenomenon as evidence for the criminal trial. Digital forensics is considered to be the og extit{application of science to the identification, collection, examination, and analysis, of data while preserving the integrity of the information and maintaining a strict chain of custody for the data fg~ (National Institute of Standards and Technology). Faced with a crime scene, the investigator seeks to understand the criminal event. He examines the data stored in the physical medium and/or in a remote part of the cloud. Our work develops a process of rapid identification of the phenomenon according to four phases: detection, localization, object recognition and information crosschecking. It is enriched with radio signature search tools~: single-sensor and multi-sensor mesh network. This approach is built around the problem of apprehending a multiform connected environment, containing devices that are not always visible or identifiable during a field approach. We integrate in our study the strategy of equipment collection. The challenge lies in the ability to extract one or more connected objects, without compromising the stored data, to place them in a controlled and secure environment. The object is maintained in a state that guarantees the non-alteration or loss of data. The study includes a first phase of understanding the physical environment and dependencies. It seeks to determine the mechanisms of information migration to online platforms and to isolate groups of objects by intelligently breaking the connections. Devices are extracted, then packaged and sealed according to their technical characteristics and the connected infrastructure. We then deepen the exploitation of the information collected using forensic methods. The data is then analyzed according to temporal, spatial and contextual axes. We also propose a classification and a prioritization of the connected structure according to the characteristics of the desired data. The work gives a reading of the life cycle of the data within the Internet of Things infrastructure. In a prospective approach, we deepen the questions of the fine identification of the connected object according to these hardware and software characteristics. The acoustic signature of electronics appears as a relevant physical property in the study of equipment. This feature completes our range of tools in the identification of connected objects
APA, Harvard, Vancouver, ISO, and other styles
34

Lalanne, Vincent. "Gestion des risques appliquée aux systèmes d’information distribués." Thesis, Pau, 2013. http://www.theses.fr/2013PAUU3052/document.

Full text
Abstract:
Dans cette thèse nous abordons la gestion des risques appliquée aux systèmes d’information distribués. Nous traitons des problèmes d’interopérabilité et de sécurisation des échanges dans les systèmes DRM et nous proposons la mise en place de ce système pour l’entreprise: il doit nous permettre de distribuer des contenus auto-protégés. Ensuite nous présentons la participation à la création d’une entreprise innovante qui met en avant la sécurité de l’information, avec en particulier la gestion des risques au travers de la norme ISO/IEC 27005:2011. Nous présentons les risques liés à l’utilisation de services avec un accent tout particulier sur les risques autres que les risques technologiques; nous abordons les risques inhérents au cloud (défaillance d’un provider, etc...) mais également les aspects plus sournois d’espionnage et d’intrusion dans les données personnelles (Affaire PRISM en juin 2013). Dans la dernière partie nous présentons un concept de DRM d’Entreprise qui utilise les métadonnées pour déployer des contextes dans les modèles de contrôle d’usage. Nous proposons une ébauche de formalisation des métadonnées nécessaires à la mise en œuvre de la politique de sécurité et nous garantissons le respect de la réglementation et de la loi en vigueur
In this thesis we discuss the application of risk management to distributed information systems. We handle problems of interoperability and securisation of the exchanges within DRM systems and we propose the implementation of this system for the company: it needs to permit the distribution of self-protected contents. We then present the (our) participation in the creation of an innovative company which emphasizes on the security of information, in particular the management of risks through the ISO/IEC 27005:2011 standard. We present risks related to the use of services, highlighting in particular the ones which are not technological: we approach inheritent risks in clouds (provider failure, etc ...) but also the more insidious aspects of espionage and intrusion in personal data (Case PRISM in June 2013). In the last section, we present a concept of a DRM company which uses metadata to deploy settings in usage control models. We propose a draft formalization of metadata necessary for the implementation of a security policy and guarantee respect of regulations and legislation
APA, Harvard, Vancouver, ISO, and other styles
35

Arthur, Kweku Kwakye. "Considerations towards the development of a forensic evidence management system." Diss., 2010. http://hdl.handle.net/2263/26567.

Full text
Abstract:
The decentralized nature of the Internet forms its very foundation, yet it is this very nature that has opened networks and individual machines to a host of threats and attacks from malicious agents. Consequently, forensic specialists - tasked with the investigation of crimes commissioned through the use of computer systems, where evidence is digital in nature - are often unable to adequately reach convincing conclusions pertaining to their investigations. Some of the challenges within reliable forensic investigations include the lack of a global view of the investigation landscape and the complexity and obfuscated nature of the digital world. A perpetual challenge within the evidence analysis process is the reliability and integrity associated with digital evidence, particularly from disparate sources. Given the ease with which digital evidence (such as metadata) can be created, altered, or destroyed, the integrity attributed to digital evidence is of paramount importance. This dissertation focuses on the challenges relating to the integrity of digital evidence within reliable forensic investigations. These challenges are addressed through the proposal of a model for the construction of a Forensic Evidence Management System (FEMS) to preserve the integrity of digital evidence within forensic investigations. The Biba Integrity Model is utilized to maintain the integrity of digital evidence within the FEMS. Casey's Certainty Scale is then employed as the integrity classifcation scheme for assigning integrity labels to digital evidence within the system. The FEMS model consists of a client layer, a logic layer and a data layer, with eight system components distributed amongst these layers. In addition to describing the FEMS system components, a fnite state automata is utilized to describe the system component interactions. In so doing, we reason about the FEMS's behaviour and demonstrate how rules within the FEMS can be developed to recognize and pro le various cyber crimes. Furthermore, we design fundamental algorithms for processing of information by the FEMS's core system components; this provides further insight into the system component interdependencies and the input and output parameters for the system transitions and decision-points infuencing the value of inferences derived within the FEMS. Lastly, the completeness of the FEMS is assessed by comparing the constructs and operation of the FEMS against the published work of Brian D Carrier. This approach provides a mechanism for critically analyzing the FEMS model, to identify similarities or impactful considerations within the solution approach, and more importantly, to identify shortcomings within the model. Ultimately, the greatest value in the FEMS is in its ability to serve as a decision support or enhancement system for digital forensic investigators. Copyright
Dissertation (MSc)--University of Pretoria, 2010.
Computer Science
unrestricted
APA, Harvard, Vancouver, ISO, and other styles
36

Alharbi, Soltan Abed. "Proactive System for Digital Forensic Investigation." Thesis, 2014. http://hdl.handle.net/1828/5237.

Full text
Abstract:
Digital Forensics (DF) is defined as the ensemble of methods, tools and techniques used to collect, preserve and analyse digital data originating from any type of digital media involved in an incident with the purpose of extracting valid evidence for a court of law. DF investigations are usually performed as a response to a digital crime and, as such, they are termed Reactive Digital Forensic (RDF). An RDF investigation takes the traditional (or post-mortem) approach of investigating digital crimes after incidents have occurred. This involves identifying, preserving, collecting, analyzing, and generating the final report. Although RDF investigations are effective, they are faced with many challenges, especially when dealing with anti-forensic incidents, volatile data and event reconstruction. To tackle these challenges, Proactive Digital Forensic (PDF) is required. By being proactive, DF is prepared for incidents. In fact, the PDF investigation has the ability to proactively collect data, preserve it, detect suspicious events, analyze evidence and report an incident as it occurs. This dissertation focuses on the detection and analysis phase of the proactive investigation system, as it is the most expensive phase of the system. In addition, theories behind such systems will be discussed. Finally, implementation of the whole proactive system will be tested on a botnet use case (Zeus).
Graduate
0984
0537
soltanalharbi@hotmail.com
APA, Harvard, Vancouver, ISO, and other styles
37

Grobler, Cornelia Petronella. "DFMF : a digital forensic management framework." Thesis, 2012. http://hdl.handle.net/10210/6365.

Full text
Abstract:
D.Phil.(Computer Science)
We are living in an increasingly complex world in which much of society is dependent on technology and its various offshoots and incarnations (Rogers & Siegfried, 2004). There is ample evidence of the influence of technology on our daily lives. We communicate via e-mail, use chat groups to interact and conduct business by using e-commerce. People relate each other’s existence to a presence on Facebook. The convergence of the products, systems and services of information technology is changing the way of living. The latest smart and cell phones have cameras, applications, and access to social networking sites. These phones contain sensitive information, for example photographs, e-mail, spread sheets, documents, and presentations. The loss of a cell phone therefore may pose a serious problem to an individual or an organisation, when considering privacy and intellectual property issues from an information security (Info Sec) perspective (Pieterse, 2006). Organisations have accepted the protection of information and information assets as a fundamental business requirement and managers are therefore implementing an increasing number of security counter measures, such as security policies, intrusion detection systems, access control mechanisms, and anti-virus products to protect the information and information assets from potential threats. However, incidents still occur, as no system is 100% secure. The incidents must be investigated to determine their root cause and potentially to prosecute the perpetrators (Louwrens, von Solms, Reeckie & Grobler, 2006b). Humankind has long been interested in the connection between cause and event, wishing to know what happened, what went wrong and why it happened. The need for computer forensics emerged when an increasing number of crimes were committed with the use of computers and the evidence required was stored on the computer. In 1984, a Federal Bureau of Investigation (FBI) laboratory began to examine computer evidence (Barayumureeba & Tushabe, 2004), and in 1991 the international association of computer investigation specialists (IACIS) in Portland, Oregon coined the term ‘computer forensics’ during a training session.
APA, Harvard, Vancouver, ISO, and other styles
38

Adedayo, Oluwasola Mary. "Reconstruction in Database Forensics." Thesis, 2015. http://hdl.handle.net/2263/43777.

Full text
Abstract:
The increasing usage of databases in the storage of critical and sensitive information in many organizations has led to an increase in the rate at which databases are exploited in computer crimes. Databases are often manipulated to facilitate crimes and as such are usually of interest during many investigations as useful information relevant to the investigation can be found therein. A branch of digital forensics that deals with the identification, preservation, analysis and presentation of digital evidence from databases is known as database forensics. Despite the large amount of information that can be retrieved from databases and the amount of research that has been done on various aspects of databases, database security and digital forensics in general, very little has been done on database forensics. Databases have also been excluded from traditional digital investigations until very recently. This can be attributed to the inherent complexities of databases and the lack of knowledge on how the information contained in the database can be retrieved, especially in cases where such information have been modified or existed in the past. This thesis addresses one major part of the challenges in database forensics, which is the reconstruction of the information stored in the database at some earlier time. The dimensions involved in a database forensics analysis problem are identified and the thesis focuses on one of these dimensions. Concepts such as the relational algebra log and the inverse relational algebra are introduced as tools in the definition of a theoretical framework that can be used for database forensics. The thesis provides an algorithm for database reconstruction and outlines the correctness proof of the algorithm. Various techniques for a complete regeneration of deleted or lost data during a database forensics analysis are also described. Due to the importance of having adequate logs in order to use the algorithm, specifications of an ideal log configuration for an effective reconstruction process are given, putting into consideration the various dimensions of the database forensics problem space. Throughout the thesis, practical situations that illustrate the application of the algorithms and techniques described are given. The thesis provides a scientific approach that can be used for handling database forensics analysis practice and research, particularly in the aspect of reconstructing the data in a database. It also adds to the field of digital forensics by providing insights into the field of database forensics reconstruction.
Thesis (PhD)--University of Pretoria, 2015.
Computer Science
PhD
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
39

Fei, B. K. L. (Bennie Kar Leung). "Data visualisation in digital forensics." Diss., 2007. http://hdl.handle.net/2263/22996.

Full text
Abstract:
As digital crimes have risen, so has the need for digital forensics. Numerous state-of-the-art tools have been developed to assist digital investigators conduct proper investigations into digital crimes. However, digital investigations are becoming increasingly complex and time consuming due to the amount of data involved, and digital investigators can find themselves unable to conduct them in an appropriately efficient and effective manner. This situation has prompted the need for new tools capable of handling such large, complex investigations. Data mining is one such potential tool. It is still relatively unexplored from a digital forensics perspective, but the purpose of data mining is to discover new knowledge from data where the dimensionality, complexity or volume of data is prohibitively large for manual analysis. This study assesses the self-organising map (SOM), a neural network model and data mining technique that could potentially offer tremendous benefits to digital forensics. The focus of this study is to demonstrate how the SOM can help digital investigators to make better decisions and conduct the forensic analysis process more efficiently and effectively during a digital investigation. The SOM’s visualisation capabilities can not only be used to reveal interesting patterns, but can also serve as a platform for further, interactive analysis.
Dissertation (MSc (Computer Science))--University of Pretoria, 2007.
Computer Science
unrestricted
APA, Harvard, Vancouver, ISO, and other styles
40

Delgado, Manuel. "Combater o crime económico com armas digitais: o papel do open-source." Master's thesis, 2012. http://hdl.handle.net/10071/8063.

Full text
Abstract:
A par do extraordinário desenvolvimento económico proporcionado pela evolução das Tecnologias da Informação e Comunicação, potenciado, nos anos mais recentes, com o desenvolvimento da internet, consolida-se uma dimensão anárquica e obscura que, a coberto do anonimato e tirando partido da mesma tecnologia, veicula uma miríade de comportamentos de natureza criminosa, que tendem a subverter os princípios básicos da vivência em sociedade, alguns dos quais, contribuíram em larga medida para a dimensão da presente crise económica. Grande parte das dificuldades que a generalidade dos sistemas de justiça enfrentam, para dominar este novo tipo de criminalidade, radicam no Gap Tecnológico que se verifica nas tecnologias de que dispõe, face aos meios utilizados pelo crime organizado. Tomando como referencial o fenómeno do crime económico, o presente trabalho apresenta a Informática Forense como ferramenta incontornável para combater aquele flagelo. Ao longo da revisão da literatura, evidenciam-se cenários concretos e recorrentes na investigação da criminalidade económica, no sentido de concretizar em fase posterior, o respectivo tratamento com recurso a ferramentas Open Source, confrontando os resultados obtidos com os resultantes de idêntico tratamento efectuado com base numa ferramenta comercial de referência. Procura-se assim demonstrar que estão disponíveis ferramentas que possibilitam um salto qualitativo nos processos de investigação, sem por em causa o equilíbrio orçamental que a actual situação económica exige.
In addition to the extraordinary economic development provided by the evolution of Information and Communication Technologies, boosted in recent years, with the development of internet, will be consolidating an anarchic and obscure dimension that, under cover of anonymity and exploiting the same technology, conveys a myriad of behaviors of a criminal nature, which tend to undermine the basic principles of living in society, some of which have contributed greatly to the dimensions of this economic crisis. Many of the difficulties that the generality of the justice systems faced to master this new type of crime, rooted in the Technological Gap between the means used by organized crime and those that have the justice system. Taking as reference the phenomenon of economic crime, this dissertation presents the Forensic Computing as a tool essential to combat that scourge. Through the literature review, I will seek to show, specific scenarios, but recurrent in the investigation of economic crime, in order to realize at a later stage, their treatment with the use of Open Source tools, comparing the results obtained with the same treatment carried out based on a commercial reference tool. Thus I will show that tools are available that enable a qualitative jump in research processes, without jeopardizing the budget balance of the justice departments, as the current economic situation requires.
APA, Harvard, Vancouver, ISO, and other styles
41

Valjarevic, Aleksandar. "A comprehensive and harmonised digital forensic investigation process model." Thesis, 2015. http://hdl.handle.net/2263/50812.

Full text
Abstract:
Recent decades have seen a significant increase in the importance of the field of digital forensics as a result of the rapid development of information and communication technologies and their penetration into every corner of our lives and society. Furthermore, information security incidents are not only becoming more versatile every year, but are also growing in number, thus emphasising the importance of digital forensic investigations. Performing a digital forensic investigation requires a standardised and formalised process in order to ensure the admissibility of digital evidence, as well as the effectiveness and efficiency of investigations and collaboration between stakeholders. When this thesis was being prepared, there existed neither an international standard for formalising the overarching digital forensic investigation process, nor a process model that was accepted as a harmonised model across different jurisdictions worldwide. The author studied existing state-of-the-art digital forensic investigation process (DFIP) models and concluded that there are significant disparities between them, pertaining to the number of processes, the scope, the hierarchical levels and concepts applied (for example, some of the models are based on the physical crime investigation processes, whereas others focus only on the digital aspects of the investigation process). This thesis proposes a comprehensive DFIP model that harmonises existing models for the purpose of establishing an international standard. An effort was made to incorporate all relevant types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness, while introducing a number of novelties. The author introduces a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective digital forensic investigations, while ensuring the admissibility of digital evidence. The author also proposes a prototype that would guide the user through the implementation of a standardised and harmonised DFIP, and ultimately validate the use of a proper digital forensic investigation process. Both the proposed model and the prototype were tested and evaluated, and the results of these evaluations are presented in the thesis. The proposed model and the prototype contribute significantly to the field of digital forensics. The author believes its application would render benefits that range from the higher admissibility of digital evidence and more effective investigations to easier cross-border collaboration on international investigations, thus fulfilling the initial reasons for creating a harmonised model. The proposed model is intended to be used for different types of digital forensic investigation and should ultimately culminate in an international standard. In fact, while this thesis was being written, an international standard on digital forensic investigation process model – as developed by the author was published as a result of the research reported on in this thesis.
Thesis (PhD)--University of Pretoria, 2015.
tm2015
Computer Science
PhD
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
42

Pooe, El Antonio. "Developing a multidisciplinary digital forensic readiness model for evidentiary data handling." Thesis, 2018. http://hdl.handle.net/10500/25316.

Full text
Abstract:
There is a growing global recognition as to the importance of outlawing malicious computer related acts in a timely manner, yet few organisations have the legal and technical resources necessary to address the complexities of adapting criminal statutes to cyberspace. Literature reviewed in this study suggests that a coordinated, public-private partnership to produce a model approach can help reduce potential dangers arising from the inadvertent creation of cybercrime havens. It is against this backdrop that the study seeks to develop a digital forensic readiness model (DFRM) using a coordinated, multidisciplinary approach, involving both the public and private sectors, thus enabling organisations to reduce potential dangers arising from the inadvertent destruction and negating of evidentiary data which, in turn, results in the non-prosecution of digital crimes. The thesis makes use of 10 hypotheses to address the five research objectives, which are aimed at investigating the problem statement. This study constitutes qualitative research and adopts the post-modernist approach. The study begins by investigating each of the 10 hypotheses, utilising a systematic literature review and interviews, followed by a triangulation of findings in order to identify and explore common themes and strengthen grounded theory results. The output from the latter process is used as a theoretical foundation towards the development of a DFRM model which is then validated and verified against actual case law. Findings show that a multidisciplinary approach to digital forensic readiness can aid in preserving the integrity of evidentiary data within an organisation. The study identifies three key domains and their critical components. The research then demonstrates how the interdependencies between the domains and their respective components can enable organisations to identify and manage vulnerabilities which may contribute to the inadvertent destruction and negating of evidentiary data. The Multidisciplinary Digital Forensic Readiness Model (M-DiFoRe) provides a proactive approach to creating and improving organizational digital forensic readiness. This study contributes to the greater body of knowledge in digital forensics in that it reduces complexities associated with achieving digital forensic readiness and streamlines the handling of digital evidence within an organisation.
Information Science
Ph.D. (Information Systems)
APA, Harvard, Vancouver, ISO, and other styles
43

張緹柔. "A Study on Investigation and Digital Forensic for Wireless Cyber Crime." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/k5896p.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Ndara, Vuyani. "Computer seizure as technique in forensic investigation." Diss., 2014. http://hdl.handle.net/10500/13277.

Full text
Abstract:
The problem encountered by the researcher was that the South African Police Service Cyber-Crimes Unit is experiencing problems in seizing computer evidence. The following problems were identified by the researcher in practice: evidence is destroyed or lost because of mishandling by investigators; computer evidence is often not obtained or recognised, due to a lack of knowledge and skills on the part of investigators to properly seize computer evidence; difficulties to establish authenticity and initiate a chain of custody for the seized evidence; current training that is offered is unable to cover critical steps in the performance of seizing computer evidence; computer seizure as a technique requires specialised knowledge and continuous training, because the information technology industry is an ever-changing area. An empirical research design, followed by a qualitative research approach, allowed the researcher to also obtain information from practice. A thorough literature study, complemented by interviews, was done to collect the required data for the research. Members of the South African Police Cyber-crime Unit and prosecutors dealing with cyber-crime cases were interviewed to obtain their input into, and experiences on, the topic. The aim of the study was to explore the role of computers in the forensic investigation process, and to determine how computers can be seized without compromising evidence. The study therefore also aimed at creating an understanding and awareness about the slippery nature of computer evidence, and how it can find its way to the court of law without being compromised. The research has revealed that computer crime is different from common law or traditional crimes. It is complicated, and therefore only skilled and qualified forensic experts should be used to seize computer evidence, to ensure that the evidence is not compromised. Training of cyber-crime technicians has to be priority, in order to be successful in seizing computers.
Department of Criminology
M.Tech. (Forensic Investigation)
APA, Harvard, Vancouver, ISO, and other styles
45

Bernardo, Bruno Miguel Vital. "Toolbox application to support and enhance the mobile device forensics investigation process - breaking through the techniques available." Master's thesis, 2021. http://hdl.handle.net/10362/113177.

Full text
Abstract:
Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business Intelligence
One of the main topics that is discussed today is how can a person leverage on technology on a positive and secure way in order to enhance their daily life, making it a healthier, more productive, joyful and easier. However, with improvements in technology, comes challenges for which there is not yet a stable and safe way to overcome. One of the greatest challenges that people are faced has to do with their concern on their privacy and on the safeguard of their sensitive information that is stored in any device that one uses. In fact, one of the most used technology is the Mobile, which can take several forms, features, shapes, and many other components. In line manner, cybercrime is growing rapidly, targeting the exploitation and retrieval of information from these gadgets. Even so, with a Mobile, comes several challenges including a rapidly dynamic change in its landscape, an everincreasing diversity of mobile phones forms, integration of the information on a Mobile into the Cloud and IoT. As such, it’s vital to have a stable and safe toolbox that will enable a digital investigator to potentially prevent, detect and solve any issue that may be related to Mobile Device Forensics while solving out various investigations, being it criminal, civil, corporate or any other.
APA, Harvard, Vancouver, ISO, and other styles
46

Joshi, Abhishek Shriram. "Image Processing and Super Resolution Methods for a Linear 3D Range Image Scanning Device for Forensic Imaging." 2013. http://hdl.handle.net/1805/3414.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
In the last few decades, forensic science has played a significant role in bringing criminals to justice. Shoe and tire track impressions found at the crime scene are important pieces of evidence since the marks and cracks on them can be uniquely tied to a person or vehicle respectively. We have designed a device that can generate a highly accurate 3-Dimensional (3D) map of an impression without disturbing the evidence. The device uses lasers to detect the changes in depth and hence it is crucial to accurately detect the position of the laser. Typically, the forensic applications require very high resolution images in order to be useful in prosecutions of criminals. Limitations of the hardware technology have led to the use of signal and image processing methods to achieve high resolution images. Super Resolution is the process of generating higher resolution images from multiple low resolution images using knowledge about the motion and the properties of the imaging geometry. This thesis presents methods for developing some of the image processing components of the 3D impression scanning device. In particular, the thesis describes the following two components: (i) methods to detect the laser stripes projected onto the impression surface in order to calculate the deformations of the laser stripes due to 3D surface shape being scanned, and (ii) methods to improve the resolution of the digitized color image of the impression by utilizing multiple overlapping low resolution images captured during the scanning process and super resolution techniques.
APA, Harvard, Vancouver, ISO, and other styles
47

Breitinger, Frank. "On the utility of bytewise approximate matching in computer science with a special focus on digital forensics investigations." Phd thesis, 2014. https://tuprints.ulb.tu-darmstadt.de/4055/1/20140717_Dissertation_Breitinger_final.pdf.

Full text
Abstract:
Handling hundreds of thousands of files is a major challenge in today’s digital forensics. In order to cope with this information overload, investigators often apply hash functions for automated input identification. Besides identifying exact duplicates, which is mostly solved running cryptographic hash functions, it is also necessary to cope with similar inputs (e.g., different versions of files), embedded objects (e.g., a JPG within a office document), and fragments (e.g., network packets). Thus, the essential idea is to complement the use of cryptographic hash functions, to detect data objects with bytewise identical representation, with the capability to find objects with bytewise similar representations. Unlike cryptographic hash functions, which have a wide range of applications and have been studied as well as tested for a long time, approximate matching algorithms are still in their early development stages. More precisely, currently the community is missing a definition, an evaluation methodology and (additional) fields of application. Therefore, this thesis aims at establishing approximate matching in computer sciences with a special focus on digital forensic investigations. One of our firsts step was to develop a generic definition for approximate matching, in collaboration with the National Institute of Standards and Technology (NIST) which is applicable to the different levels approximate matching, e.g., bytewise and semantic. A subsequent detailed analysis of both existing approaches uncovers different strengths and weaknesses, therefore we present improvements. To extend the range of algorithms, this work introduces three of our new algorithms, that are based on well-known techniques of computer sciences. A core contribution of this thesis is the open source evaluation framework called FRASH which assesses tools on different criteria. Besides traditional properties (borrowed from hash functions) like generation efficiency and space efficiency (compression), we conceive methods to determine precision and recall rates based on synthetic as well as real world data. Since digital investigations are often time critical, we improve the performance of automated file identification by a mechanism we call prefetching. Compared to a straight forward analysis, the performance increases by almost 40% without additional hardware. In this context we also discuss the impact of different hashing/approximate matching algorithms for digital investigations and conclude that it is absolutely reasonable to apply crypto hashing as well as bytewise/semantic approximate matching algorithms in a prosecution. To extend the fields of application, this thesis demonstrates the capabilities of applying approximate matching on network traffic analysis and biometric template protection. Our research shows that approximate matching is perfectly suited for data leakage prevention and can also be applied for biometric template protection, biometric data compression and efficient biometric identification.
APA, Harvard, Vancouver, ISO, and other styles
48

Lochner, Hendrik Thomas. "Kartering van selfoontegnologie." Diss., 2007. http://hdl.handle.net/10500/554.

Full text
Abstract:
It is sincerely hoped that this work will motivate other researchers and in particular my colleagues to do further research in the field of cellphone technology, especially how it can be mapped to enable it to be utilised as evidence in our courts. This research aims to develop the mapping of cellphone technology as an aid in the investigation of crime. The mapping of cellphone technology refers to how cellphone technology can be utilised in crime investigation and in particular how a criminal can be placed at the scene of a crime, as a result of a cellphone call that was either made or received. To place the suspect at the scene of a crime as a result of a call made or received, cellphone records and technology of the relevant cellphone company, as well as present computer programmes can be utilised. Shortly, it can be said that a criminal can geographically be placed within a space some where on this earth.
Criminology
M.Tech. (Forensic Investigation)
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography