To see the other types of publications on this topic, follow the link: Computer forensic investigation.

Dissertations / Theses on the topic 'Computer forensic investigation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Computer forensic investigation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Law, Yuet-wing, and 羅越榮. "Investigation models for emerging computer forensic challenges." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2011. http://hub.hku.hk/bib/B46971324.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sanyamahwe, Tendai. "Digital forensic model for computer networks." Thesis, University of Fort Hare, 2011. http://hdl.handle.net/10353/d1000968.

Full text
Abstract:
The Internet has become important since information is now stored in digital form and is transported both within and between organisations in large amounts through computer networks. Nevertheless, there are those individuals or groups of people who utilise the Internet to harm other businesses because they can remain relatively anonymous. To prosecute such criminals, forensic practitioners have to follow a well-defined procedure to convict responsible cyber-criminals in a court of law. Log files provide significant digital evidence in computer networks when tracing cyber-criminals. Network log mining is an evolution of typical digital forensics utilising evidence from network devices such as firewalls, switches and routers. Network log mining is a process supported by presiding South African laws such as the Computer Evidence Act, 57 of 1983; the Electronic Communications and Transactions (ECT) Act, 25 of 2002; and the Electronic Communications Act, 36 of 2005. Nevertheless, international laws and regulations supporting network log mining include the Sarbanes-Oxley Act; the Foreign Corrupt Practices Act (FCPA) and the Bribery Act of the USA. A digital forensic model for computer networks focusing on network log mining has been developed based on the literature reviewed and critical thought. The development of the model followed the Design Science methodology. However, this research project argues that there are some important aspects which are not fully addressed by South African presiding legislation supporting digital forensic investigations. With that in mind, this research project proposes some Forensic Investigation Precautions. These precautions were developed as part of the proposed model. The Diffusion of Innovations (DOI) Theory is the framework underpinning the development of the model and how it can be assimilated into the community. The model was sent to IT experts for validation and this provided the qualitative element and the primary data of this research project. From these experts, this study found out that the proposed model is very unique, very comprehensive and has added new knowledge into the field of Information Technology. Also, a paper was written out of this research project.
APA, Harvard, Vancouver, ISO, and other styles
3

Montasari, Reza. "The Comprehensive Digital Forensic Investigation Process Model (CDFIPM) for digital forensic practice." Thesis, University of Derby, 2016. http://hdl.handle.net/10545/620799.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Etow, Tambue Ramine. "IMPACT OF ANTI-FORENSICS TECHNIQUES ON DIGITAL FORENSICS INVESTIGATION." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-97116.

Full text
Abstract:
Computer crimes have become very complex in terms of investigation and prosecution. This is mainly because forensic investigations are based on artifacts left oncomputers and other digital devices. In recent times, perpetrators of computer crimesare getting abreast of the digital forensics dynamics hence, capacitated to use someanti-forensics measures and techniques to obfuscate the investigation processes.Incases where such techniques are employed, it becomes extremely difficult, expensive and time consuming to carry out an effective investigation. This might causea digital forensics expert to abandon the investigation in a pessimistic manner.ThisProject work serves to practically demonstrate how numerous anti-forensics can bedeployed by the criminals to derail the smooth processes of digital forensic investigation with main focus on data hiding and encryption techniques, later a comparativestudy of the effectiveness of some selected digital forensics tools in analyzing andreporting shreds of evidence will be conducted.
APA, Harvard, Vancouver, ISO, and other styles
5

Fairbanks, Kevin D. "Forensic framework for honeypot analysis." Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/33977.

Full text
Abstract:
The objective of this research is to evaluate and develop new forensic techniques for use in honeynet environments, in an effort to address areas where anti-forensic techniques defeat current forensic methods. The fields of Computer and Network Security have expanded with time to become inclusive of many complex ideas and algorithms. With ease, a student of these fields can fall into the thought pattern of preventive measures as the only major thrust of the topics. It is equally important to be able to determine the cause of a security breach. Thus, the field of Computer Forensics has grown. In this field, there exist toolkits and methods that are used to forensically analyze production and honeypot systems. To counter the toolkits, anti-forensic techniques have been developed. Honeypots and production systems have several intrinsic differences. These differences can be exploited to produce honeypot data sources that are not currently available from production systems. This research seeks to examine possible honeypot data sources and cultivate novel methods to combat anti-forensic techniques. In this document, three parts of a forensic framework are presented which were developed specifically for honeypot and honeynet environments. The first, TimeKeeper, is an inode preservation methodology which utilizes the Ext3 journal. This is followed with an examination of dentry logging which is primarily used to map inode numbers to filenames in Ext3. The final component presented is the initial research behind a toolkit for the examination of the recently deployed Ext4 file system. Each respective chapter includes the necessary background information and an examination of related work as well as the architecture, design, conceptual prototyping, and results from testing each major framework component.
APA, Harvard, Vancouver, ISO, and other styles
6

Bourg, Rachel. "Bloom Filters for Filesystem Forensics." ScholarWorks@UNO, 2006. http://scholarworks.uno.edu/td/1288.

Full text
Abstract:
Digital forensics investigations become more time consuming as the amount of data to be investigated grows. Secular growth trends between hard drive and memory capacity just exacerbate the problem. Bloom filters are space-efficient, probabilistic data structures that can represent data sets with quantifiable false positive rates that have the potential to alleviate the problem by reducing space requirements. We provide a framework using Bloom filters to allow fine-grained content identification to detect similarity, instead of equality. We also provide a method to compare filters directly and a statistical means of interpreting the results. We developed a tool--md5bloom--that uses Bloom filters for standard queries and direct comparisons. We provide a performance comparison with a commonly used tool, md5deep, and achieved a 50% performance gain that only increases with larger hash sets. We compared filters generated from different versions of KNOPPIX and detected similarities and relationships between the versions.
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Mengmeng, and 王萌萌. "Temporal analysis on HFS+ and across file systems in digital forensic investigation." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hub.hku.hk/bib/B50900122.

Full text
Abstract:
In computer forensics, digital evidence related to time is both important and complex. The rules of changes in time associated with digital evidence, such as files or folders, can be used to analyze certain user behaviors like data access, modification or transfer. However, the format and the rules in time information for user actions are quite different for different file systems, even for different versions of operating systems with the same file system. Some research on temporal analysis has already been done on NTFS and FAT file systems, while there are few resources that describe temporal analysis on the Hierarchical File System Plus (HFS+), the default file system in Apple computer. Moreover, removable devices like USB disks are used frequently; transferring files and folders between different devices with different file systems and operating systems happens more and more frequently, so the changes of times across different file systems are also crucial in digital forensics and investigations. In this research, the changes in time attributes of files and folders resulting from user actions on the HFS+ file system and across file systems are analyzed, and the rules of time are generated by inductive reasoning to help reconstruct crime scenes in the digital forensic investigation. Since inductive reasoning is not definitely true compared with deductive reasoning, experiments are performed to validate the rules. The usage of the rules is demonstrated by analyzing a case in details. The methods proposed here are efficient, practical and easy to put into practice in real scenarios.
published_or_final_version
Computer Science
Master
Master of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
8

Sonnekus, Michael Hendrik. "A comparison of open source and proprietary digital forensic software." Thesis, Rhodes University, 2015. http://hdl.handle.net/10962/d1017939.

Full text
Abstract:
Scrutiny of the capabilities and accuracy of computer forensic tools is increasing as the number of incidents relying on digital evidence and the weight of that evidence increase. This thesis describes the capabilities of the leading proprietary and open source digital forensic tools. The capabilities of the tools were tested separately on digital media that had been formatted using Windows and Linux. Experiments were carried out with the intention of establishing whether the capabilities of open source computer forensics are similar to those of proprietary computer forensic tools, and whether these tools could complement one another. The tools were tested with regards to their capabilities to make and analyse digital forensic images in a forensically sound manner. The tests were carried out on each media type after deleting data from the media, and then repeated after formatting the media. The results of the experiments performed demonstrate that both proprietary and open source computer forensic tools have superior capabilities in different scenarios, and that the toolsets can be used to validate and complement one another. The implication of these findings is that investigators have an affordable means of validating their findings and are able to more effectively investigate digital media.
APA, Harvard, Vancouver, ISO, and other styles
9

Marziale, Lodovico. "Advanced Techniques for Improving the Efficacy of Digital Forensics Investigations." ScholarWorks@UNO, 2009. http://scholarworks.uno.edu/td/1027.

Full text
Abstract:
Digital forensics is the science concerned with discovering, preserving, and analyzing evidence on digital devices. The intent is to be able to determine what events have taken place, when they occurred, who performed them, and how they were performed. In order for an investigation to be effective, it must exhibit several characteristics. The results produced must be reliable, or else the theory of events based on the results will be flawed. The investigation must be comprehensive, meaning that it must analyze all targets which may contain evidence of forensic interest. Since any investigation must be performed within the constraints of available time, storage, manpower, and computation, investigative techniques must be efficient. Finally, an investigation must provide a coherent view of the events under question using the evidence gathered. Unfortunately the set of currently available tools and techniques used in digital forensic investigations does a poor job of supporting these characteristics. Many tools used contain bugs which generate inaccurate results; there are many types of devices and data for which no analysis techniques exist; most existing tools are woefully inefficient, failing to take advantage of modern hardware; and the task of aggregating data into a coherent picture of events is largely left to the investigator to perform manually. To remedy this situation, we developed a set of techniques to facilitate more effective investigations. To improve reliability, we developed the Forensic Discovery Auditing Module, a mechanism for auditing and enforcing controls on accesses to evidence. To improve comprehensiveness, we developed ramparser, a tool for deep parsing of Linux RAM images, which provides previously inaccessible data on the live state of a machine. To improve efficiency, we developed a set of performance optimizations, and applied them to the Scalpel file carver, creating order of magnitude improvements to processing speed and storage requirements. Last, to facilitate more coherent investigations, we developed the Forensic Automated Coherence Engine, which generates a high-level view of a system from the data generated by low-level forensics tools. Together, these techniques significantly improve the effectiveness of digital forensic investigations conducted using them.
APA, Harvard, Vancouver, ISO, and other styles
10

Hashim, Noor Hayati. "An architecture for the forensic analysis of Windows system generated artefacts." Thesis, University of South Wales, 2011. https://pure.southwales.ac.uk/en/studentthesis/forensic-analysis-of-windows-system-generated-artefacts(be571569-2afe-4d52-8c99-9dbc8388b1db).html.

Full text
Abstract:
Computer forensic tools have been developed to enable forensic investigators to analyse software artefacts to help reconstruct possible scenarios for activity on a particular computer system. A number of these tools allow the examination and analysis of system generated artefacts such as the Windows registry. Examination and analysis of these artefacts is focussed on recovering the data extracting information relevant to a digital investigation. This information is currently underused in most digital investigations. With this in mind, this thesis considers system generated artefacts that contain information concerning the activities that occur on a Windows system and will often contain evidence relevant to a digital investigation. The objective of this research is to develop an architecture that simplifies and automates the collection of forensic evidence from system generated files where the data structures may be either known or in a structured but poorly understood (unknown) format. The hypothesis is that it should be feasible to develop an architecture that will be to integrate forensic data extracted from a range of system generated files and to implement a proof of concept prototype tool, capable of visualising the Event logs and Swap files. This thesis presents an architecture to enable the forensic investigator to analyse and visualise a range of system generated artefacts for which the internal arrangement of data is either well structured and understood or those for which the internal arrangement of the data is unclear or less publicised (known and not known data structures). The architecture reveals methods to access, view and analyse system generated artefacts. The architecture is intended to facilitate the extraction and analysis of operating system generated artefacts while being extensible, flexible and reusable. The architectural concepts are tested using a prototype implementation focussed the Windows Event Logs and the Swap Files. Event logs reveal evidence regarding logons, authentication, account and privilege use and can address questions relating to which user accounts were being used and which machines were accessed. Swap file contains fragments of data, remnants or entire documents, e-mail messages or results of internet browsing which reveal past user activities. Issues relating to understanding and visualising artefacts data structure are discussed and possible solutions are explored. The architecture is developed by examining the requirements and methods with respect to the needs of computer forensic investigations and forensic process models with the intention to develop a new multiplatform tool to visualise the content of Event logs and Swap files. This tool is aimed at displaying data contained in event logs and swap files in a graphical manner. This should enable the detection of information which may support the investigation. Visualisation techniques can also aid the forensic investigators in identifying suspicious events and files, making such techniques more feasible for consideration in a wider range of cases and, in turn, improve standard procedures. The tool is developed to fill a gap between capabilities of certain other open source tools which visualise the Event logs and Swap files data in a text based format only.
APA, Harvard, Vancouver, ISO, and other styles
11

Patterson, Farrah M. "The implications of virtual environments in digital forensic investigations." Master's thesis, University of Central Florida, 2011. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4819.

Full text
Abstract:
This research paper discusses the role of virtual environments in digital forensic investigations. With virtual environments becoming more prevalent as an analysis tool in digital forensic investigations, it's becoming more important for digital forensic investigators to understand the limitation and strengths of virtual machines. The study aims to expose limitations within commercial closed source virtual machines and open source virtual machines. The study provides a brief overview of history digital forensic investigations and virtual environments, and concludes with an experiment with four common open and closed source virtual machines; the effects of the virtual machines on the host machine as well as the performance of the virtual machine itself. My findings discovered that while the open source tools provided more control and freedom to the operator, the closed source tools were more stable and consistent in their operation. The significance of these findings can be further researched by applying them in the context of exemplifying reliability of forensic techniques when presented as analysis tool used in litigation.
ID: 030646240; System requirements: World Wide Web browser and PDF reader.; Mode of access: World Wide Web.; Thesis (M.S.)--University of Central Florida, 2011.; Includes bibliographical references (p. 46).
M.S.
Masters
Computer Science
Engineering and Computer Science
Digital Forensics; Science/Computing Track
APA, Harvard, Vancouver, ISO, and other styles
12

LeRoi, Jack. "A forensic investigation of the electrical properties of digital audio recording." Thesis, University of Colorado at Denver, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=1556863.

Full text
Abstract:

In media forensics, the devices; e.g. computers, smart phones, still/video cameras, audio recorders, and software; e.g. video, audio, and graphics editors, file and disk utilities, mathematical computation applications, are, for the most part, black boxes. The design specifications are usually proprietary and the operating specifications may be incomplete, inaccurate, or unavailable. This makes it difficult to validate the technology, but using it without validation could discredit a practitioner's findings or testimony. The alternative is to test the device or program to determine relevant characteristics of its performance.

An important and common device in media forensics is the portable digital audio recorder used to record surveillance and interviews. This type can also be used to record the alternating current (AC) waveform from the mains power. While small variations in the AC frequency (ENF) can be forensically important, distortion in the recording can affect its value in adjudication or investigation. A method is presented to evaluate aspects of a recorder's operation that can cause distortion. Specifically, the method measures the noise generated by the recorder's electronics in its input and amplifier circuits. The method includes a procedure to isolate the recorder from environmental sources of noise. The method analyzes the broadband noise floor produced by the range of recording conditions and recorder settings. It also analyzes the noise amplitude for the harmonics for the mains frequency.

APA, Harvard, Vancouver, ISO, and other styles
13

McCallister, Ronald F. "Forensic Computing for Non-Profits: A Case Study for Consideration When Non-Profits Need to Determine if a Computer Forensic Investigation is Warranted." [Johnson City, Tenn. : East Tennessee State University], 2004. https://dc.etsu.edu/etd/940.

Full text
Abstract:
Thesis (M.S.)--East Tennessee State University, 2004.
Title from electronic submission form. ETSU ETD database URN: etd-0831104-124226 Includes bibliographical references. Also available via Internet at the UMI web site.
APA, Harvard, Vancouver, ISO, and other styles
14

Jiang, Lin, and 蒋琳. "New cryptographic schemes with application in network security and computer forensics." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2010. http://hub.hku.hk/bib/B44753226.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Hovmark, Olle, and Emma Schüldt. "Towards Extending Probabilistic Attack Graphs with Forensic Evidence : An investigation of property list files in macOS." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280102.

Full text
Abstract:
Cyber-attacks against all types of systems is a growing problem in society. Since the Mac operating systems are becoming more common, so are the attacks against them. Probabilistic attack graphs are a way to model cyber- attacks. The Meta Attack Language is a language that can be used to create domain-specific languages that in turn can be used to model an attack on the specific domain with a probabilistic attack graph. This report investigates how the Meta Attack Language can be extended so that it could be used for creating attack graphs with forensic evidence, by focusing on attacks on Mac operating systems that has left evidence in the form of property list files. The MITRE ATT&CK matrix is a knowledge base with information about cyber- attacks. A study of the matrix was made to examine what evidence has been found from attacks on a Mac operating system and also to motivate why this report focuses on evidence in the form of property list files. A study on grey literature was then made to investigate different types of attacks that has left evidence in the form of property list files. The studies showed that there are a multitude of evidence that could be left from an attack on a Mac operating system and that most evidence in the form of property list files was used by the adversary as persistence mechanisms. They also showed that the property list files often were placed at root level in the file system. The studies also showed that the adversary often tried to hide the files by giving them names that are common in a Mac operating system. After the studies were conducted a list of requirements for extending the Meta Attack Language was created. This list was based on the results from the studies and included requirements that says there must be a way of expressing the name and location of the files, detection evasion methods, connections between different types of evidence or between evidence and attack steps, and more.
Cyberattacker mot alla typer av system är ett växande problem i samhället. Eftersom Mac-operativsystemen blir allt vanligare, blir attackerna mot dem också vanligare. Probabilistiska attackgrafer är ett sätt att modellera och visualisera cyberattacker. Meta Attack Language är ett språk som kan användas för att skapa domänspecifika språk som i sin tur kan användas för att modellera en cyberattack på den specifika domänen med en probabilistisk attackgraf. Denna rapport undersöker hur Meta Attack Language kan utvidgas så att det kan användas för att skapa attackgrafer som innehåller digitala forensiska bevis, genom att undersöka attacker mot Mac- operativsystem där bevis i form av property-list-filer har lämnats. MITRE ATT&CK-matrisen är en kunskapsbas med information om cyberattacker. En studie av denna matris gjordes för att ta reda på vilka olika typer av bevis som har hittats efter attacker på ett Mac-operativsystem samt för att motivera varför denna rapport fokuserar på bevis i form av property-list-filer. En studie av grå litteratur gjordes sedan för att undersöka olika typer av attacker som har lämnat bevis i form av property-list-filer. Studierna visade att det finns en mängd bevis som kan lämnas från ett angrepp på ett Mac-operativsystem och att de flesta bevis i form av property-list-filer användes av attackeraren för att göra attacken tålig mot sådant som omstart av systemet. De visade också att property-list-filerna ofta placerades i rotkatalogen i filsystemet. Studierna visade också att motståndaren ofta försökte dölja filerna genom att ge dem namn som vanligtvis används på ett Mac-operativsystem. Efter studierna genomförts skapades en lista med krav som måste uppfyllas av en utvidgning av Meta Attack. Denna lista baserades på resultaten från studierna och inkluderade krav som säger att det till exempel måste finnas sätt att uttrycka namnet och platsen för en fil, metoder som angriparen använder för att undvika upptäckt, samband mellan olika typer av bevis och samband mellan bevis och attacksteg.
APA, Harvard, Vancouver, ISO, and other styles
16

Jones, Eric Douglas. "Forensic Investigation of Stamped Markings Using a Large-Chamber Scanning Electron Microscope and Computer Analysis for Depth Determination." TopSCHOLAR®, 2013. http://digitalcommons.wku.edu/theses/1237.

Full text
Abstract:
All firearms within the United States are required by the Gun Control Act to be physically marked with a serial number; which is at least 0.003” in depth and 1/16” in height. The purpose of a serial number is to make each firearm uniquely identifiable and traceable. Intentional removal of a serial number is a criminal offense and is used to hide the identity and movements of the involved criminal parties. The current standard for firearm serial number restoration is by chemical etching; which is time & labor intensive as well as destructive to the physical evidence (firearm). It is hypothesized that a new technique that is accurate, precise, and time efficient will greatly aid law enforcement agencies in pursuing criminals. This thesis focuses on using a large chamber scanning electron microscope to take secondary electron (SE) images of a stamped metal plate and analyzing them using the MIRA MX 7 UE image processing software for purposes of depth determination. An experimental peak luminance value of 77 (pixel values) was correlated to the known depth (273 μm) at the bottom of the sample character. Results show that it is potentially possible to determine an unknown depth from a SEM image; using luminance values obtained in the MIRA analysis.
APA, Harvard, Vancouver, ISO, and other styles
17

Hargreaves, C. J. "Assessing the Reliability of Digital Evidence from Live Investigations Involving Encryption." Thesis, Department of Informatics and Sensors, 2009. http://hdl.handle.net/1826/4007.

Full text
Abstract:
The traditional approach to a digital investigation when a computer system is encountered in a running state is to remove the power, image the machine using a write blocker and then analyse the acquired image. This has the advantage of preserving the contents of the computer’s hard disk at that point in time. However, the disadvantage of this approach is that the preservation of the disk is at the expense of volatile data such as that stored in memory, which does not remain once the power is disconnected. There are an increasing number of situations where this traditional approach of ‘pulling the plug’ is not ideal since volatile data is relevant to the investigation; one of these situations is when the machine under investigation is using encryption. If encrypted data is encountered on a live machine, a live investigation can be performed to preserve this evidence in a form that can be later analysed. However, there are a number of difficulties with using evidence obtained from live investigations that may cause the reliability of such evidence to be questioned. This research investigates whether digital evidence obtained from live investigations involving encryption can be considered to be reliable. To determine this, a means of assessing reliability is established, which involves evaluating digital evidence against a set of criteria; evidence should be authentic, accurate and complete. This research considers how traditional digital investigations satisfy these requirements and then determines the extent to which evidence from live investigations involving encryption can satisfy the same criteria. This research concludes that it is possible for live digital evidence to be considered to be reliable, but that reliability of digital evidence ultimately depends on the specific investigation and the importance of the decision being made. However, the research provides structured criteria that allow the reliability of digital evidence to be assessed, demonstrates the use of these criteria in the context of live digital investigations involving encryption, and shows the extent to which each can currently be met.
APA, Harvard, Vancouver, ISO, and other styles
18

Hargreaves, Christopher James. "Assessing the reliability of digital evidence from live investigations involving encryption." Thesis, Cranfield University, 2009. http://dspace.lib.cranfield.ac.uk/handle/1826/4007.

Full text
Abstract:
The traditional approach to a digital investigation when a computer system is encountered in a running state is to remove the power, image the machine using a write blocker and then analyse the acquired image. This has the advantage of preserving the contents of the computer’s hard disk at that point in time. However, the disadvantage of this approach is that the preservation of the disk is at the expense of volatile data such as that stored in memory, which does not remain once the power is disconnected. There are an increasing number of situations where this traditional approach of ‘pulling the plug’ is not ideal since volatile data is relevant to the investigation; one of these situations is when the machine under investigation is using encryption. If encrypted data is encountered on a live machine, a live investigation can be performed to preserve this evidence in a form that can be later analysed. However, there are a number of difficulties with using evidence obtained from live investigations that may cause the reliability of such evidence to be questioned. This research investigates whether digital evidence obtained from live investigations involving encryption can be considered to be reliable. To determine this, a means of assessing reliability is established, which involves evaluating digital evidence against a set of criteria; evidence should be authentic, accurate and complete. This research considers how traditional digital investigations satisfy these requirements and then determines the extent to which evidence from live investigations involving encryption can satisfy the same criteria. This research concludes that it is possible for live digital evidence to be considered to be reliable, but that reliability of digital evidence ultimately depends on the specific investigation and the importance of the decision being made. However, the research provides structured criteria that allow the reliability of digital evidence to be assessed, demonstrates the use of these criteria in the context of live digital investigations involving encryption, and shows the extent to which each can currently be met.
APA, Harvard, Vancouver, ISO, and other styles
19

Allinson, Caroline Linda. "Legislative and security requirements of audit material for evidentiary purpose." Thesis, Queensland University of Technology, 2004. https://eprints.qut.edu.au/36813/1/Caroline_Allinson_Thesis.pdf.

Full text
Abstract:
This research used the Queensland Police Service, Australia, as a major case study. Information on principles, techniques and processes used, and the reason for the recording, storing and release of audit information for evidentiary purposes is reported. It is shown that Law Enforcement Agencies have a two-fold interest in, and legal obligation pertaining to, audit trails. The first interest relates to the situation where audit trails are actually used by criminals in the commission of crime and the second to where audit trails are generated by the information systems used by the police themselves in support of the recording and investigation of crime. Eleven court cases involving Queensland Police Service audit trails used in evidence in Queensland courts were selected for further analysis. It is shown that, of the cases studied, none of the evidence presented was rejected or seriously challenged from a technical perspective. These results were further analysed and related to normal requirements for trusted maintenance of audit trail information in sensitive environments with discussion on the ability and/or willingness of courts to fully challenge, assess or value audit evidence presented. Managerial and technical frameworks for firstly what is considered as an environment where a computer system may be considered to be operating “properly” and, secondly, what aspects of education, training, qualifications, expertise and the like may be considered as appropriate for persons responsible within that environment, are both proposed. Analysis was undertaken to determine if audit and control of information in a high security environment, such as law enforcement, could be judged as having improved, or not, in the transition from manual to electronic processes. Information collection, control of processing and audit in manual processes used by the Queensland Police Service, Australia, in the period 1940 to 1980 was assessed against current electronic systems essentially introduced to policing in the decades of the 1980s and 1990s. Results show that electronic systems do provide for faster communications with centrally controlled and updated information readily available for use by large numbers of users who are connected across significant geographical locations. However, it is clearly evident that the price paid for this is a lack of ability and/or reluctance to provide improved audit and control processes. To compare the information systems audit and control arrangements of the Queensland Police Service with other government departments or agencies, an Australia wide survey was conducted. Results of the survey were contrasted with the particular results of a survey, conducted by the Australian Commonwealth Privacy Commission four years previous, to this survey which showed that security in relation to the recording of activity against access to information held on Australian government computer systems has been poor and a cause for concern. However, within this four year period there is evidence to suggest that government organisations are increasingly more inclined to generate audit trails. An attack on the overall security of audit trails in computer operating systems was initiated to further investigate findings reported in relation to the government systems survey. The survey showed that information systems audit trails in Microsoft Corporation's “Windows” operating system environments are relied on quite heavily. An audit of the security for audit trails generated, stored and managed in the Microsoft “Windows 2000” operating system environment was undertaken and compared and contrasted with similar such audit trail schemes in the “UNIX” and “Linux” operating systems. Strength of passwords and exploitation of any security problems in access control were targeted using software tools that are freely available in the public domain. Results showed that such security for the “Windows 2000” system is seriously flawed and the integrity of audit trails stored within these environments cannot be relied upon. An attempt to produce a framework and set of guidelines for use by expert witnesses in the information technology (IT) profession is proposed. This is achieved by examining the current rules and guidelines related to the provision of expert evidence in a court environment, by analysing the rationale for the separation of distinct disciplines and corresponding bodies of knowledge used by the Medical Profession and Forensic Science and then by analysing the bodies of knowledge within the discipline of IT itself. It is demonstrated that the accepted processes and procedures relevant to expert witnessing in a court environment are transferable to the IT sector. However, unlike some discipline areas, this analysis has clearly identified two distinct aspects of the matter which appear particularly relevant to IT. These two areas are; expertise gained through the application of IT to information needs in a particular public or private enterprise; and expertise gained through accepted and verifiable education, training and experience in fundamental IT products and system.
APA, Harvard, Vancouver, ISO, and other styles
20

Nordin, Anton, and Felix Liffner. "Forensiska Artefakter hos Mobila Applikationer : Utvinning och Analys av Applikationen Snapchat." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-40207.

Full text
Abstract:
Today's smartphones and tablets use different applications and software for all sorts of purposes: communication, entertainment, fitness, to share images with each other, to keep up to date with the news and lots of different daily tasks. With the heavy usage of all these apps, it is no wonder that it comes with a few issues. Private data is stored in large quantities both on the local device and on the app-creators' servers. It is no wonder that applications advertising user secrecy and transient storage of user data. One of these applications is Snapchat, with over 500 million downloads on Google Play store, at the time of writing. Snapchat is a communication application with the niched feature that the images and messages sent, disappear once opened or after 24 hours have passed. With the illusion of privacy behind Snapchats niche it has become a breeding ground for criminal activity. The niche itself translates to a troublesome hurdle for law enforcement trying to retrieve evidence from devices of Snapchat users. This paper is aimed to investigate these issues and perform a methodology to retrieve potential evidence on a device using Snapchat to send images and messages. By performing a physical acquisition on a test device and analyzing to find artifacts pertaining to Snapchat and the test-data that was created. The method is performed on a Samsung Galaxy S4 with Android 5.0.1 running Snapchat version 10.52.3.0. Test data such as different images and messages were created and attempted to be retrieved at three points in time. First one being right after data creation. Second one after a restart and 24 hours after the data was created. And the third with 48 hours passed and the Snapchat user logged out at the time of acquisition. The acquisition resulted in the extraction of several sent images and a full text conversation between the experimental device and another party. A full video which was uploaded by the receiving user was able to be extracted even though the experimental device never actually viewed the video. The second acquisition which was made when 24h had passed gave the same results as the first one. This meant that time at least up to a day after the initial creation of the data did not have any effect on the evidence. However, when the Snapchat user was logged out from the application, the data was then unobtainable and had disappeared. Presumably Snapchat has a function which deletes personal data about the user when logged out from the application. This function might become a hurdle in law enforcement investigations where the application Snapchat is involved.
APA, Harvard, Vancouver, ISO, and other styles
21

Urrea, Jorge Mario. "An analysis of Linux RAM forensics." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2006. http://library.nps.navy.mil/uhtbin/hyperion/06Mar%5FUrrea.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science) Naval Postgraduate School, March 2006.
Thesis Advisor(s): Christopher S. Eagle. "March 2006." Includes bibliographical references (p.71-72). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
22

Barbosa, Akio Nogueira. "Método para ranqueamento e triagem de computadores aplicado à perícia de informática." Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/3/3141/tde-14062016-081553/.

Full text
Abstract:
Considerando-se que uma das tarefas mais comuns para um perito judicial que atua na área da informática é procurar vestígios de interesse no conteúdo de dispositivos de armazenamento de dados (DADs), que esses vestígios na maioria das vezes consistem em palavras-chave (PChs) e durante o tempo necessário para realização da duplicação do DAD o perito fica praticamente impossibilitado de interagir com os dados contidos no mesmo, decidiu-se verificar a hipótese de que seja possível na etapa de coleta, realizar simultaneamente à duplicação do DAD a varredura para procurar PCHs em dados brutos (raw data), sem com isso impactar significativamente o tempo de duplicação. O principal objetivo desta tese é propor um método que possibilite identificar os DADs com maior chance de conter vestígios de interesse para uma determinada perícia ao término da etapa de coleta, baseado na quantidade de ocorrências de PCHs encontradas por um mecanismo de varredura que atua no nível de dados brutos. A partir desses resultados é realizada uma triagem dos DADs. Com os resultados da triagem é realizado um processo de ranqueamento, indicando quais DADs deverão ser examinados prioritariamente na etapa de análise. Os resultados dos experimentos mostraram que é possível e viável a aplicação do método sem onerar o tempo de duplicação e com um bom nível de precisão. Em muitos de casos, a aplicação do método contribui para a diminuição da quantidade de DADs que devem ser analisados, auxiliando a diminuir o esforço humano necessário.
Considering that one of the most common tasks for a legal expert acting in the information technology area is to look for invidences of interest in the content data storage devices (DADs). In most cases these evidences consist of keywords. During the time necessary to perform the DAD duplication, the expert is practically unable to interact with the data contained on DAD. In this work we have decided to verify the following hypothesis: It is possible, at the collection stage, to simultaneously hold the duplication of the DAD and scan to search for keywords in raw data, without thereby significantly impact the duplication time. The main objective of this thesis is to propose a method that allows to identify DADs with a strong chance of containing evidences of interest for a particular skill at the end of the collection stage, based on the keywords occurrences found by a scanner mechanism that operates at the raw data level. Based on these results, a triage of DADs is established. With the results of the triage, a ranking process is made, providing an indication of which DADs should be examined first at the analysis stage. The results of the ours experiments showed that it is possible and feasible to apply the method without hindering the duplication time and with a certain level of accuracy. In most cases, the application of the method contributes to reduce the number of DADs that must be analyzed, helping to reduces the human effort required.
APA, Harvard, Vancouver, ISO, and other styles
23

Davidsson, Pontus, and Niklas Englund. "Docker forensics: Investigation and data recovery on containers." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-42498.

Full text
Abstract:
Container technology continuously grows in popularity, and the forensic area is less explored than other areas of research concerning containers. The aim of this thesis is, therefore, to explore Docker containers in a forensic investigation to test whether data can be recovered from deleted containers and how malicious processes can be detected in active containers. The results of the experiments show that, depending on which container is used, and how it is configured, data sometimes persists after the container is removed. Furthermore, file carving is tested and evaluated as a useful method of recovering lost files from deleted containers, should data not persist. Lastly, tests reveal that malicious processes running inside an active container can be detected by inspection from the host machine.
APA, Harvard, Vancouver, ISO, and other styles
24

Hewling, Moniphia Orlease. "Digital forensics : an integrated approach for the investigation of cyber/computer related crimes." Thesis, University of Bedfordshire, 2013. http://hdl.handle.net/10547/326231.

Full text
Abstract:
Digital forensics has become a predominant field in recent times and courts have had to deal with an influx of related cases over the past decade. As computer/cyber related criminal attacks become more predominant in today’s technologically driven society the need for and use of, digital evidence in courts has increased. There is the urgent need to hold perpetrators of such crimes accountable and successfully prosecuting them. The process used to acquire this digital evidence (to be used in cases in courts) is digital forensics. The procedures currently used in the digital forensic process were developed focusing on particular areas of the digital evidence acquisition process. This has resulted in very little regard being made for the core components of the digital forensics field, for example the legal and ethical along with other integral aspects of investigations as a whole. These core facets are important for a number of reasons including the fact that other forensic sciences have included them, and to survive as a true forensics discipline digital forensics must ensure that they are accounted for. This is because, digital forensics like other forensics disciplines must ensure that the evidence (digital evidence) produced from the process is able to withstand the rigors of a courtroom. Digital forensics is a new and developing field still in its infancy when compared to traditional forensics fields such as botany or anthropology. Over the years development in the field has been tool centered, being driven by commercial developers of the tools used in the digital investigative process. This, along with having no set standards to guide digital forensics practitioners operating in the field has led to issues regarding the reliability, verifiability and consistency of digital evidence when presented in court cases. Additionally some developers have neglected the fact that the mere mention of the word forensics suggests courts of law, and thus legal practitioners will be intimately involved. Such omissions have resulted in the digital evidence being acquired for use in various investigations facing major challenges when presented in a number of cases. Mitigation of such issues is possible with the development of a standard set of methodologies flexible enough to accommodate the intricacies of all fields to be considered when dealing with digital evidence. This thesis addresses issues regarding digital forensics frameworks, methods, methodologies and standards for acquiring digital evidence using the grounded theory approach. Data was gathered using literature surveys, questionnaires and interviews electronically. Collecting data using electronic means proved useful when there is need to collect data from different jurisdictions worldwide. Initial surveys indicated that there were no existing standards in place and that the terms models/frameworks and methodologies were used interchangeably to refer to methodologies. A framework and methodology have been developed to address the identified issues and represent the major contribution of this research. The dissertation outlines solutions to the identified issues and presents the 2IR Framework of standards which governs the 2IR Methodology supported by a mobile application and a curriculum of studies. These designs were developed using an integrated approach incorporating all four core facets of the digital forensics field. This research lays the foundation for a single integrated approach to digital forensics and can be further developed to ensure the robustness of process and procedures used by digital forensics practitioners worldwide.
APA, Harvard, Vancouver, ISO, and other styles
25

Liljekvist, Erika, and Oscar Hedlund. "Uncovering Signal : Simplifying Forensic Investigations of the Signal Application." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-44835.

Full text
Abstract:
The increasing availability of easy-to-use end-to-end encrypted messaging applications has made it possible for more people to conduct their conversations privately. This is something that criminals have taken advantage of and it has proven to make digital forensic investigations more difficult as methods of decrypting the data are needed. In this thesis, data from iOS and Windows devices is extracted and analysed, with focus on the application Signal. Even though other operating systems are compatible with the Signal application, such as Android, it is outside the scope of this thesis. The results of this thesis provide access to data stored in the encrypted application Signalwithout the need for expensive analysis tools. This is done by developing and publishing the first open-source script for decryption and parsing of the Signal database. The script is available for anyone at https://github.com/decryptSignal/decryptSignal.
APA, Harvard, Vancouver, ISO, and other styles
26

Fjellström, Lisa. "The Contribution of Visual Explanations in Forensic Investigations of Deepfake Video : An Evaluation." Thesis, Umeå universitet, Institutionen för datavetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-184671.

Full text
Abstract:
Videos manipulated by machine learning have rapidly increased online in the past years. So called deepfakes can depict people who never participated in a video recording by transposing their faces onto others in it. This raises the concern of authenticity of media, which demand for higher performing detection methods in forensics. Introduction of AI detectors have been of interest, but is held back today by their lack of interpretability. The objective of this thesis was therefore to examine what the explainable AI method local interpretable model-agnostic explanations (LIME) could contribute to forensic investigations of deepfake video.  An evaluation was conducted where 3 multimedia forensics evaluated the contribution of visual explanations of classifications when investigating deepfake video frames. The estimated contribution was not significant yet answers showed that LIME may be used to indicate areas to start examine. LIME was however not considered to provide sufficient proof to why a frame was classified as `fake', and would if introduced be used as one of several methods in the process. Issues were apparent regarding the interpretability of the explanations, as well as LIME's ability to indicate features of manipulation with superpixels.
APA, Harvard, Vancouver, ISO, and other styles
27

AlMarri, Saeed. "A structured approach to malware detection and analysis in digital forensics investigation." Thesis, University of Bedfordshire, 2017. http://hdl.handle.net/10547/622529.

Full text
Abstract:
Within the World Wide Web (WWW), malware is considered one of the most serious threats to system security with complex system issues caused by malware and spam. Networks and systems can be accessed and compromised by various types of malware, such as viruses, worms, Trojans, botnet and rootkits, which compromise systems through coordinated attacks. Malware often uses anti-forensic techniques to avoid detection and investigation. Moreover, the results of investigating such attacks are often ineffective and can create barriers for obtaining clear evidence due to the lack of sufficient tools and the immaturity of forensics methodology. This research addressed various complexities faced by investigators in the detection and analysis of malware. In this thesis, the author identified the need for a new approach towards malware detection that focuses on a robust framework, and proposed a solution based on an extensive literature review and market research analysis. The literature review focussed on the different trials and techniques in malware detection to identify the parameters for developing a solution design, while market research was carried out to understand the precise nature of the current problem. The author termed the new approaches and development of the new framework the triple-tier centralised online real-time environment (tri-CORE) malware analysis (TCMA). The tiers come from three distinctive phases of detection and analysis where the entire research pattern is divided into three different domains. The tiers are the malware acquisition function, detection and analysis, and the database operational function. This framework design will contribute to the field of computer forensics by making the investigative process more effective and efficient. By integrating a hybrid method for malware detection, associated limitations with both static and dynamic methods are eliminated. This aids forensics experts with carrying out quick, investigatory processes to detect the behaviour of the malware and its related elements. The proposed framework will help to ensure system confidentiality, integrity, availability and accountability. The current research also focussed on a prototype (artefact) that was developed in favour of a different approach in digital forensics and malware detection methods. As such, a new Toolkit was designed and implemented, which is based on a simple architectural structure and built from open source software that can help investigators develop the skills to critically respond to current cyber incidents and analyses.
APA, Harvard, Vancouver, ISO, and other styles
28

Bouchaud, François. "Analyse forensique des écosystèmes intelligents communicants de l'internet des objets." Thesis, Lille, 2021. http://www.theses.fr/2021LILUI014.

Full text
Abstract:
Avec le développement des écosystèmes connectés à Internet, la recherche de données dans un environnement numérique par l’enquêteur judiciaire constitue une tâche de plus en plus ardue. Elle est un véritable défi en particulier par l’hétérogénéité des objets à étudier. A cette affirmation, il convient d'y ajouter l’absence de standardisation des architectures de communication et des remontées de données, des dépendances entre les dispositifs connectés et une dispersion de l’information. Dans cette thèse, nous proposons d’adapter l’approche traditionnelle de l’investigation numérique aux contraintes de l’Internet des objets. Nous développons des méthodologies et des outils d’appréhension et d’analyse de l’environnement connecté pour les praticiens du judiciaire. Nous partons du principe que la scène de crime constitue un tout connecté et non un agrégat d’objets numériques. Elle contient des données clefs dans la compréhension et la contextualisation d’un évènement ou d’un phénomène passé, éléments de preuve pour le procès pénal. L’investigation numérique est une science appliquée pour identifier un incident, collecter, examiner et analyser des données tout en préservant l’intégrité de l’information et en maintenant une chaîne de contrôle stricte pour les données (National Institute of Standards and Technology). Face à une scène de crime, l’enquêteur cherche à comprendre l’évènement criminel, en examinant les traces figées ou emprisonnées dans le support physique et/ou dans une partie déportée sur le Cloud. Nos travaux développent un processus d’identification rapide du phénomène selon quatre phases : détection, localisation, reconnaissance des objets et recoupement de l’information. Il est enrichi d’outils de recherche de traces radioélectriques: simple capteur et réseau maillé multi-capteur. Cette démarche est construite autour de la problématique de l’appréhension d’un environnement connecté multiforme, contenant des dispositifs pas toujours visibles ou identifiables lors d’une approche terrain. Nous intégrons dans notre étude la stratégie de la collecte des équipements. Le défi réside dans la capacité à extraire un ou plusieurs objets connectés, sans compromettre les données stockées, pour les placer dans un environnement contrôlé et sécurisé. L’objet est maintenu dans un état garantissant la non-altération ou la perte des données. L’étude regroupe une première phase de compréhension de l’environnement physique et des dépendances. Elle cherche à déterminer les mécanismes de migration de l’information vers les plates-formes en ligne et à isoler les groupes d’objets en déstructurant avec intelligence les connexions. Les dispositifs sont extraits, puis conditionnés et scellés au regard de leurs caractéristiques techniques et de l’infrastructure connectée. Puis, nous approfondissons l’exploitation de l’information collectée par des méthodes forensiques. La donnée est alors analysée selon les axes temporels, spatiaux et contextuels. Nous proposons par ailleurs une classification et une priorisation de la structure connectée en fonction des caractéristiques de la donnée recherchée. Les travaux donnent une lecture du cycle de vie de la donnée au sein de l’infrastructure de l’Internet des Objets.Dans une approche prospective, nous approfondissons les questions de l’identification fine de l'objet connecté en fonction des caractéristiques du matériel et du logiciel. L'émission acoustique de l'électronique apparaît comme une propriété physique pertinente dans l'étude des équipements. Cet attribut complète notre palette d'outils dans l'identification des objets connectés
With the development of the Internet of Things, searching for data in a digital environment is an increasingly difficult task for the forensic investigator. It is a real challenge, especially given the heterogeneity of the connected objects. There is a lack of standardization in communication architectures and data management policies. It is accompanied by dependencies between connected ecosystems, especially through hidden links and fragmented information. In this thesis, we suggest adjusting the traditional approach of digital investigation to the constraints of the Internet of Things. We develop methodologies and tools to understand and analyze the connected environment. We assume that the crime scene is a connected whole and not an aggregate of independent digital objects. It contains key data for understanding and contextualizing a past event or phenomenon as evidence for the criminal trial. Digital forensics is considered to be the og extit{application of science to the identification, collection, examination, and analysis, of data while preserving the integrity of the information and maintaining a strict chain of custody for the data fg~ (National Institute of Standards and Technology). Faced with a crime scene, the investigator seeks to understand the criminal event. He examines the data stored in the physical medium and/or in a remote part of the cloud. Our work develops a process of rapid identification of the phenomenon according to four phases: detection, localization, object recognition and information crosschecking. It is enriched with radio signature search tools~: single-sensor and multi-sensor mesh network. This approach is built around the problem of apprehending a multiform connected environment, containing devices that are not always visible or identifiable during a field approach. We integrate in our study the strategy of equipment collection. The challenge lies in the ability to extract one or more connected objects, without compromising the stored data, to place them in a controlled and secure environment. The object is maintained in a state that guarantees the non-alteration or loss of data. The study includes a first phase of understanding the physical environment and dependencies. It seeks to determine the mechanisms of information migration to online platforms and to isolate groups of objects by intelligently breaking the connections. Devices are extracted, then packaged and sealed according to their technical characteristics and the connected infrastructure. We then deepen the exploitation of the information collected using forensic methods. The data is then analyzed according to temporal, spatial and contextual axes. We also propose a classification and a prioritization of the connected structure according to the characteristics of the desired data. The work gives a reading of the life cycle of the data within the Internet of Things infrastructure. In a prospective approach, we deepen the questions of the fine identification of the connected object according to these hardware and software characteristics. The acoustic signature of electronics appears as a relevant physical property in the study of equipment. This feature completes our range of tools in the identification of connected objects
APA, Harvard, Vancouver, ISO, and other styles
29

Morris, Sarah Louise Angela. "An investigation into the identification, reconstruction, and evidential value of thumbnail cache file fragments in unallocated space." Thesis, Cranfield University, 2013. http://dspace.lib.cranfield.ac.uk/handle/1826/8034.

Full text
Abstract:
This thesis establishes the evidential value of thumbnail cache file fragments identified in unallocated space. A set of criteria to evaluate the evidential value of thumbnail cache artefacts were created by researching the evidential constraints present in Forensic Computing. The criteria were used to evaluate the evidential value of live system thumbnail caches and thumbnail cache file fragments identified in unallocated space. Thumbnail caches can contain visual thumbnails and associated metadata which may be useful to an analyst during an investigation; the information stored in the cache may provide information on the contents of files and any user or system behaviour which interacted with the file. There is a standard definition of the purpose of a thumbnail cache, but not the structure or implementation; this research has shown that this has led to some thumbnail caches storing a variety of other artefacts such as network place names. The growing interest in privacy and security has led to an increase in user’s attempting to remove evidence of their activities; information removed by the user may still be available in unallocated space. This research adapted popular methods for the identification of contiguous files to enable the identification of single cluster sized fragments in Windows 7, Ubuntu, and Kubuntu. Of the four methods tested, none were able to identify each of the classifications with no false positive results; this result led to the creation of a new approach which improved the identification of thumbnail cache file fragments. After the identification phase, further research was conducted into the reassembly of file fragments; this reassembly was based solely on the potential thumbnail cache file fragments and structural and syntactical information. In both the identification and reassembly phases of this research image only file fragments proved the most challenging resulting in a potential area of continued future research. Finally this research compared the evidential value of live system thumbnail caches with identified and reassembled fragments. It was determined that both types of thumbnail cache artefacts can provide unique information which may assist with a digital investigation. ii This research has produced a set of criteria for determining the evidential value of thumbnail cache artefacts; it has also identified the structure and related user and system behaviour of popular operating system thumbnail cache implementations. This research has also adapted contiguous file identification techniques to single fragment identification and has developed an improved method for thumbnail cache file fragment identification. Finally this research has produced a proof of concept software tool for the automated identification and reassembly of thumbnail cache file fragments.
APA, Harvard, Vancouver, ISO, and other styles
30

Homem, Irvin. "Towards Automation in Digital Investigations : Seeking Efficiency in Digital Forensics in Mobile and Cloud Environments." Licentiate thesis, Stockholms universitet, Institutionen för data- och systemvetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-130742.

Full text
Abstract:
Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.
APA, Harvard, Vancouver, ISO, and other styles
31

Mikus, Nicholas A. "An analysis of disc carving techniques." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2005. http://library.nps.navy.mil/uhtbin/hyperion/05Mar%5FMikus.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Dath, Catrin. "Crime scenes in Virtual Reality : A user centered study." Thesis, KTH, Medieteknik och interaktionsdesign, MID, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-209952.

Full text
Abstract:
A crime scene is a vital part of an investigation. There are however, depending on the situation and crime, issues connected to physically being at the scene; risk of contamination, destruction of evidence or other issues can hinder the criminal investigators to stay, visit or revisit the scene. It is therefore important to visually capture the crime scene and any possible evidence in order to aid the investigation. This thesis aims to, with an initial research question, map out the main visual documentation needs, wishes and challenges that criminal investigators face during an investigation. In addition, with a second research question, it aims to address these in a Virtual Reality (VR) design and, with a third research question, explore however other professions in the investigation process could benefit from it. This was conducted through a literature review, interviews, workshops and iterations with the approach of the Double Diamond Model of Design. The results from the interviews were thematically analyzed and ultimately summarized into five key themes. These, together with various design criteria and principals, acted as design guidelines when creating a high fidelity VR design. The first two research questions were presented through the key themes and the VR design. The results of the third research question indicated that, besides criminal investigators, both prosecutors and criminal scene investigators may benefit from a VR design, although in different ways. A VR design can, in conclusion, address the needs, wishes and challenges of criminal investigators by being developed as a compiled visualization and collaboration tool.
En brottsplats är en vital del av en brottsundersökning. Det finns emellertid, beroende på situation och brott, problem som är kopplade till att fysiskt befinna sig på brottsplatsen. Risk för kontamination, förstörelse av bevis eller andra problem kan hindra brottsutredarna att stanna, besöka eller återvända till brottsplatsen. Det är därför viktigt att visuellt dokumentara brottsplatsen och eventuella bevis för att bistå utredningen. Detta masterarbete ämnar att, med en första forskningsfråga, kartlägga de viktigaste behoven, önskemålen och utmaningarna gällande visuell dokumentation, som brottsutredare möter under en utredning. Vidare ämnar projektet att, med en andra forskningsfråga, möta dessa i en Virtuell Verklighet (VR) -design och, med en tredje forskningsfråga, undersöka hur andra yrkesgrupper i en utredningsprocess skulle kunna dra nytta av den. Detta genomfördes genom en litteraturstudie, intervjuer, workshops och iterationer grundat i tillvägagångssättet Double Diamond Model of Design. Resultaten från intervjuerna analyserades tematiskt och sammanfattades i fem huvudteman. Dessa teman, tillsammans med olika designkriterier och principer, agerade designriktlinjer vid skapandet av en high-fidelity VR-design. De två första frågorna presenterades genom nyckeltemana och VR-designen. Resultaten gällande den tredje forskningsfrågan visar att, utöver brottsutredare, både åklagare och kriminaltekniker kan dra nytta av en VR-design, även om på olika vis. Sammanfattningsvis kan en VRdesign möta utredarnas behov, önskemål och utmaningar gällande visuell dokumentation genom att utvecklas som ett kompilerat visualiserings- och samarbetsverktyg.
APA, Harvard, Vancouver, ISO, and other styles
33

Quan, Weize. "Detection of computer-generated images via deep learning." Thesis, Université Grenoble Alpes, 2020. http://www.theses.fr/2020GRALT076.

Full text
Abstract:
Avec les progrès des outils logiciels d'édition et de génération d'images, il est devenu plus facile de falsifier le contenu des images ou de créer de nouvelles images, même pour les novices. Ces images générées, telles que l'image de rendu photoréaliste et l'image colorisée, ont un réalisme visuel de haute qualité et peuvent potentiellement menacer de nombreuses applications importantes. Par exemple, les services judiciaires doivent vérifier que les images ne sont pas produites par la technologie de rendu infographique, les images colorisées peuvent amener les systèmes de reconnaissance / surveillance à produire des décisions incorrectes, etc. Par conséquent, la détection d'images générées par ordinateur a attiré une large attention dans la communauté de recherche en sécurité de multimédia. Dans cette thèse, nous étudions l'identification de différents types d'images générées par ordinateur, y compris l'image de rendu et l'image coloriée. Nous nous intéressons à identifier si une image est acquise par une caméra ou générée par un programme informatique. L'objectif principal est de concevoir un détecteur efficace, qui a une précision de classification élevée et une bonne capacité de généralisation. Nous considérons la construction de jeux de données, l'architecture du réseau de neurones profond, la méthode d'entraînement, la visualisation et la compréhension, pour les problèmes d'investigation légale des images considérés. Nos principales contributions sont : (1) une méthode de détection d'image colorisée basée sur l'insertion d'échantillons négatifs, (2) une méthode d'amélioration de la généralisation pour la détection d'image colorisée, (3) une méthode d'identification d'image naturelle et d'image de rendu basée sur le réseau neuronal convolutif, et (4) une méthode d'identification d'image de rendu basée sur l'amélioration de la diversité des caractéristiques et des échantillons contradictoires
With the advances of image editing and generation software tools, it has become easier to tamper with the content of images or create new images, even for novices. These generated images, such as computer graphics (CG) image and colorized image (CI), have high-quality visual realism, and potentially throw huge threats to many important scenarios. For instance, the judicial departments need to verify that pictures are not produced by computer graphics rendering technology, colorized images can cause recognition/monitoring systems to produce incorrect decisions, and so on. Therefore, the detection of computer-generated images has attracted widespread attention in the multimedia security research community. In this thesis, we study the identification of different computer-generated images including CG image and CI, namely, identifying whether an image is acquired by a camera or generated by a computer program. The main objective is to design an efficient detector, which has high classification accuracy and good generalization capability. Specifically, we consider dataset construction, network architecture, training methodology, visualization and understanding, for the considered forensic problems. The main contributions are: (1) a colorized image detection method based on negative sample insertion, (2) a generalization method for colorized image detection, (3) a method for the identification of natural image (NI) and CG image based on CNN (Convolutional Neural Network), and (4) a CG image identification method based on the enhancement of feature diversity and adversarial samples
APA, Harvard, Vancouver, ISO, and other styles
34

Breitinger, Frank [Verfasser], Stefan [Akademischer Betreuer] Katzenbeisser, Harald [Akademischer Betreuer] Baier, and Felix [Akademischer Betreuer] Freiling. "On the utility of bytewise approximate matching in computer science with a special focus on digital forensics investigations / Frank Breitinger. Betreuer: Stefan Katzenbeisser ; Harald Baier ; Felix Freiling." Darmstadt : Universitäts- und Landesbibliothek Darmstadt, 2014. http://d-nb.info/1110901852/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Homem, Irvin. "LEIA: The Live Evidence Information Aggregator : A Scalable Distributed Hypervisor‐based Peer‐2‐Peer Aggregator of Information for Cyber‐Law Enforcement I." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-177902.

Full text
Abstract:
The Internet in its most basic form is a complex information sharing organism. There are billions of interconnected elements with varying capabilities that work together supporting numerous activities (services) through this information sharing. In recent times, these elements have become portable, mobile, highly computationally capable and more than ever intertwined with human controllers and their activities. They are also rapidly being embedded into other everyday objects and sharing more and more information in order to facilitate automation, signaling that the rise of the Internet of Things is imminent. In every human society there are always miscreants who prefer to drive against the common good and engage in illicit activity. It is no different within the society interconnected by the Internet (The Internet Society). Law enforcement in every society attempts to curb perpetrators of such activities. However, it is immensely difficult when the Internet is the playing field. The amount of information that investigators must sift through is incredibly massive and prosecution timelines stated by law are prohibitively narrow. The main solution towards this Big Data problem is seen to be the automation of the Digital Investigation process. This encompasses the entire process: From the detection of malevolent activity, seizure/collection of evidence, analysis of the evidentiary data collected and finally to the presentation of valid postulates. This paper focuses mainly on the automation of the evidence capture process in an Internet of Things environment. However, in order to comprehensively achieve this, the subsequent and consequent procedures of detection of malevolent activity and analysis of the evidentiary data collected, respectively, are also touched upon. To this effect we propose the Live Evidence Information Aggregator (LEIA) architecture that aims to be a comprehensive automated digital investigation tool. LEIA is in essence a collaborative framework that hinges upon interactivity and sharing of resources and information among participating devices in order to achieve the necessary efficiency in data collection in the event of a security incident. Its ingenuity makes use of a variety of technologies to achieve its goals. This is seen in the use of crowdsourcing among devices in order to achieve more accurate malicious event detection; Hypervisors with inbuilt intrusion detection capabilities to facilitate efficient data capture; Peer to Peer networks to facilitate rapid transfer of evidentiary data to a centralized data store; Cloud Storage to facilitate storage of massive amounts of data; and the Resource Description Framework from Semantic Web Technologies to facilitate the interoperability of data storage formats among the heterogeneous devices. Within the description of the LEIA architecture, a peer to peer protocol based on the Bittorrent protocol is proposed, corresponding data storage and transfer formats are developed, and network security protocols are also taken into consideration. In order to demonstrate the LEIA architecture developed in this study, a small scale prototype with limited capabilities has been built and tested. The prototype functionality focuses only on the secure, remote acquisition of the hard disk of an embedded Linux device over the Internet and its subsequent storage on a cloud infrastructure. The successful implementation of this prototype goes to show that the architecture is feasible and that the automation of the evidence seizure process makes the otherwise arduous process easy and quick to perform.
APA, Harvard, Vancouver, ISO, and other styles
36

Grobler, Cornelia Petronella. "DFMF : a digital forensic management framework." Thesis, 2012. http://hdl.handle.net/10210/6365.

Full text
Abstract:
D.Phil.(Computer Science)
We are living in an increasingly complex world in which much of society is dependent on technology and its various offshoots and incarnations (Rogers & Siegfried, 2004). There is ample evidence of the influence of technology on our daily lives. We communicate via e-mail, use chat groups to interact and conduct business by using e-commerce. People relate each other’s existence to a presence on Facebook. The convergence of the products, systems and services of information technology is changing the way of living. The latest smart and cell phones have cameras, applications, and access to social networking sites. These phones contain sensitive information, for example photographs, e-mail, spread sheets, documents, and presentations. The loss of a cell phone therefore may pose a serious problem to an individual or an organisation, when considering privacy and intellectual property issues from an information security (Info Sec) perspective (Pieterse, 2006). Organisations have accepted the protection of information and information assets as a fundamental business requirement and managers are therefore implementing an increasing number of security counter measures, such as security policies, intrusion detection systems, access control mechanisms, and anti-virus products to protect the information and information assets from potential threats. However, incidents still occur, as no system is 100% secure. The incidents must be investigated to determine their root cause and potentially to prosecute the perpetrators (Louwrens, von Solms, Reeckie & Grobler, 2006b). Humankind has long been interested in the connection between cause and event, wishing to know what happened, what went wrong and why it happened. The need for computer forensics emerged when an increasing number of crimes were committed with the use of computers and the evidence required was stored on the computer. In 1984, a Federal Bureau of Investigation (FBI) laboratory began to examine computer evidence (Barayumureeba & Tushabe, 2004), and in 1991 the international association of computer investigation specialists (IACIS) in Portland, Oregon coined the term ‘computer forensics’ during a training session.
APA, Harvard, Vancouver, ISO, and other styles
37

Ndara, Vuyani. "Computer seizure as technique in forensic investigation." Diss., 2014. http://hdl.handle.net/10500/13277.

Full text
Abstract:
The problem encountered by the researcher was that the South African Police Service Cyber-Crimes Unit is experiencing problems in seizing computer evidence. The following problems were identified by the researcher in practice: evidence is destroyed or lost because of mishandling by investigators; computer evidence is often not obtained or recognised, due to a lack of knowledge and skills on the part of investigators to properly seize computer evidence; difficulties to establish authenticity and initiate a chain of custody for the seized evidence; current training that is offered is unable to cover critical steps in the performance of seizing computer evidence; computer seizure as a technique requires specialised knowledge and continuous training, because the information technology industry is an ever-changing area. An empirical research design, followed by a qualitative research approach, allowed the researcher to also obtain information from practice. A thorough literature study, complemented by interviews, was done to collect the required data for the research. Members of the South African Police Cyber-crime Unit and prosecutors dealing with cyber-crime cases were interviewed to obtain their input into, and experiences on, the topic. The aim of the study was to explore the role of computers in the forensic investigation process, and to determine how computers can be seized without compromising evidence. The study therefore also aimed at creating an understanding and awareness about the slippery nature of computer evidence, and how it can find its way to the court of law without being compromised. The research has revealed that computer crime is different from common law or traditional crimes. It is complicated, and therefore only skilled and qualified forensic experts should be used to seize computer evidence, to ensure that the evidence is not compromised. Training of cyber-crime technicians has to be priority, in order to be successful in seizing computers.
Department of Criminology
M.Tech. (Forensic Investigation)
APA, Harvard, Vancouver, ISO, and other styles
38

Ngomane, Amanda Refiloe. "The use of electronic evidence in forensic investigation." Diss., 2010. http://hdl.handle.net/10500/4200.

Full text
Abstract:
For millions of people worldwide the use of computers has become a central part of life. Criminals are exploiting these technological advances for illegal activities. This growth of technology has therefore produced a completely new source of evidence referred to as ‘electronic evidence’. In light of this the researcher focused on the collection of electronic evidence and its admissibility at trial. The study intends to assist and give guidance to investigators to collect electronic evidence properly and legally and ensure that it is admitted as evidence in court. Electronic evidence is fragile and volatile by nature and therefore requires the investigator always to exercise reasonable care during its collection, preservation and analysis to protect its identity and integrity. The legal requirements that the collected electronic evidence must satisfy for it to be admissible in court are relevance, reliability, and authenticity. When presenting the evidence in court the investigator should always keep in mind that the judges are not specialists in the computing environment and that therefore the investigator must be able to explain how the chain of custody was maintained during the collection, preservation and analysis of electronic evidence. The complex technology behind electronic evidence must be clearly explained so that the court is able to understand the evidence in a way that an ordinary person or those who have never used a computer before can. This is because the court always relies on the expertise of the investigator to understand electronic evidence and make a ruling on matters related to it.
Police Practice
M. Tech. (Forensic Investigation)
APA, Harvard, Vancouver, ISO, and other styles
39

"Liforac - A Model For Life Forensic Acquisition." Thesis, 2010. http://hdl.handle.net/10210/3438.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Adedayo, Oluwasola Mary. "Reconstruction in Database Forensics." Thesis, 2015. http://hdl.handle.net/2263/43777.

Full text
Abstract:
The increasing usage of databases in the storage of critical and sensitive information in many organizations has led to an increase in the rate at which databases are exploited in computer crimes. Databases are often manipulated to facilitate crimes and as such are usually of interest during many investigations as useful information relevant to the investigation can be found therein. A branch of digital forensics that deals with the identification, preservation, analysis and presentation of digital evidence from databases is known as database forensics. Despite the large amount of information that can be retrieved from databases and the amount of research that has been done on various aspects of databases, database security and digital forensics in general, very little has been done on database forensics. Databases have also been excluded from traditional digital investigations until very recently. This can be attributed to the inherent complexities of databases and the lack of knowledge on how the information contained in the database can be retrieved, especially in cases where such information have been modified or existed in the past. This thesis addresses one major part of the challenges in database forensics, which is the reconstruction of the information stored in the database at some earlier time. The dimensions involved in a database forensics analysis problem are identified and the thesis focuses on one of these dimensions. Concepts such as the relational algebra log and the inverse relational algebra are introduced as tools in the definition of a theoretical framework that can be used for database forensics. The thesis provides an algorithm for database reconstruction and outlines the correctness proof of the algorithm. Various techniques for a complete regeneration of deleted or lost data during a database forensics analysis are also described. Due to the importance of having adequate logs in order to use the algorithm, specifications of an ideal log configuration for an effective reconstruction process are given, putting into consideration the various dimensions of the database forensics problem space. Throughout the thesis, practical situations that illustrate the application of the algorithms and techniques described are given. The thesis provides a scientific approach that can be used for handling database forensics analysis practice and research, particularly in the aspect of reconstructing the data in a database. It also adds to the field of digital forensics by providing insights into the field of database forensics reconstruction.
Thesis (PhD)--University of Pretoria, 2015.
Computer Science
PhD
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
41

Whelpton, Juliette. "The psychological effects experienced by computer forensic examiners working with child pornography." Diss., 2012. http://hdl.handle.net/10500/6217.

Full text
Abstract:
Convergence of technology has made access to the Internet faster, easier and cheaper. Criminals, including paedophiles, child abusers and pornography traders make use of this technology to commit criminal offences. Computer Forensic Examiners (CFEs) are members of the Cyber Crime Unit, a professional, specialised unit of the South African Police Service (SAPS) who are responsible for computer forensic examination including the investigation of child pornographic images. The aim of the study was to seek understanding on what psychological effects the CFEs experienced when working with the images and was conducted from within the social constructionism and the narrative frameworks. The images had a severe impact on the CFEs as was clearly uncovered in the stories of six CFEs who participated in this study. The participants' stories were recorded and transcribed after which the application of thematic content analysis found that the participants all suffered similar negative effects. These findings were integrated with the findings of a focus group as well as with the findings of a similar study that was conducted during the same time by the Crimes against Children Research Center at the University of New Hampshire and resulted in identifying symptoms of trauma and stress experienced by the CFEs. Based on these results recommendations regarding the support for the CFEs were made.
Psychology
M.A. (Psychology)
APA, Harvard, Vancouver, ISO, and other styles
42

Themeli, Aluwani Rufaroh. "Exploring the value of computer forensics in the investigation of procurement fraud." Diss., 2017. http://hdl.handle.net/10500/22400.

Full text
Abstract:
The research problem for this study was that forensic investigators in the Forensic Services (FS) of the City of Tshwane (CoT) are unable to successfully deal with procurement fraud as a result of the lack of knowledge, skills and resources required to conduct computer forensics during the investigation of procurement fraud. This research was conducted to ascertain the value of computer forensics in the investigation of procurement fraud. Further, the study sought to determine how to improve the CoT forensic investigators’ knowledge and competence regarding the application of computer forensics in the investigation of procurement fraud. The purpose of this study was to explore the procedures that should be followed by CoT forensic investigators when conducting computer forensics during the investigation of procurement fraud. The research also aimed to discover new information, not previously known to the researcher, related to computer forensics during the investigation of procurement fraud by exploring national and international literature. In addition, the study explored existing practices so as to use this information to improve the current CoT procedure, within the confines of the legislative requirements. The overall purpose of this study is to provide practical recommendations for best practices, based on the results of the data analysis, which address the problem and enhance the investigative skills of CoT forensic investigators. The study established that it is imperative and compulsory to apply computer forensics in any procurement fraud investigation in order to efficiently track down cyber criminals and solve complicated and complex computer crimes. It was also established that forensic investigators within the FS in the CoT lack the necessary computer skills to optimally investigate procurement fraud. It is therefore recommended that CoT forensic investigators acquire the necessary skills and essential training in computer forensics in order to improve their knowledge and competence regarding the application and understanding of the value of computer forensics in the investigation of procurement fraud.
School of Criminal Justice
M.Tech. (Forensic Investigation)
APA, Harvard, Vancouver, ISO, and other styles
43

Pooe, El Antonio. "Developing a multidisciplinary digital forensic readiness model for evidentiary data handling." Thesis, 2018. http://hdl.handle.net/10500/25316.

Full text
Abstract:
There is a growing global recognition as to the importance of outlawing malicious computer related acts in a timely manner, yet few organisations have the legal and technical resources necessary to address the complexities of adapting criminal statutes to cyberspace. Literature reviewed in this study suggests that a coordinated, public-private partnership to produce a model approach can help reduce potential dangers arising from the inadvertent creation of cybercrime havens. It is against this backdrop that the study seeks to develop a digital forensic readiness model (DFRM) using a coordinated, multidisciplinary approach, involving both the public and private sectors, thus enabling organisations to reduce potential dangers arising from the inadvertent destruction and negating of evidentiary data which, in turn, results in the non-prosecution of digital crimes. The thesis makes use of 10 hypotheses to address the five research objectives, which are aimed at investigating the problem statement. This study constitutes qualitative research and adopts the post-modernist approach. The study begins by investigating each of the 10 hypotheses, utilising a systematic literature review and interviews, followed by a triangulation of findings in order to identify and explore common themes and strengthen grounded theory results. The output from the latter process is used as a theoretical foundation towards the development of a DFRM model which is then validated and verified against actual case law. Findings show that a multidisciplinary approach to digital forensic readiness can aid in preserving the integrity of evidentiary data within an organisation. The study identifies three key domains and their critical components. The research then demonstrates how the interdependencies between the domains and their respective components can enable organisations to identify and manage vulnerabilities which may contribute to the inadvertent destruction and negating of evidentiary data. The Multidisciplinary Digital Forensic Readiness Model (M-DiFoRe) provides a proactive approach to creating and improving organizational digital forensic readiness. This study contributes to the greater body of knowledge in digital forensics in that it reduces complexities associated with achieving digital forensic readiness and streamlines the handling of digital evidence within an organisation.
Information Science
Ph.D. (Information Systems)
APA, Harvard, Vancouver, ISO, and other styles
44

"Forensic Computing for Non-profits: A Case Study for Consideration when Non-profits Need to Determine if a Computer Forensic Investigation is Warranted." East Tennessee State University, 2004. http://etd-submit.etsu.edu/etd/theses/available/etd-0831104-124226/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Ncube, Njabulo. "Procedures for searching evidence in the investigation of computer-related crime in Bulawayo, Zimbabwe." Diss., 2015. http://hdl.handle.net/10500/21021.

Full text
Abstract:
Text in English
The continued advancement in myriad technological, societal and legal issues has affected the investigation of computer aided crimes. The investigators are confronted with tremendous impediments as the computer aided and traditional crime scenes differ. The study sought to analyse the procedures for searching evidence in the investigation of computer-related crime with the intention to improve admissibility of such evidence. The researcher employed empirical design to reach conclusions based upon evidence collected from observations and real life experiences. This aided the researcher to obtain information through face-to-face interviews. The study was qualitative in approach as it consisted of a set of interpretive and material practices that make the real social world visible. The training curriculum for investigators should include aspects of computer-related crime investigation, search and seizure of computer evidence. Search and collection of computer-related evidence should be done preferably by qualified forensic experts, so that evidence is accepted in court.
Police Practice
M. Tech. (Forensic Investigation)
APA, Harvard, Vancouver, ISO, and other styles
46

Broucek, Vlastimil. "Forensic computing : exploring paradoxes : an investigation into challenges of digital evidence and implications for emerging responses to criminal, illegal and inappropriate on-line behaviours." Thesis, 2009. https://eprints.utas.edu.au/19282/7/whole_BroucekVlastimil2009_thesis%20excluding%20pub%20mat.pdf.

Full text
Abstract:
This research thesis explores technical, legal and organisational challenges of digital evidence and the implications of their inter-relationships for responses to criminal, illegal and inappropriate on-line behaviours. From a forensic computing perspective the solutions to these challenges have tended to focus on discrete sets of technical, legal or organisational issues individually. Lack of understanding of the inter-relationships between these issues is inhibiting the development of integrated and coordinated solutions that can effectively balance requirements for the generation of legally admissible digital evidence, e-security and privacy. More significantly, this research highlights that the fragmented nature of these discrete approaches may be impairing the overall effectiveness of the responses developed. The methodological framework underpinning this exploratory research adopts a subjective ontology and employs an interpretative epistemology. The research strategy involves the examination of three cases on technical, legal and organisational challenges of digital evidence respectively. Each case is analysed independently and the interpretation and discussion adopts a forensic computing perspective to interpret and discuss the inter-relationships across these areas and to explore the implications for digital evidence and the underlying problematic on-line behaviours. Case A examines the validity of quantitative data collected by running a network intrusion detection system (NIDS) SNORT on University network. Case B examines an Australian Federal Court case illustrating legal arguments applied to digital evidence, its discovery and presentation. Case C examines the Cyber Tools On-line Search for Evidence (CTOSE) project highlighting the difficulties of developing and implementing organisational level processes for digital evidence handling. Analysis of Case A involves descriptive statistical analysis of network data and reveals significant problems with the validity and quality of the data. The results of the case analysis show that data collected by SNORT are not sufficient to track and trace the sources of the attacks. The analysis also reveals that the data sets collected may be flawed, erroneous or already have been tampered with. Despite significant fine tuning, SNORT continued to generate numerous false positive alerts and/or wrongly identified sources of attacks. This case highlights that intrusion detection systems can play an important role in protecting information systems infrastructure, but to be effective they require the attention of highly trained security personnel/system administrators. These personnel also need to engage in regular monitoring and analysis of alerts and other log files, and to ensure regular updating of the rule sets used by these systems. Analysis of Case B reveals the impact of legal misconceptualisations about the nature of digital systems on court decisions and on the generation of legal precedents that have potentially broader social implications. The results of the analysis reveal serious flaws in understanding amongst all participants in the case over the nature of digital evidence and how it should best be collected, analysed and presented. More broadly, the judgement also appears to have worrying implications for individual privacy and data protection. Analysis of Case C highlights the practical challenges faced at the organisational level in the implementation of models and tools for digital evidence handling. The analysis highlights that models and tools that have been developed for handling digital evidence are by their very nature and complexity highly problematic to adopt and utilise in organisational settings. A key element that continues to inhibit their use is the lack of early and comprehensive end-user education. The results from this case highlight the critical need for organisations to have greater 'forensic readiness' for dealing with criminal, illegal or inappropriate on-line behaviours.
APA, Harvard, Vancouver, ISO, and other styles
47

Joshi, Abhishek Shriram. "Image Processing and Super Resolution Methods for a Linear 3D Range Image Scanning Device for Forensic Imaging." 2013. http://hdl.handle.net/1805/3414.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
In the last few decades, forensic science has played a significant role in bringing criminals to justice. Shoe and tire track impressions found at the crime scene are important pieces of evidence since the marks and cracks on them can be uniquely tied to a person or vehicle respectively. We have designed a device that can generate a highly accurate 3-Dimensional (3D) map of an impression without disturbing the evidence. The device uses lasers to detect the changes in depth and hence it is crucial to accurately detect the position of the laser. Typically, the forensic applications require very high resolution images in order to be useful in prosecutions of criminals. Limitations of the hardware technology have led to the use of signal and image processing methods to achieve high resolution images. Super Resolution is the process of generating higher resolution images from multiple low resolution images using knowledge about the motion and the properties of the imaging geometry. This thesis presents methods for developing some of the image processing components of the 3D impression scanning device. In particular, the thesis describes the following two components: (i) methods to detect the laser stripes projected onto the impression surface in order to calculate the deformations of the laser stripes due to 3D surface shape being scanned, and (ii) methods to improve the resolution of the digitized color image of the impression by utilizing multiple overlapping low resolution images captured during the scanning process and super resolution techniques.
APA, Harvard, Vancouver, ISO, and other styles
48

Arthur, Kweku Kwakye. "Considerations towards the development of a forensic evidence management system." Diss., 2010. http://hdl.handle.net/2263/26567.

Full text
Abstract:
The decentralized nature of the Internet forms its very foundation, yet it is this very nature that has opened networks and individual machines to a host of threats and attacks from malicious agents. Consequently, forensic specialists - tasked with the investigation of crimes commissioned through the use of computer systems, where evidence is digital in nature - are often unable to adequately reach convincing conclusions pertaining to their investigations. Some of the challenges within reliable forensic investigations include the lack of a global view of the investigation landscape and the complexity and obfuscated nature of the digital world. A perpetual challenge within the evidence analysis process is the reliability and integrity associated with digital evidence, particularly from disparate sources. Given the ease with which digital evidence (such as metadata) can be created, altered, or destroyed, the integrity attributed to digital evidence is of paramount importance. This dissertation focuses on the challenges relating to the integrity of digital evidence within reliable forensic investigations. These challenges are addressed through the proposal of a model for the construction of a Forensic Evidence Management System (FEMS) to preserve the integrity of digital evidence within forensic investigations. The Biba Integrity Model is utilized to maintain the integrity of digital evidence within the FEMS. Casey's Certainty Scale is then employed as the integrity classifcation scheme for assigning integrity labels to digital evidence within the system. The FEMS model consists of a client layer, a logic layer and a data layer, with eight system components distributed amongst these layers. In addition to describing the FEMS system components, a fnite state automata is utilized to describe the system component interactions. In so doing, we reason about the FEMS's behaviour and demonstrate how rules within the FEMS can be developed to recognize and pro le various cyber crimes. Furthermore, we design fundamental algorithms for processing of information by the FEMS's core system components; this provides further insight into the system component interdependencies and the input and output parameters for the system transitions and decision-points infuencing the value of inferences derived within the FEMS. Lastly, the completeness of the FEMS is assessed by comparing the constructs and operation of the FEMS against the published work of Brian D Carrier. This approach provides a mechanism for critically analyzing the FEMS model, to identify similarities or impactful considerations within the solution approach, and more importantly, to identify shortcomings within the model. Ultimately, the greatest value in the FEMS is in its ability to serve as a decision support or enhancement system for digital forensic investigators. Copyright
Dissertation (MSc)--University of Pretoria, 2010.
Computer Science
unrestricted
APA, Harvard, Vancouver, ISO, and other styles
49

Botes, Christo. "Utilising advanced accounting software to trace the reintegration of proceeds of crime, from underground banking into the formal banking system." Diss., 2008. http://hdl.handle.net/10500/791.

Full text
Abstract:
The aim of this paper is to research how advanced accounting software can be used by police detectives, financial risk specialists and forensic investigation specialists, who are responsible for the investigation and tracing of the reintegration of proceeds of crime, from underground banking into formal banking system (pro active and reactive money laundering investigation) with a view on criminal prosecution. The research started of by looking at the basic ways how proceeds of crime are smuggled before it is integrated into the formal banking system. In that context, the phenomenon of Underground banking was researched. Currency smuggling, Hawala currency transfer schemes and the way in which it is used to move proceeds of crime were discussed in detail. Thereafter Formal banking and the way in which proceeds of crime is reintegrated from underground banking structures into formal banking systems were discussed. The use of advanced accounting software to trace the point where proceeds of crime are reintegrated into formal banking were researched extensively. Accounting software and investigative techniques on how to trace financial transactions which might be tainted with proceeds of crime were discussed. Accounting software which can be used on office computers such as laptops were discussed and more advanced automated systems which can be used to trace proceeds of crime transactions in the formal banking systems were also discussed. In specific, the investigative techniques on how to use these systems as investigative tools were discussed in great detail. This research paper gives a truly unique perspective on the financial investigative and analytical angle on proceeds of crime and money laundering detection.
Criminal Justice
M.Tech. (Forensic Investigation)
APA, Harvard, Vancouver, ISO, and other styles
50

Bem, Derek. "On the genesis of computer forensis." Thesis, 2009. http://hdl.handle.net/1959.7/uws:49292.

Full text
Abstract:
This thesis presents a coherent set of research contributions to the new discipline of computer forensis. It analyses emergence of computer forensis and defines challenges facing this discipline, carries forward research advances in conventional methodology, introduces novel approach to using virtual environments in forensis, and systemises the computer forensis body of knowledge leading to the establishment of tertiary curriculum. The emergence of computer forensis as a separate discipline of science was triggered by evolution and growth of computer crime. Computer technology reached a stage when a conventional, mechanistic approach to collecting and analysing data is insufficient: the existing methodology must be formalised, and embrace technologies and methods that will enable the inclusion of transient data and live systems analysis. Further work is crucial to incorporate advances in related disciplines like computer security and information systems audit, as well as developments in operating systems to make computer forensics issues inherent in their design. For example: it is proposed that some of the features offered by persistent systems could be built into conventional operating systems to make illicit activities easier to identify and analyse. The analysis of permanent data storage is fundamental to computer forensics practice. There is very little finalised, and a lot still to be discovered in the conventional computer forensics methodology. This thesis contributes to formalisation and improved integrity of forensic handling of data storage by: formalising methods for data collection and analysis in NTFS (Microsoft file system) environment: presenting safe methodology for handling data backups in order to avoid information loss where Alternate Data Streams (ADS) are present: formalising methods of hiding and extracting hidden and encrypted data. A significant contribution of this thesis is in the field of application of virtualisation, or simulation of the computer in the virtual environment created by the underlying hardware and software, to computer forensics practice. Computer systems are not easily analysed for forensic purpose, and it is demonstrated that virtualisation applied in computer forensics allows for more efficient and accurate identification and analysis of the evidence. A new method is proposed where two environments used in parallel can bring faster and verifiable results not dependent on proprietary, close source tools and may lead to gradual shift from commercial Windows software to open source software (OSS). The final contribution of this thesis is systemising the body of knowledge in computer forensics, which is a necessary condition for it to become an established discipline of science. This systemisation led to design and development of tertiary curriculum in computer forensics illustrated here with a case study of computer forensics major for Bachelor of Computer Science at University of Western Sydney. All genesis starts as an idea. A natural part of scientific research process is replacing previous assumptions, concepts, and practices with new ones which better approximate the truth. This thesis advances computer forensis body of knowledge in the areas which are crucial to further development of this discipline. Please note that the appendices to this thesis consist of separately published items which cannot be made available due to copyright restrictions. These items are listed in the PDF attachment for reference purposes.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography