To see the other types of publications on this topic, follow the link: Forensic engineering - Data processing.

Journal articles on the topic 'Forensic engineering - Data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Forensic engineering - Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Prakash, Vijay, Alex Williams, Lalit Garg, Claudio Savaglio, and Seema Bawa. "Cloud and Edge Computing-Based Computer Forensics: Challenges and Open Problems." Electronics 10, no. 11 (May 21, 2021): 1229. http://dx.doi.org/10.3390/electronics10111229.

Full text
Abstract:
In recent years, there has been a dramatic change in attitude towards computers and the use of computer resources in general. Cloud and Edge computing have emerged as the most widely used technologies, including fog computing and the Internet of Things (IoT). There are several benefits in exploiting Cloud and Edge computing paradigms, such as lower costs and higher efficiency. It provides data computation and storage where data are processed, enables better data control, faster understanding and actions, and continuous operation. However, though these benefits seem to be appealing, their effects on computer forensics are somewhat undesirable. The complexity of the Cloud and Edge environments and their key features present many technical challenges from multiple stakeholders. This paper seeks to establish an in-depth understanding of the impact of Cloud and Edge computing-based environmental factors. Software and hardware tools used in the digital forensic process, forensic methods for handling tampered sound files, hidden files, image files, or images with steganography, etc. The technical/legal challenges and the open design problems (such as distributed maintenance, multitasking and practicality) highlight the various challenges for the digital forensics process.
APA, Harvard, Vancouver, ISO, and other styles
2

Jean, Thilmany. "Working Backward." Mechanical Engineering 127, no. 06 (June 1, 2005): 36–38. http://dx.doi.org/10.1115/1.2005-jun-3.

Full text
Abstract:
This article reviews how reverse engineering is used in detecting and preserving. Engineers across many disciplines find reverse engineering an invaluable tool to discover and learn about a product’s structure and design. A good forensic engineer will glean relevant information through meticulous investigation and by taking a reverse-engineering approach. Texas Tech University, the National Park Service, and the Historic American Buildings Survey are now creating digital architectural drawings to detail the 120-year-old statue’s every curve, cranny, and dimension. They are doing this through reverse engineering. The university is capturing the statue's unique architecture with three-dimensional laser scanning technology tied to geometry processing software, which automatically generates an accurate digital model from the scan data. To help align the scans and to fix the holes, the team turned to technology that creates surface models from scanned data. The software is Geomagic Studio, from Raindrop Geomagic of Research Triangle Park, NC.
APA, Harvard, Vancouver, ISO, and other styles
3

Kim, Hyungchan, Sungbum Kim, Yeonghun Shin, Wooyeon Jo, Seokjun Lee, and Taeshik Shon. "Ext4 and XFS File System Forensic Framework Based on TSK." Electronics 10, no. 18 (September 20, 2021): 2310. http://dx.doi.org/10.3390/electronics10182310.

Full text
Abstract:
Recently, the number of Internet of Things (IoT) devices, such as artificial intelligence (AI) speakers and smartwatches, using a Linux-based file system has increased. Moreover, these devices are connected to the Internet and generate vast amounts of data. To efficiently manage these generated data and improve the processing speed, the function is improved by updating the file system version or using new file systems, such as an Extended File System (XFS), B-tree file system (Btrfs), or Flash-Friendly File System (F2FS). However, in the process of updating the existing file system, the metadata structure may be changed or the analysis of the newly released file system may be insufficient, making it impossible for existing commercial tools to extract and restore deleted files. In an actual forensic investigation, when deleted files become unrecoverable, important clues may be missed, making it difficult to identify the culprit. Accordingly, a framework for extracting and recovering files based on The Sleuth Kit (TSK) is proposed by deriving the metadata changed in Ext4 file system journal checksum v3 and XFS file system v5. Thereafter, by comparing the accuracy and recovery rate of the proposed framework with existing commercial tools using the experimental dataset, we conclude that sustained research on file systems should be conducted from the perspective of forensics.
APA, Harvard, Vancouver, ISO, and other styles
4

Chaves, Deisy, Eduardo Fidalgo, Enrique Alegre, Rocío Alaiz-Rodríguez, Francisco Jáñez-Martino, and George Azzopardi. "Assessment and Estimation of Face Detection Performance Based on Deep Learning for Forensic Applications." Sensors 20, no. 16 (August 11, 2020): 4491. http://dx.doi.org/10.3390/s20164491.

Full text
Abstract:
Face recognition is a valuable forensic tool for criminal investigators since it certainly helps in identifying individuals in scenarios of criminal activity like fugitives or child sexual abuse. It is, however, a very challenging task as it must be able to handle low-quality images of real world settings and fulfill real time requirements. Deep learning approaches for face detection have proven to be very successful but they require large computation power and processing time. In this work, we evaluate the speed–accuracy tradeoff of three popular deep-learning-based face detectors on the WIDER Face and UFDD data sets in several CPUs and GPUs. We also develop a regression model capable to estimate the performance, both in terms of processing time and accuracy. We expect this to become a very useful tool for the end user in forensic laboratories in order to estimate the performance for different face detection options. Experimental results showed that the best speed–accuracy tradeoff is achieved with images resized to 50% of the original size in GPUs and images resized to 25% of the original size in CPUs. Moreover, performance can be estimated using multiple linear regression models with a Mean Absolute Error (MAE) of 0.113, which is very promising for the forensic field.
APA, Harvard, Vancouver, ISO, and other styles
5

Salamh, Fahad E., Mohammad Meraj Mirza, and Umit Karabiyik. "UAV Forensic Analysis and Software Tools Assessment: DJI Phantom 4 and Matrice 210 as Case Studies." Electronics 10, no. 6 (March 19, 2021): 733. http://dx.doi.org/10.3390/electronics10060733.

Full text
Abstract:
Unmanned Aerial Vehicles (UAVs) also known as drones have created many challenges to the digital forensic field. These challenges are introduced in all processes of the digital forensic investigation (i.e., identification, preservation, examination, documentation, and reporting). From identification of evidence to reporting, there are several challenges caused by the data type, source of evidence, and multiple components that operate UAVs. In this paper, we comprehensively reviewed the current UAV forensic investigative techniques from several perspectives. Moreover, the contributions of this paper are as follows: (1) discovery of personal identifiable information, (2) test and evaluation of currently available forensic software tools, (3) discussion on data storage mechanism and evidence structure in two DJI UAV models (e.g., Phantom 4 and Matrice 210), and (4) exploration of flight trajectories recovered from UAVs using a three-dimensional (3D) visualization software. The aforementioned contributions aim to aid digital investigators to encounter challenges posed by UAVs. In addition, we apply our testing, evaluation, and analysis on the two selected models including DJI Matrice 210, which have not been presented in previous works.
APA, Harvard, Vancouver, ISO, and other styles
6

Keim, Yansi, Yung Han Yoon, and Umit Karabiyik. "Digital Forensics Analysis of Ubuntu Touch on PinePhone." Electronics 10, no. 3 (February 1, 2021): 343. http://dx.doi.org/10.3390/electronics10030343.

Full text
Abstract:
New smartphones made by small companies enter the technology market everyday. These new devices introduce new challenges for mobile forensic investigators as these devices end up becoming pertinent evidence during an investigation. One such device is the PinePhone from Pine Microsystems (Pine64). These new devices are sometimes also shipped with OSes that are developed by open source communities and are otherwise never seen by investigators. Ubuntu Touch is one of these OSes and is currently being developed for deployment on the PinePhone. There is little research behind both the device and OS on what methodology an investigator should follow to reliably and accurately extract data. This results in potentially flawed methodologies being used before any testing can occur and contributes to the backlog of devices that need to be processed. Therefore, in this paper, the first forensic analysis of the PinePhone device with Ubuntu Touch OS is performed using Autopsy, an open source tool, to establish a framework that can be used to examine and analyze devices running the Ubuntu Touch OS. The findings include analysis of artifacts that could impact user privacy and data security, organization structure of file storage, app storage, OS, etc. Moreover, locations within the device that stores call logs, SMS messages, images, and videos are reported. Interesting findings include forensic artifacts, which could be useful to investigators in understanding user activity and attribution. This research will provide a roadmap to the digital forensic investigators to efficiently and effectively conduct their investigations where they have Ubuntu Touch OS and/or PinePhone as the evidence source.
APA, Harvard, Vancouver, ISO, and other styles
7

Samsuryadi, Samsuryadi, Rudi Kurniawan, and Fatma Susilawati Mohamad. "Automated handwriting analysis based on pattern recognition: a survey." Indonesian Journal of Electrical Engineering and Computer Science 22, no. 1 (April 1, 2021): 196. http://dx.doi.org/10.11591/ijeecs.v22.i1.pp196-206.

Full text
Abstract:
<span>Handwriting analysis has wide scopes include recruitment, medical diagnosis, forensic, psychology, and human-computer interaction. Computerized handwriting analysis makes it easy to recognize human personality and can help graphologists to understand and identify it. The features of handwriting use as input to classify a person’s personality traits. This paper discusses a pattern recognition point of view, in which different stages are described. The stages of study are data collection and pre-processing technique, feature extraction with associated personality characteristics, and the classification model. Therefore, the purpose of this paper is to present a review of the methods and their achievements used in various stages of a pattern recognition system. </span>
APA, Harvard, Vancouver, ISO, and other styles
8

Moldovan, Nicanor I., and Mauro Ferrari. "Prospects for Microtechnology and Nanotechnology in Bioengineering of Replacement Microvessels." Archives of Pathology & Laboratory Medicine 126, no. 3 (March 1, 2002): 320–24. http://dx.doi.org/10.5858/2002-126-0320-pfmani.

Full text
Abstract:
Abstract Context.—Due to its anticipated curative potential, therapeutic angiogenesis recently became a major preoccupation for the biomedical research community. Most of the related work reported to date employs either biochemical or genetic tools. Objective.—To identify opportunities for application of the current developments in microtechnology and nanotechnology to the field of therapeutic angiogenesis. Data Sources.—Survey of recent English-language literature on microvascular tissue engineering in the context of therapeutic angiogenesis. We include our results regarding the role played by microtopographical cues in the progression of angiogenesis, such as those produced during processing of the extracellular matrix by chronic inflammatory cells. Conclusion.—While notable accomplishments have been identified in the field of tissue engineering of larger vessels, reports on purposeful assembly of microvascular structures with the ability to be transferred in vivo by implantation are still scarce. Under these circumstances, we suggest the development of a new class of implantable biomedical microdevices, that is, “angiogenesis assist devices” (or “angiochips”), and we indicate some of their conceivable applications.
APA, Harvard, Vancouver, ISO, and other styles
9

Amin, Muhammad Sadiq, Siddiqui Muhammad Yasir, and Hyunsik Ahn. "Recognition of Pashto Handwritten Characters Based on Deep Learning." Sensors 20, no. 20 (October 17, 2020): 5884. http://dx.doi.org/10.3390/s20205884.

Full text
Abstract:
Handwritten character recognition is increasingly important in a variety of automation fields, for example, authentication of bank signatures, identification of ZIP codes on letter addresses, and forensic evidence. Despite improved object recognition technologies, Pashto’s hand-written character recognition (PHCR) remains largely unsolved due to the presence of many enigmatic hand-written characters, enormously cursive Pashto characters, and lack of research attention. We propose a convolutional neural network (CNN) model for recognition of Pashto hand-written characters for the first time in an unrestricted environment. Firstly, a novel Pashto handwritten character data set, “Poha”, for 44 characters is constructed. For preprocessing, deep fusion image processing techniques and noise reduction for text optimization are applied. A CNN model optimized in the number of convolutional layers and their parameters outperformed common deep models in terms of accuracy. Moreover, a set of benchmark popular CNN models applied to Poha is evaluated and compared with the proposed model. The obtained experimental results show that the proposed model is superior to other models with test accuracy of 99.64 percent for PHCR. The results indicate that our model may be a strong candidate for handwritten character recognition and automated PHCR applications.
APA, Harvard, Vancouver, ISO, and other styles
10

Vandenabeele, Peter, and Jan Jehlička. "Mobile Raman spectroscopy in astrobiology research." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 372, no. 2030 (December 13, 2014): 20140202. http://dx.doi.org/10.1098/rsta.2014.0202.

Full text
Abstract:
Raman spectroscopy has proved to be a very useful technique in astrobiology research. Especially, working with mobile instrumentation during fieldwork can provide useful experiences in this field. In this work, we provide an overview of some important aspects of this research and, apart from defining different types of mobile Raman spectrometers, we highlight different reasons for this research. These include gathering experience and testing of mobile instruments, the selection of target molecules and to develop optimal data processing techniques for the identification of the spectra. We also identify the analytical techniques that it would be most appropriate to combine with Raman spectroscopy to maximize the obtained information and the synergy that exists with Raman spectroscopy research in other research areas, such as archaeometry and forensics.
APA, Harvard, Vancouver, ISO, and other styles
11

Sherrod, Laura, Heather Willever, Kim Shollenberger, Corey Potter, Roger Thorne, and Ann Kline. "Geophysical Investigations of United States Revolutionary War Era (1777–1778) Mass Burial Sites in Pennsylvania, USA." Journal of Environmental and Engineering Geophysics 25, no. 4 (December 2020): 477–96. http://dx.doi.org/10.32389/jeeg20-023.

Full text
Abstract:
The United States Revolutionary War (1775–1783) resulted in numerous mass burials in the eastern United States, with deaths occurring not just directly related to the battles fought, but also from disease, starvation, and exposure. Current information relating to these mass burials is often gathered from myths and rumors, leaving the truth of the historical aspects of that time period ambiguous. Geophysical techniques are increasingly utilized in archaeologic and forensic studies to locate unmarked burials. GPR, magnetics, and electrical resistivity have been used to successfully identify burial locations around the world in a non-invasive manner. This paper aims to illustrate how different burials of the US Revolutionary War period can be detected and characterized with geophysics, providing important knowledge of a better historical understanding of that time period, as well as optimizing equipment instrumentation and processing procedures for such targeted investigations. Three case studies of Revolutionary War Era mass burial sites in Pennsylvania, USA are described here: the Paoli Battlefield Memorial, the Old Charlestown Cemetery, and Saint Peter's Church in the Great Valley. These sites are within 9 km of each other and have historic records that suggest mass burials during this period. Results show GPR to provide the most useful data overall, with supporting information gathered from the supplemental geophysical techniques of magnetometry and resistivity. 2D profiles tend to provide a more valuable image of the subsurface than 3D slices. Larger burials leave a greater footprint and have a higher chance of causing a geophysical disturbance that can be measured more than 200 years after the burial. Soil moisture content and vegetation type can impact quality of results. Study implications demonstrate the challenges and potential usefulness of geophysical techniques to successfully locate and characterize mass burials of this time period.
APA, Harvard, Vancouver, ISO, and other styles
12

Korniiko, S. M. "CONTENTS AND SYSTEM OF EXPERT ACTIVITY IN THE FIELD OF COMPUTER TECHNOLOGIES: DEFINITION PROBLEMS." Actual problems of native jurisprudence, no. 4 (August 30, 2019): 158–62. http://dx.doi.org/10.15421/391934.

Full text
Abstract:
The article is devoted to the definition of the content and the system of expert activity in the field of computer technologies, which is based on the results of determining the general system of expert activity. Expert activity should be understood as the implementation by authorized agents on the basis of special knowledge in the field of science, technology, art, crafts, etc. Studies of objects, phenomena and processes in order to provide scientifically substantiated conclusions on the diverse issues that arise in the process of life of society. Such a definition of expert activity includes both judicial and non-judicial expert examination. At present, more than 500 laws are adopted in Ukraine, which in one way or another concern the conduct of expert assessments (most of them are valid at 2019). But no any among that laws directly devoted to the expert work in the field of computer technology. So the system and content of the expert work in the field of computer technology should be established, based on knowledge of the object of expertise – computer technology. It is considered as synonymous with the concept of “information technology” or “information and communication technologies”. Information technology – it is a purposeful organized set of information processes using computer facilities, which provide high speed data processing, rapid information search, dispersal of data, access to information sources regardless of places of their location. The system of expert activity in the field of computer technologies includes examinations belonging to a group of judicial (engineering, commodity, forensic, etc.) and non-judicial (scientific and scientific and technical expertise; examination of issues of quality and conformity of goods (products) to certain requirements; examination of issues of information security; examination of issues of environmental impact and the environment of human life, etc.), as well as presented by different kindsand species examinations that have different goals focused on the study of computer technology in their various aspects and provides solutions to diverse issues.
APA, Harvard, Vancouver, ISO, and other styles
13

Rodriguez-Ortega, Yohanna, Dora M. Ballesteros, and Diego Renza. "Copy-Move Forgery Detection (CMFD) Using Deep Learning for Image and Video Forensics." Journal of Imaging 7, no. 3 (March 20, 2021): 59. http://dx.doi.org/10.3390/jimaging7030059.

Full text
Abstract:
With the exponential growth of high-quality fake images in social networks and media, it is necessary to develop recognition algorithms for this type of content. One of the most common types of image and video editing consists of duplicating areas of the image, known as the copy-move technique. Traditional image processing approaches manually look for patterns related to the duplicated content, limiting their use in mass data classification. In contrast, approaches based on deep learning have shown better performance and promising results, but they present generalization problems with a high dependence on training data and the need for appropriate selection of hyperparameters. To overcome this, we propose two approaches that use deep learning, a model by a custom architecture and a model by transfer learning. In each case, the impact of the depth of the network is analyzed in terms of precision (P), recall (R) and F1 score. Additionally, the problem of generalization is addressed with images from eight different open access datasets. Finally, the models are compared in terms of evaluation metrics, and training and inference times. The model by transfer learning of VGG-16 achieves metrics about 10% higher than the model by a custom architecture, however, it requires approximately twice as much inference time as the latter.
APA, Harvard, Vancouver, ISO, and other styles
14

Demertzis, Konstantinos, Konstantinos Tsiknas, Dimitrios Takezis, Charalabos Skianis, and Lazaros Iliadis. "Darknet Traffic Big-Data Analysis and Network Management for Real-Time Automating of the Malicious Intent Detection Process by a Weight Agnostic Neural Networks Framework." Electronics 10, no. 7 (March 25, 2021): 781. http://dx.doi.org/10.3390/electronics10070781.

Full text
Abstract:
Attackers are perpetually modifying their tactics to avoid detection and frequently leverage legitimate credentials with trusted tools already deployed in a network environment, making it difficult for organizations to proactively identify critical security risks. Network traffic analysis products have emerged in response to attackers’ relentless innovation, offering organizations a realistic path forward for combatting creative attackers. Additionally, thanks to the widespread adoption of cloud computing, Device Operators (DevOps) processes, and the Internet of Things (IoT), maintaining effective network visibility has become a highly complex and overwhelming process. What makes network traffic analysis technology particularly meaningful is its ability to combine its core capabilities to deliver malicious intent detection. In this paper, we propose a novel darknet traffic analysis and network management framework to real-time automating the malicious intent detection process, using a weight agnostic neural networks architecture. It is an effective and accurate computational intelligent forensics tool for network traffic analysis, the demystification of malware traffic, and encrypted traffic identification in real time. Based on a weight agnostic neural networks (WANNs) methodology, we propose an automated searching neural net architecture strategy that can perform various tasks such as identifying zero-day attacks. By automating the malicious intent detection process from the darknet, the advanced proposed solution is reducing the skills and effort barrier that prevents many organizations from effectively protecting their most critical assets.
APA, Harvard, Vancouver, ISO, and other styles
15

Książek, Kamil, Michał Romaszewski, Przemysław Głomb, Bartosz Grabowski, and Michał Cholewa. "Blood Stain Classification with Hyperspectral Imaging and Deep Neural Networks." Sensors 20, no. 22 (November 21, 2020): 6666. http://dx.doi.org/10.3390/s20226666.

Full text
Abstract:
In recent years, growing interest in deep learning neural networks has raised a question on how they can be used for effective processing of high-dimensional datasets produced by hyperspectral imaging (HSI). HSI, traditionally viewed as being within the scope of remote sensing, is used in non-invasive substance classification. One of the areas of potential application is forensic science, where substance classification on the scenes is important. An example problem from that area—blood stain classification—is a case study for the evaluation of methods that process hyperspectral data. To investigate the deep learning classification performance for this problem we have performed experiments on a dataset which has not been previously tested using this kind of model. This dataset consists of several images with blood and blood-like substances like ketchup, tomato concentrate, artificial blood, etc. To test both the classic approach to hyperspectral classification and a more realistic application-oriented scenario, we have prepared two different sets of experiments. In the first one, Hyperspectral Transductive Classification (HTC), both a training and a test set come from the same image. In the second one, Hyperspectral Inductive Classification (HIC), a test set is derived from a different image, which is more challenging for classifiers but more useful from the point of view of forensic investigators. We conducted the study using several architectures like 1D, 2D and 3D convolutional neural networks (CNN), a recurrent neural network (RNN) and a multilayer perceptron (MLP). The performance of the models was compared with baseline results of Support Vector Machine (SVM). We have also presented a model evaluation method based on t-SNE and confusion matrix analysis that allows us to detect and eliminate some cases of model undertraining. Our results show that in the transductive case, all models, including the MLP and the SVM, have comparative performance, with no clear advantage of deep learning models. The Overall Accuracy range across all models is 98–100% for the easier image set, and 74–94% for the more difficult one. However, in a more challenging inductive case, selected deep learning architectures offer a significant advantage; their best Overall Accuracy is in the range of 57–71%, improving the baseline set by the non-deep models by up to 9 percentage points. We have presented a detailed analysis of results and a discussion, including a summary of conclusions for each tested architecture. An analysis of per-class errors shows that the score for each class is highly model-dependent. Considering this and the fact that the best performing models come from two different architecture families (3D CNN and RNN), our results suggest that tailoring the deep neural network architecture to hyperspectral data is still an open problem.
APA, Harvard, Vancouver, ISO, and other styles
16

Gabryś, Marta, and Łukasz Ortyl. "Georeferencing of Multi-Channel GPR—Accuracy and Efficiency of Mapping of Underground Utility Networks." Remote Sensing 12, no. 18 (September 11, 2020): 2945. http://dx.doi.org/10.3390/rs12182945.

Full text
Abstract:
Due to the capabilities of non-destructive testing of inaccessible objects, GPR (Ground Penetrating Radar) is used in geology, archeology, forensics and increasingly also in engineering tasks. The wide range of applications of the GPR method has been provided by the use of advanced technological solutions by equipment manufacturers, including multi-channel units. The acquisition of data along several profiles simultaneously allows time to be saved and quasi-continuous information to be collected about the subsurface situation. One of the most important aspects of data acquisition systems, including GPR, is the appropriate methodology and accuracy of the geoposition. This publication aims to discuss the results of GPR measurements carried out using the multi-channel Leica Stream C GPR (IDS GeoRadar Srl, Pisa, Italy). The significant results of the test measurement were presented the idea of which was to determine the achievable accuracy depending on the georeferencing method using a GNSS (Global Navigation Satellite System) receiver, also supported by time synchronization PPS (Pulse Per Second) and a total station. Methodology optimization was also an important aspect of the discussed issue, i.e., the effect of dynamic changes in motion trajectory on the positioning accuracy of echograms and their vectorization products was also examined. The standard algorithms developed for the dedicated software were used for post-processing of the coordinates and filtration of echograms, while the vectorization was done manually. The obtained results provided the basis for the confrontation of the material collected in urban conditions with the available cartographic data in terms of the possibility of verifying the actual location of underground utilities. The urban character of the area limited the possibility of the movement of Leica Stream C due to the large size of the instrument, however, it created the opportunity for additional analyses, including the accuracy of different location variants around high-rise buildings or the agreement of the amplitude distribution at the intersection of perpendicular profiles.
APA, Harvard, Vancouver, ISO, and other styles
17

Jiang, Chao, Jinlin Wang, and Yang Li. "An Efficient Indexing Scheme for Network Traffic Collection and Retrieval System." Electronics 10, no. 2 (January 15, 2021): 191. http://dx.doi.org/10.3390/electronics10020191.

Full text
Abstract:
Historical network traffic retrieval, both at the packet and flow level, has been applied in many fields of network security, such as network traffic analysis and network forensics. To retrieve specific packets from a vast number of packet traces, it is an effective solution to build indexes for the query attributes. However, it brings challenges of storage consumption and construction time overhead for packet indexing. To address these challenges, we propose an efficient indexing scheme called IndexWM based on the wavelet matrix data structure for packet indexing. Moreover, we design a packet storage format based on the PcapNG format for our network traffic collection and retrieval system, which can speed up the extraction of index data from packet traces. Offline experiments on randomly generated network traffic and actual network traffic are performed to evaluate the performance of the proposed indexing scheme. We choose an open-source and widely used bitmap indexing scheme, FastBit, for comparison. Apart from the native bitmap compression method Word-Aligned Hybrid (WAH), we implement an efficient bitmap compression method Scope-Extended COMPAX (SECOMPAX) in FastBit for performance evaluation. The comparison results show that our scheme outperforms the selected bitmap indexing schemes in terms of time consumption, storage consumption and retrieval efficiency.
APA, Harvard, Vancouver, ISO, and other styles
18

Jonson, C. S. L. "Amphetamine profiling — improvements of data processing." Forensic Science International 69, no. 1 (November 1994): 45–54. http://dx.doi.org/10.1016/0379-0738(94)90048-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Pu, Wenjing. "Standardized Data Processing." Transportation Research Record: Journal of the Transportation Research Board 2338, no. 1 (January 2013): 44–57. http://dx.doi.org/10.3141/2338-06.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Tibbitts, J., and Yibin Lu. "Forensic applications of signal processing." IEEE Signal Processing Magazine 26, no. 2 (March 2009): 104–11. http://dx.doi.org/10.1109/msp.2008.931099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Wiest, Joachim, Aldric Namias, Cornelia Pfister, Peter Wolf, Franz Demmel, and Martin Brischwein. "Data Processing in Cellular Microphysiometry." IEEE Transactions on Biomedical Engineering 63, no. 11 (November 2016): 2368–75. http://dx.doi.org/10.1109/tbme.2016.2533868.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Pahlm, Olle, and Leif Sornmo. "Data Processing of Exercise ECG's." IEEE Transactions on Biomedical Engineering BME-34, no. 2 (February 1987): 158–65. http://dx.doi.org/10.1109/tbme.1987.326040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Passot, X., and O. La Marle. "CNES in GAIA data processing." EAS Publications Series 45 (2010): 89–94. http://dx.doi.org/10.1051/eas/1045014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Busso, G. "The Gaia Photometric Data Processing." EAS Publications Series 45 (2010): 381–84. http://dx.doi.org/10.1051/eas/1045064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Köhnemann, S., J. Kretzschmann, K. Mittmann, and H. Pfeiffer. "Import and direct processing of capillary electrophoresis analysis data in forensic casework." Forensic Science International: Genetics Supplement Series 3, no. 1 (December 2011): e459-e460. http://dx.doi.org/10.1016/j.fsigss.2011.09.091.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Lee, Loong Chuen, and Abdul Aziz Jemain. "On overview of PCA application strategy in processing high dimensionality forensic data." Microchemical Journal 169 (October 2021): 106608. http://dx.doi.org/10.1016/j.microc.2021.106608.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Gomez, A., A. Boronat, J. A. Carsi, I. Ramos, C. Taubner, and S. Eckstein. "Biological Data Processing using Model Driven Engineering." IEEE Latin America Transactions 6, no. 4 (August 2008): 324–31. http://dx.doi.org/10.1109/tla.2008.4815285.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Ullman, David S., and David Hebert. "Processing of Underway CTD Data." Journal of Atmospheric and Oceanic Technology 31, no. 4 (April 1, 2014): 984–98. http://dx.doi.org/10.1175/jtech-d-13-00200.1.

Full text
Abstract:
Abstract A processing methodology for computation of accurate salinity from measurements with an underway CTD (UCTD) is presented. The UCTD is a rapidly profiling sensor package lacking a pump that relies on instrument motion to produce flow through the conductivity cell. With variable instrument descent rate, the flow through the cell is not constant, and this has important implications for the processing. As expected, the misalignment of the raw temperature and conductivity is found to be a function of the instrument descent rate. Application of a constant temporal advance of conductivity or temperature as is done with pumped CTDs is shown to produce unacceptable salinity spiking. With the descent rate of the UCTD reaching upwards of 4 dbar s−1, the effect of viscous heating of the thermistor is shown to produce a significant salinity error of up to 0.005 psu, and a correction based on previous laboratory work is applied. Correction of the error due to the thermal mass of the conductivity cell is achieved using a previously developed methodology with the correction parameters varying with instrument descent rate. Comparison of salinity from the UCTD with that from a standard shipboard, pumped CTD in side-by-side deployments indicates that the processed UCTD salinity is accurate to better than 0.01 psu.
APA, Harvard, Vancouver, ISO, and other styles
29

Teller, J., F. Ozguner, and R. Ewing. "Data processing through optical interfaces." IEEE Aerospace and Electronic Systems Magazine 24, no. 10 (October 2009): 42–43. http://dx.doi.org/10.1109/maes.2009.5317786.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Hwang, J. S., K. C. Chang, and G. C. Lee. "Modified Frequency‐Domain Data Processing." Journal of Engineering Mechanics 115, no. 10 (October 1989): 2333–39. http://dx.doi.org/10.1061/(asce)0733-9399(1989)115:10(2333).

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Fowler, Kim. "Data processing in measurement instrumentation." IEEE Instrumentation and Measurement Magazine 9, no. 6 (December 2006): 36–42. http://dx.doi.org/10.1109/mim.2006.250649.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Sorokin, V. I. "Use of Data Processing Technologies." Chemistry and Technology of Fuels and Oils 40, no. 2 (March 2004): 85–89. http://dx.doi.org/10.1023/b:cafo.0000028952.73078.91.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Liu, Chang Hong, Run Yang Zhong, Yi Er Yan, and Xiao Hu. "CEP-Based Massive Data Processing Approach for RFID Data." Advanced Materials Research 317-319 (August 2011): 350–53. http://dx.doi.org/10.4028/www.scientific.net/amr.317-319.350.

Full text
Abstract:
Complex Event Processing (CEP) is proposed in this paper to tackle the changeability, tense correlation, and association in massive data process. This paper takes RFID (Radio Frequency Identification) data for example to illustrate how CEP works on dealing with massive data processing. First, semantic expressions are introduced as the reason of modeling for CEP. Then, model solution based on semantic expressions is proposed. Finally, this methodology achieves good results in processing massive RFID data in terms of speed, efficiency and veracity. The experiment results demonstrate that it is better in getting rid of complexity data after comparing with traditional data management based on database with massive data.
APA, Harvard, Vancouver, ISO, and other styles
34

P Rao, Aishrith, Raghavendra J C, Dr Sowmyarani C N, and Dr Padmashree T. "Data Quality Associated with Big Data Processing: A Survey." Journal of University of Shanghai for Science and Technology 23, no. 06 (June 18, 2021): 1011–18. http://dx.doi.org/10.51201/jusst/21/05386.

Full text
Abstract:
With the advancement of technology and the large volume of data produced, processed, and stored, it is becoming increasingly important to maintain the quality of data in a cost-effective and productive manner. The most important aspects of Big Data (BD) are storage, processing, privacy, and analytics. The Big Data group has identified quality as a critical aspect of its maturity. Nonetheless, it is a critical approach that should be adopted early in the lifecycle and gradually extended to other primary processes. Companies are very reliant and drive profits from the huge amounts of data they collect. When its consistency deteriorates, the ramifications are uncertain and may result in completely undesirable conclusions. In the sense of BD, determining data quality is difficult, but it is essential that we uphold the data quality before we can proceed with any analytics. We investigate data quality during the stages of data gathering, preprocessing, data repository, and evaluation/analysis of BD processing in this paper. The related solutions are also suggested based on the elaboration and review of the proposed problems.
APA, Harvard, Vancouver, ISO, and other styles
35

Guo, Hong Tao, Zhi Guo Chang, and Shan Wei He. "Geostationary Meteorological Satellite Data Processing System." Advanced Materials Research 181-182 (January 2011): 257–60. http://dx.doi.org/10.4028/www.scientific.net/amr.181-182.257.

Full text
Abstract:
In order to design a set of geostationary meteorological satellite data processing system,which have common data processing,practical remote sensing products and rich visual stylet,used VC++6.0 MFC and dynamic link library and the mature remote sensing products processing algorithms, to design it. Multi-satellite source geostationary meteorological satellite data are integrated, remote sensing products are generated, for example, cloud detection, cloud classification, cloud top height. The original cloud image, remote sensing and geographic information products are displayed vividly. The three-dimensional cloud image processing retrieval based on cloud detection product takes into account the visual effect, and also has a clear physical meaning, practicality is strong. The system makes geostationary meteorological satellite information play a more important role in the current weather forecast.
APA, Harvard, Vancouver, ISO, and other styles
36

Abellard, P., and B. Barbagelata. "A data flow numerical processing operator." Microprocessing and Microprogramming 25, no. 1-5 (January 1989): 133–38. http://dx.doi.org/10.1016/0165-6074(89)90185-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Kerckhoffs, E. J. H., and G. C. Vansteenkiste. "Advanced simulation: Advanced data/knowledge-processing." Annual Review in Automatic Programming 12 (January 1985): 13–23. http://dx.doi.org/10.1016/0066-4138(85)90323-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Stevens, N. "Processing of sar data: fundamentals, signal processing, interferometry." Photogrammetric Record 19, no. 108 (December 2004): 419–20. http://dx.doi.org/10.1111/j.0031-868x.2004.295_5.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Seiler, Pete, Michael Frenklach, Andrew Packard, and Ryan Feeley. "Numerical approaches for collaborative data processing." Optimization and Engineering 7, no. 4 (December 2006): 459–78. http://dx.doi.org/10.1007/s11081-006-0350-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Ganesan, Madhubala, Ah-Lian Kor, Colin Pattinson, and Eric Rondeau. "Green Cloud Software Engineering for Big Data Processing." Sustainability 12, no. 21 (November 7, 2020): 9255. http://dx.doi.org/10.3390/su12219255.

Full text
Abstract:
Internet of Things (IoT) coupled with big data analytics is emerging as the core of smart and sustainable systems which bolsters economic, environmental and social sustainability. Cloud-based data centers provide high performance computing power to analyze voluminous IoT data to provide invaluable insights to support decision making. However, multifarious servers in data centers appear to be the black hole of superfluous energy consumption that contributes to 23% of the global carbon dioxide (CO2) emissions in ICT (Information and Communication Technology) industry. IoT-related energy research focuses on low-power sensors and enhanced machine-to-machine communication performance. To date, cloud-based data centers still face energy–related challenges which are detrimental to the environment. Virtual machine (VM) consolidation is a well-known approach to affect energy-efficient cloud infrastructures. Although several research works demonstrate positive results for VM consolidation in simulated environments, there is a gap for investigations on real, physical cloud infrastructure for big data workloads. This research work addresses the gap of conducting real physical cloud infrastructure-based experiments. The primary goal of setting up a real physical cloud infrastructure is for the evaluation of dynamic VM consolidation approaches which include integrated algorithms from existing relevant research. An open source VM consolidation framework, Openstack NEAT is adopted and experiments are conducted on a Multi-node Openstack Cloud with Apache Spark as the big data platform. Open sourced Openstack has been deployed because it enables rapid innovation, and boosts scalability as well as resource utilization. Additionally, this research work investigates the performance based on service level agreement (SLA) metrics and energy usage of compute hosts. Relevant results concerning the best performing combination of algorithms are presented and discussed.
APA, Harvard, Vancouver, ISO, and other styles
41

Clark, Nigel. "Materializing informatics: From data processing to molecular engineering." Information, Communication & Society 1, no. 1 (March 1998): 70–90. http://dx.doi.org/10.1080/13691189809358954.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Badalyan, V. G., A. Kh Vopilkin, S. A. Dolenko, Yu V. Orlov, and I. G. Persiantsev. "Data-processing algorithms for automatic operation of ultrasonic systems with coherent data processing." Russian Journal of Nondestructive Testing 40, no. 12 (December 2004): 791–800. http://dx.doi.org/10.1007/s11181-005-0108-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Yang, Jun, Yan-dong Cao, Guang-cai Sun, Meng-dao Xing, and Liang Guo. "GF-3 data real-time processing method based on multi-satellite distributed data processing system." Journal of Central South University 27, no. 3 (March 2020): 842–52. http://dx.doi.org/10.1007/s11771-020-4335-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Alves da Silva, A. P., V. H. Quintana, and G. K. H. Pang. "Associative memory models for data processing." International Journal of Electrical Power & Energy Systems 14, no. 1 (February 1992): 23–32. http://dx.doi.org/10.1016/0142-0615(92)90005-t.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Flinn, Lucinda Charlotte, Charlotte Louise Hassett, and Louise Braham. "Development of forensic normative data for the WAIS-IV." Journal of Forensic Practice 20, no. 1 (February 12, 2018): 58–67. http://dx.doi.org/10.1108/jfp-08-2017-0029.

Full text
Abstract:
Purpose The Wechsler Adult Intelligence Scale – Fourth Edition (WAIS-IV) (Wechsler, 2008) is a cognitive assessment that is often used in secure forensic settings, however it has not been normed on this population. The purpose of this paper is to develop forensic normative data. Design/methodology/approach Patient files in a high secure forensic hospital were reviewed in order to obtain completed WAIS-IV (Wechsler, 2008) assessments and scores from the five indexes (verbal comprehension, perceptual reasoning, working memory, processing speed and full scale intelligence quotient (FSIQ)). This included reviewing patient files from all directorates, including male mental health, male learning disability, male personality disorder and the women’s service, yielding a sample size of n=86. Findings The qualitative descriptors obtained across the hospital ranged between extremely low and superior. The learning disability service scored significantly lower than the mental health and personality disorder services in verbal comprehension index, perceptual reasoning index, working memory index and FSIQ, and significantly lower than the mental health, personality disorder and women’s services in processing speed index. Mean scores from this study were significantly lower in comparison to those from the UK validation study (Wechsler, 2008). Practical implications The significant difference between scores from the current study and those from the UK validation study (Wechsler, 2008) highlights the need to have appropriate normative data for forensic populations. Clinicians should consider interventions that may serve to increase cognitive function, such as cognitive remediation therapy. Originality/value Whilst several special group studies have previously been conducted, this study is the first to develop forensic normative data for the WAIS-IV (Wechsler, 2008). Whilst the sample size was relatively small with limited female participants, the data collated will enable clinicians working in forensic establishments to interpret their assessments in light of this information.
APA, Harvard, Vancouver, ISO, and other styles
46

McDonnell, M. D., N. G. Stocks, C. E. M. Pearce, and D. Abbott. "Stochastic resonance and data processing inequality." Electronics Letters 39, no. 17 (2003): 1287. http://dx.doi.org/10.1049/el:20030792.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Felsen, L. B., and L. Carin. "Wave-orientated processing of scattering data." Electronics Letters 29, no. 22 (1993): 1930. http://dx.doi.org/10.1049/el:19931285.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Zaidel’, R. M. "New computer programs for data processing." Atomic Energy 103, no. 2 (August 2007): 657–59. http://dx.doi.org/10.1007/s10512-007-0105-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Gheshlaghi, F., and J. C. Santamarina. "Data Pre‐Processing in Cross‐Hole Geotomography." Journal of Environmental and Engineering Geophysics 3, no. 1 (March 1998): 41–47. http://dx.doi.org/10.4133/jeeg3.1.41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Béjar, M. A., J. M. Gómez-Rodríguez, J. Gómez-Herrero, and A. Baró. "New developments in fast image processing and data acquisition for STM." Journal of Microscopy 152, no. 3 (December 1988): 619–26. http://dx.doi.org/10.1111/j.1365-2818.1988.tb01429.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography