Academic literature on the topic 'Mechanics, data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Mechanics, data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Mechanics, data processing"

1

Hwang, J. S., K. C. Chang, and G. C. Lee. "Modified Frequency‐Domain Data Processing." Journal of Engineering Mechanics 115, no. 10 (October 1989): 2333–39. http://dx.doi.org/10.1061/(asce)0733-9399(1989)115:10(2333).

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

TAMURA, Yoshiaki, Hiroyuki FURUSAWA, Hidenori FUJII, and Yuki MIYAMOTO. "Data Processing for Large-Scale Comoutational Mechanics Results." Journal of the Visualization Society of Japan 29-1, no. 1 (2009): 129. http://dx.doi.org/10.3154/jvs.29.129.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Chakrabarti, Satyananda, Donald E. Shaw, Dale E. Stephenson, and B. V. K. Vijaya Kumar. "Digital Signal Processing of Geotechnical Data." Journal of Engineering Mechanics 112, no. 1 (January 1986): 70–83. http://dx.doi.org/10.1061/(asce)0733-9399(1986)112:1(70).

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

McConnell, K. G., D. O. Adams, and P. D. Hoist. "SUBTLE ERRORS IN DIGITAL EXPERIMENTAL DATA PROCESSING." Experimental Techniques 21, no. 6 (November 1997): 15–18. http://dx.doi.org/10.1111/j.1747-1567.1997.tb00564.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Badalyan, V. G., A. Kh Vopilkin, S. A. Dolenko, Yu V. Orlov, and I. G. Persiantsev. "Data-processing algorithms for automatic operation of ultrasonic systems with coherent data processing." Russian Journal of Nondestructive Testing 40, no. 12 (December 2004): 791–800. http://dx.doi.org/10.1007/s11181-005-0108-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Likhachev, Aleksey V., and Marina V. Tabanyukhova. "A new processing algorithm for photoelasticity method data." Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mekhanika, no. 79 (2022): 100–110. http://dx.doi.org/10.17223/19988621/79/9.

Full text
Abstract:
The photoelasticity method is a reliable tool for studying the stress state of flat elements in building structures using the models made of optically sensitive materials. In this paper, the classical photoelasticity is considered. The experimental data obtained with the use of the method are presented as interferograms. A decoding procedure implies the obtaining of some normal and tangential stress values in the plane of the model. The polarization-projection installations that are used in optical methods are rather simple. However, the digital processing of the images obtained during the loaded model transmission requires high-intelligent software. Nowadays, national and international laboratories, working with polarization-optical methods, strive to develop digital photoelasticity. For some reasons, the authors of the presented work needed to develop their own algorithms for decoding experimental data of the photoelasticity method. This work is mainly devoted to a formulation of the problems to be solved. Some of them have already been solved, and the results obtained are presented here. The authors place special emphasis on the description of the algorithm for tracing of interference fringes based on the analysis of the image gradient.
APA, Harvard, Vancouver, ISO, and other styles
7

French, M., and M. Jay. "INSTRUMENTATION AND DATA PROCESSING FOR AUTOMOTIVE NVH TESTING." Experimental Techniques 22, no. 6 (November 1998): 43–44. http://dx.doi.org/10.1111/j.1747-1567.1998.tb02301.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, He Quan. "Data Processing in Dynamic Traffic Capacity of Urban Road Based on Squatted Lane." Advanced Materials Research 910 (March 2014): 415–18. http://dx.doi.org/10.4028/www.scientific.net/amr.910.415.

Full text
Abstract:
With more and more cars, rates of traffic accidents are also increasing. When there is an accident, lanes will be squatted. So the traffic capacity will be influenced. I have established a dynamic traffic capacity model based on mechanical mechanics that basic road capacity is influenced by instantaneous traffic. First of all, considering the phrase cycle of traffic lights of upstream crossing, we will set a sampling point per 60 seconds to count the number of cars that pass through the cross-section of accident in this video, and then calculate the traffic of every sampling point. Secondly, we will analyze actual traffic capacity by the line chart of traffic of cross-section and time that we evacuate queuing vehicles within 120 meters. By analyzing, we find that the actual traffic capacity at the cross-section of accident emerges fluctuation status. Based on the traditional model of road capacity and mechanical mechanics, by improving compensation factors, we can establish the model of dynamic traffic capacity at accident point.
APA, Harvard, Vancouver, ISO, and other styles
9

Xiao, Boxiang, Sheng Wu, and Xinyu Guo. "A physics-based approach to motion capture data processing for virtual plant modeling and simulation." International Journal of Modeling, Simulation, and Scientific Computing 09, no. 03 (May 24, 2018): 1840005. http://dx.doi.org/10.1142/s1793962318400056.

Full text
Abstract:
Dynamic virtual plant simulation is an attractive research issue in both botany and computer graphics. Data-driven method is an efficient way for motion analysis and animation synthesis. As a widely used tool, motion capture has been used in plant motion data acquisition and analysis. The most prominent and important problem in motion capture for plants is primary data processing such as missing markers reconstruction. This paper presents a novel physics-based approach to motion capture data processing of plants. Firstly, a physics-based mechanics model is found by Lagrangian mechanics for a motion captured plant organ such as a leaf, and then its dynamic mechanical properties are analyzed and relevant model parameters are evaluated. Further, by using the physical model with evaluated parameters, we can calculate the next positions of a maker to reconstruct the missing makers in motion capture sequence. We take an example of a maize leaf and pachira leaf to examine the proposed approach, and the results show that the physics-based method is feasible and effective for plant motion data processing.
APA, Harvard, Vancouver, ISO, and other styles
10

Shankar, Karthi, and Prabu Sevugan. "Spatial Data Indexing and Query Processing in GeoCloud." Journal of Testing and Evaluation 47, no. 6 (February 26, 2019): 20180502. http://dx.doi.org/10.1520/jte20180502.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Mechanics, data processing"

1

Xu, Lin. "Data modeling and processing in deregulated power system." Online access for everyone, 2005. http://www.dissertations.wsu.edu/Dissertations/Spring2005/l%5Fxu%5F022805.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhu, Tulong. "Meshless methods in computational mechanics." Diss., Georgia Institute of Technology, 1998. http://hdl.handle.net/1853/11795.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wyatt, Timothy Robert. "Development and evaluation of an educational software tool for geotechnical engineering." Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/20225.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Leung, Tsui-shan, and 梁翠珊. "A functional analysis of GIS for slope management in Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31223072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Fortier, Hélène. "AFM Indentation Measurements and Viability Tests on Drug Treated Leukemia Cells." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/34345.

Full text
Abstract:
A significant body of literature has reported strategies and techniques to assess the mechanical properties of biological samples such as proteins, cellular and tissue systems. Atomic force microscopy has been used to detect elasticity changes of cancer cells. However, only a few studies have provided a detailed and complete protocol of the experimental procedures and data analysis methods for non-adherent blood cancer cells. In this work, the elasticity of NB4 cells derived from acute promyelocytic leukemia (APL) was probed by AFM indentation measurements to investigate the effects of the disease on cellular biomechanics. Understanding how leukemia influences the nanomechanical properties of cells is expected to provide a better understanding of the cellular mechanisms associated to cancer, and promises to become a valuable new tool for cancer detection and staging. In this context, the quantification of the mechanical properties of APL cells requires a systematic and optimized approach for data collection and analysis, in order to generate reproducible and comparative data. This Thesis elucidates the automated data analysis process that integrates programming, force curve collection and analysis optimization to assess variations of cell elasticity in response to processing criteria. A processing algorithm was developed by using the IGOR Pro software to automatically analyze large numbers of AFM data sets in an efficient and accurate manner. In fact, since the analysis involves multiple steps that must be repeated for many individual cells, an automated and un-biased processing approach is essential to precisely determine cell elasticity. Different fitting models for extracting the Young’s modulus have been systematically applied to validate the process, and the best fitting criteria, such as the contact point location and indentation length, have been determined in order to obtain consistent results. The designed automated processing code described in this Thesis was used to correlate alterations in cellular biomechanics of cancer cells as they undergo drug treatments. In order to fully assess drug effects on NB4 cells, viability assays were first performed using Trypan Blue staining for primary insights before initiating thorough microplate fluorescence intensity readings using a LIVE/DEAD viability kit involving ethidium and calcein AM labelling components. From 0 to 24 h after treatment using 30 µM arsenic trioxide, relative live cell populations increased until 36 h. From 0 to 12 h post-treatment, relative populations of dead cells increased until 24 h post-treatment. Furthermore, a drastic drop in dead cell count has been observed between 12 and 24 h. Additionally, arsenic trioxide drug induced alterations in elasticity of NB4 cells can be correlated to the cell viability tests. With respect to cell mechanics, trapping of the non-adherent NB4 cells within fabricated SU8-10 microwell arrays, allowed consistent AFM indentation measurements up to 48 h after treatment. Results revealed an increase in cell elasticity up to 12 h post-treatment and a drastic decrease between 12 and 24 h. Furthermore, arsenic trioxide drug induced alterations in elasticity of NB4 cells can be correlated to the cell viability tests. In addition to these indentation and viability testing approaches, morphological appearances were monitored, in order to track the apoptosis process of the affected cells. Relationships found between viability and elasticity assays in conjunction with morphology alterations revealed distinguish stages of apoptosis throughout treatment. 24 h after initial treatment, most cells were observed to have burst or displayed obvious blebbing. These relations between different measurement methods may reveal a potential drug screening approach, for understanding specific physical and biological of drug effects on the cancer cells.
APA, Harvard, Vancouver, ISO, and other styles
6

Chiu, Cheng-Jung. "Data processing in nanoscale profilometry." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/36677.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1995.
Includes bibliographical references (p. 176-177).
New developments on the nanoscale are taking place rapidly in many fields. Instrumentation used to measure and understand the geometry and property of the small scale structure is therefore essential. One of the most promising devices to head the measurement science into the nanoscale is the scanning probe microscope. A prototype of a nanoscale profilometer based on the scanning probe microscope has been built in the Laboratory for Manufacturing and Productivity at MIT. A sample is placed on a precision flip stage and different sides of the sample are scanned under the SPM to acquire its separate surface topography. To reconstruct the original three dimensional profile, many techniques like digital filtering, edge identification, and image matching are investigated and implemented in the computer programs to post process the data, and with greater emphasis placed on the nanoscale application. The important programming issues are addressed, too. Finally, this system's error sources are discussed and analyzed.
by Cheng-Jung Chiu.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
7

Turel, Mesut. "Soft computing based spatial analysis of earthquake triggered coherent landslides." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/45909.

Full text
Abstract:
Earthquake triggered landslides cause loss of life, destroy structures, roads, powerlines, and pipelines and therefore they have a direct impact on the social and economic life of the hazard region. The damage and fatalities directly related to strong ground shaking and fault rupture are sometimes exceeded by the damage and fatalities caused by earthquake triggered landslides. Even though future earthquakes can hardly be predicted, the identification of areas that are highly susceptible to landslide hazards is possible. For geographical information systems (GIS) based deterministic slope stability and earthquake-induced landslide analysis, the grid-cell approach has been commonly used in conjunction with the relatively simple infinite slope model. The infinite slope model together with Newmark's displacement analysis has been widely used to create seismic landslide susceptibility maps. The infinite slope model gives reliable results in the case of surficial landslides with depth-length ratios smaller than 0.1. On the other hand, the infinite slope model cannot satisfactorily analyze deep-seated coherent landslides. In reality, coherent landslides are common and these types of landslides are a major cause of property damage and fatalities. In the case of coherent landslides, two- or three-dimensional models are required to accurately analyze both static and dynamic performance of slopes. These models are rarely used in GIS-based landslide hazard zonation because they are numerically expensive compared to one dimensional infinite slope models. Building metamodels based on data obtained from computer experiments and using computationally inexpensive predictions based on these metamodels has been widely used in several engineering applications. With these soft computing methods, design variables are carefully chosen using a design of experiments (DOE) methodology to cover a predetermined range of values and computer experiments are performed at these chosen points. The design variables and the responses from the computer simulations are then combined to construct functional relationships (metamodels) between the inputs and the outputs. In this study, Support Vector Machines (SVM) and Artificial Neural Networks (ANN) are used to predict the static and seismic responses of slopes. In order to integrate the soft computing methods with GIS for coherent landslide hazard analysis, an automatic slope profile delineation method from Digital Elevation Models is developed. The integrated framework is evaluated using a case study of the 1989 Loma Prieta, CA earthquake (Mw = 6.9). A seismic landslide hazard analysis is also performed for the same region for a future scenario earthquake (Mw = 7.03) on the San Andreas Fault.
APA, Harvard, Vancouver, ISO, and other styles
8

Roland, Jérémie. "Adiabatic quantum computation." Doctoral thesis, Universite Libre de Bruxelles, 2004. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/211148.

Full text
Abstract:
Le développement de la Théorie du Calcul Quantique provient de l'idée qu'un ordinateur est avant tout un système physique, de sorte que ce sont les lois de la Nature elles-mêmes qui constituent une limite ultime sur ce qui peut être calculé ou non. L'intérêt pour cette discipline fut stimulé par la découverte par Peter Shor d'un algorithme quantique rapide pour factoriser un nombre, alors qu'actuellement un tel algorithme n'est pas connu en Théorie du Calcul Classique. Un autre résultat important fut la construction par Lov Grover d'un algorithme capable de retrouver un élément dans une base de donnée non-structurée avec un gain de complexité quadratique par rapport à tout algorithme classique. Alors que ces algorithmes quantiques sont exprimés dans le modèle ``standard' du Calcul Quantique, où le registre évolue de manière discrète dans le temps sous l'application successive de portes quantiques, un nouveau type d'algorithme a été récemment introduit, où le registre évolue continûment dans le temps sous l'action d'un Hamiltonien. Ainsi, l'idée à la base du Calcul Quantique Adiabatique, proposée par Edward Farhi et ses collaborateurs, est d'utiliser un outil traditionnel de la Mécanique Quantique, à savoir le Théorème Adiabatique, pour concevoir des algorithmes quantiques où le registre évolue sous l'influence d'un Hamiltonien variant très lentement, assurant une évolution adiabatique du système. Dans cette thèse, nous montrons tout d'abord comment reproduire le gain quadratique de l'algorithme de Grover au moyen d'un algorithme quantique adiabatique. Ensuite, nous montrons qu'il est possible de traduire ce nouvel algorithme adiabatique, ainsi qu'un autre algorithme de recherche à évolution Hamiltonienne, dans le formalisme des circuits quantiques, de sorte que l'on obtient ainsi trois algorithmes quantiques de recherche très proches dans leur principe. Par la suite, nous utilisons ces résultats pour construire un algorithme adiabatique pour résoudre des problèmes avec structure, utilisant une technique, dite de ``nesting', développée auparavant dans le cadre d'algorithmes quantiques de type circuit. Enfin, nous analysons la résistance au bruit de ces algorithmes adiabatiques, en introduisant un modèle de bruit utilisant la théorie des matrices aléatoires et en étudiant son effet par la théorie des perturbations.
Doctorat en sciences appliquées
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
9

Cevikbas, Orcun. "Data Acquisition And Processing Interface Development For 3d Laser Rangefinder." Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/12607514/index.pdf.

Full text
Abstract:
In this study, it is aimed to improve the previously developed data acquisition program which was run under DOS and 2D surface reconstruction program under Windows. A new system is set up and both data acquisition and processing software are developed to collect and process data within just one application, running under Windows. The main goal of the thesis is to acquire and process the range data taken from the laser rangefinder in order to construct the 3D image map of simple objects in different positions for indoor environments. The data acquisition program collects data in helical way. To do this, appropriate parameters for the data acquisition interface are determined. In the data processing step, it is aimed to use basic triangulation algorithms and threshold conditions to calculate resolutions, detect noisy points and segment objects from the environment for line fitting. The developed and implemented data acquisition and processing interfaces in the thesis are capable of creating 3D image map and obtaining the view of scanned environment in a short time with high accuracy.
APA, Harvard, Vancouver, ISO, and other styles
10

Einstein, Noah. "SmartHub: Manual Wheelchair Data Extraction and Processing Device." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1555352793977171.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Mechanics, data processing"

1

1942-, Chong K. P., ed. Approximate solution methods in engineering mechanics. London: Elsevier Applied Science, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

1950-, Adeli Hojjat, ed. Parallel processing in computational mechanics. New York: M. Dekker, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

A, Greated Clive, Cosgrove J, and Buick J. M, eds. Optical methods and data processing in heat and fluid flow. Bury St. Edmunds: Professional Engineering, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sharp, J. J. BASIC fluid mechanics. London: Butterworths, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Manolis, Papadrakakis, and Topping B. H. V, eds. Innovative computatuonal methods for structural mechanics. Edinburgh: Saxe-Coburg Publications, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

American Society of Mechanical Engineers. Winter Meeting. Small computers in fluid mechanics. New York: American Society of Mechanical Engineers, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

S, Alekseev A., and Dulov V. G, eds. Chislennye metody. Novosibirsk: Vychislitelʹnyĭ t͡s︡entr SO AN SSSR, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

S, Alekseev A., and Dulov V. G, eds. Chislennye metody. Novosibirsk: Vychislitelʹnyĭ t͡s︡entr SO AN SSSR, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

P, Wriggers, Wagner, W., PD Dr.-Ing. habil., and Stein Erwin, eds. Nonlinear computational mechanics: State of the art. Berlin: Springer, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Dijksman, J. F. Topics in Applied Mechanics: Integration of Theory and Applications in Applied Mechanics. Dordrecht: Springer Netherlands, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Mechanics, data processing"

1

Gurfil, Pini, and P. Kenneth Seidelmann. "Orbit Data Processing." In Celestial Mechanics and Astrodynamics: Theory and Practice, 441–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-50370-6_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nobach, Holger, and Cameron Tropea. "Fundamentals of Data Processing." In Springer Handbook of Experimental Fluid Mechanics, 1399–417. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-30299-5_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Haertig, Jacques, and Alain Boutier. "Post-Processing of LDV Data." In Laser Velocimetry in Fluid Mechanics, 305–87. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2013. http://dx.doi.org/10.1002/9781118569610.ch8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nobach, Holger, Cameron Tropea, Laurent Cordier, Jean-Paul Bonnet, Joël Delville, Jacques Lewalle, Marie Farge, Kai Schneider, and Ronald Adrian. "Review of Some Fundamentals of Data Processing." In Springer Handbook of Experimental Fluid Mechanics, 1337–98. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-30299-5_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hübner, W., C. Tropea, and J. Volkert. "Wavenumber Spectrum Estimation from Irregularly Spaced Data: Application to PIV Data Processing." In Laser Techniques Applied to Fluid Mechanics, 3–18. Berlin, Heidelberg: Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/978-3-642-56963-0_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zgurovsky, Mikhail Z., Pavlo O. Kasyanov, Oleksiy V. Kapustyan, José Valero, and Nina V. Zadoianchuk. "Auxiliary Properties of Evolution Inclusions Solutions for Earth Data Processing." In Advances in Mechanics and Mathematics, 37–118. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-28512-7_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Sun, Wenjun, and Kevin Lü. "Parallel Query Processing Algorithms for Semi-structured Data." In Notes on Numerical Fluid Mechanics and Multidisciplinary Design, 770–73. Cham: Springer International Publishing, 2002. http://dx.doi.org/10.1007/3-540-47961-9_64.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zores, R., J. Trapp, S. Standfuß, and H. G. Pagendarm. "On-Line Data Processing for the DLR-F9 Windtunnel Experiment." In Notes on Numerical Fluid Mechanics (NNFM), 421–26. Wiesbaden: Vieweg+Teubner Verlag, 1997. http://dx.doi.org/10.1007/978-3-322-86573-1_53.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Riemer, Dominik, Nenad Stojanovic, and Ljiljana Stojanovic. "A Methodology for Designing Events and Patterns in Fast Data Processing." In Notes on Numerical Fluid Mechanics and Multidisciplinary Design, 133–48. Cham: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-642-38709-8_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Fest-Santini, Stephanie. "Image Processing of Two-Phase Data for Drop-Surface Interaction Obtained by X-Ray Microtomography." In Fluid Mechanics and Its Applications, 101–11. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-33338-6_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Mechanics, data processing"

1

Fan, Yigang, and Frederick Lutze. "Unsteady aerodynamic tests and data reductions using digital signal processing approach." In 23rd Atmospheric Flight Mechanics Conference. Reston, Virigina: American Institute of Aeronautics and Astronautics, 1998. http://dx.doi.org/10.2514/6.1998-4454.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Yongzan, Lin Liang, Olga Podgornova, Smaine Zeroug, Takashi Mizuno, and Joel Le Calvez. "Automated Microseismic Event Detection for Downhole Distributed Acoustic Sensing Data Processing." In 57th U.S. Rock Mechanics/Geomechanics Symposium. ARMA, 2023. http://dx.doi.org/10.56952/arma-2023-0797.

Full text
Abstract:
ABSTRACT Distributed acoustic sensing (DAS) can provide high-resolution measurements owing to the closely spaced sensing channels over a long distance and high sampling rate – attributes that are beneficial to seismic monitoring and analysis. However, DAS arrays generally suffer from a lower signal-to-noise ratio (SNR). Additionally, dense spatial and temporal measurements can result in extremely large data volumes that hinder efficient data storage, transmission, and processing. Automatically and accurately detecting seismic events in continuous DAS data is challenging. In this study, we present a microseismic event detection workflow that is based on coherency analysis of the waveforms and accommodates the high-resolution characteristics of DAS data. DAS data is transformed into an apparent velocity-time (v-t) domain by slant-stacking the data along different apparent velocities, i.e., slopes of the waveform, at each time sample. The coherency of the slant-stacked waveform is measured by the semblance coefficient. A coherent signal is identified if the semblance coefficient is larger than a threshold that is adaptively calculated. Then, the density-based spatial clustering (DBSCAN) algorithm is applied to group the coherent detections into different clusters. Finally, the microseismic events are detected by filtering out the clusters of coherent noise. The algorithm generates low-volume time-windowed data containing microseismic events that can be efficiently transferred and processed to estimate event location, magnitude, and source mechanism. The performance of the developed algorithm is tested using two field datasets from the Utah FORGE project. The results demonstrate the potential of the algorithm to achieve automatic real-time microseismic event detection for downhole DAS systems. INTRODUCTION Microseismic monitoring is a critical element for a variety of subsurface operations such as hydraulic fracturing in unconventional reservoirs, water circulation in Enhanced Geothermal Systems (EGS), as well as carbon injection and storage in geological reservoirs. In hydraulic fracturing treatments, microseismicity is important for mapping and characterizing created fracture networks. In fluid injection and storage operations, potential seismic hazard may occur due to the large volume of injected fluid. Seismic monitoring, being also a regulatory requirement, offers an early warning tool to avoid and mitigate seismic hazard, also known as traffic light monitoring system. Seismic event detection is the first step for subsequent processing encompassing event location, magnitude estimation, and source mechanism characterization. Recently, distributed acoustic sensing (DAS) has been increasingly used for microseismic monitoring. DAS makes use of Rayleigh scattering of laser pulses in fiber-optic cables to measure strain rate along a fiber (Hartog, 2017). One significant advantage of DAS is the high spatial-temporal resolution and large coverage. The fiber can be kilometers long and the channel spacing is in the order of meters, such that thousands of measurements can be obtained at each time sample. The sampling rate of DAS can be thousands of Hertz (Hz). Compared with the traditional geophone or accelerometer-type sensors, where there are only tens of sensing points with large inter-sensor spacing, DAS can provide a more complete picture of microseismic waveforms propagating in time and space. However, the dense spatial and temporal measurements generate extremely large data volumes. Furthermore, the signal-to-noise ratio of DAS data is generally lower than that of the geophone-acquired data. These challenges require the development of new methods for automated real-time microseismic monitoring and analysis.
APA, Harvard, Vancouver, ISO, and other styles
3

Faber, Weston R., Islam I. Hussein, John T. Kent, Shambo Bhattacharjee, and Moriba Jah. "Optical Data Processing Using Directional Statistics In a Multiple Hypothesis Framework With Maneuvering Objects." In 2018 Space Flight Mechanics Meeting. Reston, Virginia: American Institute of Aeronautics and Astronautics, 2018. http://dx.doi.org/10.2514/6.2018-1971.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dupre, Jean-Christophe, and Alexis Lagarde. "Automatic data processing of speckle fringe pattern." In Second Intl Conf on Photomechanics and Speckle Metrology: Speckle Techniques, Birefringence Methods, and Applications to Solid Mechanics. SPIE, 1991. http://dx.doi.org/10.1117/12.49526.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Jingyan, and Wei Heng. "Parallel processing of blocks of data in the network." In International Conference on Experimental Mechnics 2008 and Seventh Asian Conference on Experimental Mechanics, edited by Xiaoyuan He, Huimin Xie, and YiLan Kang. SPIE, 2008. http://dx.doi.org/10.1117/12.839224.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hurda, Lukáš, and Richard Matas. "Radial compressor test data processing with real gas equation of state." In 18TH CONFERENCE OF POWER SYSTEM ENGINEERING, THERMODYNAMICS AND FLUID MECHANICS. AIP Publishing, 2019. http://dx.doi.org/10.1063/1.5138621.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Yan-Sen, Fu-Qiang Ma, and Yang Wang. "An analysis model and data-processing method on spatial characteristics of underwater radiated noise of vessel targets." In 2015 International Conference on Mechanics and Mechatronics (ICMM2015). WORLD SCIENTIFIC, 2015. http://dx.doi.org/10.1142/9789814699143_0092.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Litvin, S. A., V. V. Nemchinov, V. V. Pavlov, V. S. Evsenjov, A. M. Skanavi, and I. V. Zhavoronok. "Automatization of measurement and processing of experimental data in photoelasticity." In Second Intl Conf on Photomechanics and Speckle Metrology: Speckle Techniques, Birefringence Methods, and Applications to Solid Mechanics. SPIE, 1991. http://dx.doi.org/10.1117/12.49557.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kumar Shahi, Anup, Anil Kumar, Kumar Hemant Singh, and Ranjith Pathegama Gamage. "Improved Upscaling Methods for Carbonate Rock Image Data." In 56th U.S. Rock Mechanics/Geomechanics Symposium. ARMA, 2022. http://dx.doi.org/10.56952/arma-2022-2204.

Full text
Abstract:
ABSTRACT Volumetric image data of rocks often need sophisticated image processing steps in any rock physics and petrophysics workflow. While the segmentation is highly dependent on the quality of data, a trade-off between resolution and field of view is inevitable. This work attempts to resolve this using multiple-point statistics that have long been used for generating synthetic images, though mostly applied to sandstone rocks where the heterogeneity is significantly less than that of carbonates. These algorithms work by sequentially populating a grid to emulate the observed image. However, finding the optimum kernel parameters is crucial to capturing the spatial characteristics of the data. Also, when dealing with multiple images, finding a single set of kernel parameters might not be a trivial task. Further these methods work by computing a covariance kernel that scales as the third power with the number of training examples, thus not scaling well with the more data. Therefore, we seek to design a single image-based upscaling method that would help alleviate these difficulties. We test the proposed methodology on carbonate rock sample data which are known for their complexities at various scales. In this study images of 4 samples are considered. An upsample-deblur is developed that consistently works better than the conventional bicubic interpolation based upsampling technique. For this, a low-resolution 2D image sample is extracted from an X-ray microtomography dataset which was then subjected to a Random Forest based upsampling algorithm. It is found that the data from low scale could be improved to form a single super-resolution image. The algorithm produces an image that is always better than the bicubic algorithm. We anticipate this strategy would help design advanced algorithms where the amount of training examples is less. 1. INTRODUCTION Digital Rock Physics workflow has gained significant attention from researchers due to its promising accuracy to characterize rocks and predict desired properties through numerical simulation (Andrä et al., 2013a, 2013b). Digital rock physics is a numerical workflow to compute and simulate various rock properties such as permeability, electrical conductivity, and elastic moduli based on high-resolution representations of the complex pore geometry obtained from imaging (Andrä et al., 2013a, 2013b; Arns et al., 2019; Devarapalli et al., 2017; Mehmani et al., 2020; Wildenschild & Sheppard, 2013).
APA, Harvard, Vancouver, ISO, and other styles
10

Bernardes, Joa˜o L., Lauro M. Y. Silveira, and Clo´vis A. Martins. "RiserView: An Open Source, Multiplatform Post-Processing Tool for the Visualization and Analysis of Riser Dynamics." In 25th International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2006. http://dx.doi.org/10.1115/omae2006-92476.

Full text
Abstract:
Offshore oil exploitation, given its importance, is the target of intense research throughout the world. Much of this research generates large amounts of raw data of difficult interpretation. This paper presents RiserView, a free, open-source and multiplatform post-processing tool that merges virtual reality and scientific visualization techniques to allow a three-dimensional interactive visualization of these data for the specific domain of riser dynamics. This tool, its use, the results of performance tests and how the tool may aid in the analysis of riser dynamics, in view of the visualization tools in commercial riser analysis software, are discussed.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Mechanics, data processing"

1

Leis. L51845 Database of Mechanical and Toughness Properties of Pipe. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), December 2000. http://dx.doi.org/10.55274/r0010150.

Full text
Abstract:
�The lower-strength grades of steel used for transmission pipelines into the 60s were much like those used in other steel construction in that era. These steels gained strength by traditional hardening mechanisms through chemistry changes, largely involving carbon and manganese additions. Improvement of these grades, primarily through control of ingot chemistry and steel processing, became necessary when running brittle fracture was identified as a failure mechanism in gas-transmission pipelines in the late 50s. Eventually, this avenue to increasing strength was exhausted for pipeline applications because this approach causes increased susceptibility to hydrogen-related cracking mechanisms as strength increases. For this reason, modern steels differ significantly from their predecessors in several ways, with the transition from traditional C-Mn ferrite-pearlite steels beginning in the mid 60s with the introduction of high-strength-low-alloy (HSLA) steels. This report presents the results of projects, PR-3-9606 and PR-3-9737, both of which were planned as multi-year projects. The first of these projects initially was conceived to provide broad evaluation of the fitness-for-service of wrinkle bends while the second was conceived to generate mechanical and fracture properties data for use in the integrity analysis of both the pipe body and weld seams in modern gas-transmission pipeline systems. As possible duplication between a joint industry project and the PRCI project became apparent, this project was scaled back to focus on properties of steels used in construction involving wrinkle bends. Consideration also was given to a more modern steel such as might be found in ripple bends, which are formed in bending machines that now have become widely used. The second project likewise was reduced in scope, with a focus on only the pipe body. Because both projects ended being centered on mechanical and fracture properties, both are presented in this combination report.
APA, Harvard, Vancouver, ISO, and other styles
2

Leathers, Emily, Clayton Thurmer, and Kendall Niles. Encryption for edge computing applications. Engineer Research and Development Center (U.S.), May 2024. http://dx.doi.org/10.21079/11681/48596.

Full text
Abstract:
As smart sensors and the Internet of Things (IoT) exponentially expand, there is an increased need for effective processing solutions for sensor node data located in the operational arena where it can be leveraged for immediate decision support. Current developments reveal that edge computing, where processing and storage are performed close to data generation locations, can meet this need (Ahmed and Ahmed 2016). Edge computing imparts greater flexibility than that experienced in cloud computing architectures (Khan et al. 2019). Despite these benefits, the literature highlights open security issues in edge computing, particularly in the realm of encryption. A prominent limitation of edge devices is the hardware’s ability to support the computational complexity of traditional encryption methodologies (Alwarafy et al. 2020). Furthermore, encryption on the edge poses challenges in key management, the process by which cryptographic keys are transferred and stored among devices (Zeyu et al. 2020). Though edge computing provides reduced latency in data processing, encryption mechanism utilization reintroduces delay and can hinder achieving real-time results (Yu et al. 2018). The IoT is composed of a wide range of devices with a diverse set of computational capabilities, rendering a homogeneous solution for encryption impractical (Dar et al. 2019). Edge devices are often deployed in operational locations that are vulnerable to physical tampering and attacks. Sensitive data may be compromised if not sufficiently encrypted or if keys are not managed properly. Furthermore, the distributed nature and quantity of edge devices create a vast attack surface that can be compromised in other ways (Xiao et al. 2019). Understanding established mechanisms and exploring emerging methodologies for encryption reveals potential solutions for developing a robust solution for edge computing applications. The purpose of this document is to detail the current research for encryption methods in the edge computing space and highlight the major challenges associated with executing successful encryption on the edge.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhao, George, Grang Mei, Bulent Ayhan, Chiman Kwan, and Venu Varma. DTRS57-04-C-10053 Wave Electromagnetic Acoustic Transducer for ILI of Pipelines. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), March 2005. http://dx.doi.org/10.55274/r0012049.

Full text
Abstract:
In this project, Intelligent Automation, Incorporated (IAI) and Oak Ridge National Lab (ORNL) propose a novel and integrated approach to inspect the mechanical dents and metal loss in pipelines. It combines the state-of-the-art SH wave Electromagnetic Acoustic Transducer (EMAT) technique, through detailed numerical modeling, data collection instrumentation, and advanced signal processing and pattern classifications, to detect and characterize mechanical defects in the underground pipeline transportation infrastructures. The technique has four components: (1) thorough guided wave modal analysis, (2) recently developed three-dimensional (3-D) Boundary Element Method (BEM) for best operational condition selection and defect feature extraction, (3) ultrasonic Shear Horizontal (SH) waves EMAT sensor design and data collection, and (4) advanced signal processing algorithm like a nonlinear split-spectrum filter, Principal Component Analysis (PCA) and Discriminant Analysis (DA) for signal-to-noise-ratio enhancement, crack signature extraction, and pattern classification. This technology not only can effectively address the problems with the existing methods, i.e., to detect the mechanical dents and metal loss in the pipelines consistently and reliably but also it is able to determine the defect shape and size to a certain extent.
APA, Harvard, Vancouver, ISO, and other styles
4

Bates, C. Richards, Melanie Chocholek, Clive Fox, John Howe, and Neil Jones. Scottish Inshore Fisheries Integrated Data System (SIFIDS): Work package (3) final report development of a novel, automated mechanism for the collection of scallop stock data. Edited by Mark James and Hannah Ladd-Jones. Marine Alliance for Science and Technology for Scotland (MASTS), 2019. http://dx.doi.org/10.15664/10023.23449.

Full text
Abstract:
[Extract from Executive Summary] This project, aimed at the development of a novel, automated mechanism for the collection of scallop stock data was a sub-part of the Scottish Inshore Fisheries Integrated Data Systems (SIFIDS) project. The project reviewed the state-of-the-art remote sensing (geophysical and camera-based) technologies available from industry and compared these to inexpensive, off-the -shelf equipment. Sea trials were conducted on scallop dredge sites and also hand-dived scallop sites. Data was analysed manually, and tests conducted with automated processing methods. It was concluded that geophysical acoustic technologies cannot presently detect individual scallop but the remote sensing technologies can be used for broad scale habitat mapping of scallop harvest areas. Further, the techniques allow for monitoring these areas in terms of scallop dredging impact. Camera (video and still) imagery is effective for scallop count and provide data that compares favourably with diver-based ground truth information for recording scallop density. Deployment of cameras is possible through inexpensive drop-down camera frames which it is recommended be deployed on a wide area basis for further trials. In addition, implementation of a ‘citizen science’ approach to wide area recording is suggested to increase the stock assessment across the widest possible variety of seafloor types around Scotland. Armed with such data a full, statistical analysis could be completed and data used with automated processing routines for future long-term monitoring of stock.
APA, Harvard, Vancouver, ISO, and other styles
5

Ridgard, Chris. Complex Structures for Manned/Unmanned Aerial Vehicles. Delivery Order 0019: Low Temp Composite Processing Mechanical Property Data. Fort Belvoir, VA: Defense Technical Information Center, January 2008. http://dx.doi.org/10.21236/ada477586.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Stern, David, and Gadi Schuster. Manipulation of Gene Expression in the Chloroplast. United States Department of Agriculture, September 2000. http://dx.doi.org/10.32747/2000.7575289.bard.

Full text
Abstract:
The steady-state level of a given mRNA is determined by its rates of transcription and degradation. The stabilities of chloroplast mRNAs vary during plant development, in part regulating gene expression. Furthermore, the fitness of the organelle depends on its ability to destroy non-functional transcripts. In addition, there is a resurgent interest by the biotechnology community in chloroplast transformation due to the public concerns over pollen transmission of introduced traits or foreign proteins. Therefore, studies into basic gene expression mechanisms in the chloroplast will open the door to take advantage of these opportunities. This project was aimed at gaining mechanistic insights into mRNA processing and degradation in the chloroplast and to engineer transcripts of varying stability in Chlamydomonas reinhardtii cells. This research uncovered new and important information on chloroplast mRNA stability, processing, degradation and translation. In particular, the processing of the 3' untranslated regions of chloroplast mRNAs was shown to be important determinants in translation. The endonucleolytic site in the 3' untranslated region was characterized by site directed mutagensis. RNA polyadenylation has been characterized in the chloroplast of Chlamydomonas reinhardtii and chloroplast transformants carrying polyadenylated sequences were constructed and analyzed. Data obtained to date suggest that chloroplasts have gene regulatory mechanisms which are uniquely adapted to their post-endosymbiotic environment, including those that regulate RNA stability. An exciting point has been reached, because molecular genetic studies have defined critical RNA-protein interactions that participate in these processes. However, much remains to be learned about these multiple pathways, how they interact with each other, and how many nuclear genes are consecrated to overseeing them. Chlamydomonas is an ideal model system to extend our understanding of these areas, given its ease of manipulation and the existing knowledge base, some of which we have generated.
APA, Harvard, Vancouver, ISO, and other styles
7

Modlo, Yevhenii O., Serhiy O. Semerikov, Stanislav L. Bondarevskyi, Stanislav T. Tolmachev, Oksana M. Markova, and Pavlo P. Nechypurenko. Methods of using mobile Internet devices in the formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3677.

Full text
Abstract:
An analysis of the experience of professional training bachelors of electromechanics in Ukraine and abroad made it possible to determine that one of the leading trends in its modernization is the synergistic integration of various engineering branches (mechanical, electrical, electronic engineering and automation) in mechatronics for the purpose of design, manufacture, operation and maintenance electromechanical equipment. Teaching mechatronics provides for the meaningful integration of various disciplines of professional and practical training bachelors of electromechanics based on the concept of modeling and technological integration of various organizational forms and teaching methods based on the concept of mobility. Within this approach, the leading learning tools of bachelors of electromechanics are mobile Internet devices (MID) – a multimedia mobile devices that provide wireless access to information and communication Internet services for collecting, organizing, storing, processing, transmitting, presenting all kinds of messages and data. The authors reveals the main possibilities of using MID in learning to ensure equal access to education, personalized learning, instant feedback and evaluating learning outcomes, mobile learning, productive use of time spent in classrooms, creating mobile learning communities, support situated learning, development of continuous seamless learning, ensuring the gap between formal and informal learning, minimize educational disruption in conflict and disaster areas, assist learners with disabilities, improve the quality of the communication and the management of institution, and maximize the cost-efficiency. Bachelor of electromechanics competency in modeling of technical objects is a personal and vocational ability, which includes a system of knowledge, skills, experience in learning and research activities on modeling mechatronic systems and a positive value attitude towards it; bachelor of electromechanics should be ready and able to use methods and software/hardware modeling tools for processes analyzes, systems synthesis, evaluating their reliability and effectiveness for solving practical problems in professional field. The competency structure of the bachelor of electromechanics in the modeling of technical objects is reflected in three groups of competencies: general scientific, general professional and specialized professional. The implementation of the technique of using MID in learning bachelors of electromechanics in modeling of technical objects is the appropriate methodic of using, the component of which is partial methods for using MID in the formation of the general scientific component of the bachelor of electromechanics competency in modeling of technical objects, are disclosed by example academic disciplines “Higher mathematics”, “Computers and programming”, “Engineering mechanics”, “Electrical machines”. The leading tools of formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects are augmented reality mobile tools (to visualize the objects’ structure and modeling results), mobile computer mathematical systems (universal tools used at all stages of modeling learning), cloud based spreadsheets (as modeling tools) and text editors (to make the program description of model), mobile computer-aided design systems (to create and view the physical properties of models of technical objects) and mobile communication tools (to organize a joint activity in modeling).
APA, Harvard, Vancouver, ISO, and other styles
8

Weeks and Dash Weeks. L52336 Weld Design Testing and Assessment Procedures for High-strength Pipelines Curved Wide Plate Tests. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), December 2011. http://dx.doi.org/10.55274/r0010452.

Full text
Abstract:
A variety of mechanical property tests are performed in the design, construction and maintenance phase of a pipeline. Most of the tests are performed by use of small-scale specimens with size typically in the range of a few inches to tens of inches (1 in = 25.4 mm). There are numerous test labs capable of performing most small-scale tests. These tests can be performed effectively under a variety of conditions, e.g., test temperature, strain rate, and loading configuration. More importantly, most routine small-scale tests are performed in accordance with national and international standards, ensuring the consistency of testing procedures. To confirm pipeline designs and validate material performance, it is desirable to test girth welds under realistic service conditions. Full-scale tests can incorporate certain realistic features that small-scale specimens cannot. However, these tests can be time-consuming and expensive to conduct. Very few labs can perform the tests, even with months of start-up and preparation time. There are no generally accepted, consistent test procedures among different test labs. The data acquisition and post-processing may differ from lab to lab, creating difficulties in data comparison. Full-scale tests can only be performed under selected conditions as a supplemental tool to the small-scale tests. The work described in this report focuses on the development of test procedures and instrumentation requirements for curved-wide-plate (CWP) tests. The results of this work can be used for: Developing a test methodology to measure the physical response of a finite-length surface-breaking flaw to axial loads applied to a girth welded line pipe section, Determining the appropriate instrumentation to fully characterize the global stress/strain response of the CWP specimen during loading, Evaluating the applicability of the test methodology for sub-ambient temperatures, and Developing a standardized test procedure for CWP testing with a wide range of test parameters.
APA, Harvard, Vancouver, ISO, and other styles
9

Engel, Bernard, Yael Edan, James Simon, Hanoch Pasternak, and Shimon Edelman. Neural Networks for Quality Sorting of Agricultural Produce. United States Department of Agriculture, July 1996. http://dx.doi.org/10.32747/1996.7613033.bard.

Full text
Abstract:
The objectives of this project were to develop procedures and models, based on neural networks, for quality sorting of agricultural produce. Two research teams, one in Purdue University and the other in Israel, coordinated their research efforts on different aspects of each objective utilizing both melons and tomatoes as case studies. At Purdue: An expert system was developed to measure variances in human grading. Data were acquired from eight sensors: vision, two firmness sensors (destructive and nondestructive), chlorophyll from fluorescence, color sensor, electronic sniffer for odor detection, refractometer and a scale (mass). Data were analyzed and provided input for five classification models. Chlorophyll from fluorescence was found to give the best estimation for ripeness stage while the combination of machine vision and firmness from impact performed best for quality sorting. A new algorithm was developed to estimate and minimize training size for supervised classification. A new criteria was established to choose a training set such that a recurrent auto-associative memory neural network is stabilized. Moreover, this method provides for rapid and accurate updating of the classifier over growing seasons, production environments and cultivars. Different classification approaches (parametric and non-parametric) for grading were examined. Statistical methods were found to be as accurate as neural networks in grading. Classification models by voting did not enhance the classification significantly. A hybrid model that incorporated heuristic rules and either a numerical classifier or neural network was found to be superior in classification accuracy with half the required processing of solely the numerical classifier or neural network. In Israel: A multi-sensing approach utilizing non-destructive sensors was developed. Shape, color, stem identification, surface defects and bruises were measured using a color image processing system. Flavor parameters (sugar, acidity, volatiles) and ripeness were measured using a near-infrared system and an electronic sniffer. Mechanical properties were measured using three sensors: drop impact, resonance frequency and cyclic deformation. Classification algorithms for quality sorting of fruit based on multi-sensory data were developed and implemented. The algorithms included a dynamic artificial neural network, a back propagation neural network and multiple linear regression. Results indicated that classification based on multiple sensors may be applied in real-time sorting and can improve overall classification. Advanced image processing algorithms were developed for shape determination, bruise and stem identification and general color and color homogeneity. An unsupervised method was developed to extract necessary vision features. The primary advantage of the algorithms developed is their ability to learn to determine the visual quality of almost any fruit or vegetable with no need for specific modification and no a-priori knowledge. Moreover, since there is no assumption as to the type of blemish to be characterized, the algorithm is capable of distinguishing between stems and bruises. This enables sorting of fruit without knowing the fruits' orientation. A new algorithm for on-line clustering of data was developed. The algorithm's adaptability is designed to overcome some of the difficulties encountered when incrementally clustering sparse data and preserves information even with memory constraints. Large quantities of data (many images) of high dimensionality (due to multiple sensors) and new information arriving incrementally (a function of the temporal dynamics of any natural process) can now be processed. Furhermore, since the learning is done on-line, it can be implemented in real-time. The methodology developed was tested to determine external quality of tomatoes based on visual information. An improved model for color sorting which is stable and does not require recalibration for each season was developed for color determination. Excellent classification results were obtained for both color and firmness classification. Results indicted that maturity classification can be obtained using a drop-impact and a vision sensor in order to predict the storability and marketing of harvested fruits. In conclusion: We have been able to define quantitatively the critical parameters in the quality sorting and grading of both fresh market cantaloupes and tomatoes. We have been able to accomplish this using nondestructive measurements and in a manner consistent with expert human grading and in accordance with market acceptance. This research constructed and used large databases of both commodities, for comparative evaluation and optimization of expert system, statistical and/or neural network models. The models developed in this research were successfully tested, and should be applicable to a wide range of other fruits and vegetables. These findings are valuable for the development of on-line grading and sorting of agricultural produce through the incorporation of multiple measurement inputs that rapidly define quality in an automated manner, and in a manner consistent with the human graders and inspectors.
APA, Harvard, Vancouver, ISO, and other styles
10

Fluhr, Robert, and Volker Brendel. Harnessing the genetic diversity engendered by alternative gene splicing. United States Department of Agriculture, December 2005. http://dx.doi.org/10.32747/2005.7696517.bard.

Full text
Abstract:
Our original objectives were to assess the unexplored dimension of alternative splicing as a source of genetic variation. In particular, we sought to initially establish an alternative splicing database for Arabidopsis, the only plant for which a near-complete genome has been assembled. Our goal was to then use the database, in part, to advance plant gene prediction programs that are currently a limiting factor in annotating genomic sequence data and thus will facilitate the exploitation of the ever increasing quantity of raw genomic data accumulating for plants. Additionally, the database was to be used to generate probes for establishing high-throughput alternative transcriptome analysis in the form of a splicing-specific oligonucleotide microarray. We achieved the first goal and established a database and web site termed Alternative Splicing In Plants (ASIP, http://www.plantgdb.org/ASIP/). We also thoroughly reviewed the extent of alternative splicing in plants (Arabidopsis and rice) and proposed mechanisms for transcript processing. We noted that the repertoire of plant alternative splicing differs from that encountered in animals. For example, intron retention turned out to be the major type. This surprising development was proven by direct RNA isolation techniques. We further analyzed EST databases available from many plants and developed a process to assess their alternative splicing rate. Our results show that the lager genome-sized plant species have enhanced rates of alternative splicing. We did advance gene prediction accuracy in plants by incorporating scoring for non-canonical introns. Our data and programs are now being used in the continuing annotation of plant genomes of agronomic importance, including corn, soybean, and tomato. Based on the gene annotation data developed in the early part of the project, it turned out that specific probes for different exons could not be scaled up to a large array because no uniform hybridization conditions could be found. Therefore, we modified our original objective to design and produce an oligonucleotide microarray for probing alternative splicing and realized that it may be reasonable to investigate the extent of alternative splicing using novel commercial whole genome arrays. This possibility was directly examined by establishing algorithms for the analysis of such arrays. The predictive value of the algorithms was then shown by isolation and verification of alternative splicing predictions from the published whole genome array databases. The BARD-funded work provides a significant advance in understanding the extent and possible roles of alternative splicing in plants as well as a foundation for advances in computational gene prediction.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography