To see the other types of publications on this topic, follow the link: Aeronautics, data processing.

Journal articles on the topic 'Aeronautics, data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Aeronautics, data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Sin Win, Ei Phyu. "Urban Road Change Detection using Morphological Processing." Qubahan Academic Journal 1, no. 1 (February 28, 2021): 57–61. http://dx.doi.org/10.48161/qaj.v1n1a29.

Full text
Abstract:
The primary point of this research is to design a road extraction algorithm for processing National Aeronautics and Space Administration satellite pictures. Roadway network detection is one of the important appointments for calamity emergency response, smart shipping structures, and real-time modify roadway network. Everyone is trying to detect road; this system is useful for urban or rural developing schedule. The development of a town / village depends not only on the building and population density of the town or village, but also in the systematic development of roads. The research focused on finding ways to use morphological image processing primarily. As an application area, we use National Aeronautics and Space Administration imagery obtained from 2009-2020 in Monywa, Upper Myanmar to find out how the roads have been developed and how the city has been developed. Extraction road from planet pictures is hard matter with many realistic application programs. The primary points in the model are the advancement of the picture, the segmentation of that picture, the application of the morphological operators, and finally the detection of the roadway network. Use Google Earth Pro to get the necessary data photos and search for road improvements. After collecting images from different seasons and years, we can find precise answers by combining them with precise algorithms. In addition to significant, benefits of Google Earth Pro, this research demonstrates the ability to make good use of satellite imagery and to integrate it with outside experts to save money, save time, and provide accurate answers. It is simulated with MATLAB programming language.
APA, Harvard, Vancouver, ISO, and other styles
2

Mitrescu, Cristian, Steven Miller, Jeffrey Hawkins, Tristan L’Ecuyer, Joseph Turk, Philip Partain, and Graeme Stephens. "Near-Real-Time Applications of CloudSat Data." Journal of Applied Meteorology and Climatology 47, no. 7 (July 1, 2008): 1982–94. http://dx.doi.org/10.1175/2007jamc1794.1.

Full text
Abstract:
Abstract Within 2 months of its launch in April 2006 as part of the Earth Observing System A-Train satellite constellation, the National Aeronautics and Space Administration Earth System Science Pathfinder (ESSP) CloudSat mission began making significant contributions toward broadening the understanding of detailed cloud vertical structures around the earth. Realizing the potential benefit of CloudSat to both the research objectives and operational requirements of the U.S. Navy, the Naval Research Laboratory coordinated early on with the CloudSat Data Processing Center to receive and process first-look 94-GHz Cloud Profiling Radar datasets in near–real time (4–8 h latency), thereby making the observations more relevant to the operational community. Applications leveraging these unique data, described herein, include 1) analysis/validation of cloud structure and properties derived from conventional passive radiometers, 2) tropical cyclone vertical structure analysis, 3) support of research field programs, 4) validation of numerical weather prediction model cloud fields, and 5) quantitative precipitation estimation in light rainfall regimes.
APA, Harvard, Vancouver, ISO, and other styles
3

Lewis, Tracy R., and Huseyin Yildirim. "Managing Dynamic Competition." American Economic Review 92, no. 4 (August 1, 2002): 779–97. http://dx.doi.org/10.1257/00028280260344461.

Full text
Abstract:
In many important high-technology markets, including software development, data processing, communications, aeronautics, and defense, suppliers learn through experience how to provide better service at lower cost. This paper examines how a buyer designs dynamic competition among rival suppliers to exploit learning economies while minimizing the costs of becoming locked in to one producer. Strategies for controlling dynamic competition include the handicapping of more efficient suppliers in procurement competitions, the protection and allocation of intellectual property, and the sharing of information among rival suppliers.
APA, Harvard, Vancouver, ISO, and other styles
4

Monika, Putri, Budi Nurani Ruchjana, and Atje Setiawan Abdullah. "GSTARI-X-ARCH Model with Data Mining Approach for Forecasting Climate in West Java." Computation 10, no. 12 (November 23, 2022): 204. http://dx.doi.org/10.3390/computation10120204.

Full text
Abstract:
The spatiotemporal model consists of stationary and non-stationary data, respectively known as the Generalized Space–Time Autoregressive (GSTAR) model and the Generalized Space–Time Autoregressive Integrated (GSTARI) model. The application of this model in forecasting climate with rainfall variables is also influenced by exogenous variables such as humidity, and often the assumption of error is not constant. Therefore, this study aims to design a spatiotemporal model with the addition of exogenous variables and to overcome the non-constant error variance. The proposed model is named GSTARI-X-ARCH. The model is used to predict climate phenomena in West Java, obtained from National Aeronautics and Space Administration Prediction of Worldwide Energy Resources (NASA POWER) data. Climate data are big data, so we used knowledge discovery in databases (KDD) in this study. The pre-processing step is collecting and cleaning data. Then, the data mining process with the GSTARI-X-ARCH model follows the Box–Jenkins procedure: model identification, parameter estimation, and diagnostic checking. Finally, the post-processing step for visualization and interpretation of forecast results was conducted. This research is expected to contribute to developing the spatiotemporal model and forecast results as recommendations to the relevant agencies.
APA, Harvard, Vancouver, ISO, and other styles
5

Nugroho, Jalu Tejo, Zylshal Zylshal, and Dony Kushardono. "LAPAN-A3 SATELLITE DATA ANALYSIS FOR LAND COVER CLASSIFICATION (CASE STUDY: TOBA LAKE AREA, NORTH SUMATRA)." International Journal of Remote Sensing and Earth Sciences (IJReSES) 15, no. 1 (July 6, 2018): 71. http://dx.doi.org/10.30536/j.ijreses.2018.v15.a2782.

Full text
Abstract:
LAPAN-A3 is the 3rdgeneration satellite for remote sensing developed by National Institute of Aeronautics and Space (LAPAN). The camera provides imagery with 15 m spatial resolution and able to view a swath 120 km wide. This research analyzes the performance of LAPAN-A3 satellite data to classify land cover in Toba Lake area, North Sumatera. Data processing starts from the selection of region of interest up to the assessment of accuracy. Supervised classification with maximum likelihood approach and confusion matrix method was applied to classify and evaluate the assessment results. The land cover is classified into five classes; water, bare land, agriculture, forest and secondary forest. The result of accuracy test is 93.71%. It proves that LAPAN-A3 data could classify the land cover accurately. The data is expected to complement the need of the satellite data with medium spatial resolution.
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Xiao Ping, Hui Jie Cui, Si Chen, Chun Hui Liang, Feng Liu, Jian Qiang Xu, and Xu Mao. "The Research on Behavioral Status Judgment Method Based on Skeleton Modeling and Vector Processing." Applied Mechanics and Materials 190-191 (July 2012): 321–28. http://dx.doi.org/10.4028/www.scientific.net/amm.190-191.321.

Full text
Abstract:
Fast, simple and reliable method for judging the behavioral status of video object is the public research base of many application fields, including public safety monitoring, national defense, aeronautics, sports and so on. This paper focuses on the fast parametric skeleton modeling and simple process of vectorization. It proposes an improved thinning algorithm for fast skeleton modeling and then locates the key points by a new scanning and marketing method, ultimately, according to the coordinate of the key points, it constructs a matrix of pose estimation and formulates the behavioral status judgment as a problem of calculating the angle between vectors. This method not only resolves the problems of large volumes of data and high complexity of algorithm in the traditional approach, but also lays the foundation for the extended researches and applications of multi-field intelligent video behavior analysis.
APA, Harvard, Vancouver, ISO, and other styles
7

Ćwiklak, Janusz, Marek Grzegorzewski, and Kamil Krasuski. "The Application of the BSSD Iono-Free Linear Combination Method in the Processing of Aircraft Positioning." Journal of KONES 26, no. 3 (September 1, 2019): 15–21. http://dx.doi.org/10.2478/kones-2019-0052.

Full text
Abstract:
Abstract The article presents the results of research into the use of the differentiation technique of BSSD (Between Satellite Single Difference) observations for the Iono-Free LC combination (Linear Combination) in the GPS system for the needs of aircraft positioning. Within the conducted investigations, a positioning algorithm for the BSSD Iono-Free LC positioning method was presented. In addition, an experimental test was conducted, in which raw observational data and GPS navigation data were exploited in order to recover the aircraft position. The examination was conducted for the Cessna 172 and the on-board dual-frequency receiver Topcon HiperPro. The experimental test presents the results of average errors of determining the position of the Cessna 172 in the XYZ geocentric frame and in the ellipsoidal BLh frame. Furthermore, the article presents the results of DOP (Dilution of Precision) coefficients, the test of the Chi square internal reliability test and the HPL and VPL confidence levels in GNSS precision approach (PA) in air transport. The calculations were performed in the original APS software (APS Aircraft Positioning Software) developed in the Department of Air Navigation of the Faculty of Aeronautics at the Polish Air Force University.
APA, Harvard, Vancouver, ISO, and other styles
8

Corcoran, Forrest, and Christopher E. Parrish. "Diffuse Attenuation Coefficient (Kd) from ICESat-2 ATLAS Spaceborne Lidar Using Random-Forest Regression." Photogrammetric Engineering & Remote Sensing 87, no. 11 (November 1, 2021): 831–40. http://dx.doi.org/10.14358/pers.21-00013r2.

Full text
Abstract:
This study investigates a new method for measuring water turbidity—specifically, the diffuse attenuation coefficient of downwelling irradiance Kd —using data from a spaceborne, green-wavelength lidar aboard the National Aeronautics and Space Administration's ICESat-2 satellite. The method enables us to fill nearshore data voids in existing Kd data sets and provides a more direct measurement approach than methods based on passive multispectral satellite imagery. Furthermore, in contrast to other lidar-based methods, it does not rely on extensive signal processing or the availability of the system impulse response function, and it is designed to be applied globally rather than at a specific geographic location. The model was tested using Kd measurements from the National Oceanic and Atmospheric Administration's Visible Infrared Imaging Radiometer Suite sensor at 94 coastal sites spanning the globe, with Kd values ranging from 0.05 to 3.6 m –1 . The results demonstrate the efficacy of the approach and serve as a benchmark for future machine-learning regression studies of turbidity using ICESat-2.
APA, Harvard, Vancouver, ISO, and other styles
9

Ramasso, Emmanuel, Benoît Verdin, and Gaël Chevallier. "Monitoring a Bolted Vibrating Structure Using Multiple Acoustic Emission Sensors: A Benchmark." Data 7, no. 3 (March 2, 2022): 31. http://dx.doi.org/10.3390/data7030031.

Full text
Abstract:
The dataset presented in this work, called ORION-AE, is made of raw AE data streams collected by three different AE sensors and a laser vibrometer during five campaigns of measurements by varying the tightening conditions of two bolted plates submitted to harmonic vibration tests. With seven different operating conditions, this dataset was designed to challenge supervised and unsupervised machine/deep learning as well as signal processing methods which are developed for material characterization or structural health monitoring (SHM). One motivation of this work was to create a common benchmark for comparing data-driven methods dedicated to AE data interpretation. The dataset is made of time series collected during an experiment designed to reproduce the loosening phenomenon observed in aeronautics, automotive, or civil engineering structures where parts are assembled together by means of bolted joints. Monitoring loosening in jointed structures during operation remains challenging because contact and friction in bolted joints induce a nonlinear stochastic behavior.
APA, Harvard, Vancouver, ISO, and other styles
10

Nugroho, Jalu Tejo, Zylshal, Nurwita Mustika Sari, and Dony Kushardono. "A COMPARISON OF OBJECT-BASED AND PIXEL-BASED APPROACHES FOR LAND USE/LAND COVER CLASSIFICATION USING LAPAN-A2 MICROSATELLITE DATA." International Journal of Remote Sensing and Earth Sciences (IJReSES) 14, no. 1 (June 21, 2017): 27. http://dx.doi.org/10.30536/j.ijreses.2017.v14.a2680.

Full text
Abstract:
In recent years, small satellite industry has been a rapid trend and become important especially when associated with operational cost, technology adaptation and the missions. One mission of LAPAN-A2, the 2nd generation of microsatellite that developed by Indonesian National Institute of Aeronautics and Space (LAPAN), is Earth observation using digital camera that provides imagery with 3.5 m spatial resolution. The aim of this research is to compare between object-based and pixel-based classification of land use/land cover (LU/LC) in order to determine the appropriate classification method in LAPAN-A2 dataprocessing (case study Semarang, Central Java).The LU/LC were classified into eleven classes, as follows: sea, river, fish pond, tree, grass, road, building 1, building 2, building 3, building 4 and rice field. The accuracy of classification outputs were assessed using confusion matrix. The object-based and pixel-based classification methods result for overall accuracy are 31.63% and 61.61%, respectively. According to accuracy result, it was thought that blurring effect on LAPAN-A2 data may be the main cause ofaccuracy decrease. Furthermore, the result is suggested to use pixel-based classification to be applied inLAPAN-A2 data processing.
APA, Harvard, Vancouver, ISO, and other styles
11

Palaseanu-Lovejoy, Monica, Marina Bisson, Claudia Spinetti, Maria Fabrizia Buongiorno, Oleg Alexandrov, and Thomas Cecere. "High-Resolution and Accurate Topography Reconstruction of Mount Etna from Pleiades Satellite Data." Remote Sensing 11, no. 24 (December 12, 2019): 2983. http://dx.doi.org/10.3390/rs11242983.

Full text
Abstract:
The areas characterized by dynamic and rapid morphological changes need accurate topography information with frequent updates, especially if these are populated and involve infrastructures. This is particularly true in active volcanic areas such as Mount (Mt.) Etna, located in the northeastern portion of Sicily, Italy. The Mt. Etna volcano is periodically characterized by explosive and effusive eruptions and represents a potential hazard for several thousands of local people and hundreds of tourists present on the volcano itself. In this work, a high-resolution, high vertical accuracy digital surface model (DSM) of Mt. Etna was derived from Pleiades satellite data using the National Aeronautics and Space Administration (NASA) Ames Stereo Pipeline (ASP) tool set. We believe that this is the first time that the ASP using Pleiades imagery has been applied to Mt. Etna with sub-meter vertical root mean square error (RMSE) results. The model covers an area of about 400 km2 with a spatial resolution of 2 m and centers on the summit portion of the volcano. The model was validated by using a set of reference ground control points (GCP) obtaining a vertical RMSE of 0.78 m. The described procedure provides an avenue to obtain DSMs at high spatial resolution and elevation accuracy in a relatively short amount of processing time, making the procedure itself suitable to reproduce topographies often indispensable during the emergency management case of volcanic eruptions.
APA, Harvard, Vancouver, ISO, and other styles
12

Ahmed, M., Quazi Hassan, Masoud Abdollahi, and Anil Gupta. "Processing of Near Real Time Land Surface Temperature and Its Application in Forecasting Forest Fire Danger Conditions." Sensors 20, no. 4 (February 12, 2020): 984. http://dx.doi.org/10.3390/s20040984.

Full text
Abstract:
Near real time (NRT) remote sensing derived land surface temperature (Ts) data has an utmost importance in various applications of natural hazards and disasters. Space-based instrument MODIS (moderate resolution imaging spectroradiometer) acquired NRT data products of Ts are made available for the users by LANCE (Land, Atmosphere Near real-time Capability) for Earth Observing System (EOS) of NASA (National Aeronautics and Space Administration) free of cost. Such Ts products are swath data with 5 min temporal increments of satellite acquisition, and the average latency is 60-125 min to be available in public domain. The swath data of Ts requires a specialized tool, i.e., HEG (HDF-EOS to GeoTIFF conversion tool) to process and make the data useful for further analysis. However, the file naming convention of the available swath data files in LANCE is not appropriate to download for an area of interest (AOI) to be processed by HEG. In this study, we developed a method/algorithm to overcome such issues in identifying the appropriate swath data files for an AOI that would be able to further processes supported by the HEG. In this case, we used Terra MODIS acquired NRT swath data of Ts, and further applied it to an existing framework of forecasting forest fires (as a case study) for the performance evaluation of our processed Ts. We were successful in selecting appropriate swath data files of Ts for our study area that was further processed by HEG, and finally were able to generate fire danger map in the existing forecasting model. Our proposed method/algorithm could be applied on any swath data product available in LANCE for any location in the world.
APA, Harvard, Vancouver, ISO, and other styles
13

Nugroho, J. T., S. Sulma, K. I. N. Rahmi, and S. Harini. "Development of Drought Information System at Remote Sensing Application Center, National Institute of Aeronautics and Space (LAPAN)." IOP Conference Series: Earth and Environmental Science 893, no. 1 (November 1, 2021): 012080. http://dx.doi.org/10.1088/1755-1315/893/1/012080.

Full text
Abstract:
Abstract National Earth Monitoring System (SPBN) is a natural resource and disaster information system developed by Remote Sensing Application Center, National Institute of Aeronautics and Space (LAPAN), Indonesia. Drought information system is one of the SPBN disaster information products, consisting of Standardized Precipitation Index (SPI), Vegetation Greenness Level (TKV), and monthly accumulation of rainfall information. The quality of information products are improved towards data processing automation as well as provision of user-oriented products. The purpose of our research is to report the existing of drought information products at SPBN-LAPAN, to present briefly the automation process and also to analyze the result of the products. In this study, the “new” drought index information, which developed by blended of two datasets (TKV dataset that characterized agricultural drought and monthly rainfall dataset that characterized meteorological drought), using threshold method has introduced. The level of drought index is divided into five classes, namely cloud/water, severely dry, dry and normal
APA, Harvard, Vancouver, ISO, and other styles
14

Álvarez, Esther, Luis Enrique Sánchez, and Santiago Molins. "New technological surveillance and benchmarking theories and tools applied to sustainable and strategic planning of the naval industry." Ciencia y tecnología de buques 5, no. 10 (January 9, 2012): 81. http://dx.doi.org/10.25043/19098642.61.

Full text
Abstract:
Since their beginning, companies establish procedures to observe their competitors. Methods for obtaining this kind of information have evolved with the internet era; a plethora of tools is nowadays available for this job. As a consequence, a new problem has emerged: documentary noise, keeping companies from being able to process and benefit from the huge amount of information gathered. Strategic planning mainly relies on obtaining environmental knowledge, so companies need help on dealing with this documentary noise; technological surveillance and benchmarking are preferred methodologies to achieve this objective, coping with data produced by automatic internet tools like search engines and others. Qualified results of better nature are produced by bringing new theories on information gathering and processing intoboth tools. This article exposes empirical results on the application of a demonstrative technological surveillance system based on different R&D management structures, relying on benchmarking indicators for the naval and aeronautics industries.
APA, Harvard, Vancouver, ISO, and other styles
15

Lawrence, Z. D., G. L. Manney, K. Minschwaner, M. L. Santee, and A. Lambert. "Comparisons of polar processing diagnostics from 34 years of the ERA-Interim and MERRA reanalyses." Atmospheric Chemistry and Physics 15, no. 7 (April 13, 2015): 3873–92. http://dx.doi.org/10.5194/acp-15-3873-2015.

Full text
Abstract:
Abstract. We present a comprehensive comparison of polar processing diagnostics derived from the National Aeronautics and Space Administration (NASA) Modern Era Retrospective-analysis for Research and Applications (MERRA) and the European Centre for Medium-Range Weather Forecasts (ECMWF) Interim Reanalysis (ERA-Interim). We use diagnostics that focus on meteorological conditions related to stratospheric chemical ozone loss based on temperatures, polar vortex dynamics, and air parcel trajectories to evaluate the effects these reanalyses might have on polar processing studies. Our results show that the agreement between MERRA and ERA-Interim changes significantly over the 34 years from 1979 to 2013 in both hemispheres and in many cases improves. By comparing our diagnostics during five time periods when an increasing number of higher-quality observations were brought into these reanalyses, we show how changes in the data assimilation systems (DAS) of MERRA and ERA-Interim affected their meteorological data. Many of our stratospheric temperature diagnostics show a convergence toward significantly better agreement, in both hemispheres, after 2001 when Aqua and GOES (Geostationary Operational Environmental Satellite) radiances were introduced into the DAS. Other diagnostics, such as the winter mean volume of air with temperatures below polar stratospheric cloud formation thresholds (VPSC) and some diagnostics of polar vortex size and strength, do not show improved agreement between the two reanalyses in recent years when data inputs into the DAS were more comprehensive. The polar processing diagnostics calculated from MERRA and ERA-Interim agree much better than those calculated from earlier reanalysis data sets. We still, however, see fairly large differences in many of the diagnostics in years prior to 2002, raising the possibility that the choice of one reanalysis over another could significantly influence the results of polar processing studies. After 2002, we see overall good agreement among the diagnostics, which demonstrates that the ERA-Interim and MERRA reanalyses are equally appropriate choices for polar processing studies of recent Arctic and Antarctic winters.
APA, Harvard, Vancouver, ISO, and other styles
16

Lawrence, Z. D., G. L. Manney, K. Minschwaner, M. L. Santee, and A. Lambert. "Comparisons of polar processing diagnostics from 34 years of the ERA-Interim and MERRA reanalyses." Atmospheric Chemistry and Physics Discussions 14, no. 22 (December 12, 2014): 31361–408. http://dx.doi.org/10.5194/acpd-14-31361-2014.

Full text
Abstract:
Abstract. We present a comprehensive comparison of polar processing diagnostics derived from the National Aeronautics and Space Administration (NASA) Modern Era Retrospective-analysis for Research and Applications (MERRA) and the European Centre for Medium-Range Weather Forecasts (ECMWF) Interim Reanalysis (ERA-Interim). We use diagnostics that focus on meteorological conditions related to stratospheric chemical ozone loss based on temperatures, polar vortex dynamics, and air parcel trajectories to evaluate the effects these reanalyses might have on polar processing studies. Our results show that the agreement between MERRA and ERA-Interim changes significantly over the 34 years from 1979 through 2013 in both hemispheres, and in many cases improves. By comparing our diagnostics during five time periods when an increasing number of higher quality observations were brought into these reanalyses, we show how changes in the data assimilation systems (DAS) of MERRA and ERA-Interim affected their meteorological data. Many of our stratospheric temperature diagnostics show a convergence toward significantly better agreement, in both hemispheres, after 2001 when Aqua and GOES (Geostationary Operational Environmental Satellite) radiances were introduced into the DAS. Other diagnostics, such as the winter mean volume of air with temperatures below polar stratospheric cloud formation thresholds (VPSC) and some diagnostics of polar vortex size and strength, do not show improved agreement between the two reanalyses in recent years when data inputs into the DAS were more comprehensive. The polar processing diagnostics calculated from MERRA and ERA-Interim agree much better than those calculated from earlier reanalysis datasets. We still, however, see fairly large relative biases in many of the diagnostics in years prior to 2002, raising the possibility that the choice of one reanalysis over another could significantly influence the results of polar processing studies. After 2002, we see overall good agreement among the diagnostics, which demonstrates that the ERA-Interim and MERRA reanalyses are equally appropriate choices for polar processing studies of recent Arctic and Antarctic winters.
APA, Harvard, Vancouver, ISO, and other styles
17

Zweifel, Peter, Davor Mance, Jan ten Pierick, Domenico Giardini, Cedric Schmelzbach, Thomas Haag, Tobias Nicollier, et al. "Seismic High-Resolution Acquisition Electronics for the NASA InSight Mission on Mars." Bulletin of the Seismological Society of America 111, no. 6 (October 19, 2021): 2909–23. http://dx.doi.org/10.1785/0120210071.

Full text
Abstract:
ABSTRACT The Seismic Experiment for Interior Structures (SEIS) was deployed on Mars in November 2018 and began science operations in March 2019. SEIS is the primary instrument of the Interior Exploration using Seismic Investigations, Geodesy and Heat Transport (InSight) mission, which was launched by the National Aeronautics and Space Administration (NASA). The acquisition and control (AC) electronics is a key element of SEIS. The AC acquires the seismic signals of the two sets of seismic sensors with high resolution, stores the data in its local nonvolatile memory for later transmission by the lander, and controls the numerous functions of SEIS. In this article, we present an overview of the AC with its connections to the sensors and to the lander, as well as its functionality. We describe the elements of the acquisition chains and filters, and discuss the performance of the seismic and temperature channels. Furthermore, we outline the safety functions and health monitoring, which are of paramount importance for reliable operation on Mars. In addition, we analyze an artefact affecting the seismic data referred to as the “tick-noise” and provide a method to remove this artefact by post-processing the data.
APA, Harvard, Vancouver, ISO, and other styles
18

Jatlaoui, Mohamed Mehdi, Daniela Dragomirescu, Mariano Ercoli, Michael Krämer, Samuel Charlot, Patrick Pons, Hervé Aubert, and Robert Plana. "Wireless communicating nodes at 60 GHz integrated on flexible substrate for short-distance instrumentation in aeronautics and space." International Journal of Microwave and Wireless Technologies 4, no. 1 (November 17, 2011): 109–17. http://dx.doi.org/10.1017/s1759078711000961.

Full text
Abstract:
This paper presents the research done at LAAS-CNRS and in the context of “NANOCOMM” project. This project aims to demonstrate the potential of nanotechnology for the development of reconfigurable, ultra-sensitive, low consumption, and easy installation sensor networks with high performance in terms of reliability in line with the requirements of aeronautics and space. Each node of the sensor network is composed of nano-sensors, transceiver, and planar antenna. In this project, three-dimensional (3D) heterogeneous integration of these different components, on flexible polyimide substrate, is planned. Two types of sensors are selected for this project: strain gauges are used for the structure health monitoring (SHM) application and electrochemical cells are used to demonstrate the ability to detect frost phenomenon. After processing, sensors data are processed and transmitted to the reader unit using an ultra-wide band (UWB) transceiver. (digital baseband and radiofrequency (RF) head). The design and implementation of reconfigurable wireless communication architectures are provided according to the application requirements using nanoscale 65 nm CMOS technology. It is proposed to integrate on flexible substrate the transceiver using the flip-chip technique. A 60 GHz planar antenna is connected to the transceiver for the wireless data transmission. This paper is focused on the 3D integration techniques and the technological process used for the realization of such communicating nano-objects on polyimide substrate. The first assembly tests were carried out. Tests of interconnections quality and electrical contacts (Daisy Chain, calibration kit, etc.) were also performed with good results. A bumps contact resistance of 15 mΩ is measured.
APA, Harvard, Vancouver, ISO, and other styles
19

Kostinski, Alexander B., Brian D. James, and Wolfgang-M. Boerner. "Polarimetric matched filter for coherent imaging." Canadian Journal of Physics 66, no. 10 (October 1, 1988): 871–77. http://dx.doi.org/10.1139/p88-144.

Full text
Abstract:
In this paper we focus on image-contrast optimization between two rough-surface classes. Our approach is based strictly on polarimetric filtering, and therefore, no digital image-processing techniques are employed. The approach is tested on a complete polarimetric synthetic aperture radar (POL-SAR) image of the San Francisco Bay area. The data have been taken with the National Aeronautics and Space Administration – Jet Propulsion Laboratory CV-990L-band POL-SAR system, where eight real numbers (complex elements of a 2 × 2 polarization scattering matrix) are associated with each image pixel. Optimal transmitted polarizations (corresponding to maxima or minima of reflected energy) are found for each image pixel, and the results are analyzed statistically via a set of joint two-dimensional histograms. This is done for both of the rough-surface classes. The image response to the "optimal" incident polarization is then simulated digitally by adjusting the receiver polarization according to the modes of the histograms. The corresponding images are computed and displayed with significant image-contrast improvement.
APA, Harvard, Vancouver, ISO, and other styles
20

Lawrence, Zachary D., Gloria L. Manney, and Krzysztof Wargan. "Reanalysis intercomparisons of stratospheric polar processing diagnostics." Atmospheric Chemistry and Physics 18, no. 18 (September 25, 2018): 13547–79. http://dx.doi.org/10.5194/acp-18-13547-2018.

Full text
Abstract:
Abstract. We compare herein polar processing diagnostics derived from the four most recent “full-input” reanalysis datasets: the National Centers for Environmental Prediction Climate Forecast System Reanalysis/Climate Forecast System, version 2 (CFSR/CFSv2), the European Centre for Medium-Range Weather Forecasts Interim (ERA-Interim) reanalysis, the Japanese Meteorological Agency's 55-year (JRA-55) reanalysis, and the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA-2). We focus on diagnostics based on temperatures and potential vorticity (PV) in the lower-to-middle stratosphere that are related to formation of polar stratospheric clouds (PSCs), chlorine activation, and the strength, size, and longevity of the stratospheric polar vortex. Polar minimum temperatures (Tmin) and the area of regions having temperatures below PSC formation thresholds (APSC) show large persistent differences between the reanalyses, especially in the Southern Hemisphere (SH), for years prior to 1999. Average absolute differences of the reanalyses from the reanalysis ensemble mean (REM) in Tmin are as large as 3 K at some levels in the SH (1.5 K in the Northern Hemisphere – NH), and absolute differences of reanalysis APSC from the REM up to 1.5 % of a hemisphere (0.75 % of a hemisphere in the NH). After 1999, the reanalyses converge toward better agreement in both hemispheres, dramatically so in the SH: average Tmin differences from the REM are generally less than 1 K in both hemispheres, and average APSC differences less than 0.3 % of a hemisphere. The comparisons of diagnostics based on isentropic PV for assessing polar vortex characteristics, including maximum PV gradients (MPVGs) and the area of the vortex in sunlight (or sunlit vortex area, SVA), show more complex behavior: SH MPVGs showed convergence toward better agreement with the REM after 1999, while NH MPVGs differences remained largely constant over time; differences in SVA remained relatively constant in both hemispheres. While the average differences from the REM are generally small for these vortex diagnostics, understanding such differences among the reanalyses is complicated by the need to use different methods to obtain vertically resolved PV for the different reanalyses. We also evaluated other winter season summary diagnostics, including the winter mean volume of air below PSC thresholds, and vortex decay dates. For the volume of air below PSC thresholds, the reanalyses generally agree best in the SH, where relatively small interannual variability has led to many winter seasons with similar polar processing potential and duration, and thus low sensitivity to differences in meteorological conditions among the reanalyses. In contrast, the large interannual variability of NH winters has given rise to many seasons with marginal conditions that are more sensitive to reanalysis differences. For vortex decay dates, larger differences are seen in the SH than in the NH; in general, the differences in decay dates among the reanalyses follow from persistent differences in their vortex areas. Our results indicate that the transition from the reanalyses assimilating Tiros Operational Vertical Sounder (TOVS) data to advanced TOVS and other data around 1998–2000 resulted in a profound improvement in the agreement of the temperature diagnostics presented (especially in the SH) and to a lesser extent the agreement of the vortex diagnostics. We present several recommendations for using reanalyses in polar processing studies, particularly related to the sensitivity to changes in data inputs and assimilation. Because of these sensitivities, we urge great caution for studies aiming to assess trends derived from reanalysis temperatures. We also argue that one of the best ways to assess the sensitivity of scientific results on polar processing is to use multiple reanalysis datasets.
APA, Harvard, Vancouver, ISO, and other styles
21

de Barros, Everaldo, Fernando Juliani, and Leandro Ribeiro de Camargo. "Experimental facilities for modal testing." Aircraft Engineering and Aerospace Technology 89, no. 2 (March 6, 2017): 358–63. http://dx.doi.org/10.1108/aeat-04-2015-0099.

Full text
Abstract:
Purpose The experimental modal analysis requires good knowledge of various engineering fields, such as mechanical vibrations, transducers used in vibration measurement, transducers and system calibration methods, data acquisition systems, digital signal processing and system identification. Test facilities constitute a key factor for improving the quality of the estimated modal model. This paper aims to describe the experimental facilities at the Institute of Aeronautics and Space (IAE) Modal Testing Laboratory in terms of associated instrumentation and data acquisition system, metrological aspects and computational resources. The discussion is completed with a practical application showing a ground vibration testing (GVT) of an unmanned aerial vehicle (UAV). Design/methodology/approach The experimental facilities were evaluated in a typical GVT, using three shakers in both vertical and horizontal excitations and 88 response measurement points. The global excitation method was used to excite all desired modes. The reliability of the experimental modal model was validated by an auto modal assurance criterion matrix for the measured modes of the structure. Findings The experimental facilities were successfully used for validating the dynamical characteristics of the UAV under testing. Originality/value The modal test facilities of the Modal Testing Laboratory at the IAE, the main research center of the Brazilian Air Force, are described in this paper.
APA, Harvard, Vancouver, ISO, and other styles
22

AISYAH, SITI, SRI WAHYUNINGSIH, and FDT AMIJAYA. "PERAMALAN JUMLAH TITIK PANAS PROVINSI KALIMANTAN TIMUR MENGGUNAKAN METODE RADIAL BASIS FUNCTION NEURAL NETWORK." Jambura Journal of Probability and Statistics 2, no. 2 (November 11, 2021): 64–74. http://dx.doi.org/10.34312/jjps.v2i2.10292.

Full text
Abstract:
Radial Basis Function Neural Network (RBFNN) is a neural that uses a radial base function in hidden layers for classification and forecasting purposes. Neural Network is developed into a radial function base with an information processing system that has characteristics similar to biological neural networks, consisting of input layers, hidden layers, and output layers. The data used in this study is data on the number of hotspots in East Kalimantan Province obtained from the official website of the National Aeronautics and Space Administration (NASA). The purpose of this research is to obtain the RBFNN model and the results of forecasting the number of hotspots for the period January 2020 to March 2020. The radial basis function used is the local Gaussian function and the linear activation function. In this study using the proportion of training data and testing data 70: 30; 80:20; and 90:10. The results showed that the input network using significant Partial Autocorrelation Function (PACF) at lag 1 and lag 2, so that the RBFNN model that was formed involved Xt-1 and Xt-2. The best Mean Absolute Percentage Error (MAPE) minimum obtained the 80:20 data proportion with 2 hidden networks. The RBFNN architecture that is formed is 2 input layers, 2 hidden layers and 1 output layer. Data from forecasting the number of hotspots in East Kalimantan Province shows that from January 2020 to February 2020 there was a decline and March 2020 an increase.
APA, Harvard, Vancouver, ISO, and other styles
23

Brooke, Samantha, David Graham, Todd Jacobs, Charles Littnan, Mark Manuel, and Robert O’Conner. "Testing marine conservation applications of unmanned aerial systems (UAS) in a remote marine protected area." Journal of Unmanned Vehicle Systems 3, no. 4 (December 1, 2015): 237–51. http://dx.doi.org/10.1139/juvs-2015-0011.

Full text
Abstract:
In 2014, the United States National Oceanic and Atmospheric Administration (NOAA) utilized unique partnerships with the National Aeronautics and Space Administration (NASA), and the US Coast Guard for the first comparative testing of two unmanned aircraft systems (UAS): the Ikhana (an MQ-9 Predator B) and a Puma All-Environment (Puma AE). A multidisciplinary team of scientists developed missions to explore the application of the two platforms to maritime surveillance and marine resource monitoring and assessment. Testing was conducted in the Papahānaumokuākea Marine National Monument, a marine protected area in the Northwest Hawaiian Islands. Nearly 30 h of footage were collected by the test platforms, containing imagery of marine mammals, sea turtles, seabirds, marine debris, and coastal habitat. Both platforms proved capable of collecting usable data, although imagery collected using the Puma was determined to be more useful for resource monitoring purposes. Lessons learned included the need for increased camera resolution, co-location of mission scientists and UAS operators, the influence of weather on the quality of imagery collected, post-processing resource demands, and the need for pre-planning of mission targets and approach to maximize efficiency.
APA, Harvard, Vancouver, ISO, and other styles
24

MAYAUX, PHILIPPE, FRÉDÉRIC ACHARD, and JEAN-PAUL MALINGREAU. "Global tropical forest area measurements derived from coarse resolution satellite imagery: a comparison with other approaches." Environmental Conservation 25, no. 1 (March 1998): 37–52. http://dx.doi.org/10.1017/s0376892998000083.

Full text
Abstract:
Definition of appropriate tropical forest policies must be supported by better information about forest distribution. New information technologies make possible the development of advanced systems which can accurately report on tropical forest area issues. The European Commission TREES (Tropical Ecosystem Environment observation by Satellite) project has produced a consistent map of the humid tropical forest cover based on 1 km resolution satellite data. This base-line reference information can be further calibrated using a sample of high-resolution data, in order to produce accurate forest area estimates. There is good general agreement with other pantropical inventories (Food & Agriculture Organization of the United Nations Forest Resources Assessment 90, World Conservation Union Conservation Atlas of Tropical Forests, National Aeronautics & Space Administration [USA] Landsat Pathfinder) using different approaches (compilation of existing data, statistical sampling, exhaustive survey with satellite data). However, for some countries, large differences appear among the assessments. Discrepancies arising from this comparison are here analysed in terms of limitations associated with each approach and they are generally associated with differences in forest definition, data source and processing methodology. According to the different inventories, the total area of closed tropical forest is estimated at 1090–1220 million hectares with the following continental distribution: 185–215 million hectares in Africa, 235–275 million hectares in Asia, and 670–730 million hectares in Latin America. A proposal for improving the current state of forest statistics by combining the contribution of the various methods under review is made.
APA, Harvard, Vancouver, ISO, and other styles
25

Amir, Richky Faizal, Irwan Agus Sobari, and Rousyati Rousyati. "Penerapan PSO Over Sampling Dan Adaboost Random Forest Untuk Memprediksi Cacat Software." Indonesian Journal on Software Engineering (IJSE) 6, no. 2 (December 2, 2020): 230–39. http://dx.doi.org/10.31294/ijse.v6i2.9258.

Full text
Abstract:
Abstract: The dataset of software metrics, in general, are not balanced (Imbalanced). Class imbalance in Dataset can reduce the performance of software defect prediction models, because it tends to produce majority class predictions from minority classes, the dataset used in this study uses the National Aeronautics and Space Administration (NASA) Metrics Data Program (MDP), dataset From Stages Pre-processing proposed the Particle Swarm Optimization (PSO). method to overcome the problem of attributes in the training data and the Random Over Sampling (ROS) Resampling method. to deal with class imbalances. This study proposes that the Random Forest method combined with Adaboost can estimate the level of disability of software through training data. The results of this study indicate that the Resampling + Adaboost + Random Forest algorithm can be used to predict software defects with an average accuracy of 94.70% and a value of AUC 0.939. While the PSO + Random Forest algorithm only has an average accuracy of 89.60% and AUC 0.636 the difference in the accuracy of the two models is 5.10% and AUC 0.303. Statistical tests show that there is a significant influence between the proposed model and the Random Forest model with a p-value (0.036) smaller than the alpha value (0.05), which means there is a significant difference between the two models.Keywords: Imbalanced Class, Resample, Particle Swarm Optimization, Random Forest, Adaboost, Software DefectAbstrak: Dataset dari software matrik secara umum bersifat tidak seimbang (Imbalanced). Ketidak seimbangan kelas yang ada dalam dataset dapat menurunkan kinerja model prediksi cacat software, karena cenderung menghasilkan prediksi kelas mayoritas dari kelas minoritas. Dataset yang digunakan pada penelitian ini menggunakan dataset National Aeronautics and Space Administration (NASA) Metrics Data Program (MDP). Dari tahapan pra pemrosesan diusulkan metode Particle Swarm Optimization (PSO) untuk mengatasi masalah attribute pada data training dan metode Resampling Random Over Sampling (ROS). untuk menangani ketidak seimbangan kelas. Penelitian ini mengusulkan metode Random Forest yang dikombinasikan dengan Adaboost dapat mengestimasi tingkat kecacatan suatu Software melalui data training, Dari Hasil penelitian ini menunjukan bahwa algoritma Resampling+Adaboost+Random Forest dapat digunakan untuk memprediksi cacat software dengan rata-rata akurasi 94,70% dan nilai AUC 0,939. Sementara algoritma PSO+Random Forest hanya memiliki rata-rata akurasi 89,60% dan AUC 0,636 perbedaan akurasi dari kedua model tersebut 5,10% dan AUC 0,303. Uji statistik menunjukan bahwa adanya pengaruh yang signifikan antara model usulan dengan model Random Forest dengan nilai p (0,036) lebih kecil dari nilai alpha (0,05) yang artinya terdapat perbedaan yang siginifkan antara kedua model.Kata kunci: Imbalanced Class, Resample, Particle Swarm Optimization, Random Forest, Adaboost, Kecacatan Software
APA, Harvard, Vancouver, ISO, and other styles
26

Marchetti, Yuliya, Robert Rosenberg, and David Crisp. "Classification of Anomalous Pixels in the Focal Plane Arrays of Orbiting Carbon Observatory-2 and -3 via Machine Learning." Remote Sensing 11, no. 24 (December 5, 2019): 2901. http://dx.doi.org/10.3390/rs11242901.

Full text
Abstract:
A machine learning approach was developed to improve the bad pixel maps that mask damaged or unusable pixels in the imaging spectrometers of National Aeronautics and Space Administration (NASA) Orbiting Carbon Observatory-2 (OCO-2) and Orbiting Carbon Observatory-3 (OCO-3). The OCO-2 and OCO-3 instruments use nearly 500,000 pixels to record high resolution spectra in three infrared wavelength ranges. These spectra are analyzed to retrieve estimates of the column-average carbon dioxide (XCO 2) concentration in Earth’s atmosphere. To meet mission requirements, these XCO 2 estimates must have accuracies exceeding 0.25%, and small uncertainties in the bias or gain of even one detector pixel can add significant error to the retrieved XCO 2 estimates. Thus, anomalous pixels are identified and removed from the data stream by applying a bad pixel map prior to further processing. To develop these maps, we first characterize each pixel’s behavior through a collection of interpretable and statistically well-defined metrics. These features and a prior map are then used as inputs in a Random Forest classifier to assign a likelihood that a given pixel is bad. Consequently, the likelihoods are analyzed and thresholds are chosen to produce a new bad pixel map. The machine learning approach adopted here has improved data quality by identifying hundreds of new bad pixels in each detector. Such an approach can be generalized to other instruments that require independent calibration of many individual elements.
APA, Harvard, Vancouver, ISO, and other styles
27

Hassan, Aqeel Abboud Abdul. "Accuracy Assessment of Open Source Digital Elevation Models." Journal of University of Babylon for Engineering Sciences 26, no. 3 (February 1, 2018): 23–33. http://dx.doi.org/10.29196/jub.v26i3.601.

Full text
Abstract:
Digital Elevation Model is a three-dimensional representation of the earth's surface, which is essential for Geoscience and hydrological implementations. DEM can be created utilizing Photogrammetry techniques, radar interferometry, laser scanning and land surveying. There are some world agencies provide open source digital elevation models which are freely available for all users, such as the National Aeronautics and Space Administration (NASA), Japan Aerospace Exploration Agency’s (JAXA) and others. ALOS, SRTM and ASTER are satellite based DEMs which are open source products. The technologies that are used for obtaining raw data and the methods used for its processing and on the other hand the characteristics of natural land and land cover type, these and other factors are the cause of implied errors produced in the digital elevation model which can't be avoided. In this paper, ground control points observed by the differential global positioning system DGPS were used to compare the validation and performance of different satellite based digital elevation models. For validation, standard statistical tests were applied such as Mean Error (ME) and Root Mean Square Error (RMSE) which showed ALOS DEM had ME and RMSE are -1.262m and 1.988m, while SRTM DEM had ME of -0.782m with RMSE of 2.276m and ASTER DEM had 4.437m and 6.241m, respectively. These outcomes can be very helpful for analysts utilizing such models in different areas of work.
APA, Harvard, Vancouver, ISO, and other styles
28

Ren, Guanghao, Yun Wang, Zhenyun Shi, Guigang Zhang, Feng Jin, and Jian Wang. "Aero-Engine Remaining Useful Life Estimation Based on CAE-TCN Neural Networks." Applied Sciences 13, no. 1 (December 20, 2022): 17. http://dx.doi.org/10.3390/app13010017.

Full text
Abstract:
With the rapid growth of the aviation fields, the remaining useful life (RUL) estimation of aero-engine has become the focus of the industry. Due to the shortage of existing prediction methods, life prediction is stuck in a bottleneck. Aiming at the low efficiency of traditional estimation algorithms, a more efficient neural network is proposed by using Convolutional Neural Networks (CNN) to replace Long-Short Term Memory (LSTM). Firstly, multi-sensor degenerate information fusion coding is realized with the convolutional autoencoder (CAE). Then, the temporal convolutional network (TCN) is applied to achieve efficient prediction with the obtained degradation code. It does not depend on the iteration along time, but learning the causality through a mask. Moreover, the data processing is improved to further improve the application efficiency of the algorithm. ExtraTreesClassifier is applied to recognize when the failure first develops. This step can not only assist labelling, but also realize feature filtering combined with tree model interpretation. For multiple operation conditions, new features are clustered by K-means++ to encode historical condition information. Finally, an experiment is carried out to evaluate the effectiveness on the Commercial Modular Aero-Propulsion System Simulation (CMAPSS) datasets provided by the National Aeronautics and Space Administration (NASA). The results show that the proposed algorithm can ensure high-precision prediction and effectively improve the efficiency.
APA, Harvard, Vancouver, ISO, and other styles
29

Petropavlovskikh, I., R. Evans, G. Mcconville, S. Oltmans, D. Quincy, K. Lantz, P. Disterhoft, M. Stanek, and L. Flynn. "Sensitivity of Dobson and Brewer Umkehr ozone profile retrievals to ozone cross-sections and stray light effects." Atmospheric Measurement Techniques Discussions 4, no. 2 (March 30, 2011): 2007–35. http://dx.doi.org/10.5194/amtd-4-2007-2011.

Full text
Abstract:
Abstract. Remote sounding methods are used to derive ozone profile and column information from various ground-based and satellite measurements. Vertical ozone profiles measured in Dobson units (DU) are currently retrieved based on laboratory measurements of the ozone absorption cross-section spectrum between 270 and 400 nm published in 1985 by Bass and Paur (BP). Recently, the US National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA) proposed using the set of ozone cross-section measurements made at the Daumont laboratory in 1992 (BDM) for revising the Aura Ozone Monitoring Instrument (OMI) and Global Ozone Monitoring Experiment (GOME) satellite ozone profiles and total ozone column retrievals. Dobson Umkehr zenith sky data have been collected by NOAA ground-based stations at Boulder, CO (BDR) and Mauna Loa Observatory, HI (MLO) since the 1980s. The UMK04 algorithm is based on the BP ozone cross-section data. It is currently used for all Dobson Umkehr data processing submitted to the World Ozone and Ultraviolet radiation Data Centre (WOUDC) under the Global Atmosphere Watch (GAW) program of the World Meteorological Organization (WMO). Ozone profiles are also retrieved from measurements by the Mark IV Brewers operated by the NOAA-EPA Brewer Spectrophotometer UV and Ozone Network (NEUBrew) using a modified UMK04 algorithm (O3BUmkehr v.2.6, Martin Stanek). Records from Dobson and Brewer instruments located at MLO and BDR were used to produce Umkehr ozone retrievals using BDM ozone cross-sections and compared to profiles produced using the BP ozone cross sections. Additional effects of the out-of-band stray light and stratospheric temperature variability on Umkehr profile retrievals are also discussed in this paper. This paper describes the sensitivity of the Umkehr retrievals with respect to the proposed ozone cross-section changes.
APA, Harvard, Vancouver, ISO, and other styles
30

Hällqvist, Robert, Raghu Chaitanya Munjulury, Robert Braun, Magnus Eek, and Petter Krus. "Realizing Interoperability between MBSE Domains in Aircraft System Development." Electronics 11, no. 18 (September 13, 2022): 2901. http://dx.doi.org/10.3390/electronics11182901.

Full text
Abstract:
Establishing interoperability is an essential aspect of the often-pursued shift towards Model-Based Systems Engineering (MBSE) in, for example, aircraft development. If models are to be the primary information carriers during development, the applied methods to enable interaction between engineering domains need to be modular, reusable, and scalable. Given the long life cycles and often large and heterogeneous development organizations in the aircraft industry, a piece to the overall solution could be to rely on open standards and tools. In this paper, the standards Functional Mock-up Interface (FMI) and System Structure and Parameterization (SSP) are exploited to exchange data between the disciplines of systems simulation and geometry modeling. A method to export data from the 3D Computer Aided Design (CAD) Software (SW) CATIA in the SSP format is developed and presented. Analogously, FMI support of the Modeling & Simulation (M&S) tools OMSimulator, OpenModelica, and Dymola is utilized along with the SSP support of OMSimulator. The developed technology is put into context by means of integration with the M&S methodology for aircraft vehicle system development deployed at Saab Aeronautics. Finally, the established interoperability is demonstrated on two different industrially relevant application examples addressing varying aspects of complexity. A primary goal of the research is to prototype and demonstrate functionality, enabled by the SSP and FMI standards, that could improve on MBSE methodology implemented in industry and academia.
APA, Harvard, Vancouver, ISO, and other styles
31

Taylor, Thomas E., Christopher W. O'Dell, Christian Frankenberg, Philip T. Partain, Heather Q. Cronk, Andrey Savtchenko, Robert R. Nelson, et al. "Orbiting Carbon Observatory-2 (OCO-2) cloud screening algorithms: validation against collocated MODIS and CALIOP data." Atmospheric Measurement Techniques 9, no. 3 (March 8, 2016): 973–89. http://dx.doi.org/10.5194/amt-9-973-2016.

Full text
Abstract:
Abstract. The objective of the National Aeronautics and Space Administration's (NASA) Orbiting Carbon Observatory-2 (OCO-2) mission is to retrieve the column-averaged carbon dioxide (CO2) dry air mole fraction (XCO2) from satellite measurements of reflected sunlight in the near-infrared. These estimates can be biased by clouds and aerosols, i.e., contamination, within the instrument's field of view. Screening of the most contaminated soundings minimizes unnecessary calls to the computationally expensive Level 2 (L2) XCO2 retrieval algorithm. Hence, robust cloud screening methods have been an important focus of the OCO-2 algorithm development team. Two distinct, computationally inexpensive cloud screening algorithms have been developed for this application. The A-Band Preprocessor (ABP) retrieves the surface pressure using measurements in the 0.76 µm O2 A band, neglecting scattering by clouds and aerosols, which introduce photon path-length differences that can cause large deviations between the expected and retrieved surface pressure. The Iterative Maximum A Posteriori (IMAP) Differential Optical Absorption Spectroscopy (DOAS) Preprocessor (IDP) retrieves independent estimates of the CO2 and H2O column abundances using observations taken at 1.61 µm (weak CO2 band) and 2.06 µm (strong CO2 band), while neglecting atmospheric scattering. The CO2 and H2O column abundances retrieved in these two spectral regions differ significantly in the presence of cloud and scattering aerosols. The combination of these two algorithms, which are sensitive to different features in the spectra, provides the basis for cloud screening of the OCO-2 data set.To validate the OCO-2 cloud screening approach, collocated measurements from NASA's Moderate Resolution Imaging Spectrometer (MODIS), aboard the Aqua platform, were compared to results from the two OCO-2 cloud screening algorithms. With tuning of algorithmic threshold parameters that allows for processing of ≃ 20–25 % of all OCO-2 soundings, agreement between the OCO-2 and MODIS cloud screening methods is found to be ≃ 85 % over four 16-day orbit repeat cycles in both the winter (December) and spring (April–May) for OCO-2 nadir-land, glint-land and glint-water observations.No major, systematic, spatial or temporal dependencies were found, although slight differences in the seasonal data sets do exist and validation is more problematic with increasing solar zenith angle and when surfaces are covered in snow and ice and have complex topography. To further analyze the performance of the cloud screening algorithms, an initial comparison of OCO-2 observations was made to collocated measurements from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) aboard the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO). These comparisons highlight the strength of the OCO-2 cloud screening algorithms in identifying high, thin clouds but suggest some difficulty in identifying some clouds near the surface, even when the optical thicknesses are greater than 1.
APA, Harvard, Vancouver, ISO, and other styles
32

Petropavlovskikh, I., R. Evans, G. McConville, S. Oltmans, D. Quincy, K. Lantz, P. Disterhoft, M. Stanek, and L. Flynn. "Sensitivity of Dobson and Brewer Umkehr ozone profile retrievals to ozone cross-sections and stray light effects." Atmospheric Measurement Techniques 4, no. 9 (September 9, 2011): 1841–53. http://dx.doi.org/10.5194/amt-4-1841-2011.

Full text
Abstract:
Abstract. Remote sounding methods are used to derive ozone profile and column information from various ground-based and satellite measurements. Vertical ozone profiles measured in Dobson units (DU) are currently retrieved based on laboratory measurements of the ozone absorption cross-section spectrum between 270 and 400 nm published in 1985 by Bass and Paur (BP). Recently, the US National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA) proposed using the set of ozone cross-section measurements made at the Daumont laboratory in 1992 (BDM) for revising the Aura Ozone Monitoring Instrument (OMI) and Global Ozone Monitoring Experiment (GOME) satellite ozone profiles and total ozone column retrievals. Dobson Umkehr zenith sky data have been collected by NOAA ground-based stations at Boulder, CO (BDR) and Mauna Loa Observatory, HI (MLO) since the 1980s. The UMK04 algorithm is based on the BP ozone cross-section data. It is currently used for all Dobson Umkehr data processing submitted to the World Ozone and Ultraviolet radiation Data Centre (WOUDC) under the Global Atmosphere Watch (GAW) program of the World Meteorological Organization (WMO). Ozone profiles are also retrieved from measurements by the Mark IV Brewers operated by the NOAA-EPA Brewer Spectrophotometer UV and Ozone Network (NEUBrew) using a modified UMK04 algorithm (O3BUmkehr v.2.6, Martin Stanek). This paper describes the sensitivity of the Umkehr retrievals with respect to the proposed ozone cross-section changes. It is found that the ozone cross-section choice only minimally (within the retrieval accuracy) affects the Dobson and the Brewer Umkehr retrievals. On the other hand, significantly larger errors were found in the MLO and Boulder Umkehr ozone data (−8 and +5% bias in stratosphere and troposphere respectively) when the out-of-band (OOB) stray light contribution to the Umkehr measurement is not taken into account (correction is currently not included in the UMK04). The vertical distribution of OOB effect in the retrieved profile can be related to the local ozone climatology, instrument degradation, and optical characteristics of the instrument. Nonetheless, recurring OOB errors do not contribute to the long-term ozone trends.
APA, Harvard, Vancouver, ISO, and other styles
33

Lolli, Simone, Gemine Vivone, Jasper R. Lewis, Michaël Sicard, Ellsworth J. Welton, James R. Campbell, Adolfo Comerón, et al. "Overview of the New Version 3 NASA Micro-Pulse Lidar Network (MPLNET) Automatic Precipitation Detection Algorithm." Remote Sensing 12, no. 1 (December 24, 2019): 71. http://dx.doi.org/10.3390/rs12010071.

Full text
Abstract:
Precipitation modifies atmospheric column thermodynamics through the process of evaporation and serves as a proxy for latent heat modulation. For this reason, a correct precipitation parameterization (especially for low-intensity precipitation) within global scale models is crucial. In addition to improving our modeling of the hydrological cycle, this will reduce the associated uncertainty of global climate models in correctly forecasting future scenarios, and will enable the application of mitigation strategies. In this manuscript we present a proof of concept algorithm to automatically detect precipitation from lidar measurements obtained from the National Aeronautics and Space Administration Micropulse lidar network (MPLNET). The algorithm, once tested and validated against other remote sensing instruments, will be operationally implemented into the network to deliver a near real time (latency <1.5 h) rain masking variable that will be publicly available on MPLNET website as part of the new Version 3 data products. The methodology, based on an image processing technique, detects only light precipitation events (defined by intensity and duration) such as light rain, drizzle, and virga. During heavy rain events, the lidar signal is completely extinguished after a few meters in the precipitation or it is unusable because of water accumulated on the receiver optics. Results from the algorithm, in addition to filling a gap in light rain, drizzle, and virga detection by radars, are of particular interest for the scientific community as they help to fully characterize the aerosol cycle, from emission to deposition, as precipitation is a crucial meteorological phenomenon accelerating atmospheric aerosol removal through the scavenging effect. Algorithm results will also help the understanding of long term aerosol–cloud interactions, exploiting the multi-year database from several MPLNET permanent observational sites across the globe. The algorithm is also applicable to other lidar and/or ceilometer network infrastructures in the framework of the Global Aerosol Watch (GAW) aerosol lidar observation network (GALION).
APA, Harvard, Vancouver, ISO, and other styles
34

De Lemos, Hugo, Michel M. Verstraete, and Mary Scholes. "Parametric Models to Characterize the Phenology of the Lowveld Savanna at Skukuza, South Africa." Remote Sensing 12, no. 23 (November 30, 2020): 3927. http://dx.doi.org/10.3390/rs12233927.

Full text
Abstract:
Mathematical models, such as the logistic curve, have been extensively used to model the temporal evolution of biological processes, though other similarly shaped functions could be (and sometimes have been) used for this purpose. Most previous studies focused on agricultural regions in the Northern Hemisphere and were based on the Normalized Difference Vegetation Index (NDVI). This paper compares the capacity of four parametric double S-shaped models (Gaussian, Hyperbolic Tangent, Logistic, and Sine) to represent the seasonal phenology of an unmanaged, protected savanna biome in South Africa’s Lowveld, using the Fraction of Absorbed Photosynthetically Active Radiation (FAPAR) generated by the Multi-angle Imaging SpectroRadiometer-High Resolution (MISR-HR) processing system on the basis of data originally collected by National Aeronautics and Space Administration (NASA)’s Multi-angle Imaging SpectroRadiometer (MISR) instrument since 24 February 2000. FAPAR time series are automatically split into successive vegetative seasons, and the models are inverted against those irregularly spaced data to provide a description of the seasonal fluctuations despite the presence of noise and missing values. The performance of these models is assessed by quantifying their ability to account for the variability of remote sensing data and to evaluate the Gross Primary Productivity (GPP) of vegetation, as well as by evaluating their numerical efficiency. Simulated results retrieved from remote sensing are compared to GPP estimates derived from field measurements acquired at Skukuza’s flux tower in the Kruger National Park, which has also been operational since 2000. Preliminary results indicate that (1) all four models considered can be adjusted to fit an FAPAR time series when the temporal distribution of the data is sufficiently dense in both the growing and the senescence phases of the vegetative season, (2) the Gaussian and especially the Sine models are more sensitive than the Hyperbolic Tangent and Logistic to the temporal distribution of FAPAR values during the vegetative season, and, in particular, to the presence of long temporal gaps in the observational data, and (3) the performance of these models to simulate the phenology of plants is generally quite sensitive to the presence of unexpectedly low FAPAR values during the peak period of activity and to the presence of long gaps in the observational data. Consequently, efforts to screen out outliers and to minimize those gaps, especially during the rainy season (vegetation’s growth phase), would go a long way to improve the capacity of the models to adequately account for the evolution of the canopy cover and to better assess the relation between FAPAR and GPP.
APA, Harvard, Vancouver, ISO, and other styles
35

Dudek, Ewa, and Michał Kozłowski. "The Concept of a Method Ensuring Aeronautical Data Quality." Journal of KONBiN 37, no. 1 (July 1, 2016): 319–40. http://dx.doi.org/10.1515/jok-2016-0015.

Full text
Abstract:
Abstract The paper presents the concept of a method ensuring quality of aeronautical data. European Union (among others UE 73/2010) as well as international (among others ICAO Annex 15) regulations introduce a number of requirements regarding the quality and safety of aeronautical data. Those directives set up a complementary regulations system. However with their objective and scope they determine mainly the specifications and requirements that are to be implemented and compatible. Mentioned regulations also refer to selected international standards (e.g. ISO 19157), focused on quality and safety of geographic data and information. Nevertheless within the scope of considered regulations and norms no algorithms and methods of ensuring required quality in the process of aeronautical data collection and processing were determined. Taking into account the identified needs, authors proposed the application of statistical method for process quality management – six-sigma.
APA, Harvard, Vancouver, ISO, and other styles
36

Eshmuradov, D. E., T. D. Elmuradov, and N. M. Turayeva. "Automation of aeronautical information processing based on multi-agent technologies." Civil Aviation High Technologies 25, no. 1 (February 28, 2022): 65–76. http://dx.doi.org/10.26467/2079-0619-2022-25-1-65-76.

Full text
Abstract:
Progress in the development of computer engineering provides an opportunity to address a wider variety of challenges using computer software systems. The task of automatic aeronautical navigation information processing is referred to the number of such issues. This stipulates the necessity to adopt new approaches to design and develop similar systems. One of these approaches is based on the application of the collective activity idea of a set of agents – multi-agent technologies. In this regard, the purpose of the article is to consider the features of the automated aeronautical navigation information processing implementation on the basis of multi-agent technologies. To achieve this goal, the problem-structural methodology of hybrid systems synthesis, which allows us to create self-organizing models, was selected. Each element of which develops, obtaining data and knowledge from other elements. In the research process, a formal definition of the multi-agent system of automatic aeronautical information processing is presented, which involves a set of agents, environment of agent functioning, a set of permissible relations between agents, description of rules for forming a network of agents, a set of individual and joint actions, communication interactions, behavior and actions strategies, a possibility of system evolution. Furthermore, an emphasis is placed on the description of each agent. For this purpose, the authors propose to use four elements: a set of variables, inputs and outputs, an autonomous technique that performs appropriate changes over a set of variables. As agents, the paper comes up with the idea to use the following: Pilots Notification Agent, Preflight Information Bulletin Agent, Data Generation Agent, Aviation Processes Agent, Aviation Database Generation Agent, Aeronautical Maps Creation Agent, Aeronautical Data Set Export/Import Agent, Publications and References Agent. In addition, the article presents the multi-agent system diagram of automated aeronautical information processing and describes in detail processing an application in the agent using the mathematical expression. The results, obtained in the course of investigations, can be used to improve the effectiveness of the analytical component in the structure of the system to form the direct and reverse coordination relationship while solving aerial navigation problems.
APA, Harvard, Vancouver, ISO, and other styles
37

Dattoma, Vito, Francesco Panella, Alessandra Pirinu, and Andrea Saponaro. "Advanced NDT Methods and Data Processing on Industrial CFRP Components." Applied Sciences 9, no. 3 (January 24, 2019): 393. http://dx.doi.org/10.3390/app9030393.

Full text
Abstract:
In this work, enhanced thermal data processing is developed with experimental procedures, improving visualization algorithm for sub-surface defect detection on industrial composites. These materials are prone to successful infrared nondestructive investigation analyses, since defects are easily characterized by temperature response under thermal pulses with reliable results. Better defect characterization is achieved analyzing data with refined processing and experimental procedures, providing detailed contrasts maps where defects are better distinguished. Thermal data are analyzed for different CFRP specimens with artificial defects and experimental procedures are verified on real structural aeronautical component with internal anomalies due to impact simulation. A better computation method is found to be useful for simultaneous defect detection by means of automatic mapping of absolute contrast, optimized to identify defect boundaries.
APA, Harvard, Vancouver, ISO, and other styles
38

Sathyendranath, Shubha, Robert Brewin, Carsten Brockmann, Vanda Brotas, Ben Calton, Andrei Chuprin, Paolo Cipollini, et al. "An Ocean-Colour Time Series for Use in Climate Studies: The Experience of the Ocean-Colour Climate Change Initiative (OC-CCI)." Sensors 19, no. 19 (October 3, 2019): 4285. http://dx.doi.org/10.3390/s19194285.

Full text
Abstract:
Ocean colour is recognised as an Essential Climate Variable (ECV) by the Global Climate Observing System (GCOS); and spectrally-resolved water-leaving radiances (or remote-sensing reflectances) in the visible domain, and chlorophyll-a concentration are identified as required ECV products. Time series of the products at the global scale and at high spatial resolution, derived from ocean-colour data, are key to studying the dynamics of phytoplankton at seasonal and inter-annual scales; their role in marine biogeochemistry; the global carbon cycle; the modulation of how phytoplankton distribute solar-induced heat in the upper layers of the ocean; and the response of the marine ecosystem to climate variability and change. However, generating a long time series of these products from ocean-colour data is not a trivial task: algorithms that are best suited for climate studies have to be selected from a number that are available for atmospheric correction of the satellite signal and for retrieval of chlorophyll-a concentration; since satellites have a finite life span, data from multiple sensors have to be merged to create a single time series, and any uncorrected inter-sensor biases could introduce artefacts in the series, e.g., different sensors monitor radiances at different wavebands such that producing a consistent time series of reflectances is not straightforward. Another requirement is that the products have to be validated against in situ observations. Furthermore, the uncertainties in the products have to be quantified, ideally on a pixel-by-pixel basis, to facilitate applications and interpretations that are consistent with the quality of the data. This paper outlines an approach that was adopted for generating an ocean-colour time series for climate studies, using data from the MERIS (MEdium spectral Resolution Imaging Spectrometer) sensor of the European Space Agency; the SeaWiFS (Sea-viewing Wide-Field-of-view Sensor) and MODIS-Aqua (Moderate-resolution Imaging Spectroradiometer-Aqua) sensors from the National Aeronautics and Space Administration (USA); and VIIRS (Visible and Infrared Imaging Radiometer Suite) from the National Oceanic and Atmospheric Administration (USA). The time series now covers the period from late 1997 to end of 2018. To ensure that the products meet, as well as possible, the requirements of the user community, marine-ecosystem modellers, and remote-sensing scientists were consulted at the outset on their immediate and longer-term requirements as well as on their expectations of ocean-colour data for use in climate research. Taking the user requirements into account, a series of objective criteria were established, against which available algorithms for processing ocean-colour data were evaluated and ranked. The algorithms that performed best with respect to the climate user requirements were selected to process data from the satellite sensors. Remote-sensing reflectance data from MODIS-Aqua, MERIS, and VIIRS were band-shifted to match the wavebands of SeaWiFS. Overlapping data were used to correct for mean biases between sensors at every pixel. The remote-sensing reflectance data derived from the sensors were merged, and the selected in-water algorithm was applied to the merged data to generate maps of chlorophyll concentration, inherent optical properties at SeaWiFS wavelengths, and the diffuse attenuation coefficient at 490 nm. The merged products were validated against in situ observations. The uncertainties established on the basis of comparisons with in situ data were combined with an optical classification of the remote-sensing reflectance data using a fuzzy-logic approach, and were used to generate uncertainties (root mean square difference and bias) for each product at each pixel.
APA, Harvard, Vancouver, ISO, and other styles
39

Dudek, Ewa, and Michał Kozłowski. "The Concept of Risk Tolerability Matrix Determination for Aeronautical Data and Information Chain." Journal of KONBiN 43, no. 1 (October 1, 2017): 69–94. http://dx.doi.org/10.1515/jok-2017-0040.

Full text
Abstract:
Abstract This article is a continuation of the Authors’ study on the ways to ensure the quality and safety of aeronautical data and information in the entire process (considered as the supply chain) of those data and information creation, collection, processing and publication. In its content attention was paid to air traffic proactive safety management aa well as the need to manage identified incompatibilities. The risk assessment and tolerability matrices arising from ICAO specifications were presented, and then on their bases, the concept of such matrices determination for aeronautical data and information chain was developed. In addition, the criteria for consequences’/effects’ of incompatibilities appearance assessment related strictly to air transport were elaborated. In the summary directions for further analysis were pointed out, leading to carrying out a full risk assessment analysis of the discussed chain with the use of the FMEA method.
APA, Harvard, Vancouver, ISO, and other styles
40

Takács, B., Z. Siki, and R. Markovits-Somogyi. "EXTENSION OF RTKLIB FOR THE CALCULATION AND VALIDATION OF PROTECTION LEVELS." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4/W2 (July 5, 2017): 161–66. http://dx.doi.org/10.5194/isprs-archives-xlii-4-w2-161-2017.

Full text
Abstract:
System integrity (i.e. the capability of self-monitoring) and the reliability of the positions provided need to be ensured within all safety critical applications of the GPS technology. For the sake of such applications, GPS augmentations, for example Space Based Augmentation Systems (SBAS) are to be applied to achieve the required level of integrity. SBAS provides integrity in a multi-step procedure that is laid out in the Radio Technical Commission for Aeronautics (RTCA) Minimum Operational Performance Standards (MOPS) for airborne navigation equipment using GPS. Besides integrity, SBAS also improves accuracy of positioning via broadcasting corrections to reduce the most important systematic errors on standalone positioning. To quantify integrity, the protection level is defined, which is calculated from the standard deviation of the models broadcast in SBAS. <br><br> Air Navigation Service Providers, airspace users and aviation authorities need to evaluate the performance of GPS systems and their augmentations. This is a necessary step to define the conditions under which GPS systems can be operationally used and which operations can be supported. For this evaluation two proprietary software are used widely in Europe: Pegasus from Eurocontrol (Butzmühlen et al., 2001) and magicGemini from GMV. Both tools provide several functionalities such as computation of position simulating MOPS-compliant receivers and determination of GNSS augmentation attributes like accuracy, integrity, continuity and availability. <br><br> RTKLIB is an open source GNSS data processing and analysis tool (Takasu, 2009). The actual version (2.4.3) of RTKLIB has SBAS augmented positioning mode, but no protection level calculation is included. There is an open source project on GitHub3, a fork of RTKLIB 2.4.2 version with an option for WAAS MOPS compliant position calculation, including protection level calculation, too. This was developed by the Houghton Associates, Inc. and tested on Cygwin platform. Their development was finished in 2014. We have merged the WAAS MOPS position calculation into the newer RTKLIB release (2.4.3 beta) and made closer integration into the original RTKLIB utility program RNX2RTKP. Our enhanced RTKLIB version is also available on GitHub4 as a fork of the original RTKLIB project of Tomoji Takasu. This enhanced version was developed and tested on Ubuntu Linux 14.04 and 16.04. <br><br> Raw static and kinematic data were post-processed by our enhanced RTKLIB version. Calculated SBAS positions and protection levels were compared to the results of Pegasus and magicGemini. Although the RTCA standard defines the exact formula to calculate protection levels, the numerical results of the tested software are slightly different. Accurate tests regarding the possible sources of this kind of discrepancies were carried on in order to validate our open source solution. <br><br> The aim of our work is to provide an open source alternative to the available proprietary software. The open source solution might be a good basis for the evaluation of GPS and SBAS performance monitoring.
APA, Harvard, Vancouver, ISO, and other styles
41

Eshmuradov, D. E., and T. D. Elmuradov. "Mathematical modelling of aeronautical environment." Civil Aviation High Technologies 23, no. 5 (October 28, 2020): 67–75. http://dx.doi.org/10.26467/2079-0619-2020-23-5-67-75.

Full text
Abstract:
On the basis of the study of air flight control, as well as the need to take into account the regional parameters of the airspace, the expediency and the need for mathematical modeling of the air navigation environment were identified. The article deals with the features of the model of air navigation environment, in which the base of the study adopted the basic information used to describe the plan for the execution of air flight. In the study, it was found that in navigation main feature is the geodesic of abstraction, however, such information was found, is not complete for the plan for air flight. The study emphasized the role of taking into account all the restrictions by which certain prohibitions on the use of specific volumes of airspace are imposed. Based on the results obtained, a graph of the geometric representation of the airspace with constraints was constructed. The results of the study allow us to conclude that the construction of a mathematical model of the air navigation environment allows us to achieve the following results: optimization of the distribution of the EAP loads by sectors, visualization of the air navigation situation in the region, the establishment of critical load directions, the collection of data on the load, the study of the factor effects on the regularity and safety of the aircraft movement. The mathematical model of the aeronautical situation was built with the help of a composition of hierarchical type. As a result of such construction of mathematical model its transformation with addition of new models is possible. Using the proposed mathematical model in the framework of discrete event systems can be used to simulate complex air navigation environment with a large number of aircraft. The use of the formalism presented in this study allows a clear distinction between the mechanism of information processing and the information itself in the process of modeling.
APA, Harvard, Vancouver, ISO, and other styles
42

Prokhorov, A. V. "Impact of NOTAM on security and efficiency performance of flights (overview)." Civil Aviation High Technologies 25, no. 1 (February 28, 2022): 21–34. http://dx.doi.org/10.26467/2079-0619-2022-25-1-21-34.

Full text
Abstract:
The purpose of this work is to analyze and assess the impact of NOTAM on the flight safety and efficiency. The main problems associated with NOTAM were considered: number of NOTAMs, the practical use of NOTAM information, technical limitations of the current NOTAM system. Examples of the negative impact of NOTAM on the quality of air navigation support, safety and efficiency of flights are presented. Also, the best practice of solving problems, related to NOTAM in the world, is presented with using: Q-code and flight planning systems (on the example of Lido Flight 4D). The concept of the European AIS Database (EAD), developed on the basis of the Aeronautical Information Exchange Model (AIXM), is presented. The concept of Digital NOTAM, implemented on the basis of AIXM and intended for the exchange, automatic processing and interpretation of the dynamic aeronautical data, is considered. The research studies the new modernized NOTAM system – Federal NOTAM System (FNS – Federal NOTAM System), developed by the Federal Aviation Administration (FAA) of the United States, which allows encoding the Digital NOTAM. ICAO's plans for the transition from the concept of Aeronautical Information Service (AIS) to the Aeronautical Information Management (AIM), applying the principles of the System Wide Information Management (SWIM) concept, were analyzed. As a result of the analysis of the current NOTAM system and the modernized NOTAM system (FNS from FAA), conclusions were drawn that the implementation of Digital NOTAM should solve the technical part of the problems associated with NOTAM, brought about by the use of modern communications (internet) and new data exchange standards (AIXM), which, as a result, will lead to an increase in the level of safety and efficiency of flights. At the same time, problems with NOTAM caused by human factors remain unresolved, due to incorrect use of the NOTAM instrument.
APA, Harvard, Vancouver, ISO, and other styles
43

Mi, Baigang, Yi Fan, and Yu Sun. "NOTAM Text Analysis and Classification Based on Attention Mechanism." Journal of Physics: Conference Series 2171, no. 1 (January 1, 2022): 012042. http://dx.doi.org/10.1088/1742-6596/2171/1/012042.

Full text
Abstract:
Abstract As one of the important carriers, Notice to Air Men (NOTAM) provides key aeronautical security information during flight operation. However, a large number of NOTAM texts lack systematic utilization and organization. To improve the application of NOTAM texts, the Natural language processing can be used to classify and analyze NOTAM texts, and their effective processing attaches significance to monitor the air security. In this paper, a corpus set of NOTAM by web crawler technology has been built to generate raw data. Based on the aeronautical intelligence knowledge system, the corpus set is systematically pre-processed and analyzed, while the key information in NOTAM and corpus features are also extracted. Considering the high-dimensional and sparse of corpus set, the traditional bi-directional Recurrent Neural Network (Bi-RNN) architecture is improved by adding an attention layer which not only allows the Bi-RNN to capture hidden dependency to improve sparse features, but also allows the attention mechanism to weigh the key information base on their perceived importance to optimize text representation. Experiments show that that the Attention-based Bi-RNN network outperforms the other four models, achieving a precision of 94.17% and an F1-Score of 93.66%. The model rapidly increases the utilization of text features meanwhile reduces the loss, which demonstrates it is a useful tool for the task of NOTAM texts classification and air security surveillance.
APA, Harvard, Vancouver, ISO, and other styles
44

Lv, Sheng Li, Lei Jiang Yao, Xiao Yan Tong, and Zheng Li. "A Database for Material Design of C/SiC Composites." Applied Mechanics and Materials 80-81 (July 2011): 444–47. http://dx.doi.org/10.4028/www.scientific.net/amm.80-81.444.

Full text
Abstract:
Continuous carbon fiber reinforced silicon carbide composite material (C/SiC) is one of the most effective candidate materials for hot structures in aeronautic and aerospace applications. Its performances in the complicated service environments are widely concerned. A database, aiming at optimized design of C/SiC, was developed. The database collected original data on the fabrication, microstructure of C/SiC, as well as abundant data on performance experiments including tension, compression, shear, fatigue, creep, oxidation, high-temperature fatigue, and so on. The logic structure of the database, modeled by unified modeling language, provides a data link that connecting the processing, microstructure and performance of C/SiC, so that users can conveniently create a test result set to build the mathematical model of material design. Efficient software was developed to realize management, browsing and extension of the database.
APA, Harvard, Vancouver, ISO, and other styles
45

Brzozowski, Marek, Mariusz Pakowski, Mirosław Myszka, Mirosław Michalczewski, and Urszula Winiarska. "The Research of Modern Radar Equipment Conducted in the Air Force Institute of Technology by the Application of Military Aircrafts." Aviation Advances & Maintenance 40, no. 1 (August 1, 2017): 27–65. http://dx.doi.org/10.1515/afit-2017-0002.

Full text
Abstract:
Abstract The publication described selected issues from the research area of modern radar equipment produced by the Polish industrial plants, using military aircrafts of various types. A technical development caused a substantial improvement of detection parameters of flying objects by radar sensors what forces changes in research methods used to verify tactical and technical parameters of these devices. The article covers research methods of radar equipment, pilots’ assistive devices as well as methods of logging and processing the measurement data applied in the Research Laboratory for Radar Equipment and Aeronautical Engineering of the AFIT (pol. Laboratorium Badań Urządzeń Radarowych i Techniki Lotniczej ITWL).
APA, Harvard, Vancouver, ISO, and other styles
46

Song, Hong Bin, Patrice Peyre, Vincent Ji, and Hervé Pelletier. "Residual Stress Gradient Study of Laser Shocked Aluminum Alloy by GIXRD Analysis and FEM Simulation." Materials Science Forum 614 (March 2009): 61–66. http://dx.doi.org/10.4028/www.scientific.net/msf.614.61.

Full text
Abstract:
Laser shock processing (LSP) is a new and competitive technology in comparison with classical mechanical surface treatments for alloys strengthening. The compressive residual stress could be induced by LSP, which further significantly affects materials surface properties. Aluminum based alloy 6056 is used for aeronautic components and their surface strengthening is very important to increase components’ durability. The main factors of LSP, which affect the surface properties of alloys, such as surface roughness and residual stress gradient, are studied and analyzed in present study. The sin2* method with pseudo-grazing incidence X-ray diffraction (GIXRD) is used to determine the residual stress gradient after LSP treatment. In addition, the FEM simulation of residual stress distribution induced by LSP is also carried out and the results are compared with experimental data.
APA, Harvard, Vancouver, ISO, and other styles
47

Gudo, Adam Juma Abdallah, Marye Belete, Ghali Abdullahi Abubakar, and Jinsong Deng. "Spatio-Temporal Analysis of Solar Energy Potential for Domestic and Agricultural Utilization to Diminish Poverty in Jubek State, South Sudan, Africa." Energies 13, no. 6 (March 17, 2020): 1399. http://dx.doi.org/10.3390/en13061399.

Full text
Abstract:
The study aimed to generate informative data on solar radiation in order to establish sustainable solar energy that will support domestic needs and agricultural production and processing industries in Jubek State, South Sudan. Solar radiation intensity, timely data variation, site landscape, and environment were considered. Input data used was remotely sensed data, digital elevation model, land used land cover (LULC) processed with Aeronautical Reconnaissance Coverage Geographic Information System (ArcGIS). The spatio-temporal distribution analysis results show that (62%) 11,356.7 km2 of the study area is suitable for solar energy farm with an annual potential of about 6.05 × 109 GWh/year out of which only 69.0158 GW h/year is required to meet the local demand of 492,970 people residing in the study area, i.e., 0.11% (1249.2 km2) of Jubek State. Solar energy required for producing and processing 1 ton of different crop ranges between 58.39 × 10−6 and 1477.9 × 10−6 GWh and area size between 10.7 and 306.3 km2, whereas 1 ton of animal production requires solar energy ranging between 750.1 × 10−6 and 8334 × 10−6 GWh and area of about 137.8 to 1531.5 km2. These findings will assist in the establishment of agro-processing industries which will eventually lead to poverty reduction through job creation and improvement of food quantity and quality. The simple approach applied in this study is unique, especially for the study area, thus it can be applied to some other locations following the same steps.
APA, Harvard, Vancouver, ISO, and other styles
48

Thorning, L., M. Bower, C. D. Hardwick, and P. Hood. "Greenland ice cap aeromagnetic survey 1987: completion of the survey over the southern end of the Greenland ice cap." Rapport Grønlands Geologiske Undersøgelse 140 (December 31, 1988): 70–72. http://dx.doi.org/10.34194/rapggu.v140.8039.

Full text
Abstract:
The Geological Survey of Greenland (GGU), the Geological Survey of Canada (GSC), and the National Aeronautical Establishment (NAE) of the National Research Council of Canada are cooperating in the GICAS project. The objective of the project is to achieve a regional, aeromagnetic coverage of the Greenland ice cap, and to produce magnetic anomaly maps for use in research on the large scale geological structures. Field work was carried out in 1983, 1984, and 1985 (Thorning et al. , 1984, 1985, 1986). The work reported in this note was originally planned for April 1986, but for technical reasons it had to be postponed until April 1987. Consequently, the subsequent processing of all the data into a regional magnetic anomaly map has been correspondingly delayed.
APA, Harvard, Vancouver, ISO, and other styles
49

Rébillat, Marc, and Nazih Mechbal. "Damage localization in geometrically complex aeronautic structures using canonical polyadic decomposition of Lamb wave difference signal tensors." Structural Health Monitoring 19, no. 1 (April 24, 2019): 305–21. http://dx.doi.org/10.1177/1475921719843453.

Full text
Abstract:
Monitoring in real time and autonomously the health state of aeronautic structures is referred to as structural health monitoring and is a process decomposed in four steps: damage detection, localization, classification, and quantification. In this work, the structures under study are aeronautic geometrically complex structures equipped with a bonded piezoelectric network. When interrogating such a structure, the resulting data lie along three dimensions (namely, the “actuator,”“sensor,” and “time” dimensions) and can thus be interpreted as three-way tensors. The fact that Lamb wave structural health monitoring–based data are naturally three-way tensors is here investigated for damage localization purpose. In this article, it is demonstrated that under classical assumptions regarding wave propagation, the canonical polyadic decomposition of rank 2 of the tensors build from the phase and amplitude of the difference signals between a healthy and damaged states provides direct access to the distances between the piezoelectric elements and damage. This property is used here to propose an original tensor-based damage localization algorithm. This algorithm is successfully validated on experimental data coming from a scale one part of an airplane nacelle (1.5 m in height for a semi circumference of 4 m) equipped with 30 piezoelectric elements and many stiffeners. Obtained results demonstrate that the tensor-based localization algorithm can locate a damage within this structure with an average precision of 10 cm and with a precision lower than 1 cm at best. In comparison with standard damage localization algorithms (delay-and-sum, reconstruction algorithm for probabilistic inspection of defects, and ellipse- or hyperbola-based algorithms), the proposed algorithm appears as more precise and robust on the investigated cases. Furthermore, it is important to notice that this algorithm only takes the raw signals as inputs and that no specific pre-processing steps or finely tuned external parameters are needed. This algorithm is thus very appealing as reliable and easy to settle damage localization timeliness with low false alarm rates are one of the key successes to shorten the gap between research and industrial deployment of structural health monitoring processes.
APA, Harvard, Vancouver, ISO, and other styles
50

Aliane, Nourdine, Carlos Quiterio Gomez Muñoz, and Javier Sánchez-Soriano. "Web and MATLAB-Based Platform for UAV Flight Management and Multispectral Image Processing." Sensors 22, no. 11 (June 2, 2022): 4243. http://dx.doi.org/10.3390/s22114243.

Full text
Abstract:
The deployment of any UAV application in precision agriculture involves the development of several tasks, such as path planning and route optimization, images acquisition, handling emergencies, and mission validation, to cite a few. UAVs applications are also subject to common constraints, such as weather conditions, zonal restrictions, and so forth. The development of such applications requires the advanced software integration of different utilities, and this situation may frighten and dissuade undertaking projects in the field of precision agriculture. This paper proposes the development of a Web and MATLAB-based application that integrates several services in the same environment. The first group of services deals with UAV mission creation and management. It provides several pieces of flight conditions information, such as weather conditions, the KP index, air navigation maps, or aeronautical information services including notices to Airmen (NOTAM). The second group deals with route planning and converts selected field areas on the map to an UAV optimized route, handling sub-routes for long journeys. The third group deals with multispectral image processing and vegetation indexes calculation and visualizations. From a software development point of view, the app integrates several monolithic and independent programs around the MATLAB Runtime package with an automated and transparent data flow. Its main feature consists in designing a plethora of executable MATLAB programs, especially for the route planning and optimization of UAVs, images processing and vegetation indexes calculations, and running them remotely.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography