To see the other types of publications on this topic, follow the link: Remote Sensing Data Fusion (RSDF).

Journal articles on the topic 'Remote Sensing Data Fusion (RSDF)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Remote Sensing Data Fusion (RSDF).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ghaffar, M. A. A., T. T. Vu, and T. H. Maul. "MULTI-MODAL REMOTE SENSING DATA FUSION FRAMEWORK." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4/W2 (July 5, 2017): 85–89. http://dx.doi.org/10.5194/isprs-archives-xlii-4-w2-85-2017.

Full text
Abstract:
The inconsistency between the freely available remote sensing datasets and crowd-sourced data from the resolution perspective forms a big challenge in the context of data fusion. In classical classification problems, crowd-sourced data are represented as points that may or not be located within the same pixel. This discrepancy can result in having mixed pixels that could be unjustly classified. Moreover, it leads to failure in retaining sufficient level of details from data inferences. In this paper we propose a method that can preserve detailed inferences from remote sensing datasets accompanied with crowd-sourced data. We show that advanced machine learning techniques can be utilized towards this objective. The proposed method relies on two steps, firstly we enhance the spatial resolution of the satellite image using Convolutional Neural Networks and secondly we fuse the crowd-sourced data with the upscaled version of the satellite image. However, the covered scope in this paper is concerning the first step. Results show that CNN can enhance Landsat 8 scenes resolution visually and quantitatively.
APA, Harvard, Vancouver, ISO, and other styles
2

Butini, Francesco, Vito Cappellini, and Stefano Fini. "Remote Sensing Data Fusion on Intelligent Terminals." European Transactions on Telecommunications 3, no. 6 (November 1992): 555–63. http://dx.doi.org/10.1002/ett.4460030608.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Belgiu, Mariana, and Alfred Stein. "Spatiotemporal Image Fusion in Remote Sensing." Remote Sensing 11, no. 7 (April 4, 2019): 818. http://dx.doi.org/10.3390/rs11070818.

Full text
Abstract:
In this paper, we discuss spatiotemporal data fusion methods in remote sensing. These methods fuse temporally sparse fine-resolution images with temporally dense coarse-resolution images. This review reveals that existing spatiotemporal data fusion methods are mainly dedicated to blending optical images. There is a limited number of studies focusing on fusing microwave data, or on fusing microwave and optical images in order to address the problem of gaps in the optical data caused by the presence of clouds. Therefore, future efforts are required to develop spatiotemporal data fusion methods flexible enough to accomplish different data fusion tasks under different environmental conditions and using different sensors data as input. The review shows that additional investigations are required to account for temporal changes occurring during the observation period when predicting spectral reflectance values at a fine scale in space and time. More sophisticated machine learning methods such as convolutional neural network (CNN) represent a promising solution for spatiotemporal fusion, especially due to their capability to fuse images with different spectral values.
APA, Harvard, Vancouver, ISO, and other styles
4

Nguyen, Hai, Noel Cressie, and Amy Braverman. "Spatial Statistical Data Fusion for Remote Sensing Applications." Journal of the American Statistical Association 107, no. 499 (September 2012): 1004–18. http://dx.doi.org/10.1080/01621459.2012.694717.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wei, Chao, Dong Mei Liu, Fan Wang, and Ling Yan Chen. "Fusion Research of Remote Sensing Image Based on Compressive Sensing." Applied Mechanics and Materials 380-384 (August 2013): 3637–42. http://dx.doi.org/10.4028/www.scientific.net/amm.380-384.3637.

Full text
Abstract:
Compressive Sensing provides a new method of signal processing, when the image signal is sparse or can be com-pressed, it is possible to substantially lower than the Nyquist sampling rate, the sampling mode of the image signal is sampled, and by recovery algorithms to restore the image signal. This theory can greatly reduce the amount of data calculated in the storage, processing and transmission of the image signal. Based on this theory, the paper presents the method of remote sensing image fusion in compressed sensing domain. Firstly, the image for fast Fourier transform and measurement sampling, namely to obtain the compressed perception domain data, and then using the weighted data fusion, the final fused image is obtained by solving the optimization problem of the reconstructed image. Through the experimental proved that, this fusion method deal less data but fusion effect good.
APA, Harvard, Vancouver, ISO, and other styles
6

Cao, Lei, Jun Liu, and Shu Guang Liu. "Remote Sensing Image Fusion of Worldview-2 Satellite Data." Applied Mechanics and Materials 333-335 (July 2013): 1159–63. http://dx.doi.org/10.4028/www.scientific.net/amm.333-335.1159.

Full text
Abstract:
In view of the situation that most image fusion methods make spectral distortion more or less; this paper proposes a spatial projection method by introducing the Gaussian scale space theory. According to mechanism of the human visual system represented by the Gaussian scale space, the spatial details feature information are extracted from the original panchromatic (PAN) and multispectral (MS) images, then the feature differences are projected into the original MS image to obtain the fused image. The experimental results with WorldView-2 images show that the proposed method can improve the spatial resolution of the fused MS image effectively, while it can make little spectral distortion to the fused images so as to maintain the great majority spectral information.
APA, Harvard, Vancouver, ISO, and other styles
7

Yao, X. L., S. Y. Sun, X. J. Li, and R. Liu. "A new continuous fusion method of remote sensing data." IOP Conference Series: Earth and Environmental Science 191 (November 5, 2018): 012130. http://dx.doi.org/10.1088/1755-1315/191/1/012130.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Yushi, Chunyang Li, Pedram Ghamisi, Xiuping Jia, and Yanfeng Gu. "Deep Fusion of Remote Sensing Data for Accurate Classification." IEEE Geoscience and Remote Sensing Letters 14, no. 8 (August 2017): 1253–57. http://dx.doi.org/10.1109/lgrs.2017.2704625.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Jixian. "Multi-source remote sensing data fusion: status and trends." International Journal of Image and Data Fusion 1, no. 1 (March 2010): 5–24. http://dx.doi.org/10.1080/19479830903561035.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Schmitt, Michael, and Xiao Xiang Zhu. "Data Fusion and Remote Sensing: An ever-growing relationship." IEEE Geoscience and Remote Sensing Magazine 4, no. 4 (December 2016): 6–23. http://dx.doi.org/10.1109/mgrs.2016.2561021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Swatantran, Anu, Ralph Dubayah, Scott Goetz, Michelle Hofton, Matthew G. Betts, Mindy Sun, Marc Simard, and Richard Holmes. "Mapping Migratory Bird Prevalence Using Remote Sensing Data Fusion." PLoS ONE 7, no. 1 (January 3, 2012): e28922. http://dx.doi.org/10.1371/journal.pone.0028922.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

De Benedetto, Daniela, Annamaria Castrignano, Mariangela Diacono, Michele Rinaldi, Sergio Ruggieri, and Rosanna Tamborrino. "Field partition by proximal and remote sensing data fusion." Biosystems Engineering 114, no. 4 (April 2013): 372–83. http://dx.doi.org/10.1016/j.biosystemseng.2012.12.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Li, Xiao Jing, Tong Pan, Ting Ting Liu, and Hao Peng Wang. "Research on Remote-Sensing Data Syncretizing of Vegetation-Virtual-Reality-Simulation." Applied Mechanics and Materials 336-338 (July 2013): 1426–29. http://dx.doi.org/10.4028/www.scientific.net/amm.336-338.1426.

Full text
Abstract:
Discussed the key effects and basic principles of data fusion of remote sensing for realizing plants virtual reality simulation. Focused on researching and presenting the general methods of data fusion of remote sensing, application interfaces, and 3D visual display of virtual plants. Through the research, the visual display of visual plants will be realized with full remote sensing interfaces of real-time transferring and intervening.
APA, Harvard, Vancouver, ISO, and other styles
14

., Ezhili G. "BUILDING EXTRACTION FROM REMOTE SENSING IMAGERIES BY DATA FUSION TECHNIQUES." International Journal of Research in Engineering and Technology 02, no. 03 (March 25, 2013): 347–50. http://dx.doi.org/10.15623/ijret.2013.0203021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Pena, Jose, Yumin Tan, and Wuttichai Boonpook. "Semantic Segmentation Based Remote Sensing Data Fusion on Crops Detection." Journal of Computer and Communications 07, no. 07 (2019): 53–64. http://dx.doi.org/10.4236/jcc.2019.77006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Nguyen, Hai, Noel Cressie, and Amy Braverman. "Multivariate Spatial Data Fusion for Very Large Remote Sensing Datasets." Remote Sensing 9, no. 2 (February 9, 2017): 142. http://dx.doi.org/10.3390/rs9020142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

FU, Wei, Huan PEI, Xiao-Yu LIAO, Chao BAI, Xian-Wen GAO, He TIAN, and Qiong-Yao ZHU. "Data Assimilation Algorithm of Multi-fountain Remote Sensing Image Fusion." Acta Automatica Sinica 37, no. 3 (June 7, 2011): 309–15. http://dx.doi.org/10.3724/sp.j.1004.2011.00309.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Nguyen, Hai, Matthias Katzfuss, Noel Cressie, and Amy Braverman. "Spatio-Temporal Data Fusion for Very Large Remote Sensing Datasets." Technometrics 56, no. 2 (April 3, 2014): 174–85. http://dx.doi.org/10.1080/00401706.2013.831774.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Cheng, Xu, Yuhui Zheng, Jianwei Zhang, and Zhangjing Yang. "Multitask Multisource Deep Correlation Filter for Remote Sensing Data Fusion." IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 13 (2020): 3723–34. http://dx.doi.org/10.1109/jstars.2020.3002885.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Huanxin Zou, Hao Sun, Kefeng Ji, Chun Du, and Chunyan Lu. "Multimodal Remote Sensing Data Fusion via Coherent Point Set Analysis." IEEE Geoscience and Remote Sensing Letters 10, no. 4 (July 2013): 672–76. http://dx.doi.org/10.1109/lgrs.2012.2217936.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Sun, Heng, Jian Weng, Guangchuang Yu, and Richard H. Massawe. "A DNA-Based Semantic Fusion Model for Remote Sensing Data." PLoS ONE 8, no. 10 (October 8, 2013): e77090. http://dx.doi.org/10.1371/journal.pone.0077090.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Ashraf, Salman, Lars Brabyn, and Brendan J. Hicks. "Image data fusion for the remote sensing of freshwater environments." Applied Geography 32, no. 2 (March 2012): 619–28. http://dx.doi.org/10.1016/j.apgeog.2011.07.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Shi, Hongxiang, and Emily L. Kang. "Spatial data fusion for large non-Gaussian remote sensing datasets." Stat 6, no. 1 (2017): 390–404. http://dx.doi.org/10.1002/sta4.165.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Pena, J. A., T. Yumin, H. Liu, B. Zhao, J. A. Garcia, and J. Pinto. "REMOTE SENSING DATA FUSION TO DETECT ILLICIT CROPS AND UNAUTHORIZED AIRSTRIPS." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-3 (April 30, 2018): 1363–68. http://dx.doi.org/10.5194/isprs-archives-xlii-3-1363-2018.

Full text
Abstract:
Remote sensing data fusion has been playing a more and more important role in crop planting area monitoring, especially for crop area information acquisition. Multi-temporal data and multi-spectral time series are two major aspects for improving crop identification accuracy. Remote sensing fusion provides high quality multi-spectral and panchromatic images in terms of spectral and spatial information, respectively. In this paper, we take one step further and prove the application of remote sensing data fusion in detecting illicit crop through LSMM, GOBIA, and MCE analyzing of strategic information. This methodology emerges as a complementary and effective strategy to control and eradicate illicit crops.
APA, Harvard, Vancouver, ISO, and other styles
25

Moselhi, Osama, Hassan Bardareh, and Zhenhua Zhu. "Automated Data Acquisition in Construction with Remote Sensing Technologies." Applied Sciences 10, no. 8 (April 20, 2020): 2846. http://dx.doi.org/10.3390/app10082846.

Full text
Abstract:
Near real-time tracking of construction operations and timely progress reporting are essential for effective management of construction projects. This does not only mitigate potential negative impact of schedule delays and cost overruns but also helps to improve safety on site. Such timely tracking circumvents the drawbacks of conventional methods for data acquisition, which are manual, labor-intensive, and not reliable enough for various construction purposes. To address these issues, a wide range of automated site data acquisition, including remote sensing (RS) technologies, has been introduced. This review article describes the capabilities and limitations of various scenarios employing RS enabling technologies for localization, with a focus on multi-sensor data fusion models. In particular, we have considered integration of real-time location systems (RTLSs) including GPS and UWB with other sensing technologies such as RFID, WSN, and digital imaging for their use in construction. This integrated use of technologies, along with information models (e.g., BIM models) is expected to enhance the efficiency of automated site data acquisition. It is also hoped that this review will prompt researchers to investigate fusion-based data capturing and processing.
APA, Harvard, Vancouver, ISO, and other styles
26

Cai, Y. R., J. H. Zheng, M. J. Du, C. Mu, and J. Peng. "GRASSLAND NPP MONITORING BASED ON MULTI-SOURCE REMOTE SENSING DATA FUSION." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-3 (April 30, 2018): 113–18. http://dx.doi.org/10.5194/isprs-archives-xlii-3-113-2018.

Full text
Abstract:
Vegetation is an important part of the terrestrial ecosystem. It plays an important role in the energy and material exchange of the ground-atmosphere system and is a key part of the global carbon cycle process.Climate change has an important influence on the carbon cycle of terrestrial ecosystems. Net Primary Productivity (Net Primary Productivity)is an important parameter for evaluating global terrestrial ecosystems. For the Xinjiang region, the study of grassland NPP has gradually become a hot issue in the ecological environment.Increasing the estimation accuracy of NPP is of great significance to the development of the ecosystem in Xinjiang. Based on the third-generation GIMMS AVHRR NDVI global vegetation dataset and the MODIS NDVI (MOD13A3) collected each month by the United States Atmospheric and Oceanic Administration (NOAA),combining the advantages of different remotely sensed datasets, this paper obtained the maximum synthesis fusion for New normalized vegetation index (NDVI) time series in 2006–2015.Analysis of Net Primary Productivity of Grassland Vegetation in Xinjiang Using Improved CASA Model The method described in this article proves the feasibility of applying data processing, and the accuracy of the NPP calculation using the fusion processed NDVI has been greatly improved. The results show that: (1) The NPP calculated from the new normalized vegetation index (NDVI) obtained from the fusion of GIMMS AVHRR NDVI and MODIS NDVI is significantly higher than the NPP calculated from these two raw data; (2) The grassland NPP in Xinjiang Interannual changes show an overall increase trend; interannual changes in NPP have a certain relationship with precipitation.
APA, Harvard, Vancouver, ISO, and other styles
27

Zeng, Chuiqing, Douglas J. King, Murray Richardson, and Bo Shan. "Fusion of Multispectral Imagery and Spectrometer Data in UAV Remote Sensing." Remote Sensing 9, no. 7 (July 6, 2017): 696. http://dx.doi.org/10.3390/rs9070696.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Xie, Wei. "Adaptive remote sensing image fusion under the framework of data assimilation." Optical Engineering 50, no. 6 (June 1, 2011): 067006. http://dx.doi.org/10.1117/1.3584839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Yusuf, Yuhendra, Josaphat Tetuko Sri Sumantyo, and Hiroaki Kuze. "Spectral information analysis of image fusion data for remote sensing applications." Geocarto International 28, no. 4 (July 2013): 291–310. http://dx.doi.org/10.1080/10106049.2012.692396.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Dalla Mura, M., S. Prasad, F. Pacifici, P. Gamba, J. Chanussot, and J. A. Benediktsson. "Challenges and Opportunities of Multimodality and Data Fusion in Remote Sensing." Proceedings of the IEEE 103, no. 9 (September 2015): 1585–601. http://dx.doi.org/10.1109/jproc.2015.2462751.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Du, Xiaoxiao, and Alina Zare. "Multiresolution Multimodal Sensor Fusion for Remote Sensing Data With Label Uncertainty." IEEE Transactions on Geoscience and Remote Sensing 58, no. 4 (April 2020): 2755–69. http://dx.doi.org/10.1109/tgrs.2019.2955320.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Gur, Eran, and Zeev Zalevsky. "Resolution-enhanced remote sensing via multi spectral and spatial data fusion." International Journal of Image and Data Fusion 2, no. 2 (June 2011): 149–65. http://dx.doi.org/10.1080/19479832.2010.551520.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Xu, Guoyin, Zhongjing Wang, and Ting Xia. "Mapping Areal Precipitation with Fusion Data by ANN Machine Learning in Sparse Gauged Region." Applied Sciences 9, no. 11 (June 4, 2019): 2294. http://dx.doi.org/10.3390/app9112294.

Full text
Abstract:
Focusing on water resources assessment in ungauged or sparse gauged areas, a comparative evaluation of areal precipitation was conducted by remote sensing data, limited gauged data, and a fusion of gauged data and remote sensing data based on machine learning. The artificial neural network (ANN) model was used to fuse the remote sensing precipitation and ground gauge precipitation. The correlation coefficient, root mean square deviation, relative deviation and consistency principle were used to evaluate the reliability of the remote sensing precipitation. The case study in the Qaidam Basin, northwest of China, shows that the precision of the original remote sensing precipitation product of Tropical Precipitation Measurement Satellite (TRMM)-3B42RT and TRMM-3B43 was 0.61, 72.25 mm, 36.51%, 27% and 0.70, 64.24 mm, 31.63%, 32%, respectively, comparing with gauged precipitation. The precision of corrected TRMM-3B42RT and TRMM-3B43 improved to 0.89, 37.51 mm, –0.08%, 41% and 0.91, 34.22 mm, 0.11%, 42%, respectively, which indicates that the data mining considering elevation, longitude and latitude as the main influencing factors of precipitation is efficient and effective. The evaluation of areal precipitation in the Qaidam Basin shows that the mean annual precipitation is 104.34 mm, 186.01 mm and 174.76 mm based on the gauge data, corrected TRMM-3B42RT and corrected TRMM-3B43. The results show many differences in the areal precipitation based on sparse gauge precipitation data and fusion remote sensing data.
APA, Harvard, Vancouver, ISO, and other styles
34

Luo, Xiao Qing, and Xiao Jun Wu. "Fusing Remote Sensing Images Using a Statistical Model." Applied Mechanics and Materials 263-266 (December 2012): 416–20. http://dx.doi.org/10.4028/www.scientific.net/amm.263-266.416.

Full text
Abstract:
Enhance spectral fusion quality is the one of most significant targets in the field of remote sensing image fusion. In this paper, a statistical model based fusion method is proposed, which is the improved method for fusing remote sensing images on the basis of the framework of Principal Component Analysis(PCA) and wavelet decomposition-based image fusion. PCA is applied to the source images. In order to retain the entropy information of data, we select the principal component axes based on entropy contribution(ECA). The first entropy component and panchromatic image(PAN) are performed a multiresolution decompositon using wavelet transform. The low frequency subband fused by weighted aggregation approach and high frequency subband fused by statistical model. High resolution multispectral image is then obtained by an inverse wavelet and ECA transform. The experimental results demonstrate that the proposed method can retain the spectral information and spatial information in the fusion of PAN and multi-spectral image(MS).
APA, Harvard, Vancouver, ISO, and other styles
35

Mahyoub, S., A. Fadil, E. M. Mansour, H. Rhinane, and F. Al-Nahmi. "FUSING OF OPTICAL AND SYNTHETIC APERTURE RADAR (SAR) REMOTE SENSING DATA: A SYSTEMATIC LITERATURE REVIEW (SLR)." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-4/W12 (February 21, 2019): 127–38. http://dx.doi.org/10.5194/isprs-archives-xlii-4-w12-127-2019.

Full text
Abstract:
<p><strong>Abstract.</strong> Remote sensing and image fusion have recognized many important improvements throughout the recent years, especially fusion of optical and synthetic aperture radar (SAR), there are so many published papers that worked on fusing optical and SAR data which used in many application fields in remote sensing such as Land use Mapping and monitoring. The goal of this survey paper is to summarize and synthesize the published articles from 2013 to 2018 which focused on the fusion of Optical and synthetic aperture radar (SAR) remote sensing data in a systematic literature review (SLR), based on the pre-published articles on indexed database related to this subject and outlining the latest techniques as well as the most used methods. In addition this paper highlights the most popular image fusion methods in this blending type. After conducting many researches in the indexed databases by using different key words related to the topic “fusion Optical and SAR in remote sensing”, among 705 articles, chosen 83 articles, which match our inclusion criteria and research questions as results ,all the systematic study ‘ questions have been answered and discussed.</p>
APA, Harvard, Vancouver, ISO, and other styles
36

Wilkie, Craig J., E. Marian Scott, Claire Miller, Andrew N. Tyler, Peter D. Hunter, and Evangelos Spyrakos. "Data Fusion of Remote-sensing and In-lake chlorophylla Data Using Statistical Downscaling." Procedia Environmental Sciences 26 (2015): 123–26. http://dx.doi.org/10.1016/j.proenv.2015.05.014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Wang, Xiaofei, and Xiaoyi Wang. "Spatiotemporal Fusion of Remote Sensing Image Based on Deep Learning." Journal of Sensors 2020 (June 29, 2020): 1–11. http://dx.doi.org/10.1155/2020/8873079.

Full text
Abstract:
High spatial and temporal resolution remote sensing data play an important role in monitoring the rapid change of the earth surface. However, there is an irreconcilable contradiction between the spatial and temporal resolutions of the remote sensing image acquired from a same sensor. The spatiotemporal fusion technology for remote sensing data is an effective way to solve the contradiction. In this paper, we will study the spatiotemporal fusion method based on the convolutional neural network, which can fuse the Landsat data with high spatial but low temporal resolution and MODIS data with low spatial but high temporal resolution, and generate time series data with high spatial resolution. In order to improve the accuracy of spatiotemporal fusion, a residual convolution neural network is proposed. MODIS image is used as the input to predict the residual image between MODIS and Landsat, and the sum of the predicted residual image and MODIS data is used as the predicted Landsat-like image. In this paper, the residual network not only increases the depth of the superresolution network but also avoids the problem of vanishing gradient due to the deep network structure. The experimental results show that the prediction accuracy by our method is greater than that of several mainstream methods.
APA, Harvard, Vancouver, ISO, and other styles
38

Peng, Mingyuan, Lifu Zhang, Xuejian Sun, Yi Cen, and Xiaoyang Zhao. "A Fast Three-Dimensional Convolutional Neural Network-Based Spatiotemporal Fusion Method (STF3DCNN) Using a Spatial-Temporal-Spectral Dataset." Remote Sensing 12, no. 23 (November 27, 2020): 3888. http://dx.doi.org/10.3390/rs12233888.

Full text
Abstract:
With the growing development of remote sensors, huge volumes of remote sensing data are being utilized in related applications, bringing new challenges to the efficiency and capability of processing huge datasets. Spatiotemporal remote sensing data fusion can restore high spatial and high temporal resolution remote sensing data from multiple remote sensing datasets. However, the current methods require long computing times and are of low efficiency, especially the newly proposed deep learning-based methods. Here, we propose a fast three-dimensional convolutional neural network-based spatiotemporal fusion method (STF3DCNN) using a spatial-temporal-spectral dataset. This method is able to fuse low-spatial high-temporal resolution data (HTLS) and high-spatial low-temporal resolution data (HSLT) in a four-dimensional spatial-temporal-spectral dataset with increasing efficiency, while simultaneously ensuring accuracy. The method was tested using three datasets, and discussions of the network parameters were conducted. In addition, this method was compared with commonly used spatiotemporal fusion methods to verify our conclusion.
APA, Harvard, Vancouver, ISO, and other styles
39

Sun, Ling, and Ze Sheng Zhu. "An Images-Fusion Method of Malaria Epidemic Remote Sensing." Advanced Materials Research 790 (September 2013): 583–86. http://dx.doi.org/10.4028/www.scientific.net/amr.790.583.

Full text
Abstract:
This paper addressed the problem of improving precision of malaria epidemic remote sensing by developing optimum image-fusion system, which analyses the implementation of image-fusion system through interpretation-lattice and takes into account that the benefits of image-fusion are maximized and the risk of error malaria epidemic recognition with remote sensing is minimized. We tested our RS image-fusion method with an application about monitoring malaria epidemic that is based on several TM and SPOT images and local statistical data. This method is better able to estimate the malaria epidemic level in comparison with only one single TM or SPOT image, the main method that was previously applied in this context.
APA, Harvard, Vancouver, ISO, and other styles
40

Sun, Yue, and Hua Zhang. "A Two-Stage Spatiotemporal Fusion Method for Remote Sensing Images." Photogrammetric Engineering & Remote Sensing 85, no. 12 (December 1, 2019): 907–14. http://dx.doi.org/10.14358/pers.85.12.907.

Full text
Abstract:
This paper presents a two-stage spatiotemporal fusion method for obtaining dense remote sensing images with both high spatial and temporal resolution. Considering the large resolution differences between fine- and coarse-resolution images, the proposed method is implemented in two stages. In the first stage, the input fine- and coarse-resolution images are preprocessed to the same intermediate resolution images, respectively. Then, a linear interpolation model is introduced to fuse these resampled images for predicting preliminary fusion results. In the second stage, a residual dense network is used to learn the nonlinear mapping between the preliminary fusion results and the real fine-resolution data to reconstruct the final fine-resolution data. Two data sets with different land surface types are employed to test the performance of the proposed method. Experimental results show that the proposed method is advantageous in such areas with phenological changes, and even for the data sets with land cover changes being the main type, it still has a good ability to predict spatial structure information of images.
APA, Harvard, Vancouver, ISO, and other styles
41

Zhang, Li Ming, and Xi Qing Zhao. "The Fusion Process of RS(Remote Sensing) Data Based on Wavelet Neural Network." Advanced Materials Research 605-607 (December 2012): 2171–74. http://dx.doi.org/10.4028/www.scientific.net/amr.605-607.2171.

Full text
Abstract:
A wavelet network was constructed through multi-spectral, high-pixel wavelet decomposition of remote sensing image, which replaced the traditional neuron activation function with a nonlinear wavelet system. So we found a data fusion model. Multidimensional information fusion was to integrate the multi-source information characteristics of a high dimensional feature space. With the genetic search based on the natural selection, we combine effective evaluation of fusion accuracy.
APA, Harvard, Vancouver, ISO, and other styles
42

Alizadeh, Mohammad Reza, and Mohammad Reza Nikoo. "A fusion-based methodology for meteorological drought estimation using remote sensing data." Remote Sensing of Environment 211 (June 2018): 229–47. http://dx.doi.org/10.1016/j.rse.2018.04.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Xie, Wei. "Errata: Adaptive remote sensing image fusion under the framework of data assimilation." Optical Engineering 50, no. 6 (June 1, 2011): 069802. http://dx.doi.org/10.1117/1.3600346.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

García-Gutiérrez, Jorge, Daniel Mateos-García, and José C. Riquelme-Santos. "EVOR-STACK: A label-dependent evolutive stacking on remote sensing data fusion." Neurocomputing 75, no. 1 (January 2012): 115–22. http://dx.doi.org/10.1016/j.neucom.2011.02.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Wu, Bo, and Shengjun Tang. "Review of geometric fusion of remote sensing imagery and laser scanning data." International Journal of Image and Data Fusion 6, no. 2 (March 30, 2015): 97–114. http://dx.doi.org/10.1080/19479832.2015.1024175.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Abdi, Ghasem, Farhad Samadzadegan, and Peter Reinartz. "Deep learning decision fusion for the classification of urban remote sensing data." Journal of Applied Remote Sensing 12, no. 01 (March 13, 2018): 1. http://dx.doi.org/10.1117/1.jrs.12.016038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Chandrakanth, R., J. Saibaba, Geeta Varadan, and P. Ananth Raj. "A Novel Image Fusion System for Multisensor and Multiband Remote Sensing Data." IETE Journal of Research 60, no. 2 (March 4, 2014): 168–82. http://dx.doi.org/10.1080/03772063.2014.914697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Khodadadzadeh, Mahdi, Jun Li, Saurabh Prasad, and Antonio Plaza. "Fusion of Hyperspectral and LiDAR Remote Sensing Data Using Multiple Feature Learning." IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 8, no. 6 (June 2015): 2971–83. http://dx.doi.org/10.1109/jstars.2015.2432037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Geng, W., W. Zhou, and S. Jin. "FEATURE FUSION FOR CROSS-MODAL SCENE CLASSIFICATION OF REMOTE SENSING IMAGE." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIV-M-3-2021 (August 10, 2021): 63–66. http://dx.doi.org/10.5194/isprs-archives-xliv-m-3-2021-63-2021.

Full text
Abstract:
Abstract. Scene classification plays an important role in remote sensing field. Traditional approaches use high-resolution remote sensing images as data source to extract powerful features. Although these kind of methods are common, the model performance is severely affected by the image quality of the dataset, and the single modal (source) of images tend to cause the mission of some scene semantic information, which eventually degrade the classification accuracy. Nowadays, multi-modal remote sensing data become easy to obtain since the development of remote sensing technology. How to carry out scene classification of cross-modal data has become an interesting topic in the field. To solve the above problems, this paper proposes using feature fusion for cross-modal scene classification of remote sensing image, i.e., aerial and ground street view images, expecting to use the advantages of aerial images and ground street view data to complement each other. Our cross- modal model is based on Siamese Network. Specifically, we first train the cross-modal model by pairing different sources of data with aerial image and ground data. Then, the trained model is used to extract the deep features of the aerial and ground image pair, and the features of the two perspectives are fused to train a SVM classifier for scene classification. Our approach has been demonstrated using two public benchmark datasets, AiRound and CV-BrCT. The preliminary results show that the proposed method achieves state-of-the-art performance compared with the traditional methods, indicating that the information from ground data can contribute to aerial image classification.
APA, Harvard, Vancouver, ISO, and other styles
50

Zhang, Xiaoyong, Zhengchao Chen, Yuemin Yue, Xiangkun Qi, and Charlie H. Zhang. "Fusion of Remote Sensing and Internet Data to Calculate Urban Floor Area Ratio." Sustainability 11, no. 12 (June 19, 2019): 3382. http://dx.doi.org/10.3390/su11123382.

Full text
Abstract:
The floor area ratio is a comprehensive index that plays an important role in urban planning and sustainable development. Remote sensing data are widely used in floor area ratio calculations because they can produce both two-dimensional planar and three-dimensional stereo information on buildings. However, remote sensing is not adequate for calculating the number of floors in a building. In this paper, a simple and practical pixel-level model is established through defining a quantitative relationship among the floor area ratio, building density, and average number of floors (ANF). The floor area ratios are calculated by combining remote sensing data with publicly available Internet data. It incorporates supplemental map data and street-level views from Internet maps to confirm building types and the number of floors, thereby enabling more-accurate floor area ratio calculations. The proposed method is tested in the Tiantongyuan neighborhood, Changping District, Beijing, and the results show that it can accurately approximate the number of floors in buildings. Inaccuracies in the value of the floor area ratio were found to be primarily due to the uncertainties in building density calculations. After performing systematic error correction, the building density (BD) and floor area ratio were each calculated with the relative accuracy exceeding 90%. Moreover, the experiments verified that the fusion of internet map data with remote sensing data has innate advantages for floor area ratio calculations.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography