Academic literature on the topic 'Multisensory fusion'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Multisensory fusion.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Multisensory fusion"

1

de Winkel, Ksander N., Mikhail Katliar, and Heinrich H. Bülthoff. "Forced Fusion in Multisensory Heading Estimation." PLOS ONE 10, no. 5 (May 4, 2015): e0127104. http://dx.doi.org/10.1371/journal.pone.0127104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Prsa, Mario, Steven Gale, and Olaf Blanke. "Self-motion leads to mandatory cue fusion across sensory modalities." Journal of Neurophysiology 108, no. 8 (October 15, 2012): 2282–91. http://dx.doi.org/10.1152/jn.00439.2012.

Full text
Abstract:
When perceiving properties of the world, we effortlessly combine multiple sensory cues into optimal estimates. Estimates derived from the individual cues are generally retained once the multisensory estimate is produced and discarded only if the cues stem from the same sensory modality (i.e., mandatory fusion). Does multisensory integration differ in that respect when the object of perception is one's own body, rather than an external variable? We quantified how humans combine visual and vestibular information for perceiving own-body rotations and specifically tested whether such idiothetic cues are subjected to mandatory fusion. Participants made extensive size comparisons between successive whole body rotations using only visual, only vestibular, and both senses together. Probabilistic descriptions of the subjects' perceptual estimates were compared with a Bayes-optimal integration model. Similarity between model predictions and experimental data echoed a statistically optimal mechanism of multisensory integration. Most importantly, size discrimination data for rotations composed of both stimuli was best accounted for by a model in which only the bimodal estimator is accessible for perceptual judgments as opposed to an independent or additive use of all three estimators (visual, vestibular, and bimodal). Indeed, subjects' thresholds for detecting two multisensory rotations as different from one another were, in pertinent cases, larger than those measured using either single-cue estimate alone. Rotations different in terms of the individual visual and vestibular inputs but quasi-identical in terms of the integrated bimodal estimate became perceptual metamers. This reveals an exceptional case of mandatory fusion of cues stemming from two different sensory modalities.
APA, Harvard, Vancouver, ISO, and other styles
3

Fang, Chaoming, Bowei He, Yixuan Wang, Jin Cao, and Shuo Gao. "EMG-Centered Multisensory Based Technologies for Pattern Recognition in Rehabilitation: State of the Art and Challenges." Biosensors 10, no. 8 (July 26, 2020): 85. http://dx.doi.org/10.3390/bios10080085.

Full text
Abstract:
In the field of rehabilitation, the electromyography (EMG) signal plays an important role in interpreting patients’ intentions and physical conditions. Nevertheless, utilizing merely the EMG signal suffers from difficulty in recognizing slight body movements, and the detection accuracy is strongly influenced by environmental factors. To address the above issues, multisensory integration-based EMG pattern recognition (PR) techniques have been developed in recent years, and fruitful results have been demonstrated in diverse rehabilitation scenarios, such as achieving high locomotion detection and prosthesis control accuracy. Owing to the importance and rapid development of the EMG centered multisensory fusion technologies in rehabilitation, this paper reviews both theories and applications in this emerging field. The principle of EMG signal generation and the current pattern recognition process are explained in detail, including signal preprocessing, feature extraction, classification algorithms, etc. Mechanisms of collaborations between two important multisensory fusion strategies (kinetic and kinematics) and EMG information are thoroughly explained; corresponding applications are studied, and the pros and cons are discussed. Finally, the main challenges in EMG centered multisensory pattern recognition are discussed, and a future research direction of this area is prospected.
APA, Harvard, Vancouver, ISO, and other styles
4

Song, Il Young, Vladimir Shin, Seokhyoung Lee, and Won Choi. "Multisensor Estimation Fusion of Nonlinear Cost Functions in Mixed Continuous-Discrete Stochastic Systems." Mathematical Problems in Engineering 2014 (2014): 1–12. http://dx.doi.org/10.1155/2014/218381.

Full text
Abstract:
We propose centralized and distributed fusion algorithms for estimation of nonlinear cost function (NCF) in multisensory mixed continuous-discrete stochastic systems. The NCF represents a nonlinear multivariate functional of state variables. For polynomial NCFs, we propose a closed-form estimation procedure based on recursive formulas for high-order moments for a multivariate normal distribution. In general case, the unscented transformation is used for calculation of nonlinear estimates of a cost functions. To fuse local state estimates, the mixed differential difference equations for error cross-covariance between local estimates are derived. The subsequent application of the proposed fusion estimators for a multisensory environment demonstrates their effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
5

Kiemel, Tim, Kelvin S. Oie, and John J. Jeka. "Multisensory fusion and the stochastic structure of postural sway." Biological Cybernetics 87, no. 4 (October 1, 2002): 262–77. http://dx.doi.org/10.1007/s00422-002-0333-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Chen, Cheng, and Hong Hua Wang. "Research on Signal Detection Method of High Precision Based on Bayesian Fusion of Multisensory System." Advanced Materials Research 945-949 (June 2014): 1962–67. http://dx.doi.org/10.4028/www.scientific.net/amr.945-949.1962.

Full text
Abstract:
Faced to low detection rate and low credibility of single sensor because of noise , this paper proposes Bayesian fusion based on multisensory system, whose detec tion rate and credibility are discussed. After simulation , it is concluded that the Bayesian fusion is a feasible detection method with high precision that improves detection rate and credibility.
APA, Harvard, Vancouver, ISO, and other styles
7

Wang, Jinjiang, Junyao Xie, Rui Zhao, Laibin Zhang, and Lixiang Duan. "Multisensory fusion based virtual tool wear sensing for ubiquitous manufacturing." Robotics and Computer-Integrated Manufacturing 45 (June 2017): 47–58. http://dx.doi.org/10.1016/j.rcim.2016.05.010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Stevenson, Ryan A., and Mark T. Wallace. "The Multisensory Temporal Binding Window: Perceptual Fusion, Training, and Autism." i-Perception 2, no. 8 (October 2011): 760. http://dx.doi.org/10.1068/ic760.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Makarau, Aliaksei, Gintautas Palubinskas, and Peter Reinartz. "Alphabet-Based Multisensory Data Fusion and Classification Using Factor Graphs." IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 6, no. 2 (April 2013): 969–90. http://dx.doi.org/10.1109/jstars.2012.2219507.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ernst, M. O. "From independence to fusion: A comprehensive model for multisensory integration." Journal of Vision 5, no. 8 (March 17, 2010): 650. http://dx.doi.org/10.1167/5.8.650.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Multisensory fusion"

1

Hospedales, Timothy. "Bayesian multisensory perception." Thesis, University of Edinburgh, 2008. http://hdl.handle.net/1842/2156.

Full text
Abstract:
A key goal for humans and artificial intelligence systems is to develop an accurate and unified picture of the outside world based on the data from any sense(s) that may be available. The availability of multiple senses presents the perceptual system with new opportunities to fulfil this goal, but exploiting these opportunities first requires the solution of two related tasks. The first is how to make the best use of any redundant information from the sensors to produce the most accurate percept of the state of the world. The second is how to interpret the relationship between observations in each modality; for example, the correspondence problem of whether or not they originate from the same source. This thesis investigates these questions using ideal Bayesian observers as the underlying theoretical approach. In particular, the latter correspondence task is treated as a problem of Bayesian model selection or structure inference in Bayesian networks. This approach provides a unified and principled way of representing and understanding the perceptual problems faced by humans and machines and their commonality. In the domain of machine intelligence, we exploit the developed theory for practical benefit, developing a model to represent audio-visual correlations. Unsupervised learning in this model provides automatic calibration and user appearance learning, without human intervention. Inference in the model involves explicit reasoning about the association between latent sources and observations. This provides audio-visual tracking through occlusion with improved accuracy compared to standard techniques. It also provides detection, verification and speech segmentation, ultimately allowing the machine to understand ``who said what, where?'' in multi-party conversations. In the domain of human neuroscience, we show how a variety of recent results in multimodal perception can be understood as the consequence of probabilistic reasoning about the causal structure of multimodal observations. We show this for a localisation task in audio-visual psychophysics, which is very similar to the task solved by our machine learning system. We also use the same theory to understand results from experiments in the completely different paradigm of oddity detection using visual and haptic modalities. These results begin to suggest that the human perceptual system performs -- or at least approximates -- sophisticated probabilistic reasoning about the causal structure of observations under the hood.
APA, Harvard, Vancouver, ISO, and other styles
2

Ding, Yuhua. "An integrated approach to real-time multisensory inspection with an application to food processing." Diss., Available online, Georgia Institute of Technology, 2003:, 2003. http://etd.gatech.edu/theses/available/etd-11242003-180728/unrestricted/dingyuhu200312.pdf.

Full text
Abstract:
Thesis (Ph. D.)--Electrical and Computer Engineering, Georgia Institute of Technology, 2004.
Vachtsevanos, George J., Committee Chair; Dorrity, J. Lewis, Committee Member; Egerstedt, Magnus, Committee Member; Heck-Ferri, Bonnie S., Committee Co-Chair; Williams, Douglas B., Committee Member; Yezzi, Anthony J., Committee Member. Includes bibliography.
APA, Harvard, Vancouver, ISO, and other styles
3

Axenie, Cristian [Verfasser], Jörg [Akademischer Betreuer] [Gutachter] Conradt, and Jeffrey [Gutachter] Krichmar. "Synthesis of Distributed Cognitive Systems: Interacting Computational Maps for Multisensory Fusion / Cristian Axenie. Betreuer: Jörg Conradt. Gutachter: Jörg Conradt ; Jeffrey Krichmar." München : Universitätsbibliothek der TU München, 2016. http://d-nb.info/1100689036/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Filippidis, Arthur. "Multisensor data fusion." Title page, contents and abstract only, 1993. http://web4.library.adelaide.edu.au/theses/09ENS/09ensf482.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Petrovic, Vladimir. "Multisensor pixel-level image fusion." Thesis, University of Manchester, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.715412.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Pradhan, Pushkar S. "Multiresolution based, multisensor, multispectral image fusion." Diss., Mississippi State : Mississippi State University, 2005. http://library.msstate.edu/etd/show.asp?etd=etd-07082005-140541.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Berg, Timothy Martin. "Model distribution in decentralized multisensor data fusion." Thesis, University of Oxford, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.317852.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wellington, Sean. "Algorithms for sensor validation and multisensor fusion." Thesis, Southampton Solent University, 2002. http://ssudl.solent.ac.uk/398/.

Full text
Abstract:
Existing techniques for sensor validation and sensor fusion are often based on analytical sensor models. Such models can be arbitrarily complex and consequently Gaussian distributions are often assumed, generally with a detrimental effect on overall system performance. A holistic approach has therefore been adopted in order to develop two novel and complementary approaches to sensor validation and fusion based on empirical data. The first uses the Nadaraya-Watson kernel estimator to provide competitive sensor fusion. The new algorithm is shown to reliably detect and compensate for bias errors, spike errors, hardover faults, drift faults and erratic operation, affecting up to three of the five sensors in the array. The inherent smoothing action of the kernel estimator provides effective noise cancellation and the fused result is more accurate than the single 'best sensor'. A Genetic Algorithm has been used to optimise the Nadaraya-Watson fuser design. The second approach uses analytical redundancy to provide the on-line sensor status output μH∈[0,1], where μH=1 indicates the sensor output is valid and μH=0 when the sensor has failed. This fuzzy measure is derived from change detection parameters based on spectral analysis of the sensor output signal. The validation scheme can reliably detect a wide range of sensor fault conditions. An appropriate context dependent fusion operator can then be used to perform competitive, cooperative or complementary sensor fusion, with a status output from the fuser providing a useful qualitative indication of the status of the sensors used to derive the fused result. The operation of both schemes is illustrated using data obtained from an array of thick film metal oxide pH sensor electrodes. An ideal pH electrode will sense only the activity of hydrogen ions, however the selectivity of the metal oxide device is worse than the conventional glass electrode. The use of sensor fusion can therefore reduce measurement uncertainty by combining readings from multiple pH sensors having complementary responses. The array can be conveniently fabricated by screen printing sensors using different metal oxides onto a single substrate.
APA, Harvard, Vancouver, ISO, and other styles
9

Prajitno, Prawito. "Neuro-fuzzy methods in multisensor data fusion." Thesis, University of Sheffield, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.251258.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tannous, Halim Elie. "Interactive and connected rehabilitation systems for e-health." Thesis, Compiègne, 2018. http://www.theses.fr/2018COMP2436/document.

Full text
Abstract:
La rééducation fonctionnelle classique comprend des séances thérapeutiques, des exercices à domicile et des mouvements avec ou sans l'aide de thérapeutes. Cette approche classique souffre de nombreuses limitations, en raison de l’incapacité de l’expert à suivre les séances à domicile du patient et du manque de motivation du patient pour répéter les exercices sans retour. Les jeux sérieux ont été présentés comme une solution à ces problèmes. Premièrement, des exergames ont été développés à l’aide d’une méthodologie de codesign, où patients, experts et développeurs ont pris part aux procédures de conception. Le capteur Kinect a été utilisé pour capturer la cinématique en temps réel au cours de l’exercice. Ensuite, une fusion de données a été étudier entre le capteur Kinect et les unités de mesure inertielles, afin d’augmenter la précision de l’estimation de l’angle des articulations, à l’aide d’une approche système de systèmes. Le système a été évalué par des patients avec différentes pathologies au cours de multiples campagnes d'évaluation. Les résultats obtenus montrent que les jeux sérieux peuvent être une solution pour des pathologies spécifiques. En outre, les experts ont été convaincus de la pertinence clinique de ce dispositif et ont estimé que les données proposées étaient suffisantes pour évaluer la situation du patient. La mise en place de tels systèmes maximiserait l’efficacité du programme de réadaptation. D'autre part, ce système permettrait également de réduire les limitations actuellement présentes dans les programmes de rééducation classiques, permettant aux patients de visualiser leurs mouvements et aux experts de suivre l'exécution de l'exercice à domicile
Conventional musculoskeletal rehabilitation consists of therapeutic sessions, home exercise assignment, and movement execution with or without the assistance of therapists. This classical approach suffers from many limitations, due to the expert’s inability to follow the patient’s home sessions, and the patient’s lack of motivation to repeat the same exercises without feedback. Serious games have been presented as a possible solution for these problems. This thesis was carried out in the eBioMed experimental platform of the Université de technologie de Compiège, and in the framework of the Labex MS2T. The aim of this thesis is to develop a real-time, serious gaming system for home-based musculoskeletal rehabilitation. First, exergames were developed, using a codesign methodology, where the patients, experts and developers took part in the design and implementation procedures. The Kinect sensor was used to capture real-time kinematics during each exercise. Next, data fusion was implemented between the Kinect sensor and inertial measurement units, to increase the accuracy of joint angle estimation, using a system of systems approach. In addition, graphical user interfaces were developed, for experts and patients, to suit the needs of different end-users, based on the results of an end-user acceptability study. The system was evaluated by patients with different pathologies through multiple evaluation campaigns. Obtained results showed that serious games can be a good solution for specific types of pathologies. Moreover, experts were convinced of the clinical relevance of this device, and found that the estimated data was more than enough to assess the patient’s situation during their home-based exercise sessions. Finally, during these three years, we have set the base for a home-based rehabilitation system that can be deployed at home or in a clinical environment. The implementation of such systems would maximize the efficiency of rehabilitation program, while saving the patient’s and expert’s time and money. On the other hand, this system would also reduce the limitation that are currently present in classical rehabilitation programs, allowing the patients to visualize their movements, and the experts to follow the home exercise execution
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Multisensory fusion"

1

NATO Advanced Study Institute on Multisensor Data Fusion (2000 Pitlochry, Scotland). Multisensor fusion. Dordrecht: Kluwer Academic Publishers, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hyder, A. K., E. Shahbazian, and E. Waltz, eds. Multisensor Fusion. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

James, Llinas, ed. Multisensor data fusion. Boston: Artech House, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhu, Yunmin. Multisensor Decision And Estimation Fusion. Boston, MA: Springer US, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Multisensor decision and estimation fusion. Boston: Kluwer Academic Publishers, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhu, Yunmin. Multisensor Decision And Estimation Fusion. Boston, MA: Springer US, 2003. http://dx.doi.org/10.1007/978-1-4615-1045-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Aggarwal, J. K., ed. Multisensor Fusion for Computer Vision. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/978-3-662-02957-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Foresti, Gian Luca. Multisensor Surveillance Systems: The Fusion Perspective. Boston, MA: Springer US, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

C, Sanderson A., ed. Multisensor fusion: A minimal representation framework. Singapore: World Scientific, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

name, No. Multisensor surveillance systems: The fusion perspective. Boston, MA: Kluwer Academic, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Multisensory fusion"

1

Bystritsky, V. M. "Multisensory Experiments on the Meson Facilities." In Multisensor Fusion, 815–37. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ge, Weimin, and Zuoliang Cao. "Mobile Robot Navigation Based on Multisensory Fusion." In Lecture Notes in Computer Science, 984–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11539902_125.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Pungor, E. "The New Theory About Ion-Selective Electrodes." In Multisensor Fusion, 865–78. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_44.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Valin, P. "Reasoning Frameworks." In Multisensor Fusion, 223–45. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hanwehr, R. "Information Fusion in the Human Brain." In Multisensor Fusion, 1–36. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Valin, P. "Random Sets and Unification." In Multisensor Fusion, 247–66. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bloch, I. "Fusion of Information under Imprecision and Uncertainty, Numerical Methods, and Image Information Fusion." In Multisensor Fusion, 267–93. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Rao, N. S. V. "Multisensor Fusion under Unknown Distributions Finite-Sample Performance Guarantees." In Multisensor Fusion, 295–329. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cadre, J. P. "Data Association and Multitarget Tracking." In Multisensor Fusion, 331–49. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ranchin, T. "Wavelets for Modeling and Data Fusion in Remote Sensing." In Multisensor Fusion, 351–63. Dordrecht: Springer Netherlands, 2002. http://dx.doi.org/10.1007/978-94-010-0556-2_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Multisensory fusion"

1

Jayaratne, Madhura, Damminda Alahakoon, Daswin De Silva, and Xinghuo Yu. "Bio-Inspired Multisensory Fusion for Autonomous Robots." In IECON 2018 - 44th Annual Conference of the IEEE Industrial Electronics Society. IEEE, 2018. http://dx.doi.org/10.1109/iecon.2018.8592809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Geiger, Ray W., and J. T. Snell. "Interdisciplinary multisensory fusion: design lessons from professional architects." In Applications in Optical Science and Engineering, edited by Paul S. Schenker. SPIE, 1992. http://dx.doi.org/10.1117/12.131645.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tomasik, Jerzy A. "Discrete dynamic approach to the multisensory multitrack fusion." In AeroSense 2000, edited by Belur V. Dasarathy. SPIE, 2000. http://dx.doi.org/10.1117/12.381651.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gendron, Denis J., Mohamad Farooq, and Kaouthar Benameur. "Track-to-track fusion in a multisensory environment." In Aerospace/Defense Sensing, Simulation, and Controls, edited by Ivan Kadar. SPIE, 2001. http://dx.doi.org/10.1117/12.436995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Siva, Sriram, and Hao Zhang. "Omnidirectional Multisensory Perception Fusion for Long-Term Place Recognition." In 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2018. http://dx.doi.org/10.1109/icra.2018.8461042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Mengde, Liu, He Haijing, and Du Libin. "Acquisition of expandable current profiler based on multisensory data fusion." In 2011 IEEE 2nd International Conference on Computing, Control and Industrial Engineering (CCIE 2011). IEEE, 2011. http://dx.doi.org/10.1109/ccieng.2011.6008065.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zheng, Yufeng, Kwabena Agyepong, and Ognjen Kuljaca. "Multisensory data exploitation using advanced image fusion and adaptive colorization." In SPIE Defense and Security Symposium, edited by Ivan Kadar. SPIE, 2008. http://dx.doi.org/10.1117/12.784043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Makarau, Aliaksei, Gintautas Palubinskas, and Peter Reinartz. "Discrete Graphical Models for Alphabet-Based Multisensory Data Fusion and Classification." In 2011 International Symposium on Image and Data Fusion (ISIDF). IEEE, 2011. http://dx.doi.org/10.1109/isidf.2011.6024235.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Zhifen, Guangrui Wen, and Shanben Chen. "Multisensory data fusion technique and its application to welding process monitoring." In 2016 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO). IEEE, 2016. http://dx.doi.org/10.1109/arso.2016.7736298.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Abdulghafour, Muhamad, T. Chandra, and Mongi A. Abidi. "Data fusion through fuzzy reasoning applied to segmentation of multisensory images." In San Diego '92, edited by Su-Shing Chen. SPIE, 1992. http://dx.doi.org/10.1117/12.130843.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Multisensory fusion"

1

Bar-Shalom, Yaakov, K. R. Pattipati, and P. K. Willett. Estimation With Multisensor Fusion. Fort Belvoir, VA: Defense Technical Information Center, July 2003. http://dx.doi.org/10.21236/ada416565.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bar-Shalom, Y., and K. R. Pattipati. Multisensor/Multiscan Detection Fusion. Fort Belvoir, VA: Defense Technical Information Center, April 1997. http://dx.doi.org/10.21236/ada336763.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Yocky, D. A., M. D. Chadwick, S. P. Goudy, and D. K. Johnson. Multisensor data fusion algorithm development. Office of Scientific and Technical Information (OSTI), December 1995. http://dx.doi.org/10.2172/172138.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hall, David L., and Alan Steinberg. Dirty Secrets in Multisensor Data Fusion. Fort Belvoir, VA: Defense Technical Information Center, January 2001. http://dx.doi.org/10.21236/ada394631.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bar-Shalom, Y., and K. R. Pattipati. Estimation with Multisensor/Multiscan Detection Fusion. Fort Belvoir, VA: Defense Technical Information Center, March 1992. http://dx.doi.org/10.21236/ada250496.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Santosa, Fadil. Estimation With Multisensor/Multiscan Detection Fusion. Fort Belvoir, VA: Defense Technical Information Center, February 1993. http://dx.doi.org/10.21236/ada265673.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Vann, Laura D., Kevin M. Cuomo, Jean E. Piou, and Joseph T. Mayhan. Multisensor Fusion Processing for Enhanced Radar Imaging. Fort Belvoir, VA: Defense Technical Information Center, April 2000. http://dx.doi.org/10.21236/ada376545.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pao, Lucy Y. Distributed Multisensor Fusion Algorithms for Tracking Applications. Fort Belvoir, VA: Defense Technical Information Center, May 2000. http://dx.doi.org/10.21236/ada377900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Nasrabadi, Nasser M. Nonlinear Joint Fusion and Detection of Mines Using Multisensor Data. Fort Belvoir, VA: Defense Technical Information Center, May 2008. http://dx.doi.org/10.21236/ada484809.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Jeong, Soonho, and Jitendra K. Tugnait. Multisensor Tracking of a Maneuvering Target in Clutter with Asychronous Measurements using IMMPDA Filtering and Parallel Detection Fusion. Fort Belvoir, VA: Defense Technical Information Center, September 2003. http://dx.doi.org/10.21236/ada417405.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography