Academic literature on the topic 'Optical-inertial data fusion'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Optical-inertial data fusion.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Optical-inertial data fusion"

1

Soroush, Ali, Farzam Farahmand, and Hassan Salarieh. "Design and Implementation of an Improved Real-Time Tracking System for Navigation Surgery by Fusion of Optical and Inertial Tracking Methods." Applied Mechanics and Materials 186 (June 2012): 273–79. http://dx.doi.org/10.4028/www.scientific.net/amm.186.273.

Full text
Abstract:
The fusion of the optical and inertial tracking systems seems an attractive solution to solve the shadowing problem of the optical tracking systems, and remove the time integration troubles of the inertial sensors. We developed a fusion algorithm for this purpose, based on the Kalman filter, and examined its efficacy to improve the position and orientation data, obtained by each individual system. Experimental results indicated that the proposed fusion algorithm could effectively estimate the 2 seconds missing data of the optical tracker.
APA, Harvard, Vancouver, ISO, and other styles
2

Ursel, Tomasz, and Michał Olinski. "Displacement Estimation Based on Optical and Inertial Sensor Fusion." Sensors 21, no. 4 (2021): 1390. http://dx.doi.org/10.3390/s21041390.

Full text
Abstract:
This article aims to develop a system capable of estimating the displacement of a moving object with the usage of a relatively cheap and easy to apply sensors. There is a growing need for such systems, not only for robots, but also, for instance, pedestrian navigation. In this paper, the theory for this idea, including data postprocessing algorithms for a MEMS accelerometer and an optical flow sensor (OFS), as well as the developed complementary filter applied for sensor fusion, are presented. In addition, a vital part of the accelerometer’s algorithm, the zero velocity states detection, is im
APA, Harvard, Vancouver, ISO, and other styles
3

Sun, Hui Qin. "Design of Indoor Large-Scale Multi-Target Precise Positioning and Tracking System." Advanced Materials Research 1049-1050 (October 2014): 1233–36. http://dx.doi.org/10.4028/www.scientific.net/amr.1049-1050.1233.

Full text
Abstract:
This paper is to build a interior Large-scale multi-target precise positioning and tracking system.In-depth resear of a visual-based optical tracking technology, inertial tracking technology and multi-sensor data fusion technology.Breaking the graphics, images, data fusion, tracking and other related key technologies.Develop high versatility, real-time, robustness of a wide range of high-precision optical tracking system for interior Large-scale multi-target precise positioning and tracking system.
APA, Harvard, Vancouver, ISO, and other styles
4

Chen, Jie, Can-jun Yang, Jens Hofschulte, Wan-li Jiang, and Cha Zhang. "A robust optical/inertial data fusion system for motion tracking of the robot manipulator." Journal of Zhejiang University SCIENCE C 15, no. 7 (2014): 574–83. http://dx.doi.org/10.1631/jzus.c1300302.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Boronakhin, A. M., Yu V. Filatov, D. Yu Larionov, et al. "Fusion of inertial and optical data for monitoring the geometry of the rail track." IOP Conference Series: Materials Science and Engineering 984 (November 28, 2020): 012009. http://dx.doi.org/10.1088/1757-899x/984/1/012009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Soroush, Ali, Mohammad Akbar, and Farzam Farahmand. "How to Synchronize and Register an Optical-Inertial Tracking System." Applied Mechanics and Materials 332 (July 2013): 130–36. http://dx.doi.org/10.4028/www.scientific.net/amm.332.130.

Full text
Abstract:
Multi-sensor tracking is widely used for augmentation of tracking accuracy using data fusion. A basic requirement for such applications is the real time temporal synchronization and spatial registration of two sensory data. In this study a new method for time and space coordination of two tracking sensor measurements has been presented. For spatial registration we used a body coordinate system and then applied the effect of the level arm. The time synchronization was done based on least mean square (LMS) error method. This method was implemented to synchronize the position and orientation of a
APA, Harvard, Vancouver, ISO, and other styles
7

Cutolo, Fabrizio, Virginia Mamone, Nicola Carbonaro, Vincenzo Ferrari, and Alessandro Tognetti. "Ambiguity-Free Optical–Inertial Tracking for Augmented Reality Headsets." Sensors 20, no. 5 (2020): 1444. http://dx.doi.org/10.3390/s20051444.

Full text
Abstract:
The increasing capability of computing power and mobile graphics has made possible the release of self-contained augmented reality (AR) headsets featuring efficient head-anchored tracking solutions. Ego motion estimation based on well-established infrared tracking of markers ensures sufficient accuracy and robustness. Unfortunately, wearable visible-light stereo cameras with short baseline and operating under uncontrolled lighting conditions suffer from tracking failures and ambiguities in pose estimation. To improve the accuracy of optical self-tracking and its resiliency to marker occlusions
APA, Harvard, Vancouver, ISO, and other styles
8

Chatzitofis, Anargyros, Dimitrios Zarpalas, Stefanos Kollias, and Petros Daras. "DeepMoCap: Deep Optical Motion Capture Using Multiple Depth Sensors and Retro-Reflectors." Sensors 19, no. 2 (2019): 282. http://dx.doi.org/10.3390/s19020282.

Full text
Abstract:
In this paper, a marker-based, single-person optical motion capture method (DeepMoCap) is proposed using multiple spatio-temporally aligned infrared-depth sensors and retro-reflective straps and patches (reflectors). DeepMoCap explores motion capture by automatically localizing and labeling reflectors on depth images and, subsequently, on 3D space. Introducing a non-parametric representation to encode the temporal correlation among pairs of colorized depthmaps and 3D optical flow frames, a multi-stage Fully Convolutional Network (FCN) architecture is proposed to jointly learn reflector locatio
APA, Harvard, Vancouver, ISO, and other styles
9

Wada, Tomohito, Ryu Nagahara, Sam Gleadhill, Tatsuro Ishizuka, Hayato Ohnuma, and Yuji Ohgi. "Measurement of Pelvic Orientation Angles during Sprinting Using a Single Inertial Sensor." Proceedings 49, no. 1 (2020): 10. http://dx.doi.org/10.3390/proceedings2020049010.

Full text
Abstract:
The purpose of this study was to elucidate pelvic orientation angles using a single lower back-mounted inertial sensor during sprinting. A single inertial sensor was attached to each sprinter’s lower back, used to measure continuous pelvic movements including pelvic obliquity (roll), anterior-posterior tilt (pitch) and rotation (yaw) during sprinting from a straight to bend section. The pelvic orientation angles were estimated with the three-dimensional sensor orientation using a sensor fusion algorithm. Absolute angles derived from the sensor were compared with angles obtained from an optical
APA, Harvard, Vancouver, ISO, and other styles
10

Oh, Hyun Min, and Min Young Kim. "Kalman Filter Baded Pose Data Fusion with Optical Traking System and Inertial Navigation System Networks for Image Guided Surgery." Transactions of The Korean Institute of Electrical Engineers 66, no. 1 (2017): 121–26. http://dx.doi.org/10.5370/kiee.2017.66.1.121.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Optical-inertial data fusion"

1

Josses, Roxane. "Data Fusion between Inertial and Optical Sensors for Earth Observation Satellite Line of Sight Estimation and Stabilization." Thesis, KTH, Rymdteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-299359.

Full text
Abstract:
The line-of-sight (LoS) pointing and stability requirements of Earth observation satellites are becoming increasingly stringent. The microvibrations that disturb the LoS are therefore no longer negligible, and many studies are focusing on the estimation of these microvibrations and on mitigation strategies, on platform and payload levels. Disturbances of the LoS can be seen by image processing algorithms that provide high-frequency information. In this paper, a novel method is proposed to fuse these optical data with inertial data using a Kalman filter to obtain the best LoS estimation. To dem
APA, Harvard, Vancouver, ISO, and other styles
2

Claasen, Göntje Caroline. "Capteur de mouvement intelligent pour la chirurgie prothétique naviguée." Phd thesis, Ecole Nationale Supérieure des Mines de Paris, 2012. http://pastel.archives-ouvertes.fr/pastel-00691192.

Full text
Abstract:
Nous présentons un système de tracking optique-inertiel qui consiste en deux caméras stationnaires et une Sensor Unit avec des marqueurs optiques et une centrale inertielle. La Sensor Unit est fixée sur l'objet suivi et sa position et orientation sont déterminées par un algorithme de fusion de données. Le système de tracking est destiné à asservir un outil à main dans un système de chirurgie naviguée ou assistée par ordinateur. L'algorithme de fusion de données intègre les données des différents capteurs, c'est-à-dire les données optiques des caméras et les données inertielles des accéléromètr
APA, Harvard, Vancouver, ISO, and other styles
3

"3D human gesture tracking and recognition by MENS inertial sensor and vision sensor fusion." 2013. http://library.cuhk.edu.hk/record=b5884339.

Full text
Abstract:
Zhou, Shengli.<br>Thesis (Ph.D.)--Chinese University of Hong Kong, 2013.<br>Includes bibliographical references (leaves 133-140).<br>Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.<br>Abstracts also in Chinese.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Optical-inertial data fusion"

1

Troll, Péter, Károly Szipka, and Andreas Archenti. "Indoor Localization of Quadcopters in Industrial Environment." In Advances in Transdisciplinary Engineering. IOS Press, 2020. http://dx.doi.org/10.3233/atde200183.

Full text
Abstract:
The research work in this paper was carried out to reach advanced positioning capabilities of unmanned aerial vehicles (UAVs) for indoor applications. The paper includes the design of a quadcopter and the implementation of a control system with the capability to position the quadcopter indoor using onboard visual pose estimation system, without the help of GPS. The project also covered the design and implementation of quadcopter hardware and the control software. The developed hardware enables the quadcopter to raise at least 0.5kg additional payload. The system was developed on a Raspberry single-board computer in combination with a PixHawk flight controller. OpenCV library was used to implement the necessary computer vision. The Open-source software-based solution was developed in the Robotic Operating System (ROS) environment, which performs sensor reading and communication with the flight controller while recording data about its operation and transmits those to the user interface. For the vision-based position estimation, pre-positioned printed markers were used. The markers were generated by ArUco coding, which exactly defines the current position and orientation of the quadcopter, with the help of computer vision. The resulting data was processed in the ROS environment. LiDAR with Hector SLAM algorithm was used to map the objects around the quadcopter. The project also deals with the necessary camera calibration. The fusion of signals from the camera and from the IMU (Inertial Measurement Unit) was achieved by using Extended Kalman Filter (EKF). The evaluation of the completed positioning system was performed with an OptiTrack optical-based external multi-camera measurement system. The introduced evaluation method has enough precision to be used to investigate the enhancement of positioning performance of quadcopters, as well as fine-tuning the parameters of the used controller and filtering approach. The payload capacity allows autonomous material handling indoors. Based on the experiments, the system has an accurate positioning system to be suitable for industrial application.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Optical-inertial data fusion"

1

Zhang, Jun-wei, Hai Zhou, Dong-hui Lin, et al. "Analysis of the Effect of Ambient Loads on the SG-III TIL Structure Stability." In ASME 2008 International Mechanical Engineering Congress and Exposition. ASMEDC, 2008. http://dx.doi.org/10.1115/imece2008-66046.

Full text
Abstract:
ICF (Inertial Confinement Fusion) drivers are large-scale precision optical facilities. Structure stability is an important design index for ICF drivers and it has a direct influence on the quality of beam shooting. The total positioning error budget designed for the SG-III Technical Integration Experiment Line (TIL) built in China is 30 micrometers (μm) at the target plane or its equivalent, among which the allocated stability budget is 27μm and the translation of a single optical element should be less than 1μm under the effect of ambient loads. Based on the previous evaluation of the TIL en
APA, Harvard, Vancouver, ISO, and other styles
2

Hou, Zhi, Juntong Qi, and Mingming Wang. "Fusing Optical Flow and Inertial Data for UAV Motion Estimation in GPS-denied Environment." In 2019 Chinese Control Conference (CCC). IEEE, 2019. http://dx.doi.org/10.23919/chicc.2019.8866551.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!