Academic literature on the topic 'Vision-aided INS'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Vision-aided INS.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Vision-aided INS"

1

Lee, Kyuman, and Eric N. Johnson. "Robust Outlier-Adaptive Filtering for Vision-Aided Inertial Navigation." Sensors 20, no. 7 (April 4, 2020): 2036. http://dx.doi.org/10.3390/s20072036.

Full text
Abstract:
With the advent of unmanned aerial vehicles (UAVs), a major area of interest in the research field of UAVs has been vision-aided inertial navigation systems (V-INS). In the front-end of V-INS, image processing extracts information about the surrounding environment and determines features or points of interest. With the extracted vision data and inertial measurement unit (IMU) dead reckoning, the most widely used algorithm for estimating vehicle and feature states in the back-end of V-INS is an extended Kalman filter (EKF). An important assumption of the EKF is Gaussian white noise. In fact, measurement outliers that arise in various realistic conditions are often non-Gaussian. A lack of compensation for unknown noise parameters often leads to a serious impact on the reliability and robustness of these navigation systems. To compensate for uncertainties of the outliers, we require modified versions of the estimator or the incorporation of other techniques into the filter. The main purpose of this paper is to develop accurate and robust V-INS for UAVs, in particular, those for situations pertaining to such unknown outliers. Feature correspondence in image processing front-end rejects vision outliers, and then a statistic test in filtering back-end detects the remaining outliers of the vision data. For frequent outliers occurrence, variational approximation for Bayesian inference derives a way to compute the optimal noise precision matrices of the measurement outliers. The overall process of outlier removal and adaptation is referred to here as “outlier-adaptive filtering”. Even though almost all approaches of V-INS remove outliers by some method, few researchers have treated outlier adaptation in V-INS in much detail. Here, results from flight datasets validate the improved accuracy of V-INS employing the proposed outlier-adaptive filtering framework.
APA, Harvard, Vancouver, ISO, and other styles
2

Xiao, Ming, Liang Pan, Tianjiang Hu, and Lincheng Shen. "A Novel Fusion Scheme for Vision Aided Inertial Navigation of Aerial Vehicles." Mathematical Problems in Engineering 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/819565.

Full text
Abstract:
Vision-aided inertial navigation is an important and practical mode of integrated navigation for aerial vehicles. In this paper, a novel fusion scheme is proposed and developed by using the information from inertial navigation system (INS) and vision matching subsystem. This scheme is different from the conventional Kalman filter (CKF); CKF treats these two information sources equally even though vision-aided navigation is linked to uncertainty and inaccuracy. Eventually, by concentrating on reliability of vision matching, the fusion scheme of integrated navigation is upgraded. Not only matching positions are used, but also their reliable extents are considered. Moreover, a fusion algorithm is designed and proved to be the optimal as it minimizes the variance in terms of mean square error estimation. Simulations are carried out to validate the effectiveness of this novel navigation fusion scheme. Results show the new fusion scheme outperforms CKF and adaptive Kalman filter (AKF) in vision/INS estimation under given scenarios and specifications.
APA, Harvard, Vancouver, ISO, and other styles
3

Lee, Dongjin, Youngjoo Kim, and Hyochoong Bang. "Vision-aided terrain referenced navigation for unmanned aerial vehicles using ground features." Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering 228, no. 13 (January 6, 2014): 2399–413. http://dx.doi.org/10.1177/0954410013517804.

Full text
Abstract:
A vision-aided terrain referenced navigation (VATRN) approach is addressed for autonomous navigation of unmanned aerial vehicles (UAVs) under GPS-denied conditions. A typical terrain referenced navigation (TRN) algorithm blends inertial navigation data with measured terrain information to estimate vehicle’s position. In this paper, a low-cost inertial navigation system (INS) for UAVs is supplemented with a monocular vision-aided navigation system and terrain height measurements. A point mass filter based on Bayesian estimation is employed as a TRN algorithm. Homograpies are established to estimate the vehicle’s relative translational motion using ground features with simple assumptions. And the error analysis in homography estimation is explored to estimate the error covariance matrix associated with the visual odometry data. The estimated error covariance is delivered to the TRN algorithm for robust estimation. Furthermore, multiple ground features tracked by image observations are utilized as multiple height measurements to improve the performance of the VATRN algorithm.
APA, Harvard, Vancouver, ISO, and other styles
4

Fang, Qiang, and Xin Sheng Huang. "Global Observability Analysis IMU/Camera Integration." Applied Mechanics and Materials 380-384 (August 2013): 1069–72. http://dx.doi.org/10.4028/www.scientific.net/amm.380-384.1069.

Full text
Abstract:
Vision-aided inertial navigation systems can provide precise state estimates for the 3-D motion of a vehicle. This is achieved by combining inertial measurements from an inertial measurement unit (IMU) with visual observations from a camera. Observability is a key aspect of the state estimation problem of INS/Camera. In most previous research, conservative observability concepts based on Lie derivatives have extensively been used to characterize the estimability properties. In this paper, we present a novel approache to investigate the observability of INS/Camera: global observability. The global observability method directly starts from the basic observability definition. The global observability analysis approach is not only straightforward and comprehensive but also provides us with new insights compared with conventional methods. Some sufficient conditions for the global observability of the system is provided.
APA, Harvard, Vancouver, ISO, and other styles
5

Choi, Ji-Hoon, Yong-Woon Park, Jae-Bok Song, and In-So Kweon. "Localization using GPS and VISION aided INS with an image database and a network of a ground-based reference station in outdoor environments." International Journal of Control, Automation and Systems 9, no. 4 (August 2011): 716–25. http://dx.doi.org/10.1007/s12555-011-0413-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Rahman, Muhammed Tahsin, Tashfeen Karamat, Sidney Givigi, and Aboelmagd Noureldin. "Improving Multisensor Positioning of Land Vehicles with Integrated Visual Odometry for Next-Generation Self-Driving Cars." Journal of Advanced Transportation 2018 (2018): 1–12. http://dx.doi.org/10.1155/2018/6513970.

Full text
Abstract:
For their complete realization, autonomous vehicles (AVs) fundamentally rely on the Global Navigation Satellite System (GNSS) to provide positioning and navigation information. However, in area such as urban cores, parking lots, and under dense foliage, which are all commonly frequented by AVs, GNSS signals suffer from blockage, interference, and multipath. These effects cause high levels of errors and long durations of service discontinuity that mar the performance of current systems. The prevalence of vision and low-cost inertial sensors provides an attractive opportunity to further increase the positioning and navigation accuracy in such GNSS-challenged environments. This paper presents enhancements to existing multisensor integration systems utilizing the inertial navigation system (INS) to aid in Visual Odometry (VO) outlier feature rejection. A scheme called Aided Visual Odometry (AVO) is developed and integrated with a high performance mechanization architecture utilizing vehicle motion and orientation sensors. The resulting solution exhibits improved state covariance convergence and navigation accuracy, while reducing computational complexity. Experimental verification of the proposed solution is illustrated through three real road trajectories, over two different land vehicles, and using two low-cost inertial measurement units (IMUs).
APA, Harvard, Vancouver, ISO, and other styles
7

Da Silva, Eduardo Sant'Ana, Anderson Santos, and Helio Pedrini. "Metrics for Image Surface Approximation Based on Triangular Meshes." Image Analysis & Stereology 37, no. 1 (April 12, 2018): 71. http://dx.doi.org/10.5566/ias.1591.

Full text
Abstract:
Surface approximation plays an important role in several application fields, such as computer-aided design, computer graphics, remote sensing, computer vision, robotics, architecture, and manufacturing. A common problem present in these areas is to develop efficient methods for generating, processing, analyzing, and visualizing large amount of 3D data. Triangular meshes constitute a flexible representation of sampled points that are not regularly distributed in space, such that the model can be adaptively adjusted to the data density. The choice of metrics for building the triangular meshes is crucial to produce high quality models. This paper proposes and evaluates different measures to incrementally refine a Delaunay triangular mesh for image surface approximation until either a certain accuracy is obtained or a maximum number of iterations is achieved. Experiments on several data sets are performed to compare the quality of the resulting meshes.
APA, Harvard, Vancouver, ISO, and other styles
8

Shapey, Jonathan, Thomas Dowrick, Rémi Delaunay, Eleanor C. Mackle, Stephen Thompson, Mirek Janatka, Roland Guichard, et al. "Integrated multi-modality image-guided navigation for neurosurgery: open-source software platform using state-of-the-art clinical hardware." International Journal of Computer Assisted Radiology and Surgery 16, no. 8 (May 3, 2021): 1347–56. http://dx.doi.org/10.1007/s11548-021-02374-5.

Full text
Abstract:
Abstract Purpose Image-guided surgery (IGS) is an integral part of modern neuro-oncology surgery. Navigated ultrasound provides the surgeon with reconstructed views of ultrasound data, but no commercial system presently permits its integration with other essential non-imaging-based intraoperative monitoring modalities such as intraoperative neuromonitoring. Such a system would be particularly useful in skull base neurosurgery. Methods We established functional and technical requirements of an integrated multi-modality IGS system tailored for skull base surgery with the ability to incorporate: (1) preoperative MRI data and associated 3D volume reconstructions, (2) real-time intraoperative neurophysiological data and (3) live reconstructed 3D ultrasound. We created an open-source software platform to integrate with readily available commercial hardware. We tested the accuracy of the system’s ultrasound navigation and reconstruction using a polyvinyl alcohol phantom model and simulated the use of the complete navigation system in a clinical operating room using a patient-specific phantom model. Results Experimental validation of the system’s navigated ultrasound component demonstrated accuracy of $$<4.5\,\hbox {mm}$$ < 4.5 mm and a frame rate of 25 frames per second. Clinical simulation confirmed that system assembly was straightforward, could be achieved in a clinically acceptable time of $$<15\,\hbox {min}$$ < 15 min and performed with a clinically acceptable level of accuracy. Conclusion We present an integrated open-source research platform for multi-modality IGS. The present prototype system was tailored for neurosurgery and met all minimum design requirements focused on skull base surgery. Future work aims to optimise the system further by addressing the remaining target requirements.
APA, Harvard, Vancouver, ISO, and other styles
9

"SLAM Aided GPS/INS/Vision Navigation System for Helicopter." Journal of Institute of Control, Robotics and Systems 14, no. 8 (August 1, 2008): 745–51. http://dx.doi.org/10.5302/j.icros.2008.14.8.745.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Vision-aided INS"

1

Wu, Allen David. "Vision-based navigation and mapping for flight in GPS-denied environments." Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/37281.

Full text
Abstract:
Traditionally, the task of determining aircraft position and attitude for automatic control has been handled by the combination of an inertial measurement unit (IMU) with a Global Positioning System (GPS) receiver. In this configuration, accelerations and angular rates from the IMU can be integrated forward in time, and position updates from the GPS can be used to bound the errors that result from this integration. However, reliance on the reception of GPS signals places artificial constraints on aircraft such as small unmanned aerial vehicles (UAVs) that are otherwise physically capable of operation in indoor, cluttered, or adversarial environments. Therefore, this work investigates methods for incorporating a monocular vision sensor into a standard avionics suite. Vision sensors possess the potential to extract information about the surrounding environment and determine the locations of features or points of interest. Having mapped out landmarks in an unknown environment, subsequent observations by the vision sensor can in turn be used to resolve aircraft position and orientation while continuing to map out new features. An extended Kalman filter framework for performing the tasks of vision-based mapping and navigation is presented. Feature points are detected in each image using a Harris corner detector, and these feature measurements are corresponded from frame to frame using a statistical Z-test. When GPS is available, sequential observations of a single landmark point allow the point's location in inertial space to be estimated. When GPS is not available, landmarks that have been sufficiently triangulated can be used for estimating vehicle position and attitude. Simulation and real-time flight test results for vision-based mapping and navigation are presented to demonstrate feasibility in real-time applications. These methods are then integrated into a practical framework for flight in GPS-denied environments and verified through the autonomous flight of a UAV during a loss-of-GPS scenario. The methodology is also extended to the application of vehicles equipped with stereo vision systems. This framework enables aircraft capable of hovering in place to maintain a bounded pose estimate indefinitely without drift during a GPS outage.
APA, Harvard, Vancouver, ISO, and other styles
2

Ellingson, Gary James. "Cooperative Navigation of Fixed-Wing Micro Air Vehicles in GPS-Denied Environments." BYU ScholarsArchive, 2019. https://scholarsarchive.byu.edu/etd/8706.

Full text
Abstract:
Micro air vehicles have recently gained popularity due to their potential as autonomous systems. Their future impact, however, will depend in part on how well they can navigate in GPS-denied and GPS-degraded environments. In response to this need, this dissertation investigates a potential solution for GPS-denied operations called relative navigation. The method utilizes keyframe-to-keyframe odometry estimates and their covariances in a global back end that represents the global state as a pose graph. The back end is able to effectively represent nonlinear uncertainties and incorporate opportunistic global constraints. The GPS-denied research community has, for the most part, neglected to consider fixed-wing aircraft. This dissertation enables fixed-wing aircraft to utilize relative navigation by accounting for their sensing requirements. The development of an odometry-like, front-end, EKF-based estimator that utilizes only a monocular camera and an inertial measurement unit is presented. The filter uses the measurement model of the multi-state-constraint Kalman filter and regularly performs relative resets in coordination with keyframe declarations. In addition to the front-end development, a method is provided to account for front-end velocity bias in the back-end optimization. Finally a method is presented for enabling multiple vehicles to improve navigational accuracy by cooperatively sharing information. Modifications to the relative navigation architecture are presented that enable decentralized, cooperative operations amidst temporary communication dropouts. The proposed framework also includes the ability to incorporate inter-vehicle measurements and utilizes a new concept called the coordinated reset, which is necessary for optimizing the cooperative odometry and improving localization. Each contribution is demonstrated through simulation and/or hardware flight testing. Simulation and Monte-Carlo testing is used to show the expected quality of the results. Hardware flight-test results show the front-end estimator performance, several back-end optimization examples, and cooperative GPS-denied operations.
APA, Harvard, Vancouver, ISO, and other styles
3

Wheeler, David Orton. "Relative Navigation of Micro Air Vehicles in GPS-Degraded Environments." BYU ScholarsArchive, 2017. https://scholarsarchive.byu.edu/etd/6609.

Full text
Abstract:
Most micro air vehicles rely heavily on reliable GPS measurements for proper estimation and control, and therefore struggle in GPS-degraded environments. When GPS is not available, the global position and heading of the vehicle is unobservable. This dissertation establishes the theoretical and practical advantages of a relative navigation framework for MAV navigation in GPS-degraded environments. This dissertation explores how the consistency, accuracy, and stability of current navigation approaches degrade during prolonged GPS dropout and in the presence of heading uncertainty. Relative navigation (RN) is presented as an alternative approach that maintains observability by working with respect to a local coordinate frame. RN is compared with several current estimation approaches in a simulation environment and in hardware experiments. While still subject to global drift, RN is shown to produce consistent state estimates and stable control. Estimating relative states requires unique modifications to current estimation approaches. This dissertation further provides a tutorial exposition of the relative multiplicative extended Kalman filter, presenting how to properly ensure observable state estimation while maintaining consistency. The filter is derived using both inertial and body-fixed state definitions and dynamics. Finally, this dissertation presents a series of prolonged flight tests, demonstrating the effectiveness of the relative navigation approach for autonomous GPS-degraded MAV navigation in varied, unknown environments. The system is shown to utilize a variety of vision sensors, work indoors and outdoors, run in real-time with onboard processing, and not require special tuning for particular sensors or environments. Despite leveraging off-the-shelf sensors and algorithms, the flight tests demonstrate stable front-end performance with low drift. The flight tests also demonstrate the onboard generation of a globally consistent, metric, and localized map by identifying and incorporating loop-closure constraints and intermittent GPS measurements. With this map, mission objectives are shown to be autonomously completed.
APA, Harvard, Vancouver, ISO, and other styles
4

Jackson, James Scott. "Enabling Autonomous Operation of Micro Aerial Vehicles Through GPS to GPS-Denied Transitions." BYU ScholarsArchive, 2019. https://scholarsarchive.byu.edu/etd/8709.

Full text
Abstract:
Micro aerial vehicles and other autonomous systems have the potential to truly transform life as we know it, however much of the potential of autonomous systems remains unrealized because reliable navigation is still an unsolved problem with significant challenges. This dissertation presents solutions to many aspects of autonomous navigation. First, it presents ROSflight, a software and hardware architure that allows for rapid prototyping and experimentation of autonomy algorithms on MAVs with lightweight, efficient flight control. Next, this dissertation presents improvments to the state-of-the-art in optimal control of quadrotors by utilizing the error-state formulation frequently utilized in state estimation. It is shown that performing optimal control directly over the error-state results in a vastly more computationally efficient system than competing methods while also dealing with the non-vector rotation components of the state in a principled way. In addition, real-time robust flight planning is considered with a method to navigate cluttered, potentially unknown scenarios with real-time obstacle avoidance. Robust state estimation is a critical component to reliable operation, and this dissertation focuses on improving the robustness of visual-inertial state estimation in a filtering framework by extending the state-of-the-art to include better modeling and sensor fusion. Further, this dissertation takes concepts from the visual-inertial estimation community and applies it to tightly-coupled GNSS, visual-inertial state estimation. This method is shown to demonstrate significantly more reliable state estimation than visual-inertial or GNSS-inertial state estimation alone in a hardware experiment through a GNSS-GNSS denied transition flying under a building and back out into open sky. Finally, this dissertation explores a novel method to combine measurements from multiple agents into a coherent map. Traditional approaches to this problem attempt to solve for the position of multiple agents at specific times in their trajectories. This dissertation instead attempts to solve this problem in a relative context, resulting in a much more robust approach that is able to handle much greater intial error than traditional approaches.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Vision-aided INS"

1

Mo, Shanhui, Zebo Zhou, Shuang Du, Changgan Xiang, Changhong Kuang, and Jin Wu. "Implementation of Mixed Sequential Kalman Filters for Vision-Aided GNSS/INS Integrated Navigation System." In Lecture Notes in Electrical Engineering, 629–41. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-0029-5_54.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hesch, Joel A., Dimitrios G. Kottas, Sean L. Bowman, and Stergios I. Roumeliotis. "Towards Consistent Vision-Aided Inertial Navigation." In Springer Tracts in Advanced Robotics, 559–74. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-36279-8_34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Southall, B., T. Hague, J. A. Marchant, and B. F. Buxton. "Vision-Aided Outdoor Navigation of an Autonomous Horticultural Vehicle." In Lecture Notes in Computer Science, 37–50. Berlin, Heidelberg: Springer Berlin Heidelberg, 1999. http://dx.doi.org/10.1007/3-540-49256-9_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Hai, Xianmin Zhang, Lei Zeng, and Heng Wu. "Vision-Aided Online Kinematic Calibration of a Planar 3RRR Manipulator." In Lecture Notes in Electrical Engineering, 963–72. Singapore: Springer Singapore, 2016. http://dx.doi.org/10.1007/978-981-10-2875-5_79.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Patil, Rahul B., Ron Chaudhuri, VTNS Pavan Kumar, and S. Ramya. "A Vision-Based GPS-Aided Quadcopter for Emergency Vehicle Assistance." In Lecture Notes in Electrical Engineering, 395–404. Singapore: Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-16-0336-5_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Nguyen, Tuan T., Sang T. T. Nguyen, and Luu C. Nguyen. "Learning to Solve Sudoku Problems with Computer Vision Aided Approaches." In Advances in Intelligent Systems and Computing, 539–48. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-10-7563-6_56.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fan, Linkun, Congshuai Guo, and Xuchuan Li. "Vehicle Aided Navigation and Positioning Method Based on Computer Vision." In Advances in Intelligent Systems and Computing, 906–10. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-69999-4_131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

DSouza, Ashlyn Selena, Rifah Mohamed, Mishika, and V. Kalaichelvi. "Real-Time Fog Removal Using Google Maps Aided Computer Vision Techniques." In Lecture Notes in Electrical Engineering, 571–79. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-4775-1_62.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Relyea, Andrew, and Meir Pachter. "A Covariance Analysis of Vision-Aided Inertial Navigation: Free Fall Case." In Advances in Estimation, Navigation, and Spacecraft Control, 309–28. Berlin, Heidelberg: Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-44785-7_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Milanova, M., L. Nikolov, and Svetlin Fotev. "Three dimensional computer vision for computer aided design and manufacturing applications." In Advances in Structural and Syntactical Pattern Recognition, 279–88. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/3-540-61577-6_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Vision-aided INS"

1

Aouf, Nabil, Vasko Sazdovski, Antonios Tsourdos, and Brian White. "Low Altitude Airborne SLAM with INS Aided Vision System." In AIAA Guidance, Navigation and Control Conference and Exhibit. Reston, Virigina: American Institute of Aeronautics and Astronautics, 2007. http://dx.doi.org/10.2514/6.2007-6790.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Youan, Zhang, Kou Kunhu, Chen Yu, and Wang Xibin. "Vision aided INS error estimation based on EKF for cruise missile." In 2012 IEEE International Conference on Computer Science and Automation Engineering (CSAE). IEEE, 2012. http://dx.doi.org/10.1109/csae.2012.6272591.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ming, Cai, Sun Xiu-Xia, Xu Song, and Liu Xi. "Vision aided INS for UAV auto landing navigation using SR-UKF based on two-view homography." In 2014 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC). IEEE, 2014. http://dx.doi.org/10.1109/cgncc.2014.7007276.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cappelle, Cindy, Maan El Badaoui El Najjar, Denis Pomorski, and Francois Charpillet. "Localisation in urban environment using GPS and INS aided by monocular vision system and 3D geographical model." In 2007 IEEE Intelligent Vehicles Symposium. IEEE, 2007. http://dx.doi.org/10.1109/ivs.2007.4290216.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Eberwien, U., S. D. Bade, W. Grafe, R. Stute, and P.-O. Nilsson. "Computer Aided Approval – From Vision to Reality." In International Conference on Computer Applications in Shipbuilding. RINA, 2007. http://dx.doi.org/10.3940/rina.iccas.2007.15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ballew, Aaron, Shiva Srivastava, Nikolay Valtchanov, C. C. Lee, and Aleksandar Kuzmanovic. "Characterization of Vision-Aided Indoor Localization and Landmark Routing." In 2012 Sixth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS). IEEE, 2012. http://dx.doi.org/10.1109/imis.2012.27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Viggers, Jens. "Technology Enablers for Computer-Aided Fleet Operation CAFO a Vision." In International Conference on Computer Applications in Shipbuilding. RINA, 2007. http://dx.doi.org/10.3940/rina.iccas.2007.59.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Smieschek, Manfred, Gregor Kobsik, Andre Stollenwerk, Stefan Kowalewski, Thorsten Orlikowsky, and Mark Schoberer. "Aided Hand Detection in Thermal Imaging Using RGB Stereo Vision." In 2019 41st Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). IEEE, 2019. http://dx.doi.org/10.1109/embc.2019.8856990.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Khleif, Ali Abbar, and Ahmed Abdulraheem Shnawa. "Vision system aided 3D object reconstruction and machining using CNC milling machine." In 2018 International Conference on Advance of Sustainable Engineering and its Application (ICASEA). IEEE, 2018. http://dx.doi.org/10.1109/icasea.2018.8370990.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Fowers, Spencer G., Dah-Jye Lee, Beau J. Tippetts, Kirt D. Lillywhite, Aaron W. Dennis, and James K. Archibald. "Vision Aided Stabilization and the Development of a Quad-Rotor Micro UAV." In 2007 International Symposium on Computational Intelligence in Robotics and Automation. IEEE, 2007. http://dx.doi.org/10.1109/cira.2007.382886.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography