To see the other types of publications on this topic, follow the link: Camera guidance for robot.

Journal articles on the topic 'Camera guidance for robot'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Camera guidance for robot.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Sai, Hesin, and Yoshikuni Okawa. "Structured Sign for Guidance of Mobile Robot." Journal of Robotics and Mechatronics 3, no. 5 (1991): 379–86. http://dx.doi.org/10.20965/jrm.1991.p0379.

Full text
Abstract:
As part of a guidance system for mobile robots operating on a wide and flat floor, such as an ordinary factory or a gymnasium, we have proposed a special-purpose sign. It consists of a cylinder, with four slits, and a fluorescent light, which is placed on the axis of the cylinder. Two of the slits are parallel to each other, and the other two are angled. A robot obtains an image of the sign with a TV camera. After thresholding, we have four bright sets of pixels which correspond to the four slits of the cylinder. We compute by measuring the relative distances between the four points, the dista
APA, Harvard, Vancouver, ISO, and other styles
2

Yang, Long, and Nan Feng Xiao. "Robot Stereo Vision Guidance System Based on Attention Mechanism." Applied Mechanics and Materials 385-386 (August 2013): 708–11. http://dx.doi.org/10.4028/www.scientific.net/amm.385-386.708.

Full text
Abstract:
Add attention mechanism into traditional robot stereo vision system, thus got the possible workpiece position quickly by saliency image, highly accelerate the computing process. First, to get the camera intrinsic matrix and extrinsic matrix, camera stereo calibration needed be done. Then use those parameter matrixes to rectify the newly captured images, disparity map can be got based on the OpenCV library, meanwhile, saliency image was computed by Itti algorithm. Workpiece spatial pose to left camera coordinates can be got with triangulation measurement principal. After a series of coordinates
APA, Harvard, Vancouver, ISO, and other styles
3

Golkowski, Alexander Julian, Marcus Handte, Peter Roch, and Pedro J. Marrón. "An Experimental Analysis of the Effects of Different Hardware Setups on Stereo Camera Systems." International Journal of Semantic Computing 15, no. 03 (2021): 337–57. http://dx.doi.org/10.1142/s1793351x21400080.

Full text
Abstract:
For many application areas such as autonomous navigation, the ability to accurately perceive the environment is essential. For this purpose, a wide variety of well-researched sensor systems are available that can be used to detect obstacles or navigation targets. Stereo cameras have emerged as a very versatile sensing technology in this regard due to their low hardware cost and high fidelity. Consequently, much work has been done to integrate them into mobile robots. However, the existing literature focuses on presenting the concepts and algorithms used to implement the desired robot functions
APA, Harvard, Vancouver, ISO, and other styles
4

Blais, François, Marc Rioux, and Jacques Domey. "Compact three-dimensional camera for robot and vehicle guidance." Optics and Lasers in Engineering 10, no. 3-4 (1989): 227–39. http://dx.doi.org/10.1016/0143-8166(89)90039-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Imasato, Akimitsu, and Noriaki Maru. "Guidance and Control of Nursing Care Robot Using Gaze Point Detector and Linear Visual Servoing." International Journal of Automation Technology 5, no. 3 (2011): 452–57. http://dx.doi.org/10.20965/ijat.2011.p0452.

Full text
Abstract:
The gaze guidance and control we propose for a nursing robot uses a gaze point detector (GPD) and linear visual servoing (LVS). The robot captures stereo camera images, presents them via a head-mounted display (HMD) to the user, calculates the user’s gaze tracked by the camera, and moves to gaze of the LVS. Since in the proposal, persons requiring nursing share the robot’s field of view via the GPD, the closer they get to the target, the more accurate control becomes. The GPD, on the user’s head, has an HMD and a CCD camera.
APA, Harvard, Vancouver, ISO, and other styles
6

Yang, Chun Hui, and Fu Dong Wang. "Trajectory Recognition and Navigation Control in the Mobile Robot." Key Engineering Materials 464 (January 2011): 11–14. http://dx.doi.org/10.4028/www.scientific.net/kem.464.11.

Full text
Abstract:
Fast and accurate acquisition of navigation information is the key and premise for robot guidance. In this paper, a robot trajectory guidance system composed of a camera, a Digital Signal Controller and mobile agency driven by stepper motors is given. First the JPEG (Joint Photographic Expert Group) image taken by camera is decoded and turns to correspond pixel image. By binarization process the image is then transformed to a binary image. A fast line extraction algorithm is presented based on Column Elementary Line Segment method. Furthermore the trajectory direction deviation parameters and
APA, Harvard, Vancouver, ISO, and other styles
7

Belmonte, Álvaro, José Ramón, Jorge Pomares, Gabriel Garcia, and Carlos Jara. "Optimal Image-Based Guidance of Mobile Manipulators using Direct Visual Servoing." Electronics 8, no. 4 (2019): 374. http://dx.doi.org/10.3390/electronics8040374.

Full text
Abstract:
This paper presents a direct image-based controller to perform the guidance of a mobile manipulator using image-based control. An eye-in-hand camera is employed to perform the guidance of a mobile differential platform with a seven degrees-of-freedom robot arm. The presented approach is based on an optimal control framework and it is employed to control mobile manipulators during the tracking of image trajectories taking into account robot dynamics. The direct approach allows us to take both the manipulator and base dynamics into account. The proposed image-based controllers consider the optim
APA, Harvard, Vancouver, ISO, and other styles
8

Achour, K., and A. O. Djekoune. "Localization and guidance with an embarked camera on a mobile robot." Advanced Robotics 16, no. 1 (2002): 87–102. http://dx.doi.org/10.1163/156855302317413754.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bazeille, Stephane, Emmanuel Battesti, and David Filliat. "A Light Visual Mapping and Navigation Framework for Low-Cost Robots." Journal of Intelligent Systems 24, no. 4 (2015): 505–24. http://dx.doi.org/10.1515/jisys-2014-0116.

Full text
Abstract:
AbstractWe address the problems of localization, mapping, and guidance for robots with limited computational resources by combining vision with the metrical information given by the robot odometry. We propose in this article a novel light and robust topometric simultaneous localization and mapping framework using appearance-based visual loop-closure detection enhanced with the odometry. The main advantage of this combination is that the odometry makes the loop-closure detection more accurate and reactive, while the loop-closure detection enables the long-term use of odometry for guidance by co
APA, Harvard, Vancouver, ISO, and other styles
10

Xue, Jin Lin, and Tony E. Grift. "Agricultural Robot Turning in the Headland of Corn Fields." Applied Mechanics and Materials 63-64 (June 2011): 780–84. http://dx.doi.org/10.4028/www.scientific.net/amm.63-64.780.

Full text
Abstract:
This article discusses the development of variable field of view (FOV) of camera to realize headland turning of an agricultural robot in corn fields. The variable FOV of camera was implemented to change direction of view of camera by two DC motors rotating separately in vertical and horizontal planes. Headland turning is executed in six steps: end of row detection and guidance, going blind for a distance, first 90˚ turning, position calculation, backing control, second 90˚ turning. Mathematically morphological operations were chosen to segment crops, and fuzzy logic control was applied to guid
APA, Harvard, Vancouver, ISO, and other styles
11

Villagran, Carlos R. Tercero, Seiichi Ikeda, Toshio Fukuda, et al. "Robot Manipulation and Guidance Using Magnetic Motion Capture Sensor and a Rule-Based Controller." Journal of Robotics and Mechatronics 20, no. 1 (2008): 151–58. http://dx.doi.org/10.20965/jrm.2008.p0151.

Full text
Abstract:
Magnetic motion capture sensors (MMCS) are not commonly used for robot control due to the need for complex, resource-consuming calibration to correct error introduced by the magnetic sensor. We propose avoiding such calibration using a rule-based controller that only uses spatial coordinates from the magnetic sensor. This controller uses a sparse look-up table of spatial coordinates and actions conducted by the robot and reacts to the presence of the sensor near reference points. The control method was applied to manipulate a robotic camera to track a catheter-shaped sensor inside vessels sili
APA, Harvard, Vancouver, ISO, and other styles
12

HAN, LONG, XINYU WU, YONGSHENG OU, YEN-LUN CHEN, CHUNJIE CHEN, and YANGSHENG XU. "HOUSEHOLD SERVICE ROBOT WITH CELLPHONE INTERFACE." International Journal of Information Acquisition 09, no. 02 (2013): 1350009. http://dx.doi.org/10.1142/s0219878913500095.

Full text
Abstract:
In this paper, an efficient and low-cost cellphone-commandable mobile manipulation system is described. Aiming at house and elderly caring, this system can be easily commanded through common cellphone network to efficiently grasp objects in household environment, utilizing several low-cost off-the-shelf devices. Unlike the visual servo technology using high quality vision system with high cost, the household-service robot may not afford to such high quality vision servo system, and thus it is essential to use some of low-cost device. However, it is extremely challenging to have the said vision
APA, Harvard, Vancouver, ISO, and other styles
13

Lei, Wentai, Mengdi Xu, Feifei Hou, et al. "Calibration Venus: An Interactive Camera Calibration Method Based on Search Algorithm and Pose Decomposition." Electronics 9, no. 12 (2020): 2170. http://dx.doi.org/10.3390/electronics9122170.

Full text
Abstract:
Cameras are widely used in many scenes such as robot positioning and unmanned driving, in which the camera calibration is a major task in this field. The interactive camera calibration method based on a plane board is becoming popular due to its stability and handleability. However, most methods choose suggestions subjectively from a fixed pose dataset, which is error-prone and limited for different camera models. In addition, these methods do not provide clear guidelines on how to place the board in the specified pose. This paper proposes a new interactive calibration method, named ‘Calibrati
APA, Harvard, Vancouver, ISO, and other styles
14

Jinlin, Xue. "Guidance of an agricultural robot with variable angle-of-view camera arrangement in cornfield." African Journal of Agricultural Research 9, no. 18 (2014): 1378–85. http://dx.doi.org/10.5897/ajar2013.7670.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Sreenivasan, S. V., and K. J. Waldron. "A Drift-Free Navigation System for a Mobile Robot Operating on Unstructured Terrain." Journal of Mechanical Design 116, no. 3 (1994): 894–900. http://dx.doi.org/10.1115/1.2919466.

Full text
Abstract:
The orientation and the angular rates of the body of a robotic vehicle are required for the guidance and control of the vehicle. In the current robotic systems these quantities are obtained by the use of inertial sensing systems. Inertial sensing systems involve drift errors which can be significant even after the vehicle has traversed only short distances on the terrain. A different approach is suggested here which guarantees accurate, drift-free sensing of the angular position and rates of the vehicle body. A camera system consisting of two cameras in fixed relationship to one another is mad
APA, Harvard, Vancouver, ISO, and other styles
16

Ravankar, Abhijeet, Ankit Ravankar, Yukinori Kobayashi, and Takanori Emaru. "Intelligent Robot Guidance in Fixed External Camera Network for Navigation in Crowded and Narrow Passages." Proceedings 1, no. 2 (2016): 37. http://dx.doi.org/10.3390/ecsa-3-d008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Ponnambalam, Vignesh Raja, Marianne Bakken, Richard J. D. Moore, Jon Glenn Omholt Gjevestad, and Pål Johan From. "Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields." Sensors 20, no. 18 (2020): 5249. http://dx.doi.org/10.3390/s20185249.

Full text
Abstract:
Automated robotic platforms are an important part of precision agriculture solutions for sustainable food production. Agri-robots require robust and accurate guidance systems in order to navigate between crops and to and from their base station. Onboard sensors such as machine vision cameras offer a flexible guidance alternative to more expensive solutions for structured environments such as scanning lidar or RTK-GNSS. The main challenges for visual crop row guidance are the dramatic differences in appearance of crops between farms and throughout the season and the variations in crop spacing a
APA, Harvard, Vancouver, ISO, and other styles
18

Kent, Ernest W., Thomas Wheatley, and Marilyn Nashman. "Real-time cooperative interaction between structured-light and reflectance ranging for robot guidance." Robotica 3, no. 1 (1985): 7–11. http://dx.doi.org/10.1017/s0263574700001417.

Full text
Abstract:
SUMMARYWhen applied to rapidly moving objects with complex trajectories, the information-rate limitation imposed by video-camera frame rates impairs the effectiveness of structured-light techniques in real-time robot servoing. To improve the performance of such systems, the use of fast infra-red proximity detectors to augment visual guidance in the final phase of target acquisition was explored. It was found that this approach was limited by the necessity of employing a different range/intensity calibration curve for the proximity detectors for every object and for every angle of approach to c
APA, Harvard, Vancouver, ISO, and other styles
19

PRETLOVE, J. R. G., and G. A. PARKER. "THE SURREY ATTENTIVE ROBOT VISION SYSTEM." International Journal of Pattern Recognition and Artificial Intelligence 07, no. 01 (1993): 89–107. http://dx.doi.org/10.1142/s0218001493000066.

Full text
Abstract:
This paper presents the design and development of a real-time eye-in-hand stereo-vision system to aid robot guidance in a manufacturing environment. The stereo vision head comprises a novel camera arrangement with servo-vergence, focus, and aperture that continuously provides high-quality images to a dedicated image processing system and parallel processing array. The stereo head has four degrees of freedom but it relies on the robot end-effector for all remaining movement. This provides the robot with exploratory sensing abilities allowing it to undertake a wider variety of less constrained t
APA, Harvard, Vancouver, ISO, and other styles
20

Hamdy ElGohary, Sherif, Yomna Sabah Mohamed, Mennatallah Hany Elkhodary, Omnya Ahmed, and Mennatallah Hesham. "Photodynamic Therapy Using Endoscopy Capsule Robot." Academic Journal of Life Sciences, no. 67 (September 10, 2020): 93–100. http://dx.doi.org/10.32861/ajls.67.93.100.

Full text
Abstract:
Among the photosensitizers used in Photodynamic therapy (PDT) technique for cancer treatment, it is found out that the Methylene blue and glycoconjugates chlorine are the best ones for this purpose. In this paper, it is suggested to use Active Capsule Wireless Endoscopy Robot instead of the traditional endoscope. The capsule has many valuable features. It uses LEDs as a source of light in the PDT to kill the colon cancer cells. So, the doctor can make use of the advantage of applying the LED light locally at the tumor which was previously injected by the photosensitizers, the light activates t
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Hang, Andrey V. Savkin, and Branka Vucetic. "Autonomous Area Exploration and Mapping in Underground Mine Environments by Unmanned Aerial Vehicles." Robotica 38, no. 3 (2019): 442–56. http://dx.doi.org/10.1017/s0263574719000754.

Full text
Abstract:
SummaryIn this paper, we propose a method of using an autonomous flying robot to explore an underground tunnel environment and build a 3D map. The robot model we use is an extension of a 2D non-holonomic robot. The measurements and sensors we considered in the presented method are simple and valid in practical unmanned aerial vehicle (UAV) engineering. The proposed safe exploration algorithm belongs to a class of probabilistic area search, and with a mathematical proof, the performance of the algorithm is analysed. Based on the algorithm, we also propose a sliding control law to apply the algo
APA, Harvard, Vancouver, ISO, and other styles
22

Bayro-Corrochano, Eduardo. "Editorial." Robotica 26, no. 4 (2008): 415–16. http://dx.doi.org/10.1017/s0263574708004785.

Full text
Abstract:
Robotic sensing is a relatively new field of activity compared with the design and control of robot mechanisms. In both areas the role of geometry is natural and necessary for the development of devices, their control and use in challenging environments. At the very beginning odometry, tactile and touch sensors dominated robot sensing. More recently, due to the fall in the price of laser devices, they have become more attractive to the community. On the other hand, progress in photogrametry, particularly during the nineties as the n-view geometry in projective geometry matured, boot-strapped t
APA, Harvard, Vancouver, ISO, and other styles
23

Rivas-Blanco, Irene, Carlos Perez-del-Pulgar, Carmen López-Casado, Enrique Bauzano, and Víctor Muñoz. "Transferring Know-How for an Autonomous Camera Robotic Assistant." Electronics 8, no. 2 (2019): 224. http://dx.doi.org/10.3390/electronics8020224.

Full text
Abstract:
Robotic platforms are taking their place in the operating room because they provide more stability and accuracy during surgery. Although most of these platforms are teleoperated, a lot of research is currently being carried out to design collaborative platforms. The objective is to reduce the surgeon workload through the automation of secondary or auxiliary tasks, which would benefit both surgeons and patients by facilitating the surgery and reducing the operation time. One of the most important secondary tasks is the endoscopic camera guidance, whose automation would allow the surgeon to be c
APA, Harvard, Vancouver, ISO, and other styles
24

Ferland, François, Aurélien Reveleau, Francis Leconte, Dominic Létourneau, and François Michaud. "Coordination mechanism for integrated design of Human-Robot Interaction scenarios." Paladyn, Journal of Behavioral Robotics 8, no. 1 (2017): 100–111. http://dx.doi.org/10.1515/pjbr-2017-0006.

Full text
Abstract:
Abstract The ultimate long-term goal in Human-Robot Interaction (HRI) is to design robots that can act as a natural extension to humans. This requires the design of robot control architectures to provide structure for the integration of the necessary components into HRI. This paper describes how HBBA, a Hybrid Behavior-Based Architecture, can be used as a unifying framework for integrated design of HRI scenarios. More specifically, we focus here on HBBA’s generic coordination mechanism of behavior-producing modules, which allows to address a wide range or cognitive capabilities ranging from as
APA, Harvard, Vancouver, ISO, and other styles
25

Choi, Keun Ha, Sang Kwon Han, Kwang-Ho Park, Kyung-Soo Kim, and Soohyun Kim. "Guidance Line Extraction Algorithm using Central Region Data of Crop for Vision Camera based Autonomous Robot in Paddy Field." Journal of Korea Robotics Society 11, no. 1 (2016): 1–8. http://dx.doi.org/10.7746/jkros.2016.11.1.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Vaida, C., D. Pisla, and N. Plitea. "Graphical simulation of a new concept of low sized surgical parallel robot for camera guidance in minimally invasive surgery." PAMM 7, no. 1 (2007): 2090005–6. http://dx.doi.org/10.1002/pamm.200700132.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Li, Xingdong, Hewei Gao, Fusheng Zha, et al. "Learning the Cost Function for Foothold Selection in a Quadruped Robot." Sensors 19, no. 6 (2019): 1292. http://dx.doi.org/10.3390/s19061292.

Full text
Abstract:
This paper is focused on designing a cost function of selecting a foothold for a physical quadruped robot walking on rough terrain. The quadruped robot is modeled with Denavit–Hartenberg (DH) parameters, and then a default foothold is defined based on the model. Time of Flight (TOF) camera is used to perceive terrain information and construct a 2.5D elevation map, on which the terrain features are detected. The cost function is defined as the weighted sum of several elements including terrain features and some features on the relative pose between the default foothold and other candidates. It
APA, Harvard, Vancouver, ISO, and other styles
28

Bárdosi, Zoltán, Christian Plattner, Yusuf Özbek, et al. "CIGuide: in situ augmented reality laser guidance." International Journal of Computer Assisted Radiology and Surgery 15, no. 1 (2019): 49–57. http://dx.doi.org/10.1007/s11548-019-02066-1.

Full text
Abstract:
Abstract Purpose A robotic intraoperative laser guidance system with hybrid optic-magnetic tracking for skull base surgery is presented. It provides in situ augmented reality guidance for microscopic interventions at the lateral skull base with minimal mental and workload overhead on surgeons working without a monitor and dedicated pointing tools. Methods Three components were developed: a registration tool (Rhinospider), a hybrid magneto-optic-tracked robotic feedback control scheme and a modified robotic end-effector. Rhinospider optimizes registration of patient and preoperative CT data by
APA, Harvard, Vancouver, ISO, and other styles
29

Liu, Yi, Ming Cong, Hang Dong, and Dong Liu. "Human skill integrated motion planning of assembly manipulation for 6R industrial robot." Industrial Robot: the international journal of robotics research and application 46, no. 1 (2019): 171–80. http://dx.doi.org/10.1108/ir-09-2018-0189.

Full text
Abstract:
Purpose The purpose of this paper is to propose a new method based on three-dimensional (3D) vision technologies and human skill integrated deep learning to solve assembly positioning task such as peg-in-hole. Design/methodology/approach Hybrid camera configuration was used to provide the global and local views. Eye-in-hand mode guided the peg to be in contact with the hole plate using 3D vision in global view. When the peg was in contact with the workpiece surface, eye-to-hand mode provided the local view to accomplish peg-hole positioning based on trained CNN. Findings The results of assembl
APA, Harvard, Vancouver, ISO, and other styles
30

Zhang, Le, Rui Li, Zhiqiang Li, et al. "A Quadratic Traversal Algorithm of Shortest Weeding Path Planning for Agricultural Mobile Robots in Cornfield." Journal of Robotics 2021 (February 19, 2021): 1–19. http://dx.doi.org/10.1155/2021/6633139.

Full text
Abstract:
In order to improve the weeding efficiency and protect farm crops, accurate and fast weeds removal guidance to agricultural mobile robots is an utmost important topic. Based on this motivation, we propose a time-efficient quadratic traversal algorithm for the removal guidance of weeds around the recognized corn in the field. To recognize the weeds and corns, a Faster R-CNN neural network is implemented in real-time recognition. Then, an ultra-green characterization (EXG) hyperparameter is used for grayscale image processing. An improved OTSU (IOTSU) algorithm is proposed to accurately generate
APA, Harvard, Vancouver, ISO, and other styles
31

Fernández, Ignacio, Manuel Mazo, José L. Lázaro, et al. "Guidance of a mobile robot using an array of static cameras located in the environment." Autonomous Robots 23, no. 4 (2007): 305–24. http://dx.doi.org/10.1007/s10514-007-9049-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Golparvar, Ata Jedari, and Murat Kaya Yapici. "Toward graphene textiles in wearable eye tracking systems for human–machine interaction." Beilstein Journal of Nanotechnology 12 (February 11, 2021): 180–89. http://dx.doi.org/10.3762/bjnano.12.14.

Full text
Abstract:
The study of eye movements and the measurement of the resulting biopotential, referred to as electrooculography (EOG), may find increasing use in applications within the domain of activity recognition, context awareness, mobile human–computer and human–machine interaction (HCI/HMI), and personal medical devices; provided that, seamless sensing of eye activity and processing thereof is achieved by a truly wearable, low-cost, and accessible technology. The present study demonstrates an alternative to the bulky and expensive camera-based eye tracking systems and reports the development of a graph
APA, Harvard, Vancouver, ISO, and other styles
33

Wang, Xiaoguang, Yunbo Hu, and Qi Lin. "Workspace analysis and verification of cable-driven parallel mechanism for wind tunnel test." Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering 231, no. 6 (2016): 1012–21. http://dx.doi.org/10.1177/0954410016646601.

Full text
Abstract:
Cable-driven parallel mechanism is a special kind of parallel robot in which traditional rigid links are replaced by actuated cables. This provides a new suspension method for wind tunnel test, in which an aircraft model is driven by a number of parallel cables to fulfil 6-DOF motion. The workspace of such a cable robot is limited due to the geometrical and unilateral force constraints, the investigation of which is important for applications requiring large flight space. This paper focuses on the workspace analysis and verification of a redundant constraint 6-DOF cable-driven parallel suspens
APA, Harvard, Vancouver, ISO, and other styles
34

Zhang, Jincheng, Prashant Ganesh, Kyle Volle, Andrew Willis, and Kevin Brink. "Low-Bandwidth and Compute-Bound RGB-D Planar Semantic SLAM." Sensors 21, no. 16 (2021): 5400. http://dx.doi.org/10.3390/s21165400.

Full text
Abstract:
Visual simultaneous location and mapping (SLAM) using RGB-D cameras has been a necessary capability for intelligent mobile robots. However, when using point-cloud map representations as most RGB-D SLAM systems do, limitations in onboard compute resources, and especially communication bandwidth can significantly limit the quantity of data processed and shared. This article proposes techniques that help address these challenges by mapping point clouds to parametric models in order to reduce computation and bandwidth load on agents. This contribution is coupled with a convolutional neural network
APA, Harvard, Vancouver, ISO, and other styles
35

Nebylov, A. V., V. V. Perliouk, and T. S. Leontieva. "Investigation of the technology of mutual navigation and orientation of small space vehicles flying in formation." VESTNIK of Samara University. Aerospace and Mechanical Engineering 18, no. 1 (2019): 88–93. http://dx.doi.org/10.18287/2541-7533-2019-18-1-88-93.

Full text
Abstract:
The paper presents the problem of ensuring support of the flight of a group of small spacecraft (microsatellites) taking into account the small mutual distances between them. The purpose of using the orbital constellation specified is to create a radio communication system to control remote objects like unmanned aerial vehicles and ground robots located in hard-to-reach areas of the Earth from the Central ground station. To reduce the cost of microsatellite design, it was decided to rigidly fix the receiving and transmitting antennas on their housings and use the spatial orientation of the ent
APA, Harvard, Vancouver, ISO, and other styles
36

Hunter, Mark, Kevin Kremer, and Kristen Wymore. "A first report of robotic-assisted sentinel lymph node mapping and inguinal lymph node dissection, using near-infared fluorescence, in vulvar cancer." Journal of Clinical Oncology 35, no. 15_suppl (2017): e17027-e17027. http://dx.doi.org/10.1200/jco.2017.35.15_suppl.e17027.

Full text
Abstract:
e17027 Background: Vulvar carcinoma is a rare gynecologic malignancy which has seen a considerable evolution in surgical techniques over the last several decades. However, the morbidity associated with inguinal lymph node dissections remains significant. The majority of patients undergoing full lymphadenectomy will have some complication, with wound breakdown being the most common. In males, robotic inguinal lymph node dissection has been described for penile cancer. This report represents a first use of near-infared fluorescence for sentinel inguinal lymph node mapping, and the first descript
APA, Harvard, Vancouver, ISO, and other styles
37

Ito, Minoru. "Robot vision modelling-camera modelling and camera calibration." Advanced Robotics 5, no. 3 (1990): 321–35. http://dx.doi.org/10.1163/156855391x00232.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Tobita, Kazuteru, Katsuyuki Sagayama, Mayuko Mori, et al. "Guidance Robot LIGHBOT and Robot Town Sagami." Journal of the Robotics Society of Japan 38, no. 7 (2020): 604–10. http://dx.doi.org/10.7210/jrsj.38.604.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Wallster, Ola. "Optimaster robot guidance system." Sensor Review 15, no. 2 (1995): 23–26. http://dx.doi.org/10.1108/eum0000000004263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

TAKAHASHI, Hironobu, and Fumiaki TOMITA. "Camera Calibration for Robot Vision." Journal of the Robotics Society of Japan 10, no. 2 (1992): 177–84. http://dx.doi.org/10.7210/jrsj.10.177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Lubis, Abdul Jabbar, Yuyun Dwi Lestari, Haida Dafitri, and Azanuddin. "ROBOT TRACER WITH VISUAL CAMERA." Journal of Physics: Conference Series 930 (December 2017): 012017. http://dx.doi.org/10.1088/1742-6596/930/1/012017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Webb, P., I. Gibson, and C. Wykes. "Robot guidance using ultrasonic arrays." Journal of Robotic Systems 11, no. 8 (1994): 681–92. http://dx.doi.org/10.1002/rob.4620110802.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Shyi, C. N., J. Y. Lee, and C. H. Chen. "Robot guidance using standard mark." Electronics Letters 24, no. 21 (1988): 1326. http://dx.doi.org/10.1049/el:19880901.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Deveza, Reimundo, David Thiel, Andrew Russell, and Alan Mackay-Sim. "Odor Sensing for Robot Guidance." International Journal of Robotics Research 13, no. 3 (1994): 232–39. http://dx.doi.org/10.1177/027836499401300305.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Kumar, Arun V. Rejus, and A. Sagai Francis Britto. "Robot Controlled Six Degree Freedom Camera." International Journal of Psychosocial Rehabilitation 23, no. 4 (2019): 243–53. http://dx.doi.org/10.37200/ijpr/v23i4/pr190183.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

CHRISTENSEN, HENRIK I. "A LOW-COST ROBOT CAMERA HEAD." International Journal of Pattern Recognition and Artificial Intelligence 07, no. 01 (1993): 69–87. http://dx.doi.org/10.1142/s0218001493000054.

Full text
Abstract:
Active vision involving the exploitation of controllable cameras and camera heads is an area which has received increased attention over the last few years. At LIA/AUC a binocular robot camera head has been constructed for use in geometric modelling and interpretation. In this manuscript the basic design of the head is outlined and a first prototype is described in some detail. Detailed specifications for the components used are provided together with a section on lessons learned from construction and initial use of this prototype.
APA, Harvard, Vancouver, ISO, and other styles
47

Pour Yousefian Barfeh, D., and E. Ramos. "Color Detection in Autonomous Robot-Camera." Journal of Physics: Conference Series 1169 (February 2019): 012048. http://dx.doi.org/10.1088/1742-6596/1169/1/012048.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Zhuang, Hanqi, Zvi S. Roth, and Kuanchih Wang. "Robot calibration by mobile camera systems." Journal of Robotic Systems 11, no. 3 (1994): 155–67. http://dx.doi.org/10.1002/rob.4620110303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Wang, Long, Ying Sheng, Yuan Ying Qiu, and Guang Da Chen. "Design and Implementation of the Control Device for a Cable-Driven Camera Robot." Applied Mechanics and Materials 373-375 (August 2013): 225–30. http://dx.doi.org/10.4028/www.scientific.net/amm.373-375.225.

Full text
Abstract:
Due to the advantages of strong bearing capacity, compact structure and large working space, cable-driven camera robots are being used increasingly. This paper focuses on the design and implementation method of the control device for a four-cable-driven camera robot. Firstly, a kinematics model is built and the variations of the cable lengths are analyzed along with the camera robot motion; then, the console and actuator of the camera robot are designed; again, PC software transferring command codes from the camera robot console to the actuator and displaying camera robot motion state is devel
APA, Harvard, Vancouver, ISO, and other styles
50

Herpe, G., M. Chevreton, J. C. Robin, P. Prat, B. Servan, and F. Gex. "An intensified C.C.D. camera for telescope guidance." Experimental Astronomy 2, no. 3 (1991): 163–77. http://dx.doi.org/10.1007/bf00566684.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!