Academic literature on the topic 'Vision based sensors'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Vision based sensors.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Vision based sensors"

1

HOU, Lei, Shingo KAGAMI, and Koichi HASHIMOTO. "2P2-B24 Illumination-based Synchronization of High-Speed Vision Sensors." Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2008 (2008): _2P2—B24_1—_2P2—B24_4. http://dx.doi.org/10.1299/jsmermd.2008._2p2-b24_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Casciati, Fabio, Sara Casciati, and Li Jun Wu. "Vision-Based Sensing in Dynamic Tests." Key Engineering Materials 569-570 (July 2013): 767–74. http://dx.doi.org/10.4028/www.scientific.net/kem.569-570.767.

Full text
Abstract:
The availability of a suitable data acquisition sensor network is a key implementation issue to link models with real world structures. Non-contact displacement sensors should be preferred since they do not change the system properties. A two-dimensional vision-based displacement measurement sensor is the focus of this contribution. In particular, the perspective distortion introduced by the angle between the optic axis of the camera and the normal to the plane in which the structural system deforms is considered. A two-dimensional affine transformation is utilized to eliminate the distortion
APA, Harvard, Vancouver, ISO, and other styles
3

Kim, H., K. Choi, and I. Lee. "IMPROVING CAR NAVIGATION WITH A VISION-BASED SYSTEM." ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences II-3/W5 (August 20, 2015): 459–65. http://dx.doi.org/10.5194/isprsannals-ii-3-w5-459-2015.

Full text
Abstract:
The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combi
APA, Harvard, Vancouver, ISO, and other styles
4

Hasegawa, Hiroaki, Yosuke Suzuki, Aiguo Ming, Masatoshi Ishikawa, and Makoto Shimojo. "Robot Hand Whose Fingertip Covered with Net-Shape Proximity Sensor - Moving Object Tracking Using Proximity Sensing -." Journal of Robotics and Mechatronics 23, no. 3 (2011): 328–37. http://dx.doi.org/10.20965/jrm.2011.p0328.

Full text
Abstract:
Occlusion in several millimeters from an object to be grasped made it difficult for a vision-sensor-based approach to detect relative positioning between this object and robot fingers joint grasping. The proximity sensor we proposed detects the object at a near range very effectively. We developed a thin proximity sensor sheet to cover the 3 fingers of a robot hand. Integrating sensors and hand control, we implemented an objecttracking controller. Using proximity sensory signals, the controller coordinates wrist positioning based on palm proximity sensors and grasping from fingertip sensors, e
APA, Harvard, Vancouver, ISO, and other styles
5

Song, Cimoo, and Meenam Shinn. "Commercial vision of silicon-based inertial sensors." Sensors and Actuators A: Physical 66, no. 1-3 (1998): 231–36. http://dx.doi.org/10.1016/s0924-4247(98)00048-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Feng, Yang, Hengyi Lv, Hailong Liu, Yisa Zhang, Yuyao Xiao, and Chengshan Han. "Event Density Based Denoising Method for Dynamic Vision Sensor." Applied Sciences 10, no. 6 (2020): 2024. http://dx.doi.org/10.3390/app10062024.

Full text
Abstract:
Dynamic vision sensor (DVS) is a new type of image sensor, which has application prospects in the fields of automobiles and robots. Dynamic vision sensors are very different from traditional image sensors in terms of pixel principle and output data. Background activity (BA) in the data will affect image quality, but there is currently no unified indicator to evaluate the image quality of event streams. This paper proposes a method to eliminate background activity, and proposes a method and performance index for evaluating filter performance: noise in real (NIR) and real in noise (RIN). The low
APA, Harvard, Vancouver, ISO, and other styles
7

Hou, Lei, Shingo Kagami, and Koichi Hashimoto. "Performance Evaluation of Illumination-based Synchronization of High-Speed Vision Sensors in Dynamic Scenes." Abstracts of the international conference on advanced mechatronics : toward evolutionary fusion of IT and mechatronics : ICAM 2010.5 (2010): 573–78. http://dx.doi.org/10.1299/jsmeicam.2010.5.573.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mueggler, Elias, Henri Rebecq, Guillermo Gallego, Tobi Delbruck, and Davide Scaramuzza. "The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM." International Journal of Robotics Research 36, no. 2 (2017): 142–49. http://dx.doi.org/10.1177/0278364917691115.

Full text
Abstract:
New vision sensors, such as the dynamic and active-pixel vision sensor (DAVIS), incorporate a conventional global-shutter camera and an event-based sensor in the same pixel array. These sensors have great potential for high-speed robotics and computer vision because they allow us to combine the benefits of conventional cameras with those of event-based sensors: low latency, high temporal resolution, and very high dynamic range. However, new algorithms are required to exploit the sensor characteristics and cope with its unconventional output, which consists of a stream of asynchronous brightnes
APA, Harvard, Vancouver, ISO, and other styles
9

Wan, Jixiang, Ming Xia, Zunkai Huang, et al. "Event-Based Pedestrian Detection Using Dynamic Vision Sensors." Electronics 10, no. 8 (2021): 888. http://dx.doi.org/10.3390/electronics10080888.

Full text
Abstract:
Pedestrian detection has attracted great research attention in video surveillance, traffic statistics, and especially in autonomous driving. To date, almost all pedestrian detection solutions are derived from conventional framed-based image sensors with limited reaction speed and high data redundancy. Dynamic vision sensor (DVS), which is inspired by biological retinas, efficiently captures the visual information with sparse, asynchronous events rather than dense, synchronous frames. It can eliminate redundant data transmission and avoid motion blur or data leakage in high-speed imaging applic
APA, Harvard, Vancouver, ISO, and other styles
10

Kolar, Prasanna, Patrick Benavidez, and Mo Jamshidi. "Survey of Datafusion Techniques for Laser and Vision Based Sensor Integration for Autonomous Navigation." Sensors 20, no. 8 (2020): 2180. http://dx.doi.org/10.3390/s20082180.

Full text
Abstract:
This paper focuses on data fusion, which is fundamental to one of the most important modules in any autonomous system: perception. Over the past decade, there has been a surge in the usage of smart/autonomous mobility systems. Such systems can be used in various areas of life like safe mobility for the disabled, senior citizens, and so on and are dependent on accurate sensor information in order to function optimally. This information may be from a single sensor or a suite of sensors with the same or different modalities. We review various types of sensors, their data, and the need for fusion
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Vision based sensors"

1

Sharkasi, Adam Tawfik. "Stereo Vision Based Aerial Mapping Using GPS and Inertial Sensors." Thesis, Virginia Tech, 2008. http://hdl.handle.net/10919/32263.

Full text
Abstract:
The robotics field has grown in recent years to a point where unmanned systems are no longer limited by their capabilities. As such, the mission profiles for unmanned systems are becoming more and more complicated, and a demand has risen for the deployment of unmanned systems into the most complex of environments. Additionally, the objectives for unmanned systems are once more complicated by the necessity for beyond line of sight teleoperation, and in some cases complete vehicle autonomy. Such systems require adequate sensory devices for appropriate situational awareness. Additionally, a larg
APA, Harvard, Vancouver, ISO, and other styles
2

Bowers, Roshawn Elizabeth. "Estimation algorithm for autonomous aerial refueling using a vision based relative navigation system." Texas A&M University, 2005. http://hdl.handle.net/1969.1/2700.

Full text
Abstract:
A new impetus to develop autonomous aerial refueling has arisen out of the growing demand to expand the capabilities of unmanned aerial vehicles (UAVs). With autonomous aerial refueling, UAVs can retain the advantages of being small, inexpensive, and expendable, while offering superior range and loiter-time capabilities. VisNav, a vision based sensor, offers the accuracy and reliability needed in order to provide relative navigation information for autonomous probe and drogue aerial refueling for UAVs. This thesis develops a Kalman filter to be used in combination with the VisNav sensor to imp
APA, Harvard, Vancouver, ISO, and other styles
3

Yu, Honglu. "Development of vision-based inferential sensors for process monitoring and control /." *McMaster only, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rubtsov, Vasily. "A high speed scanning system for vision based navigation/control of mobile robots." Thesis, De Montfort University, 1998. http://hdl.handle.net/2086/10684.

Full text
Abstract:
One of the main problems in the design of mobile robots is the development of creating smart integrated information systems. These systems may include different type of sensors. Usually a CeD vision system is a necessary part for these systems. This thesis considers the design of a fast mechanical scanning system for a CCD vision system and the synthesis of the optimal control for this system. The mathematical model of the transport subsystem for mobile robots subjected to an external disturbance is created. The correctness of this model is proved on the base of simulation and experimental res
APA, Harvard, Vancouver, ISO, and other styles
5

Haessig, Germain. "Neuromorphic computation using event-based sensors : from algorithms to hardware implementations." Thesis, Sorbonne université, 2018. http://www.theses.fr/2018SORUS422/document.

Full text
Abstract:
Cette thèse porte sur l’implémentation d’algorithmes événementiels, en utilisant, dans un premier temps, des données provenant d’une rétine artificielle, mimant le fonctionnement de la rétine humaine, pour ensuite évoluer vers tous types de signaux événementiels. Ces signaux événementiels sont issus d’un changement de paradigme dans la représentation du signal, offrant une grande plage dynamique de fonctionnement, une résolution temporelle importante ainsi qu’une compression native du signal. Sera notamment étudiée la réalisation d’un dispositif de création de cartes de profondeur monoculaires
APA, Harvard, Vancouver, ISO, and other styles
6

Cheng, Yongqiang. "Wireless mosaic eyes based robot path planning and control : autonomous robot navigation using environment intelligence with distributed vision sensors." Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/4421.

Full text
Abstract:
As an attempt to steer away from developing an autonomous robot with complex centralised intelligence, this thesis proposes an intelligent environment infrastructure where intelligences are distributed in the environment through collaborative vision sensors mounted in a physical architecture, forming a wireless sensor network, to enable the navigation of unintelligent robots within that physical architecture. The aim is to avoid the bottleneck of centralised robot intelligence that hinders the application and exploitation of autonomous robot. A bio-mimetic snake algorithm is proposed to coordi
APA, Harvard, Vancouver, ISO, and other styles
7

Dippold, Amanda. "Vision-Based Obstacle Avoidance for Multiple Vehicles Performing Time-Critical Missions." Diss., Virginia Tech, 2009. http://hdl.handle.net/10919/27830.

Full text
Abstract:
This dissertation discusses vision-based static obstacle avoidance for a fleet of nonholonomic robots tasked to arrive at a final destination simultaneously. Path generation for each vehicle is computed using a single polynomial function that incorporates the vehicle constraints on velocity and acceleration and satisfies boundary conditions by construction. Furthermore, the arrival criterion and a preliminary obstacle avoidance scheme is incorporated into the path generation. Each robot is equipped with an inertial measurement unit that provides measurements of the vehicleâ s position and vel
APA, Harvard, Vancouver, ISO, and other styles
8

Isoz, Wilhelm. "Calibration of Multispectral Sensors." Thesis, Linköping University, Department of Electrical Engineering, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-5202.

Full text
Abstract:
<p>This thesis describes and evaluates a number of approaches and algorithms for nonuniform correction (NUC) and suppression of fixed pattern noise in a image sequence. The main task for this thesis work was to create a general NUC for infrared focal plane arrays. To create a radiometrically correct NUC, reference based methods using polynomial approximation are used instead of the more common scene based methods which creates a cosmetic NUC.</p><p>The pixels that can not be adjusted to give a correct value for the incomming radiation are defined as dead. Four separate methods of identifying d
APA, Harvard, Vancouver, ISO, and other styles
9

Everding, Lukas [Verfasser], Jörg [Akademischer Betreuer] Conradt, Jörg [Gutachter] Conradt, and Darius [Gutachter] Burschka. "Event-based Depth Reconstruction Using Stereo Dynamic Vision Sensors / Lukas Everding ; Gutachter: Jörg Conradt, Darius Burschka ; Betreuer: Jörg Conradt." München : Universitätsbibliothek der TU München, 2018. http://d-nb.info/1178672212/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhou, Dingfu. "Vision-based moving pedestrian recognition from imprecise and uncertain data." Thesis, Compiègne, 2014. http://www.theses.fr/2014COMP2162/document.

Full text
Abstract:
La mise en oeuvre de systèmes avancés d’aide à la conduite (ADAS) basée vision, est une tâche complexe et difficile surtout d’un point de vue robustesse en conditions d’utilisation réelles. Une des fonctionnalités des ADAS vise à percevoir et à comprendre l’environnement de l’ego-véhicule et à fournir l’assistance nécessaire au conducteur pour réagir à des situations d’urgence. Dans cette thèse, nous nous concentrons sur la détection et la reconnaissance des objets mobiles car leur dynamique les rend plus imprévisibles et donc plus dangereux. La détection de ces objets, l’estimation de leurs p
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Vision based sensors"

1

Bagley, Martha. Community based confident living programs for at-risk elderly with sensory losses. Helen Keller National Center for Youths and Adults with Deaf-Blindness, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

López, Antonio M., and David Gerónimo. Vision-based Pedestrian Protection Systems for Intelligent Vehicles. Springer, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

López, Antonio M., and David Gerónimo. Vision-based Pedestrian Protection Systems for Intelligent Vehicles. Springer, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

H, Dodd C. Computer Vision and Sensor-Based Robots. Springer, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Computer Vision and Sensor-Based Robots. Springer, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yuan-Liang, Tang, Devadiga Sadashiva, and United States. National Aeronautics and Space Administration., eds. A model-based approach for detection of objects in low resolution passive millimeter wave images: An interim report for NASA grant NAG-1-1371, "analysis of image sequences from sensors for restricted visibility operations", for the period January 24, 1992 to January 23, 1993. Dept. of Electrical and COmputer Engineering, Pennsylvania State University, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yuan-Liang, Tang, Devadiga Sadashiva, and United States. National Aeronautics and Space Administration., eds. A model-based approach for detection of objects in low resolution passive millimeter wave images: An interim report for NASA grant NAG-1-1371, "analysis of image sequences from sensors for restricted visibility operations", for the period January 24, 1992 to January 23, 1993. Dept. of Electrical and COmputer Engineering, Pennsylvania State University, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

United States. National Aeronautics and Space Administration., ed. A model-based approach for detection of runways and other objects in image sequences acquired using an on-board camera: Final technical report for NASA grant NAG-1-1371, "analysis of image sequences from sensors for restricted visibility operations", period of the grant January 24, 1992 to May 31, 1994. National Aeronautics and Space Administration, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

United States. National Aeronautics and Space Administration., ed. A model-based approach for detection of runways and other objects in image sequences acquired using an on-board camera: Final technical report for NASA grant NAG-1-1371, "analysis of image sequences from sensors for restricted visibility operations", period of the grant January 24, 1992 to May 31, 1994. National Aeronautics and Space Administration, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

United States. National Aeronautics and Space Administration., ed. A model-based approach for detection of runways and other objects in image sequences acquired using an on-board camera: Final technical report for NASA grant NAG-1-1371, "analysis of image sequences from sensors for restricted visibility operations", period of the grant January 24, 1992 to May 31, 1994. National Aeronautics and Space Administration, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Vision based sensors"

1

Aranda, Miguel, Gonzalo López-Nicolás, and Carlos Sagüés. "Vision-Based Control for Nonholonomic Vehicles." In Control of Multiple Robots Using Vision Sensors. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-57828-6_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Schaub, Alexander. "Vision-Based Reactive Controllers." In Robust Perception from Optical Sensors for Reactive Behaviors in Autonomous Robotic Vehicles. Springer Fachmedien Wiesbaden, 2017. http://dx.doi.org/10.1007/978-3-658-19087-3_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ramana, Lovedeep, Wooram Choi, and Young-Jin Cha. "Automated Vision-Based Loosened Bolt Detection Using the Cascade Detector." In Sensors and Instrumentation, Volume 5. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-54987-3_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Płaczek, Bartłomiej. "A Real Time Vehicle Detection Algorithm for Vision-Based Sensors." In Computer Vision and Graphics. Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15907-7_26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Aranda, Miguel, Gonzalo López-Nicolás, and Carlos Sagüés. "Angle-Based Navigation Using the 1D Trifocal Tensor." In Control of Multiple Robots Using Vision Sensors. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-57828-6_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Xue-bo, Cong-yuan Wang, Yong-chun Fang, and Ke-xin Xing. "An Extended Kalman Filter-Based Robot Pose Estimation Approach with Vision and Odometry." In Wearable Sensors and Robots. Springer Singapore, 2016. http://dx.doi.org/10.1007/978-981-10-2404-7_41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kim, Jungho, Youngbae Hwang, and In So Kweon. "Key-Frame SLAM Based on Motion Estimation and Stochastic Filtering Using Stereo Vision." In Smart Sensors and Systems. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-42234-9_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gruber, Sheldon. "Neural Network Based Inspection of Machined Surfaces Using Multiple Sensors." In Multisensor Fusion for Computer Vision. Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/978-3-662-02957-2_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Coughlan, James, Roberto Manduchi, and Huiying Shen. "Computer Vision-Based Terrain Sensors for Blind Wheelchair Users." In Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11788713_186.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Guo, Jing, Cui-lian Zhao, Yu Li, Lin-hui Luo, and Kun-feng Zhang. "The Design of E Glove Hand Function Evaluation Device Based on Fusion of Vision and Touch." In Wearable Sensors and Robots. Springer Singapore, 2016. http://dx.doi.org/10.1007/978-981-10-2404-7_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Vision based sensors"

1

Yasin, Jawad N., Sherif A. S. Mohamed, Mohammad-hashem Haghbayan, et al. "Night vision obstacle detection and avoidance based on Bio-Inspired Vision Sensors." In 2020 IEEE SENSORS. IEEE, 2020. http://dx.doi.org/10.1109/sensors47125.2020.9278914.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Olumodeji, Olufemi A., Alessandro Paolo Bramanti, and Massimo Gottardi. "Memristor-based pixel for event-detection vision sensor." In 2015 IEEE Sensors. IEEE, 2015. http://dx.doi.org/10.1109/icsens.2015.7370688.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Henderson, Thomas C., and Chuck Hansen. "CAGD-Based Computer Vision." In 1988 Technical Symposium on Optics, Electro-Optics, and Sensors, edited by Richard D. Juday. SPIE, 1988. http://dx.doi.org/10.1117/12.976620.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Douillard, Bertrand, Alex Brooks, and Fabio Ramos. "A 3D laser and vision based classifier." In 2009 International Conference on Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP). IEEE, 2009. http://dx.doi.org/10.1109/issnip.2009.5416828.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Delbruck, Tobi, Bernabe Linares-Barranco, Eugenio Culurciello, and Christoph Posch. "Activity-driven, event-based vision sensors." In 2010 IEEE International Symposium on Circuits and Systems - ISCAS 2010. IEEE, 2010. http://dx.doi.org/10.1109/iscas.2010.5537149.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Mkhitaryan, Artashes, and Darius Burschka. "Vision based haptic multisensor for manipulation of soft, fragile objects." In 2012 IEEE Sensors. IEEE, 2012. http://dx.doi.org/10.1109/icsens.2012.6411035.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chao Zhang, Jiangli Yu, Takakazu Ishimatsu, Naoya Shiraishi, and Lawn Murray. "Vision-based interface for people with serious spinal cord injury." In 2015 IEEE Sensors. IEEE, 2015. http://dx.doi.org/10.1109/icsens.2015.7370686.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Azimi, Ehsan, Baichuan Jiang, Ethan Tang, Peter Kazanzides, and Iulian Iordachita. "Teleoperative control of intraocular robotic snake: Vision-based angular calibration." In 2017 IEEE SENSORS. IEEE, 2017. http://dx.doi.org/10.1109/icsens.2017.8234072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Miro, J. V., G. Dissanayake, and Weizhen Zhou. "Vision-based SLAM using natural features in indoor environments." In 2005 International Conference on Intelligent Sensors, Sensor Networks and Information Processing. IEEE, 2005. http://dx.doi.org/10.1109/issnip.2005.1595571.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Simlinger, Benedict, and Guillaume Ducard. "Vision-based Gyroscope Fault Detection for UAVs." In 2019 IEEE Sensors Applications Symposium (SAS). IEEE, 2019. http://dx.doi.org/10.1109/sas.2019.8705965.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Vision based sensors"

1

Surana, Amit, and Allen Tannenbaum. Vision-Based Autonomous Sensor-Tasking in Uncertain Adversarial Environments. Defense Technical Information Center, 2015. http://dx.doi.org/10.21236/ada619641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Baral, Aniruddha, Jeffery Roesler, and Junryu Fu. Early-age Properties of High-volume Fly Ash Concrete Mixes for Pavement: Volume 2. Illinois Center for Transportation, 2021. http://dx.doi.org/10.36501/0197-9191/21-031.

Full text
Abstract:
High-volume fly ash concrete (HVFAC) is more cost-efficient, sustainable, and durable than conventional concrete. This report presents a state-of-the-art review of HVFAC properties and different fly ash characterization methods. The main challenges identified for HVFAC for pavements are its early-age properties such as air entrainment, setting time, and strength gain, which are the focus of this research. Five fly ash sources in Illinois have been repeatedly characterized through x-ray diffraction, x-ray fluorescence, and laser diffraction over time. The fly ash oxide compositions from the sam
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!