Academic literature on the topic 'Camera synchronisation'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Camera synchronisation.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Camera synchronisation"
Schmitt, Robert, and Yu Cai. "Single camera-based synchronisation within a concept of robotic assembly in motion." Assembly Automation 34, no. 2 (April 1, 2014): 160–68. http://dx.doi.org/10.1108/aa-04-2013-040.
Full textTay, Cheryl Sihui, and Pui Wah Kong. "A Video-Based Method to Quantify Stroke Synchronisation in Crew Boat Sprint Kayaking." Journal of Human Kinetics 65, no. 1 (December 31, 2018): 45–56. http://dx.doi.org/10.2478/hukin-2018-0038.
Full textAdorna, Marcel, Petr Zlámal, Tomáš Fíla, Jan Falta, Markus Felten, Michael Fries, and Anne Jung. "TESTING OF HYBRID NICKEL-POLYURETHANE FOAMS AT HIGH STRAIN-RATES USING HOPKINSON BAR AND DIGITAL IMAGE CORRELATION." Acta Polytechnica CTU Proceedings 18 (October 23, 2018): 72. http://dx.doi.org/10.14311/app.2018.18.0072.
Full textCrump, Duncan A., Janice M. Dulieu-Barton, and Marco L. Longana. "Approaches to Synchronise Conventional Measurements with Optical Techniques at High Strain Rates." Applied Mechanics and Materials 70 (August 2011): 75–80. http://dx.doi.org/10.4028/www.scientific.net/amm.70.75.
Full textSzymczyk, Tomasz, and Stanisław Skulimowski. "The study of human behaviour in a laboratory set-up with the use of innovative technology." MATEC Web of Conferences 252 (2019): 02010. http://dx.doi.org/10.1051/matecconf/201925202010.
Full textIseli, C., and A. Lucieer. "TREE SPECIES CLASSIFICATION BASED ON 3D SPECTRAL POINT CLOUDS AND ORTHOMOSAICS ACQUIRED BY SNAPSHOT HYPERSPECTRAL UAS SENSOR." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W13 (June 4, 2019): 379–84. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w13-379-2019.
Full textDissertations / Theses on the topic "Camera synchronisation"
Nguyen, Thanh-Tin. "Auto-calibration d'une multi-caméra omnidirectionnelle grand public fixée sur un casque." Thesis, Université Clermont Auvergne (2017-2020), 2017. http://www.theses.fr/2017CLFAC060/document.
Full text360 degree and spherical multi-cameras built by fixing together several consumer cameras become popular and are convenient for recent applications like immersive videos, 3D modeling and virtual reality. This type of cameras allows to include the whole scene in a single view.When the goal of our applications is to merge monocular videos together into one cylinder video or to obtain 3D informations from environment,there are several basic steps that should be performed beforehand.Among these tasks, we consider the synchronization between cameras; the calibration of multi-camera system including intrinsic and extrinsic parameters (i.e. the relative poses between cameras); and the rolling shutter calibration. The goal of this thesis is to develop and apply user friendly method. Our approach does not require a calibration pattern. First, the multi-camera is initialized thanks to assumptions that are suitable to an omnidirectional camera without a privileged direction:the cameras have the same setting (frequency, image resolution, field-of-view) and are roughly equiangular.Second, a frame-accurate synchronization is estimated from instantaneous angular velocities of each camera provided by monocular Structure-from-Motion.Third, both inter-camera poses and intrinsic parameters are refined using multi-camera Structure-from-Motion and bundle adjustment.Last, we introduce a bundle adjustment that estimates not only the usual parameters but also a subframe-accurate synchronization and the rolling shutter. We experiment in a context that we believe useful for applications (3D modeling and 360 videos):several consumer cameras or a spherical camera mounted on a helmet and moving along trajectories of several hundreds of meters or kilometers
Zara, Henri. "Système d'acquisition vidéo rapide : application à la mécanique des fluides." Saint-Etienne, 1997. http://www.theses.fr/1997STET4012.
Full textPooley, Daniel William. "The automated synchronisation of independently moving cameras." 2008. http://hdl.handle.net/2440/49461.
Full textThesis (Ph.D.) - University of Adelaide, School of Computer Science, 2008
Benrhaiem, Rania. "Méthodes d’analyse de mouvement en vision 3D : invariance aux délais temporels entre des caméras non synchronisées et flux optique par isocontours." Thèse, 2016. http://hdl.handle.net/1866/18469.
Full textIn this thesis we focused on two computer vision subjects. Both of them concern motion analysis in a dynamic scene seen by one or more cameras. The first subject concerns motion capture using unsynchronised cameras. This causes many correspondence errors and 3D reconstruction errors. In contrast with existing material solutions trying to minimize the temporal delay between the cameras, we propose a software solution ensuring an invariance to the existing temporal delay. We developed a method that finds the good correspondence between points regardless of the temporal delay. It solves the resulting spatial shift and finds the correct position of the shifted points. In the second subject, we focused on the optical flow problem using a different approach than the ones in the state of the art. In most applications, optical flow is used for real-time motion analysis. It is then important to be performed in a reduced time. In general, existing optical flow methods are classified into two main categories: either precise and dense but computationally intensive, or fast but less precise and less dense. In this work, we propose an alternative solution being at the same time, fast and precise. To do this, we propose extracting intensity isocontours to find corresponding points representing the related optical flow. By addressing these problems we made two major contributions.
Conference papers on the topic "Camera synchronisation"
Imre, Evren, and Adrian Hilton. "Through-the-Lens Synchronisation for Heterogeneous Camera Networks." In British Machine Vision Conference 2012. British Machine Vision Association, 2012. http://dx.doi.org/10.5244/c.26.97.
Full textDuckworth, Tobias, and David J. Roberts. "Camera Image Synchronisation in Multiple Camera Real-Time 3D Reconstruction of Moving Humans." In 2011 IEEE/ACM 15th International Symposium on Distributed Simulation and Real Time Applications (DS-RT). IEEE, 2011. http://dx.doi.org/10.1109/ds-rt.2011.15.
Full textImre, Evren, Jean-Yves Guillemaut, and Adrian Hilton. "Through-the-Lens Multi-camera Synchronisation and Frame-Drop Detection for 3D Reconstruction." In 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission (3DIMPVT). IEEE, 2012. http://dx.doi.org/10.1109/3dimpvt.2012.31.
Full textAerts, Peter, and Eric Demeester. "Time Synchronisation of Low-cost Camera Images with IMU Data based on Similar Motion." In 16th International Conference on Informatics in Control, Automation and Robotics. SCITEPRESS - Science and Technology Publications, 2019. http://dx.doi.org/10.5220/0007837602920299.
Full textAinsworth, Roger W., and Steven J. Thorpe. "The Development of a Doppler Global Velocimeter for Transonic Turbine Applications." In ASME 1994 International Gas Turbine and Aeroengine Congress and Exposition. American Society of Mechanical Engineers, 1994. http://dx.doi.org/10.1115/94-gt-146.
Full text