Статті в журналах з теми "Linear camera"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Linear camera.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Linear camera".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Jun Chu, Jun Chu, Li Wang Li Wang, Ruina Feng Ruina Feng, and Guimei Zhang Guimei Zhang. "Linear camera calibration and pose estimation from vanishing points." Chinese Optics Letters 10, s1 (2012): S11007–311011. http://dx.doi.org/10.3788/col201210.s11007.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

KINOSITA, Jr., Kazuhiko, Katsuyuki SHIROGUCHI, Tetsuaki OKAMOTO, Kengo ADACHI, Yasuhiro ONOUE, and Hiroyasu ITOH. "Is Your Video Camera Linear?" Seibutsu Butsuri 45, no. 4 (2005): 216–18. http://dx.doi.org/10.2142/biophys.45.216.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Potapov, A. I., V. E. Makhov, Ya G. Smorodinskii, and E. Ya Manevich. "Smart-Camera–Based Linear Sizing." Russian Journal of Nondestructive Testing 55, no. 7 (July 2019): 524–32. http://dx.doi.org/10.1134/s1061830919070064.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Purnama, Sevia Indah, Irmayatul Hikmah, Mas Aly Afandi, and Elsa Sri Mulyani. "OPTIMASI PEMBACAAN SUHU KAMERA TERMAL MENGGUNAKAN REGRESI LINIER." BAREKENG: Jurnal Ilmu Matematika dan Terapan 15, no. 1 (March 1, 2021): 127–36. http://dx.doi.org/10.30598/barekengvol15iss1pp127-136.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Fever is one of the symptoms of a person with Covid-19. Body temperature must be checked e before entering crowded areas such as schools, offices, shops, and hospitals. It is a mandatory protocol that must be done. One of the tools that can be used to check body temperature is a thermal camera. Thermal cameras have the disadvantage of a high temperature reading error. This is because the thermal camera used has a low resolution. This study aims to reduce the value of the temperature reading error on the thermal camera using the linear regression method. The linear regression method is able to reduce the error rate of temperature readings by 5.27% at 36 ° C reading. The reduction in reading error also occurred by 5.27% at 37 ° C and 6.44% at 38 ° C. Based on the results obtained, this study shows that linear regression can be applied to thermal cameras and provides a decrease in the error rate of temperature readings on thermal cameras
5

Zwanenberg, Oliver van, Sophie Triantaphillidou, Robin Jenkin, and Alexandra Psarrou. "Camera System Performance Derived from Natural Scenes." Electronic Imaging 2020, no. 9 (January 26, 2020): 241–1. http://dx.doi.org/10.2352/issn.2470-1173.2020.9.iqsp-241.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The Modulation Transfer Function (MTF) is a wellestablished measure of camera system performance, commonly employed to characterize optical and image capture systems. It is a measure based on Linear System Theory; thus, its use relies on the assumption that the system is linear and stationary. This is not the case with modern-day camera systems that incorporate non-linear image signal processes (ISP) to improve the output image. Nonlinearities result in variations in camera system performance, which are dependent upon the specific input signals. This paper discusses the development of a novel framework, designed to acquire MTFs directly from images of natural complex scenes, thus making the use of traditional test charts with set patterns redundant. The framework is based on extraction, characterization and classification of edges found within images of natural scenes. Scene derived performance measures aim to characterize non-linear image processes incorporated in modern cameras more faithfully. Further, they can produce ‘live’ performance measures, acquired directly from camera feeds.
6

Fritz, Gerhard, and Alexander Bergmann. "SAXS instruments with slit collimation: investigation of resolution and flux." Journal of Applied Crystallography 39, no. 1 (January 12, 2006): 64–71. http://dx.doi.org/10.1107/s002188980503966x.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Six small-angle X-ray cameras with block collimation systems were simulated, namely the original Kratky camera, a high-flux version of the Kratky camera, a SAXSess (Anton Parr) camera with a focusing mirror in a linear collimation setup and in a pin-hole setup, as well as a similar camera with a parallelizing mirror in a linear and a pin-hole setup. Their performance was examined using Monte Carlo ray-tracing. The Kratky and the SAXSess camera gave resolutions of 64–65 nm, the high-flux Kratky camera gave a resolution of 44 nm, and the camera with parallelizing mirror gave a resolution of 32 nm. The flux of the camera with parallelizing mirror was 1.47 times higher than for the SAXSess camera, and 18.6 times the flux of the Kratky camera. On changing the alignment, the camera with parallelizing mirror exhibited the best performance up to a resolution of 44 nm; the SAXSess camera was better for higher resolutions. Experimental flux measurements agree if no collimation system is added. Measurements of beam profiles and flux including collimation systems show only qualitative agreement because of user-dependent factors during alignment.
7

Moussa, Carol, Louis Hardan, Cynthia Kassis, Rim Bourgi, Walter Devoto, Gilbert Jorquera, Saurav Panda, Roy Abou Fadel, Carlos Enrique Cuevas-Suárez, and Monika Lukomska-Szymanska. "Accuracy of Dental Photography: Professional vs. Smartphone’s Camera." BioMed Research International 2021 (December 15, 2021): 1–7. http://dx.doi.org/10.1155/2021/3910291.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
There is a scant literature on the accuracy of dental photographs captured by Digital Single-Lens Reflex (DSLR) and smartphone cameras. The aim was to compare linear measurements of plaster models photographed with DSLR and smartphone’s camera with digital models. Thirty maxillary casts were prepared. Vertical and horizontal reference lines were marked on each tooth, with exception to molars. Then, models were scanned with the TRIOS 3 Basic intraoral dental scanner (control). Six photographs were captured for each model: one using DSLR camera (Canon EOS 700D) and five with smartphone (iPhone X) (distance range 16-32 cm). Teeth heights and widths were measured on scans and photographs. The following conclusions could be drawn: (1) the measurements of teeth by means of DSLR and smartphone cameras (at distances of at least 24 cm) and scan did not differ. (2) The measurements of anterior teeth by means of DSLR and smartphone cameras (at all distances tested) and scan exhibited no difference. For documentational purposes, the distortion is negligeable, and both camera devices can be applied. Dentists can rely on DSLR and smartphone cameras (at distances of at least 24 cm) for smile designs providing comparable and reliable linear measurements.
8

Antuña-Sánchez, Juan C., Roberto Román, Victoria E. Cachorro, Carlos Toledano, César López, Ramiro González, David Mateos, Abel Calle, and Ángel M. de Frutos. "Relative sky radiance from multi-exposure all-sky camera images." Atmospheric Measurement Techniques 14, no. 3 (March 22, 2021): 2201–17. http://dx.doi.org/10.5194/amt-14-2201-2021.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract. All-sky cameras are frequently used to detect cloud cover; however, this work explores the use of these instruments for the more complex purpose of extracting relative sky radiances. An all-sky camera (SONA202-NF model) with three colour filters narrower than usual for this kind of cameras is configured to capture raw images at seven exposure times. A detailed camera characterization of the black level, readout noise, hot pixels and linear response is carried out. A methodology is proposed to obtain a linear high dynamic range (HDR) image and its uncertainty, which represents the relative sky radiance (in arbitrary units) maps at three effective wavelengths. The relative sky radiances are extracted from these maps and normalized by dividing every radiance of one channel by the sum of all radiances at this channel. Then, the normalized radiances are compared with the sky radiance measured at different sky points by a sun and sky photometer belonging to the Aerosol Robotic Network (AERONET). The camera radiances correlate with photometer ones except for scattering angles below 10∘, which is probably due to some light reflections on the fisheye lens and camera dome. Camera and photometer wavelengths are not coincident; hence, camera radiances are also compared with sky radiances simulated by a radiative transfer model at the same camera effective wavelengths. This comparison reveals an uncertainty on the normalized camera radiances of about 3.3 %, 4.3 % and 5.3 % for 467, 536 and 605 nm, respectively, if specific quality criteria are applied.
9

Gao Junchai, 高俊钗, 雷志勇 Lei Zhiyong, and 王泽民 Wang Zemin. "Image Correction of Linear Array Camera." Laser & Optoelectronics Progress 47, no. 9 (2010): 091501. http://dx.doi.org/10.3788/lop47.091501.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Long Quan and Zhongdan Lan. "Linear N-point camera pose determination." IEEE Transactions on Pattern Analysis and Machine Intelligence 21, no. 8 (1999): 774–80. http://dx.doi.org/10.1109/34.784291.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Zhang, Jian Chuan, Wu Bin Li, and Chang Hou Lu. "Design of Automatic Detection Device for Steel Bar Surface Defects." Advanced Materials Research 532-533 (June 2012): 390–93. http://dx.doi.org/10.4028/www.scientific.net/amr.532-533.390.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
To inspect the surface quality of steel bar, we designed an automatic system including linear camera and laser. Through the comparison among kinds of cameras, we select linear CCD to our system. The laser is also chosen by us with its high luminance and performance. Through a series of computation, we select the appropriate camera lens to our device. At last, we draw the whole detection system. This device has been used well and provides a good foundation for prospective image processing.
12

Rendon-Mancha, Juan Manuel, Antonio Cardenas, Marco A. Garcia, Emilio Gonzalez-Galvan, and Bruno Lara. "Robot Positioning Using Camera-Space Manipulation With a Linear Camera Model." IEEE Transactions on Robotics 26, no. 4 (August 2010): 726–33. http://dx.doi.org/10.1109/tro.2010.2050518.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Zhang, Bao Long, and Jing Yun Wang. "Design of High-Resolution FPD System Based on Doglegged Sampling." Applied Mechanics and Materials 666 (October 2014): 93–97. http://dx.doi.org/10.4028/www.scientific.net/amm.666.93.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
For the shortcomings of methods used to improve the image spatial resolution of current linear charge coupled device (CCD) FPD System, a new sampling method is proposed, and a high-resolution FPD System is designed. Two same linear CCD cameras arc fixed with specific spatial arrangement, namely both camera 1 and camera 2's CCD scan the object with a tilt angle B, and image correction and pixel interpolation arc also used for getting high-resolution image. The experimental results show that, compared with the regular scanning mode equipped with a signal camera and =0¡ã, in our method cameras with slant angle =60¡ãenables the image spatial resolution doubled, and keeps the field of view unchanging. It is easy to realize the designed system in engineering, very economical and convenient to maintain it, and only using the existing imaging device to obtain higher-resolution images.
14

Riza, Nabeel A., and Nazim Ashraf. "First Demonstration of Calibrated Color Imaging by the CAOS Camera." Photonics 8, no. 12 (November 28, 2021): 538. http://dx.doi.org/10.3390/photonics8120538.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The Coded Access Optical Sensor (CAOS) camera is a novel, single unit, full spectrum (UV to short-wave IR bands), linear, high dynamic range (HDR) camera. In this paper, calibrated color target imaging using the CAOS camera and a comparison to a commercial HDR CMOS camera is demonstrated for the first time. The first experiment using a calibrated color check chart indicates that although the CMOS sensor-based camera has an 87 dB manufacturer-specified HDR range, unrestricted usage of this CMOS camera’s output range greatly fails quality color recovery. On the other hand, the intrinsically linear full dynamic range operation CAOS camera color image recovery generally matches the restricted linear-mode commercial CMOS sensor-based camera recovery for the presented 39.5 dB non-HDR target that also matches the near 40 dB linear camera response function (CRF) range of the CMOS camera. Specifically, compared to the color checker chart manufacturer provided XYZ values for the calibrated target, percentage XYZ mean errors of 8.3% and 10.9% are achieved for the restricted linear range CMOS camera and CAOS camera, respectively. An alternate color camera assessment gives CIE ΔE00 mean values of 4.59 and 5.7 for the restricted linear range CMOS camera and CAOS camera, respectively. Unlike the CMOS camera lens optics and its photo-detection electronics, no special linear response optics and photo-detector designs were used for the experimental CAOS camera, nevertheless, a good and equivalent color recovery was achieved. Given the limited HDR linear range capabilities of a CMOS camera and the intrinsically wide linear HDR capability of a CAOS camera, a combined CAOS-CMOS mode of the CAOS smart camera is prudent and can empower HDR color imaging. Applications for such a hybrid camera includes still photography imaging, especially for quantitative imaging of biological samples, valuable artworks and archaeological artefacts that require authentic color data generation for reliable medical decisions as well as forgery preventing verifications.
15

Steger, Carsten, and Markus Ulrich. "A Multi-view Camera Model for Line-Scan Cameras with Telecentric Lenses." Journal of Mathematical Imaging and Vision 64, no. 2 (October 13, 2021): 105–30. http://dx.doi.org/10.1007/s10851-021-01055-x.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
AbstractWe propose a novel multi-view camera model for line-scan cameras with telecentric lenses. The camera model supports an arbitrary number of cameras and assumes a linear relative motion with constant velocity between the cameras and the object. We distinguish two motion configurations. In the first configuration, all cameras move with independent motion vectors. In the second configuration, the cameras are mounted rigidly with respect to each other and therefore share a common motion vector. The camera model can model arbitrary lens distortions by supporting arbitrary positions of the line sensor with respect to the optical axis. We propose an algorithm to calibrate a multi-view telecentric line-scan camera setup. To facilitate a 3D reconstruction, we prove that an image pair acquired with two telecentric line-scan cameras can always be rectified to the epipolar standard configuration, in contrast to line-scan cameras with entocentric lenses, for which this is possible only under very restricted conditions. The rectification allows an arbitrary stereo algorithm to be used to calculate disparity images. We propose an efficient algorithm to compute 3D coordinates from these disparities. Experiments on real images show the validity of the proposed multi-view telecentric line-scan camera model.
16

Do, Yongtae. "A New Linear Explicit Camera Calibration Method." Journal of Sensor Science and Technology 23, no. 1 (January 29, 2014): 66–71. http://dx.doi.org/10.5369/jsst.2014.23.1.66.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Mo Delin, 莫德林, 张永生 Zhang Yongsheng, 王涛 Wang Tao, and 张艳 Zhang Yan. "Imaging Simulation of Airborne Linear Whiskbroom Camera." Acta Optica Sinica 38, no. 7 (2018): 0728002. http://dx.doi.org/10.3788/aos201838.0728002.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Vedel, M., N. Lechocinski, and S. Breugnot. "Compact and robust linear Stokes polarization camera." EPJ Web of Conferences 5 (2010): 01005. http://dx.doi.org/10.1051/epjconf/20100501005.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Hawkins, Amy, and Cindy M. Grimm. "Camera Keyframing using Linear Interpolation of Matrices." Journal of Graphics Tools 12, no. 3 (January 2007): 55–69. http://dx.doi.org/10.1080/2151237x.2007.11674149.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Anjum, Nadeem. "Camera Localization in Distributed Networks Using Trajectory Estimation." Journal of Electrical and Computer Engineering 2011 (2011): 1–13. http://dx.doi.org/10.1155/2011/604647.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This paper presents an algorithm for camera localization using trajectory estimation (CLUTE) in a distributed network of nonoverlapping cameras. The algorithm recovers the extrinsic calibration parameters, namely, the relative position and orientation of the camera network on a common ground plane coordinate system. We first model the observed trajectories in each camera's field of view using Kalman filtering, then we use this information to estimate the missing trajectory information in the unobserved areas by fusing the results of a forward and backward linear regression estimation from adjacent cameras. These estimated trajectories are then filtered and used to recover the relative position and orientation of the cameras by analyzing the estimated and observedexitandentrypoints of an object in each camera's field of view. The final configuration of the network is established by considering one camera as a reference and by adjusting the remaining cameras with respect to this reference. We demonstrate the algorithm on both simulated and real data and compare the results with state-of-the-art approaches. The experimental results show that the proposed algorithm is more robust to noisy and missing data and in case of camera failure.
21

Riza, Nabeel A. "Thinking Camera – Powered by the CAOS Camera Platform." EPJ Web of Conferences 238 (2020): 06012. http://dx.doi.org/10.1051/epjconf/202023806012.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Introduced is the system level design of a Thinking Camera powered by the CAOS camera platform. The proposed camera provides features such as extreme linear dynamic range and full spectrum operations with selectable space-time-frequency CAOS pixel modes using active and passive light. The imager has triple output ports design and is controlled via application-specific classical image processing, machine learning techniques, and human/user feedback. Imaging capabilities cover multi-dimensional mappings allowing diverse full spectrum (350 nm to 2700 nm) applications from industrial metrology to quantitative medical imaging to artefact preserving colour photography.
22

Li, Lin, and Li Xu. "A Linear Camera Calibration Applying in Servo Mechanical Arm System with One Camera." Advanced Materials Research 748 (August 2013): 704–7. http://dx.doi.org/10.4028/www.scientific.net/amr.748.704.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
An easy method for camera intrinsic parameters and one order radial distortion parameter is studied by the mathematical model of camera on the basis of projective theory. It is not necessary to translate or rotate camera in this method, but it only uses several specially points to derive the equations and get the parameters. The results show that this method could quickly and conveniently calibrate camera intrinsic parameters and one-degree radial distortion parameter with good stability and precision, which can be used in servo mechanical arm system.
23

Gracanin, Ana, and Katarina M. Mikac. "The Use of Selfie Camera Traps to Estimate Home Range and Movement Patterns of Small Mammals in a Fragmented Landscape." Animals 12, no. 7 (April 2, 2022): 912. http://dx.doi.org/10.3390/ani12070912.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The use of camera traps to track individual mammals to estimate home range and movement patterns, has not been previously applied to small mammal species. Our aim was to evaluate the use of camera trapping, using the selfie trap method, to record movements of small mammals within and between fragments of habitat. In a fragmented landscape, 164 cameras were set up across four survey areas, with cameras left to record continuously for 28 nights. Live trapping was performed prior to ear mark animals to facilitate individual identification on camera. Four small mammal species (sugar glider; Petaurus breviceps; brown antechinus; Antechinus stuartii, bush rat; Rattus fuscipes, and brown rat; Rattus norvigecus) were recorded on camera (N = 284 individuals). The maximum distance travelled by an individual sugar glider was 14.66 km, antechinus 4.24 km; bush rat 1.90 km and brown rat 1.28 km. Movements of both female and male sugar gliders in linear fragments were recorded at much higher rates than in larger patches of forest sampled in grids. Short term core homes ranges (50% KDE) of 34 sugar gliders ranged from 0.3 ha to 4.2 ha. Sugar glider core home ranges were on average 1.2 ha (±0.17) for females and 2.4 ha (±0.28) for males. The selfie trap is an efficient camera trapping method for estimating home ranges and movements due to its ability to obtain high recapture rates for multiple species and individuals. In our study landscape, linear strips of habitat were readily utilised by all small mammals, highlighting their importance as wildlife corridors in a fragmented landscape.
24

Yang, Ling Hui, Li Jun Wang, Hai Qing Liu, Yong Jie Ren, Jia Rui Lin, and Yin Guo. "Design and Calibration of Real-Time 3D Coordinate Measurement System Based on Multi-Angle Intersection and Cylindrical Imaging." Applied Mechanics and Materials 870 (September 2017): 147–52. http://dx.doi.org/10.4028/www.scientific.net/amm.870.147.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This paper presents a high-resolution real-time 3D coordinate measurement system based on multi-angle intersection and cylindrical imaging. The measuring angle is detected by the linear camera equipped with cylindrical lenses, whose field of view is a 3D space rather than 2D plane. This camera has prominent advantages in precise coordinate measurement and dynamic position tracking due to the high resolution and outstanding frame rate of linear CCD. Each camera is a 1D angle measuring unit which confirms an angle thereby a plane passing through the light spot. With three cameras arrangement in front of the measurement field, the 3D coordinate of the light spot can be reconstructed by multi-angle intersection. An accurate and generic calibration method is introduced to calibrate this camera. The proposed calibration method is based on nonparametric ideas to find the mapping from incoming scene rays to photo-sensitive elements, and this method (black box calibration) is still effective even if the lens distortion is high and asymmetric. It is applicable to a central (single viewpoint) camera equipped with any lenses. The proposed calibration method is applied to the 3D coordinate measurement system. The coordinate measurement accuracy of the designed system is better than 0.49mm.
25

Buil, Christian, Guylaine Prat, and Eric Thouvenot. "An Amateur CCD Camera." International Astronomical Union Colloquium 98 (1988): 105–7. http://dx.doi.org/10.1017/s0252921100092344.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
A CCD camera was built in 1984 using a linear detector (Thomson TH7810A) with 1728 pixels. The simplicity of this sort of detector enabled us to gain experience with CCDs. With a linear detector it is possible to carry out spectroscopy or even two-dimensional imaging by shifting the observed field at right angles to the photosensitive strip and by acquiring data in a regular manner (the scanning technique). The limitations of this equipment soon became apparent when trying to image deep-sky objects, so a second camera was built in 1985 using a CCD array, a Thomson TH7851 with 208 × 144 pixels. Numerous improvements were later made and the most advanced model is described.
26

Xi, K., and Y. Duan. "AMS-3000 LARGE FIELD VIEW AERIAL MAPPING SYSTEM: BASIC PRINCIPLES AND THE WORKFLOW." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B1-2020 (August 6, 2020): 79–84. http://dx.doi.org/10.5194/isprs-archives-xliii-b1-2020-79-2020.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract. Three-line array stereo aerial survey camera is a typical mapping equipment of aerial photogrammetry. As one of the airborne equipment, it can quickly obtain a large range of basic geographic information with high precision. At present, typical three-line array stereoscopic aerial survey cameras, such as Leica ADS40 and 80, have the disadvantages of small field of view and low resolution, which makes it difficult to meet the demand of large-scale topographic mapping for economic construction. For the urgent need of domestic three linear array aerial mapping camera in our project, we developed the AMS-3000 camera system. Camera features include a large field of view, high resolution, low distortion and high environmental adaptability. The AMS-3000 system has reached the international advanced level on both software and hardware aspects.
27

Jiang, Haomiao, Qiyuan Tian, Joyce Farrell, and Brian Wandell. "Local Linear Approximation for Camera Image Processing Pipelines." Electronic Imaging 2016, no. 18 (February 14, 2016): 1–4. http://dx.doi.org/10.2352/issn.2470-1173.2016.18.dpmi-248.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Yao Qingyuan, Pu Jiexin, and Liu Zhonghua. "Concentric Circles Calibration Method of Linear CCD Camera." International Journal of Digital Content Technology and its Applications 7, no. 4 (February 28, 2013): 643–51. http://dx.doi.org/10.4156/jdcta.vol7.issue4.77.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
29

YANG, Sen, and Fu-Chao WU. "Weighted Linear Methods for the Camera Pose Estimation." Journal of Software 22, no. 10 (October 25, 2011): 2476–87. http://dx.doi.org/10.3724/sp.j.1001.2011.03916.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
30

SUN Jun-feng, 孙俊锋, 丁少闻 DING Shao-wen, 张小虎 ZHANG Xiao-hu, and 张跃强 ZHANG Yue-qiang. "Optimization of camera parameters based on linear model." Optics and Precision Engineering 25, no. 10 (2017): 2767–77. http://dx.doi.org/10.3788/ope.20172510.2767.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Zhao, Xia, Bin Liu, and Chunhui Yang. "A new calibration method for linear CCD camera." Procedia Engineering 7 (2010): 290–96. http://dx.doi.org/10.1016/j.proeng.2010.11.047.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Eytan, Giora E. "Airborne camera based on one-dimensional linear array." Optical Engineering 30, no. 4 (1991): 483. http://dx.doi.org/10.1117/12.55809.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Zheng, Yinqiang. "Camera calibration using two concentric circles: linear approach." Optical Engineering 48, no. 5 (May 1, 2009): 053602. http://dx.doi.org/10.1117/1.3130213.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Zhang, Zhuang, Rujin Zhao, Enhai Liu, Kun Yan, and Yuebo Ma. "A single-image linear calibration method for camera." Measurement 130 (December 2018): 298–305. http://dx.doi.org/10.1016/j.measurement.2018.07.085.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Usamentiaga, R., D. F. Garcia, and F. J. de la Calle. "Line-scan camera calibration: a robust linear approach." Applied Optics 59, no. 30 (October 16, 2020): 9443. http://dx.doi.org/10.1364/ao.404774.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Paik, Dong-Soo, Kyoung-Ho Yoo, Chong-Yun Kang, Bong-Hee Cho, Sahn Nam, and Seok-Jin Yoon. "Multilayer piezoelectric linear ultrasonic motor for camera module." Journal of Electroceramics 22, no. 1-3 (June 18, 2008): 346–51. http://dx.doi.org/10.1007/s10832-008-9513-3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Melanitis, Nikos, and Petros Maragos. "A linear method for camera pair self-calibration." Computer Vision and Image Understanding 210 (September 2021): 103223. http://dx.doi.org/10.1016/j.cviu.2021.103223.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Gong, Xuanrui, Yaowen Lv, Xiping Xu, Yuxuan Wang, and Mengdi Li. "Pose Estimation of Omnidirectional Camera with Improved EPnP Algorithm." Sensors 21, no. 12 (June 10, 2021): 4008. http://dx.doi.org/10.3390/s21124008.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The omnidirectional camera, having the advantage of broadening the field of view, realizes 360° imaging in the horizontal direction. Due to light reflection from the mirror surface, the collinearity relation is altered and the imaged scene has severe nonlinear distortions. This makes it more difficult to estimate the pose of the omnidirectional camera. To solve this problem, we derive the mapping from omnidirectional camera to traditional camera and propose an omnidirectional camera linear imaging model. Based on the linear imaging model, we improve the EPnP algorithm to calculate the omnidirectional camera pose. To validate the proposed solution, we conducted simulations and physical experiments. Results show that the algorithm has a good performance in resisting noise.
39

Elsheshtawy, Amr M., and Larisa A. Gavrilova. "Improving Linear Projects Georeferencing to Create Digital Models Using UAV Imagery." E3S Web of Conferences 310 (2021): 04001. http://dx.doi.org/10.1051/e3sconf/202131004001.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Global Positioning System (GPS) on Unmanned Aerial Vehicles (UAV) platform relies on Micro Electro Mechanical Systems (MEMS) technology with a precision of 10 m at shooting time at UAV camera stations positions. Nonetheless, obstacles to the GPS signal at the finest flight altitude can prevent accurate camera stations positions retrieval. In this research, three different georeferencing techniques were compared with geometric precision. The first is Direct Georeferencing (DG), which mainly depends on using Navigation GPS onboard without using any Ground Control Points (GCPs). The second is Indirect Georeferencing (IG), which mainly depends on three GCPs used to assist Aero-Triangulation (AT). The third is Modified technique depends on the same three GCPs used in the second method and enhanced location of camera stations usage of the Linear Relation Model (LR Model). The study area was in the south of the Moscow Region, Russia. Threeimaging strips have been taken using the DJI PHANTOM 4 PRO UAV. The accuracy assessment was carried out using image-derived coordinates and checkpoints (CPs) residuals. This study emphasizes that the Modified methodology using enhanced camera stations positions gave better accuracy than using the drone GPS camera stations positions.
40

Kelly, Julia, Natascha Kljun, Per-Ola Olsson, Laura Mihai, Bengt Liljeblad, Per Weslien, Leif Klemedtsson, and Lars Eklundh. "Challenges and Best Practices for Deriving Temperature Data from an Uncalibrated UAV Thermal Infrared Camera." Remote Sensing 11, no. 5 (March 8, 2019): 567. http://dx.doi.org/10.3390/rs11050567.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Miniaturized thermal infrared (TIR) cameras that measure surface temperature are increasingly available for use with unmanned aerial vehicles (UAVs). However, deriving accurate temperature data from these cameras is non-trivialsince they are highly sensitive to changes in their internal temperature and low-cost models are often not radiometrically calibrated. We present the results of laboratory and field experiments that tested the extent of the temperature-dependency of a non-radiometric FLIR Vue Pro 640. We found that a simple empirical line calibration using at least three ground calibration points was sufficient to convert camera digital numbers to temperature values for images captured during UAV flight. Although the camera performed well under stable laboratory conditions (accuracy ±0.5 °C), the accuracy declined to ±5 °C under the changing ambient conditions experienced during UAV flight. The poor performance resulted from the non-linear relationship between camera output and sensor temperature, which was affected by wind and temperature-drift during flight. The camera’s automated non-uniformity correction (NUC) could not sufficiently correct for these effects. Prominent vignetting was also visible in images captured under both stable and changing ambient conditions. The inconsistencies in camera output over time and across the sensor will affect camera applications based on relative temperature differences as well as user-generated radiometric calibration. Based on our findings, we present a set of best practices for UAV TIR camera sampling to minimize the impacts of the temperature dependency of these systems.
41

Li, Xiang Zhi. "A Novel Image Motion Compensation Method for the Pendular Aeronautical Camera." Advanced Materials Research 760-762 (September 2013): 1402–5. http://dx.doi.org/10.4028/www.scientific.net/amr.760-762.1402.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This paper puts forward an improved image motion compensation method according to the complex mechanical system and expensive cost problem when the pendular aeronautical camera uses multiple sensors to compensate image motion. This method can only use a linear position sensor to obtain a linear output voltage signal and a sinusoidal output voltage signal, thus a linear position sensor can not only complete image motion compensation function but also complete pendulum sweep range positioning. Finally the experiments has proved that the method this paper has chosen not only simplifies the mechanical system design and reduces the aerial camera cost, but also can meet the accuracy and real-time requirements of various pendular aeronautical camera or space camera.
42

Appelt, T., J. van der Lucht, M. Bleier, and A. Nüchter. "CALIBRATION AND VALIDATION OF THE INTEL T265 FOR VISUAL LOCALISATION AND TRACKING UNDERWATER." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLIII-B2-2021 (June 28, 2021): 635–41. http://dx.doi.org/10.5194/isprs-archives-xliii-b2-2021-635-2021.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Abstract. Localization and navigation for autonomous underwater vehicle (AUV) has always been a major challenge and many situations complex solutions had to be devised. One of the main approaches is visual odometry using a stereo camera. In this study, the Intel T265 fisheye stereo camera has been calibrated and tested to determine it’s usability for localisation and navigation under water as an alternative to more complex systems. Firstly the Intel T265 fisheye stereo camera was appropriately calibrated inside a water filled container. This calibration consisting of camera and distortion parameters got programmed onto the T265 fisheye stereo camera to take the differences between land and underwater usage into account. Successive the calibration, the accuracy and the precision of the T265 fisheye stereo camera were tested using a linear, a circular and finally a chaotic motion. This includes a review of the localisation and tracking of the cameras visual odometry compared to a ground truth provided by an OptiTrack V120:Trio to account for scaling, accuracy and precision. Also experiments to determine the usability with fast chaotic motions were performed and analysed. Finally, a conclusion concerning the applicability of the Intel T265 fisheye stereo camera, the challenges using this model, the possibilities for low cost operations and the main challenges for future work is conducted.
43

Oda, K. "Quasi-five point algorithm with non-linear minimization." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XL-5 (June 6, 2014): 473–78. http://dx.doi.org/10.5194/isprsarchives-xl-5-473-2014.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Five-point algorithm is a powerful tool for relative orientation, because it requires no initial assumption of camera position. This algorithm determines an essential matrix from five point correspondences between two calibrated cameras, but results multiple solutions and some selecting process is required. This paper proposes Quasi-Five-Point Algorithm which is non-linear solver with seed solution of 8 point algorithm. The method tries to calculate the appropriate essential matrix without selecting process among multiple solutions. It is one of non-linear approach, but tries to find an appropriate seed before non-linear calculation. Using correspondences of 3 or more additional points, seed values of the solution is calculated. In this paper relationship between traditional parametric relative orientation and essential matrix is discussed, and after that quasi-five-point algorithm is introduced.
44

Hammond, Joshua E., Cory A. Vernon, Trent J. Okeson, Benjamin J. Barrett, Samuel Arce, Valerie Newell, Joseph Janson, Kevin W. Franke, and John D. Hedengren. "Survey of 8 UAV Set-Covering Algorithms for Terrain Photogrammetry." Remote Sensing 12, no. 14 (July 16, 2020): 2285. http://dx.doi.org/10.3390/rs12142285.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Remote sensing with unmanned aerial vehicles (UAVs) facilitates photogrammetry for environmental and infrastructural monitoring. Models are created with less computational cost by reducing the number of photos required. Optimal camera locations for reducing the number of photos needed for structure-from-motion (SfM) are determined through eight mathematical set-covering algorithms as constrained by solve time. The algorithms examined are: traditional greedy, reverse greedy, carousel greedy (CG), linear programming, particle swarm optimization, simulated annealing, genetic, and ant colony optimization. Coverage and solve time are investigated for these algorithms. CG is the best method for choosing optimal camera locations as it balances number of photos required and time required to calculate camera positions as shown through an analysis similar to a Pareto Front. CG obtains a statistically significant 3.2 fewer cameras per modeled area than base greedy algorithm while requiring just one additional order of magnitude of solve time. For comparison, linear programming is capable of fewer cameras than base greedy but takes at least three orders of magnitude longer to solve. A grid independence study serves as a sensitivity analysis of the CG algorithms α (iteration number) and β (percentage to be recalculated) parameters that adjust traditional greedy heuristics, and a case study at the Rock Canyon collection dike in Provo, UT, USA, compares the results of all eight algorithms and the uniqueness (in terms of percentage comparisons based on location/angle metadata and qualitative visual comparison) of each selected set. Though this specific study uses SfM, the principles could apply to other instruments such as multi-spectral cameras or aerial LiDAR.
45

Maguire, Mitchell S., Christopher M. U. Neale, and Wayne E. Woldt. "Improving Accuracy of Unmanned Aerial System Thermal Infrared Remote Sensing for Use in Energy Balance Models in Agriculture Applications." Remote Sensing 13, no. 9 (April 22, 2021): 1635. http://dx.doi.org/10.3390/rs13091635.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Unmanned aerial system (UAS) remote sensing has rapidly expanded in recent years, leading to the development of several multispectral and thermal infrared sensors suitable for UAS integration. Remotely sensed thermal infrared imagery has been used to detect crop water stress and manage irrigation by leveraging the increased thermal signatures of water stressed plants. Thermal infrared cameras suitable for UAS remote sensing are often uncooled microbolometers. This type of thermal camera is subject to inaccuracies not typically present in cooled thermal cameras. In addition, atmospheric interference also may present inaccuracies in measuring surface temperature. In this study, a UAS with integrated FLIR Duo Pro R (FDPR) thermal camera was used to collect thermal imagery over a maize and soybean field that contained twelve infrared thermometers (IRT) that measured surface temperature. Surface temperature measurements from the UAS FDPR thermal imagery and field IRTs corrected for emissivity and atmospheric interference were compared to determine accuracy of the FDPR thermal imagery. The comparison of the atmospheric interference corrected UAS FDPR and IRT surface temperature measurements yielded a RMSE of 2.24 degree Celsius and a R2 of 0.85. Additional approaches for correcting UAS FDPR thermal imagery explored linear, second order polynomial and artificial neural network models. These models simplified the process of correcting UAS FDPR thermal imagery. All three models performed well, with the linear model yielding a RMSE of 1.27 degree Celsius and a R2 of 0.93. Laboratory experiments also were completed to test the measurement stability of the FDPR thermal camera over time. These experiments found that the thermal camera required a warm-up period to achieve stability in thermal measurements, with increased warm-up duration likely improving accuracy of thermal measurements.
46

Dongming, LAI, KONG Linghua, LIAN Guofu, YI Dingrong, and ZHU Xingxing. "Differential confocal measurement method of linear scanning based on dual linear array camera." Journal of Applied Optics 43, no. 2 (2022): 298–303. http://dx.doi.org/10.5768/jao202243.0203004.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Baek, Ji-Yeon, Jong-Dae Kim, Chan-Young Park, Yu-Seop Kim, and Ji-Soo Hwang. "Evaluation System of Open Platform Cameras for Bio-Imaging." Engineering Proceedings 6, no. 1 (May 17, 2021): 44. http://dx.doi.org/10.3390/i3s2021dresden-10093.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
With the development of smartphones, cameras based on ultra-small, high-definition, and open platforms have been mass-produced. In this paper, we outline how we built an emulation system to verify the bio-imaging performance of bulky and expensive high-performance cameras previously used in bio-imaging devices, and various smartphone cameras. Four types of camera were tested in the emulator, and the gel image analysis results were compared by selecting three cameras with more linear changes in slope, which matched the performance evaluation in the emulator.
48

Jin, Dian Chuan, Xiao Li Meng, and Hai Ming Wu. "Research on the Model of Positioning System and its Application." Applied Mechanics and Materials 50-51 (February 2011): 463–67. http://dx.doi.org/10.4028/www.scientific.net/amm.50-51.463.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Digital Cameras positioning has a wide range of application in the aspect of traffic monitoring (e-police). The introduction of space Cartesian coordinate system together with the principle of image formation of digital camera and the physical relations among image, camera, and the original picture is where the general linear space model, with the Gaussian formula, Physical optics imaging as the constraint condition, is built, putting forward the theoretical formula of pinpointed coordinates of the center of the circle in the target plane on the plane of camera. Then we can safely draw the conclusion that the track of the target plane formed on the plane of an oval track. Second, the use of graphic tools extracting the track of the edge can help to fit out of elliptic equations, showing the coordinates of the five centers.
49

Xiong, Jiu Long, Jun Ying Xia, Xian Quan Xu, and Zhen Tian. "A Novel Method of Stereo Camera Calibration Using BP Neural Network." Applied Mechanics and Materials 29-32 (August 2010): 2692–97. http://dx.doi.org/10.4028/www.scientific.net/amm.29-32.2692.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Camera calibration establishes the relationship between 2D coordinates in the image and 3D coordinates in the 3D world. BP neural network can model non-linear relationship, and therefore was used for calibrating camera by avoiding the non-linear factors of the camera in this paper. The calibration results are compared with the results of Tsai’s two stage method. The comparison show that calibration method based BP neural network improved the calibration accuracy.
50

Lai Hanxuan, 赖瀚轩, 张征宇 Zhang Zhengyu, 朱龙 Zhu Long, 彭章国 Peng Zhangguo, and 茆骥 Mao Ji. "Nonlinear Distortion Correction of Camera Based on Linear Characteristic." Laser & Optoelectronics Progress 53, no. 2 (2016): 021502. http://dx.doi.org/10.3788/lop53.021502.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

До бібліографії