Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: 2D Eye Gaze Estimation.

Artykuły w czasopismach na temat „2D Eye Gaze Estimation”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „2D Eye Gaze Estimation”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Ibrahim, Furkan Ince, and Woo Kim Jin. "A 2D Eye Gaze Estimation System with Low-Resolution Webcam Images." EURASIP Journal on Advances in Signal Processing 2011, no. 40 (2011): 1–11. https://doi.org/10.1186/1687-6180-2011-40.

Pełny tekst źródła
Streszczenie:
In this article, a low-cost system for 2D eye gaze estimation with low-resolution webcam images is presented. Two algorithms are proposed for this purpose, one for the eye-ball detection with stable approximate pupil-center and the other one for the eye movements' direction detection. Eyeball is detected using deformable angular integral search by minimum intensity (DAISMI) algorithm. Deformable template-based 2D gaze estimation (DTBGE) algorithm is employed as a noise filter for deciding the stable movement decisions. While DTBGE employs binary images, DAISMI employs gray-scale images. Ri
Style APA, Harvard, Vancouver, ISO itp.
2

Yusuf, Sait Erdem, Furkan Ince Ibrahim, Kusetogullari Huseyin, and Haidar Sharif Md. "Computer Game Controlled by Eye Movements." International Journal of Scientific Research in Information Systems and Engineering (IJSRISE) 1, no. 2 (2015): 97–102. https://doi.org/10.5281/zenodo.836155.

Pełny tekst źródła
Streszczenie:
Sundry years ago people played video games for fun merely. Nowadays, video games are correlated to education, medicine, and researches. In this paper, we have addressed a computer game which takes input from a video camera by detecting user looking direction as well as eye gestures. Since webcam is easily accessible, we have carefully weighed it as video input device. We have tried to use the eye movements as the human computer interaction (HCI) tool, which would be used instead of a mouse. In general, this furnishes much easier and faster interaction with computer for everyone especially elde
Style APA, Harvard, Vancouver, ISO itp.
3

Huynh, Sinh, Rajesh Krishna Balan, and JeongGil Ko. "iMon." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, no. 4 (2021): 1–26. http://dx.doi.org/10.1145/3494999.

Pełny tekst źródła
Streszczenie:
Gaze tracking is a key building block used in many mobile applications including entertainment, personal productivity, accessibility, medical diagnosis, and visual attention monitoring. In this paper, we present iMon, an appearance-based gaze tracking system that is both designed for use on mobile phones and has significantly greater accuracy compared to prior state-of-the-art solutions. iMon achieves this by comprehensively considering the gaze estimation pipeline and then overcoming three different sources of errors. First, instead of assuming that the user's gaze is fixed to a single 2D coo
Style APA, Harvard, Vancouver, ISO itp.
4

Kim, Jin-Woo. "Webcam-Based 2D Eye Gaze Estimation System By Means of Binary Deformable Eyeball Templates." Journal of information and communication convergence engineering 8, no. 5 (2010): 575–80. http://dx.doi.org/10.6109/jicce.2010.8.5.575.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Pomianek, Mateusz, Marek Piszczek, Marcin Maciejewski, and Piotr Krukowski. "The stability of the MEMS 2D mirror's operating point in terms of eye tracking systems." Photonics Letters of Poland 12, no. 2 (2020): 43. http://dx.doi.org/10.4302/plp.v12i2.1016.

Pełny tekst źródła
Streszczenie:
This paper describes research on the stability of the MEMS mirror for use in eye tracking systems. MEMS mirrors are the main element in scanning methods (which is one of the methods of eye tracking). Due to changes in the mirror pitch, the system can scan the area of the eye with a laser and collect the signal reflected. However, this method works on the assumption that the inclinations are constant in each period. The instability of this causes errors. The aim of this work is to examine the error level caused by pitch instability at different points of work. Full Text: PDF ReferencesW. Fuhl,
Style APA, Harvard, Vancouver, ISO itp.
6

Kim, Seunghyun, Seungkeon Lee, and Eui Chul Lee. "Advancements in Gaze Coordinate Prediction Using Deep Learning: A Novel Ensemble Loss Approach." Applied Sciences 14, no. 12 (2024): 5334. http://dx.doi.org/10.3390/app14125334.

Pełny tekst źródła
Streszczenie:
Recent advancements in deep learning have enabled gaze estimation from images of the face and eye areas without the need for precise geometric locations of the eyes and face. This approach eliminates the need for complex user-dependent calibration and the issues associated with extracting and tracking geometric positions, making further exploration of gaze position performance enhancements challenging. Motivated by this, our study focuses on an ensemble loss function that can enhance the performance of existing 2D-based deep learning models for gaze coordinate (x, y) prediction. We propose a n
Style APA, Harvard, Vancouver, ISO itp.
7

Zhu, Bo, Peng Yun Zhang, Jian Nan Chi, and Tian Xia Zhang. "Gaze Estimation Based on Single Camera." Advanced Materials Research 655-657 (January 2013): 1066–76. http://dx.doi.org/10.4028/www.scientific.net/amr.655-657.1066.

Pełny tekst źródła
Streszczenie:
A new gaze tracking method used in single camera gaze tracking system is proposed. The method can be divided into human face and eye location, human features detection and gaze parameters extraction, and ELM based gaze point estimation. In face and eye location, a face detection method which combines skin color model with Adaboost method is used for fast human face detection. In eye features and gaze parameters extraction, many image processing methods are used to detect eye features such as iris center, inner eye corner and so on. And then gaze parameter which is the vector from iris center t
Style APA, Harvard, Vancouver, ISO itp.
8

Govind, Surya. "WEBCAM BASED EYE-GAZE ESTIMATION." International Journal of Engineering Applied Sciences and Technology 04, no. 06 (2019): 144–49. http://dx.doi.org/10.33564/ijeast.2019.v04i06.025.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Jian-Gang Wang and E. Sung. "Study on eye gaze estimation." IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics) 32, no. 3 (2002): 332–50. http://dx.doi.org/10.1109/tsmcb.2002.999809.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Mohan, Susmitha, and Manoj Phirke. "Eye Gaze Estimation Invisible and IR Spectrum for Driver Monitoring System." Signal & Image Processing : An International Journal 11, no. 5 (2020): 1–20. http://dx.doi.org/10.5121/sipij.2020.11501.

Pełny tekst źródła
Streszczenie:
Driver monitoring system has gained lot of popularity in automotive sector to ensure safety while driving. Collisions due to driver inattentiveness or driver fatigue or over reliance on autonomous driving features arethe major reasons for road accidents and fatalities. Driver monitoring systems aims to monitor various aspect of driving and provides appropriate warnings whenever required. Eye gaze estimation is a key element in almost all of the driver monitoring systems. Gaze estimation aims to find the point of gaze which is basically,” -where is driver looking”. This helps in understanding i
Style APA, Harvard, Vancouver, ISO itp.
11

Nadella, Bhargavi. "Eye Detection and Tracking and Eye Gaze Estimation." Asia-pacific Journal of Convergent Research Interchange 1, no. 2 (2015): 25–42. http://dx.doi.org/10.21742/apjcri.2015.06.04.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
12

Kwon, Yong-Moo, Kyeong-Won Jeon, Jeongseok Ki, Qonita M. Shahab, Sangwoo Jo, and Sung-Kyu Kim. "3D Gaze Estimation and Interaction to Stereo Display." International Journal of Virtual Reality 5, no. 3 (2006): 41–45. http://dx.doi.org/10.20870/ijvr.2006.5.3.2697.

Pełny tekst źródła
Streszczenie:
There are several researches on 2D gaze tracking techniques to the 2D screen for the Human-Computer Interaction. However, the researches for the gaze-based interaction to the stereo images or 3D contents are not reported. The stereo display techniques are emerging now for the reality service. Moreover, the 3D interaction techniques are needed in the 3D contents service environments. This paper presents 3D gaze estimation technique and its application to gaze-based interaction in the parallax barrier stereo display
Style APA, Harvard, Vancouver, ISO itp.
13

Cheng, Hong, Yaqi Liu, Wenhao Fu, et al. "Gazing point dependent eye gaze estimation." Pattern Recognition 71 (November 2017): 36–44. http://dx.doi.org/10.1016/j.patcog.2017.04.026.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
14

Narcizo, Fabricio Batista, Fernando Eustáquio Dantas dos Santos, and Dan Witzner Hansen. "High-Accuracy Gaze Estimation for Interpolation-Based Eye-Tracking Methods." Vision 5, no. 3 (2021): 41. http://dx.doi.org/10.3390/vision5030041.

Pełny tekst źródła
Streszczenie:
This study investigates the influence of the eye-camera location associated with the accuracy and precision of interpolation-based eye-tracking methods. Several factors can negatively influence gaze estimation methods when building a commercial or off-the-shelf eye tracker device, including the eye-camera location in uncalibrated setups. Our experiments show that the eye-camera location combined with the non-coplanarity of the eye plane deforms the eye feature distribution when the eye-camera is far from the eye’s optical axis. This paper proposes geometric transformation methods to reshape th
Style APA, Harvard, Vancouver, ISO itp.
15

Kar, Anuradha, and Peter Corcoran. "Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and Visualizations." Sensors 18, no. 9 (2018): 3151. http://dx.doi.org/10.3390/s18093151.

Pełny tekst źródła
Streszczenie:
An eye tracker’s accuracy and system behavior play critical roles in determining the reliability and usability of eye gaze data obtained from them. However, in contemporary eye gaze research, there exists a lot of ambiguity in the definitions of gaze estimation accuracy parameters and lack of well-defined methods for evaluating the performance of eye tracking systems. In this paper, a set of fully defined evaluation metrics are therefore developed and presented for complete performance characterization of generic commercial eye trackers, when they operate under varying conditions on desktop or
Style APA, Harvard, Vancouver, ISO itp.
16

Golard, Andre, and Sachin S. Talathi. "Ultrasound for Gaze Estimation—A Modeling and Empirical Study." Sensors 21, no. 13 (2021): 4502. http://dx.doi.org/10.3390/s21134502.

Pełny tekst źródła
Streszczenie:
Most eye tracking methods are light-based. As such, they can suffer from ambient light changes when used outdoors, especially for use cases where eye trackers are embedded in Augmented Reality glasses. It has been recently suggested that ultrasound could provide a low power, fast, light-insensitive alternative to camera-based sensors for eye tracking. Here, we report on our work on modeling ultrasound sensor integration into a glasses form factor AR device to evaluate the feasibility of estimating eye-gaze in various configurations. Next, we designed a benchtop experimental setup to collect em
Style APA, Harvard, Vancouver, ISO itp.
17

Ansari, Mohd Faizan, Pawel Kasprowski, and Marcin Obetkal. "Gaze Tracking Using an Unmodified Web Camera and Convolutional Neural Network." Applied Sciences 11, no. 19 (2021): 9068. http://dx.doi.org/10.3390/app11199068.

Pełny tekst źródła
Streszczenie:
Gaze estimation plays a significant role in understating human behavior and in human–computer interaction. Currently, there are many methods accessible for gaze estimation. However, most approaches need additional hardware for data acquisition which adds an extra cost to gaze tracking. The classic gaze tracking approaches usually require systematic prior knowledge or expertise for practical operations. Moreover, they are fundamentally based on the characteristics of the eye region, utilizing infrared light and iris glint to track the gaze point. It requires high-quality images with particular
Style APA, Harvard, Vancouver, ISO itp.
18

Cheng, Yihua, Yiwei Bao, and Feng Lu. "PureGaze: Purifying Gaze Feature for Generalizable Gaze Estimation." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 1 (2022): 436–43. http://dx.doi.org/10.1609/aaai.v36i1.19921.

Pełny tekst źródła
Streszczenie:
Gaze estimation methods learn eye gaze from facial features. However, among rich information in the facial image, real gaze-relevant features only correspond to subtle changes in eye region, while other gaze-irrelevant features like illumination, personal appearance and even facial expression may affect the learning in an unexpected way. This is a major reason why existing methods show significant performance degradation in cross-domain/dataset evaluation. In this paper, we tackle the cross-domain problem in gaze estimation. Different from common domain adaption methods, we propose a domain ge
Style APA, Harvard, Vancouver, ISO itp.
19

Rathi, K., and K. Srinivasan. "Lulu Filterized Lin’s correlative Theil-Sen regression-based fully connected deep multilayer perceptive neural network for eye gaze pattern recognition." Yugoslav Journal of Operations Research, no. 00 (2024): 27. http://dx.doi.org/10.2298/yjor240215027r.

Pełny tekst źródła
Streszczenie:
Gaze estimation is process finding the point of gaze on observe axis of eye. Gaze tracking schemes are mainly employed in HCI and study of visual scanning samples. Traditional tracking schemes usually need accurate personal calibration procedure to evaluate the particular eye metrics. In order to improve the accurate gaze estimation, Lulu Filterized Lin?s Correlative Theil-Sen Regression-based Fully Connected Deep Multilayer Perceptive Neural Network (LFLCTR-FCDMPNN) is designed for accurate gaze pattern identification through lesser time consumption. Fully Connected Deep Multilayer Perceptive
Style APA, Harvard, Vancouver, ISO itp.
20

Putra, I. Ketut Gede Darma, Agung Cahyawan, and Yandi Perdana. "Low-Cost Based Eye Tracking and Eye Gaze Estimation." TELKOMNIKA (Telecommunication Computing Electronics and Control) 9, no. 2 (2011): 377. http://dx.doi.org/10.12928/telkomnika.v9i2.710.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
21

Son, Jake, Lei Ai, Ryan Lim, et al. "Evaluating fMRI-Based Estimation of Eye Gaze During Naturalistic Viewing." Cerebral Cortex 30, no. 3 (2019): 1171–84. http://dx.doi.org/10.1093/cercor/bhz157.

Pełny tekst źródła
Streszczenie:
Abstract The collection of eye gaze information during functional magnetic resonance imaging (fMRI) is important for monitoring variations in attention and task compliance, particularly for naturalistic viewing paradigms (e.g., movies). However, the complexity and setup requirements of current in-scanner eye tracking solutions can preclude many researchers from accessing such information. Predictive eye estimation regression (PEER) is a previously developed support vector regression-based method for retrospectively estimating eye gaze from the fMRI signal in the eye’s orbit using a 1.5-min cal
Style APA, Harvard, Vancouver, ISO itp.
22

Chinsatit, Warapon, and Takeshi Saitoh. "CNN-Based Pupil Center Detection for Wearable Gaze Estimation System." Applied Computational Intelligence and Soft Computing 2017 (2017): 1–10. http://dx.doi.org/10.1155/2017/8718956.

Pełny tekst źródła
Streszczenie:
This paper presents a convolutional neural network- (CNN-) based pupil center detection method for a wearable gaze estimation system using infrared eye images. Potentially, the pupil center position of a user’s eye can be used in various applications, such as human-computer interaction, medical diagnosis, and psychological studies. However, users tend to blink frequently; thus, estimating gaze direction is difficult. The proposed method uses two CNN models. The first CNN model is used to classify the eye state and the second is used to estimate the pupil center position. The classification mod
Style APA, Harvard, Vancouver, ISO itp.
23

Lian, Dongze, Ziheng Zhang, Weixin Luo, et al. "RGBD Based Gaze Estimation via Multi-Task CNN." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 2488–95. http://dx.doi.org/10.1609/aaai.v33i01.33012488.

Pełny tekst źródła
Streszczenie:
This paper tackles RGBD based gaze estimation with Convolutional Neural Networks (CNNs). Specifically, we propose to decompose gaze point estimation into eyeball pose, head pose, and 3D eye position estimation. Compared with RGB image-based gaze tracking, having depth modality helps to facilitate head pose estimation and 3D eye position estimation. The captured depth image, however, usually contains noise and black holes which noticeably hamper gaze tracking. Thus we propose a CNN-based multi-task learning framework to simultaneously refine depth images and predict gaze points. We utilize a ge
Style APA, Harvard, Vancouver, ISO itp.
24

Kompatsiari, Kyveli, Francesca Ciardo, and Agnieszka Wykowska. "Embodiment matters when establishing eye contact with a robot." Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systems 25, no. 2 (2024): 167–89. https://doi.org/10.1075/is.22060.kom.

Pełny tekst źródła
Streszczenie:
Abstract Eye contact constitutes a strong social signal in humans and affects various attentional processes. However, eye contact with another human evokes different responses compared with a direct gaze of an image on a screen. The question of interest is whether this holds also for eye contact with a robot. Previous experiments with physically present iCub humanoid robot showed that eye contact affects participants’ orienting of attention. In the present study, we investigated whether a robot’s eye contact on the screen could show similar effects. Specifically, in two experiments we examined
Style APA, Harvard, Vancouver, ISO itp.
25

Wang, Yafei, Xueyan Ding, Guoliang Yuan, and Xianping Fu. "Dual-Cameras-Based Driver’s Eye Gaze Tracking System with Non-Linear Gaze Point Refinement." Sensors 22, no. 6 (2022): 2326. http://dx.doi.org/10.3390/s22062326.

Pełny tekst źródła
Streszczenie:
The human eye gaze plays a vital role in monitoring people’s attention, and various efforts have been made to improve in-vehicle driver gaze tracking systems. Most of them build the specific gaze estimation model by pre-annotated data training in an offline way. These systems usually tend to have poor generalization performance during the online gaze prediction, which is caused by the estimation bias between the training domain and the deployment domain, making the predicted gaze points shift from their correct location. To solve this problem, a novel driver’s eye gaze tracking method with non
Style APA, Harvard, Vancouver, ISO itp.
26

Sangeetha, S. K. B. "A survey on Deep Learning Based Eye Gaze Estimation Methods." September 2021 3, no. 3 (2021): 190–207. http://dx.doi.org/10.36548/jiip.2021.3.003.

Pełny tekst źródła
Streszczenie:
In recent years, deep-learning systems have made great progress, particularly in the disciplines of computer vision and pattern recognition. Deep-learning technology can be used to enable inference models to do real-time object detection and recognition. Using deep-learning-based designs, eye tracking systems could determine the position of eyes or pupils, regardless of whether visible-light or near-infrared image sensors were utilized. For growing electronic vehicle systems, such as driver monitoring systems and new touch screens, accurate and successful eye gaze estimates are critical. In de
Style APA, Harvard, Vancouver, ISO itp.
27

Zhan, Xiangyi, and Changyuan Wang. "Research on Gaze Estimation Method Combined with Head Motion Changes." International Journal of Advanced Network, Monitoring and Controls 7, no. 4 (2022): 89–96. http://dx.doi.org/10.2478/ijanmc-2022-0040.

Pełny tekst źródła
Streszczenie:
Abstract The line of sight reflects the focus of human attention. Gaze estimation technology has a wide range of application prospects in human-computer interaction, human emotion analysis, commercial advertising, and so on. Gaze estimation needs to be jointly determined by eye movement and head movement because the human gaze is often with the head movement. In this paper, a gaze estimation system with head movement is implemented by a monocular camera, using eye movement features and head movement posture changes to estimate the gaze point. A camera is used as the information acquisition dev
Style APA, Harvard, Vancouver, ISO itp.
28

Shehu, Ibrahim Shehi, Yafei Wang, Athuman Mohamed Athuman, and Xianping Fu. "Remote Eye Gaze Tracking Research: A Comparative Evaluation on Past and Recent Progress." Electronics 10, no. 24 (2021): 3165. http://dx.doi.org/10.3390/electronics10243165.

Pełny tekst źródła
Streszczenie:
Several decades of eye related research has shown how valuable eye gaze data are for applications that are essential to human daily life. Eye gaze data in a broad sense has been used in research and systems for eye movements, eye tracking, and eye gaze tracking. Since early 2000, eye gaze tracking systems have emerged as interactive gaze-based systems that could be remotely deployed and operated, known as remote eye gaze tracking (REGT) systems. The drop point of visual attention known as point of gaze (PoG), and the direction of visual attention known as line of sight (LoS), are important tas
Style APA, Harvard, Vancouver, ISO itp.
29

Mihalache, Diana, Peter Sokol-Hessner, Huanghao Feng, et al. "Gaze perception from head and pupil rotations in 2D and 3D: Typical development and the impact of autism spectrum disorder." PLOS ONE 17, no. 10 (2022): e0275281. http://dx.doi.org/10.1371/journal.pone.0275281.

Pełny tekst źródła
Streszczenie:
The study of gaze perception has largely focused on a single cue (the eyes) in two-dimensional settings. While this literature suggests that 2D gaze perception is shaped by atypical development, as in Autism Spectrum Disorder (ASD), gaze perception is in reality contextually-sensitive, perceived as an emergent feature conveyed by the rotation of the pupils and head. We examined gaze perception in this integrative context, across development, among children and adolescents developing typically or with ASD with both 2D and 3D stimuli. We found that both groups utilized head and pupil rotations t
Style APA, Harvard, Vancouver, ISO itp.
30

Cheng, Yihua, Shiyao Huang, Fei Wang, Chen Qian, and Feng Lu. "A Coarse-to-Fine Adaptive Network for Appearance-Based Gaze Estimation." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 07 (2020): 10623–30. http://dx.doi.org/10.1609/aaai.v34i07.6636.

Pełny tekst źródła
Streszczenie:
Human gaze is essential for various appealing applications. Aiming at more accurate gaze estimation, a series of recent works propose to utilize face and eye images simultaneously. Nevertheless, face and eye images only serve as independent or parallel feature sources in those works, the intrinsic correlation between their features is overlooked. In this paper we make the following contributions: 1) We propose a coarse-to-fine strategy which estimates a basic gaze direction from face image and refines it with corresponding residual predicted from eye images. 2) Guided by the proposed strategy,
Style APA, Harvard, Vancouver, ISO itp.
31

Cheng, Yihua, Xucong Zhang, Feng Lu, and Yoichi Sato. "Gaze Estimation by Exploring Two-Eye Asymmetry." IEEE Transactions on Image Processing 29 (2020): 5259–72. http://dx.doi.org/10.1109/tip.2020.2982828.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
32

Wöhle, Lukas, and Marion Gebhard. "Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface." Sensors 21, no. 5 (2021): 1798. http://dx.doi.org/10.3390/s21051798.

Pełny tekst źródła
Streszczenie:
This paper presents a lightweight, infrastructureless head-worn interface for robust and real-time robot control in Cartesian space using head- and eye-gaze. The interface comes at a total weight of just 162 g. It combines a state-of-the-art visual simultaneous localization and mapping algorithm (ORB-SLAM 2) for RGB-D cameras with a Magnetic Angular rate Gravity (MARG)-sensor filter. The data fusion process is designed to dynamically switch between magnetic, inertial and visual heading sources to enable robust orientation estimation under various disturbances, e.g., magnetic disturbances or de
Style APA, Harvard, Vancouver, ISO itp.
33

Brousseau, Braiden, Jonathan Rose, and Moshe Eizenman. "Hybrid Eye-Tracking on a Smartphone with CNN Feature Extraction and an Infrared 3D Model." Sensors 20, no. 2 (2020): 543. http://dx.doi.org/10.3390/s20020543.

Pełny tekst źródła
Streszczenie:
This paper describes a low-cost, robust, and accurate remote eye-tracking system that uses an industrial prototype smartphone with integrated infrared illumination and camera. Numerous studies have demonstrated the beneficial use of eye-tracking in domains such as neurological and neuropsychiatric testing, advertising evaluation, pilot training, and automotive safety. Remote eye-tracking on a smartphone could enable the significant growth in the deployment of applications in these domains. Our system uses a 3D gaze-estimation model that enables accurate point-of-gaze (PoG) estimation with free
Style APA, Harvard, Vancouver, ISO itp.
34

Li, Changli, Enrui Tong, Kao Zhang, Nenglun Cheng, Zhongyuan Lai, and Zhigeng Pan. "Gaze Estimation Based on a Multi-Stream Adaptive Feature Fusion Network." Applied Sciences 15, no. 7 (2025): 3684. https://doi.org/10.3390/app15073684.

Pełny tekst źródła
Streszczenie:
Recently, with the widespread application of deep learning networks, appearance-based gaze estimation has made breakthrough progress. However, most methods focus on feature extraction from the facial region while neglecting the critical role of the eye region in gaze estimation, leading to insufficient eye detail representation. To address this issue, this paper proposes a multi-stream multi-input network architecture (MSMI-Net) based on appearance. The model consists of two independent streams designed to extract high-dimensional eye features and low-dimensional features, integrating both eye
Style APA, Harvard, Vancouver, ISO itp.
35

Aygun, Ayca, Thuan Nguyen, Zachary Haga, Shuchin Aeron, and Matthias Scheutz. "Investigating Methods for Cognitive Workload Estimation for Assistive Robots." Sensors 22, no. 18 (2022): 6834. http://dx.doi.org/10.3390/s22186834.

Pełny tekst źródła
Streszczenie:
Robots interacting with humans in assistive contexts have to be sensitive to human cognitive states to be able to provide help when it is needed and not overburden the human when the human is busy. Yet, it is currently still unclear which sensing modality might allow robots to derive the best evidence of human workload. In this work, we analyzed and modeled data from a multi-modal simulated driving study specifically designed to evaluate different levels of cognitive workload induced by various secondary tasks such as dialogue interactions and braking events in addition to the primary driving
Style APA, Harvard, Vancouver, ISO itp.
36

Wang, Yafei, Guoliang Yuan, Zetian Mi, et al. "Continuous Driver’s Gaze Zone Estimation Using RGB-D Camera." Sensors 19, no. 6 (2019): 1287. http://dx.doi.org/10.3390/s19061287.

Pełny tekst źródła
Streszczenie:
The driver gaze zone is an indicator of a driver’s attention and plays an important role in the driver’s activity monitoring. Due to the bad initialization of point-cloud transformation, gaze zone systems using RGB-D cameras and ICP (Iterative Closet Points) algorithm do not work well under long-time head motion. In this work, a solution for a continuous driver gaze zone estimation system in real-world driving situations is proposed, combining multi-zone ICP-based head pose tracking and appearance-based gaze estimation. To initiate and update the coarse transformation of ICP, a particle filter
Style APA, Harvard, Vancouver, ISO itp.
37

Serchi, V., A. Peruzzi, A. Cereatti, and U. Della Croce. "Use of a Remote Eye-Tracker for the Analysis of Gaze during Treadmill Walking and Visual Stimuli Exposition." BioMed Research International 2016 (2016): 1–6. http://dx.doi.org/10.1155/2016/2696723.

Pełny tekst źródła
Streszczenie:
The knowledge of the visual strategies adopted while walking in cognitively engaging environments is extremely valuable. Analyzing gaze when a treadmill and a virtual reality environment are used as motor rehabilitation tools is therefore critical. Being completely unobtrusive, remote eye-trackers are the most appropriate way to measure the point of gaze. Still, the point of gaze measurements are affected by experimental conditions such as head range of motion and visual stimuli. This study assesses the usability limits and measurement reliability of a remote eye-tracker during treadmill walki
Style APA, Harvard, Vancouver, ISO itp.
38

Li, Changli, Fangfang Li, Kao Zhang, Nenglun Chen, and Zhigeng Pan. "Gaze Estimation Network Based on Multi-Head Attention, Fusion, and Interaction." Sensors 25, no. 6 (2025): 1893. https://doi.org/10.3390/s25061893.

Pełny tekst źródła
Streszczenie:
Gaze is an externally observable indicator of human visual attention, and thus, recording the gaze position can help to solve many problems. Existing gaze estimation models typically utilize separate neural network branches to process data streams from both eyes and the face, failing to fully exploit their feature correlations. This study presents a gaze estimation network that integrates multi-head attention mechanisms, fusion, and interaction strategies to fuse facial features with eye features, as well as features from both eyes, separately. Specifically, multi-head attention and channel at
Style APA, Harvard, Vancouver, ISO itp.
39

Khedkar, Shilpa. "Eye Gaze Controlled Virtual Keyboard." INTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 09, no. 06 (2025): 1–9. https://doi.org/10.55041/ijsrem49646.

Pełny tekst źródła
Streszczenie:
Abstract— Eye gaze technology has emerged as a transformative tool within Human-Computer Interaction (HCI), offering innovative solutions for individuals with physical disabilities. A review of the development and application of an eye gaze-controlled virtual keyboard demonstrates its potential to facilitate typing without the use of hands or fingers, empowering users with limited motor abilities. Key challenges in gaze detection, eye-blink differentiation, and interaction reliability are addressed, contributing to the fields of assistive technology and inclusive design. Experimental results v
Style APA, Harvard, Vancouver, ISO itp.
40

Mokatren, Moayad, Tsvi Kuflik, and Ilan Shimshoni. "3D Gaze Estimation Using RGB-IR Cameras." Sensors 23, no. 1 (2022): 381. http://dx.doi.org/10.3390/s23010381.

Pełny tekst źródła
Streszczenie:
In this paper, we present a framework for 3D gaze estimation intended to identify the user’s focus of attention in a corneal imaging system. The framework uses a headset that consists of three cameras, a scene camera and two eye cameras: an IR camera and an RGB camera. The IR camera is used to continuously and reliably track the pupil and the RGB camera is used to acquire corneal images of the same eye. Deep learning algorithms are trained to detect the pupil in IR and RGB images and to compute a per user 3D model of the eye in real time. Once the 3D model is built, the 3D gaze direction is co
Style APA, Harvard, Vancouver, ISO itp.
41

Wan, Zijing, Xiangjun Wang, Lei Yin, and Kai Zhou. "A Method of Free-Space Point-of-Regard Estimation Based on 3D Eye Model and Stereo Vision." Applied Sciences 8, no. 10 (2018): 1769. http://dx.doi.org/10.3390/app8101769.

Pełny tekst źródła
Streszczenie:
This paper proposes a 3D point-of-regard estimation method based on 3D eye model and a corresponding head-mounted gaze tracking device. Firstly, a head-mounted gaze tracking system is given. The gaze tracking device uses two pairs of stereo cameras to capture the left and right eye images, respectively, and then sets a pair of scene cameras to capture the scene images. Secondly, a 3D eye model and the calibration process are established. Common eye features are used to estimate the eye model parameters. Thirdly, a 3D point-of-regard estimation algorithm is proposed. Three main parts of this me
Style APA, Harvard, Vancouver, ISO itp.
42

Shah, Sayyed Mudassar, Zhaoyun Sun, Khalid Zaman, Altaf Hussain, Muhammad Shoaib, and Lili Pei. "A Driver Gaze Estimation Method Based on Deep Learning." Sensors 22, no. 10 (2022): 3959. http://dx.doi.org/10.3390/s22103959.

Pełny tekst źródła
Streszczenie:
Car crashes are among the top ten leading causes of death; they could mainly be attributed to distracted drivers. An advanced driver-assistance technique (ADAT) is a procedure that can notify the driver about a dangerous scenario, reduce traffic crashes, and improve road safety. The main contribution of this work involved utilizing the driver’s attention to build an efficient ADAT. To obtain this “attention value”, the gaze tracking method is proposed. The gaze direction of the driver is critical toward understanding/discerning fatal distractions, pertaining to when it is obligatory to notify
Style APA, Harvard, Vancouver, ISO itp.
43

Kum, Junyeong, Sunghun Jung, and Myungho Lee. "The Effect of Eye Contact in Multi-Party Conversations with Virtual Humans and Mitigating the Mona Lisa Effect." Electronics 13, no. 2 (2024): 430. http://dx.doi.org/10.3390/electronics13020430.

Pełny tekst źródła
Streszczenie:
The demand for kiosk systems with embodied conversational agents has increased with the development of artificial intelligence. There have been attempts to utilize non-verbal cues, particularly virtual human (VH) eye contact, to enable human-like interaction. Eye contact with VHs can affect satisfaction with the system and the perception of VHs. However, when rendered in 2D kiosks, the gaze direction of a VH can be incorrectly perceived, due to a lack of stereo cues. A user study was conducted to examine the effects of the gaze behavior of VHs in multi-party conversations in a 2D display setti
Style APA, Harvard, Vancouver, ISO itp.
44

Shen, Kuanxin, Yingshun Li, Zhannan Guo, Jintao Gao, and Yingjian Wu. "Model-Based 3D Gaze Estimation Using a TOF Camera." Sensors 24, no. 4 (2024): 1070. http://dx.doi.org/10.3390/s24041070.

Pełny tekst źródła
Streszczenie:
Among the numerous gaze-estimation methods currently available, appearance-based methods predominantly use RGB images as input and employ convolutional neural networks (CNNs) to detect facial images to regressively obtain gaze angles or gaze points. Model-based methods require high-resolution images to obtain a clear eyeball geometric model. These methods face significant challenges in outdoor environments and practical application scenarios. This paper proposes a model-based gaze-estimation algorithm using a low-resolution 3D TOF camera. This study uses infrared images instead of RGB images a
Style APA, Harvard, Vancouver, ISO itp.
45

Wojciechowski, A., and K. Fornalczyk. "Single web camera robust interactive eye-gaze tracking method." Bulletin of the Polish Academy of Sciences Technical Sciences 63, no. 4 (2015): 879–86. http://dx.doi.org/10.1515/bpasts-2015-0100.

Pełny tekst źródła
Streszczenie:
Abstract Eye-gaze tracking is an aspect of human-computer interaction still growing in popularity,. Tracking human gaze point can help control user interfaces and may help evaluate graphical user interfaces. At the same time professional eye-trackers are very expensive and thus unavailable for most of user interface researchers and small companies. The paper presents very effective, low cost, computer vision based, interactive eye-gaze tracking method. On contrary to other authors results the method achieves very high precision (about 1.5 deg horizontally and 2.5 deg vertically) at 20 fps perf
Style APA, Harvard, Vancouver, ISO itp.
46

Li, Ting-Hao, Hiromasa Suzuki, and Yutaka Ohtake. "Visualization of user’s attention on objects in 3D environment using only eye tracking glasses." Journal of Computational Design and Engineering 7, no. 2 (2020): 228–37. http://dx.doi.org/10.1093/jcde/qwaa019.

Pełny tekst źródła
Streszczenie:
Abstract Eye tracking technology is widely applied to detect user’s attention in a 2D field, such as web page design, package design, and shooting games. However, because our surroundings primarily consist of 3D objects, applications will be expanded if there is an effective method to obtain and display user’s 3D gaze fixation. In this research, a methodology is proposed to demonstrate the user’s 3D gaze fixation on a digital model of a scene using only a pair of eye tracking glasses. The eye tracking glasses record user’s gaze data and scene video. Thus, using image-based 3D reconstruction, a
Style APA, Harvard, Vancouver, ISO itp.
47

Ansari, Mohd Faizan, Pawel Kasprowski, and Peter Peer. "Person-Specific Gaze Estimation from Low-Quality Webcam Images." Sensors 23, no. 8 (2023): 4138. http://dx.doi.org/10.3390/s23084138.

Pełny tekst źródła
Streszczenie:
Gaze estimation is an established research problem in computer vision. It has various applications in real life, from human–computer interactions to health care and virtual reality, making it more viable for the research community. Due to the significant success of deep learning techniques in other computer vision tasks—for example, image classification, object detection, object segmentation, and object tracking—deep learning-based gaze estimation has also received more attention in recent years. This paper uses a convolutional neural network (CNN) for person-specific gaze estimation. The pers
Style APA, Harvard, Vancouver, ISO itp.
48

Watanabe, Junji, Hideyuki Ando, Taro Maeda, and Susumu Tachi. "Gaze-Contingent Visual Presentation Based on Remote Saccade Detection." Presence: Teleoperators and Virtual Environments 16, no. 2 (2007): 224–34. http://dx.doi.org/10.1162/pres.16.2.224.

Pełny tekst źródła
Streszczenie:
Pursuing new display techniques based on insights into human visual perception can reveal new possibilities for visual information devices. Here, we propose a novel information presentation technique that exploits the perceptional features during rapid eye movements called saccades by using a fast remote eye-measuring method. When light sources are fixed on a vertical line, and the flashing pattern is changed quickly during a horizontal saccade, 2D images can be perceived due to spatio-temporal integration in the human vision system. We use this phenomenon to present 2D images with only one-di
Style APA, Harvard, Vancouver, ISO itp.
49

Rashid, Maria, Wardah Mehmood, and Aliya Ashraf. "Techniques Used for Eye Gaze Interfaces and Survey." International Journal of Advances in Scientific Research 1, no. 6 (2015): 276. http://dx.doi.org/10.7439/ijasr.v1i6.2125.

Pełny tekst źródła
Streszczenie:
Eye movement tracking is a method that is now-a-days used for checking the usability problems in the contexts of Human Computer Interaction (HCI). Firstly we present eye tracking technology and key elements.We tend to evaluate the behavior of the use when they are using the interace of eye gaze. Used different techniques i.e. electro-oculography, infrared oculography, video oculography, image process techniques, scrolling techniques, different models, probable approaches i.e. shape based approach, appearance based methods, 2D and 3D models based approach and different software algorithms for p
Style APA, Harvard, Vancouver, ISO itp.
50

Huang, Longzhao, Yujie Li, Xu Wang, Haoyu Wang, Ahmed Bouridane, and Ahmad Chaddad. "Gaze Estimation Approach Using Deep Differential Residual Network." Sensors 22, no. 14 (2022): 5462. http://dx.doi.org/10.3390/s22145462.

Pełny tekst źródła
Streszczenie:
Gaze estimation, which is a method to determine where a person is looking at given the person’s full face, is a valuable clue for understanding human intention. Similarly to other domains of computer vision, deep learning (DL) methods have gained recognition in the gaze estimation domain. However, there are still gaze calibration problems in the gaze estimation domain, thus preventing existing methods from further improving the performances. An effective solution is to directly predict the difference information of two human eyes, such as the differential network (Diff-Nn). However, this solut
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!