Academic literature on the topic 'Interface tracking'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Interface tracking.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Interface tracking"

1

Goldberg, Joseph H. "Eye Movement-Based Interface Evaluation: What can and Cannot be Assessed?" Proceedings of the Human Factors and Ergonomics Society Annual Meeting 44, no. 37 (July 2000): 625–28. http://dx.doi.org/10.1177/154193120004403721.

Full text
Abstract:
Interface evaluation by eye tracking-derived data is discussed in this review and synthesis paper. While analysis of eye movements during interface use is becoming more popular, there is little basis for justification of eye tracking methods. A review of traditional interface assessment methods and criteria is provided, to establish areas where eye tracking may potentially impact interface evaluations. Studies are then reviewed, that have used eye tracking-derived measures for performance assessment as interfaces are manipulated. A synthesis is then provided, suggesting that eye tracking-based analysis could have a positive impact in evaluations of consistency, resources, visual clarity, and flexibility, and should have difficulty in ascertaining interface compatibility and locus of control.
APA, Harvard, Vancouver, ISO, and other styles
2

Miyata, Kohei, Yuhki Kitazono, Shi Yuan Yang, and Serikawa Seiichi. "Development of Stationary User Interface Using Head-Tracking." Applied Mechanics and Materials 103 (September 2011): 711–16. http://dx.doi.org/10.4028/www.scientific.net/amm.103.711.

Full text
Abstract:
Recently, various user interfaces are developed. However the operation of user interface is very difficult for the physically handicapped persons who cannot move their hand. The stationary user interface we are proposing uses head tracking via a camera and a display. It is portable and can operate household appliances. It is also operated intuitively in head tracking.
APA, Harvard, Vancouver, ISO, and other styles
3

Wojciechowski, A. "Hand’s poses recognition as a mean of communication within natural user interfaces." Bulletin of the Polish Academy of Sciences: Technical Sciences 60, no. 2 (October 1, 2012): 331–36. http://dx.doi.org/10.2478/v10175-012-0044-3.

Full text
Abstract:
Abstract. Natural user interface (NUI) is a successor of command line interfaces (CLI) and graphical user interfaces (GUI) so well known to computer users. A new natural approach is based on extensive human behaviors tracking, where hand tracking and gesture recognition seem to play the main roles in communication. The presented paper reviews common approaches to discussed hand features tracking and provides a very effective proposal of the contour based hand’s poses recognition method which can be straightforwardly used for a hand-based natural user interface. Its possible usage varies from medical systems interaction, through games up to impaired people communication support.
APA, Harvard, Vancouver, ISO, and other styles
4

Rele, Rachana S., and Andrew T. Duchowski. "Using Eye Tracking to Evaluate Alternative Search Results Interfaces." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 49, no. 15 (September 2005): 1459–63. http://dx.doi.org/10.1177/154193120504901508.

Full text
Abstract:
Surveys have shown that 75% of users get frustrated with search engines and only 21% find relevant information. Inability to find relevant results can be partially attributed to cluttered results pages and failure in constructing Boolean queries. This research used sixteen subjects to evaluate two types of search results interfaces using four tasks while measuring performance and studying their ocular behavior using a Tobii 1750 eye-tracker. The two interfaces used were list interface, commonly seen on many search engines and a tabular interface presenting information in discrete categories or elements of the result's abstract. Quantitative comparisons of two interfaces are made on performance metrics such as time and errors, process metrics such as fixation durations, number of fixations, and eye movement transitions from one element or category of the abstract. Subjective data was collected through post-task and post-test questionnaires. The results did not show any significant difference in performance between the two interfaces, however, eye movements analysis provide some insights into importance of search result's abstract elements such as title, summary, and URL of the interface while searching.
APA, Harvard, Vancouver, ISO, and other styles
5

Lu, Haitian, Ning Zhao, and Donghong Wang. "A Front Tracking Method for the Simulation of Compressible Multimedium Flows." Communications in Computational Physics 19, no. 1 (January 2016): 124–42. http://dx.doi.org/10.4208/cicp.260314.310315a.

Full text
Abstract:
AbstractA front tracking method combined with the real ghost fluid method (RGFM) is proposed for simulations of fluid interfaces in two-dimensional compressible flows. In this paper the Riemann problem is constructed along the normal direction of interface and the corresponding Riemann solutions are used to track fluid interfaces. The interface boundary conditions are defined by the RGFM, and the fluid interfaces are explicitly tracked by several connected marker points. The Riemann solutions are also used directly to update the flow states on both sides of the interface in the RGFM. In order to validate the accuracy and capacity of the new method, extensive numerical tests including the bubble advection, the Sod tube, the shock-bubble interaction, the Richtmyer-Meshkov instability and the gas-water interface, are simulated by using the Euler equations. The computational results are also compared with earlier computational studies and it shows good agreements including the compressible gas-water system with large density differences.
APA, Harvard, Vancouver, ISO, and other styles
6

Nielsen, Jakob. "Usability metrics: tracking interface improvements." IEEE Software 13, no. 6 (November 1996): 1–2. http://dx.doi.org/10.1109/ms.1996.8740869.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Popovic, Jelena, and Olof Runborg. "Adaptive fast interface tracking methods." Journal of Computational Physics 337 (May 2017): 42–61. http://dx.doi.org/10.1016/j.jcp.2017.02.017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Glimm, James, John W. Grove, and Yongmin Zhang. "Interface Tracking for Axisymmetric Flows." SIAM Journal on Scientific Computing 24, no. 1 (January 2002): 208–36. http://dx.doi.org/10.1137/s1064827500366690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

SOU, Akira, Kosuke HAYASHI, and Akio TOMIYAMA. "Interface Tracking Method based on Simplified Interface Reconstruction Method." Proceedings of The Computational Mechanics Conference 2003.16 (2003): 111–12. http://dx.doi.org/10.1299/jsmecmd.2003.16.111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tezduyar, Tayfun E. "Interface-tracking and interface-capturing techniques for finite element computation of moving boundaries and interfaces." Computer Methods in Applied Mechanics and Engineering 195, no. 23-24 (April 2006): 2983–3000. http://dx.doi.org/10.1016/j.cma.2004.09.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Interface tracking"

1

Sims, Paul. "Interface tracking using Lagrangian-Eulerian methods." Thesis, University of Reading, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.298640.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Andersson, Anders Tobias. "Facial Feature Tracking and Head Pose Tracking as Input for Platform Games." Thesis, Blekinge Tekniska Högskola, Institutionen för kreativa teknologier, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-12924.

Full text
Abstract:
Modern facial feature tracking techniques can automatically extract and accurately track multiple facial landmark points from faces in video streams in real time. Facial landmark points are defined as points distributed on a face in regards to certain facial features, such as eye corners and face contour. This opens up for using facial feature movements as a handsfree human-computer interaction technique. These alternatives to traditional input devices can give a more interesting gaming experience. They also open up for more intuitive controls and can possibly give greater access to computers and video game consoles for certain disabled users with difficulties using their arms and/or fingers. This research explores using facial feature tracking to control a character's movements in a platform game. The aim is to interpret facial feature tracker data and convert facial feature movements to game input controls. The facial feature input is compared with other handsfree inputmethods, as well as traditional keyboard input. The other handsfree input methods that are explored are head pose estimation and a hybrid between the facial feature and head pose estimation input. Head pose estimation is a method where the application is extracting the angles in which the user's head is tilted. The hybrid input method utilises both head pose estimation and facial feature tracking. The input methods are evaluated by user performance and subjective ratings from voluntary participants playing a platform game using the input methods. Performance is measured by the time, the amount of jumps and the amount of turns it takes for a user to complete a platform level. Jumping is an essential part of platform games. To reach the goal, the player has to jump between platforms. An inefficient input method might make this a difficult task. Turning is the action of changing the direction of the player character from facing left to facing right or vice versa. This measurement is intended to pick up difficulties in controling the character's movements. If the player makes many turns, it is an indication that it is difficult to use the input method to control the character movements efficiently. The results suggest that keyboard input is the most effective input method, while it is also the least entertaining of the input methods. There is no significant difference in performance between facial feature input and head pose input. The hybrid input version has the best results overall of the alternative input methods. The hybrid input method got significantly better performance results than the head pose input and facial feature input methods, while it got results that were of no statistically significant difference from the keyboard input method. Keywords: Computer Vision, Facial Feature Tracking, Head Pose Tracking, Game Control
Moderna tekniker kan automatiskt extrahera och korrekt följa multipla landmärken från ansikten i videoströmmar. Landmärken från ansikten är definerat som punkter placerade på ansiktet utefter ansiktsdrag som till exempel ögat eller ansikts konturer. Detta öppnar upp för att använda ansiktsdragsrörelser som en teknik för handsfree människa-datorinteraktion. Dessa alternativ till traditionella tangentbord och spelkontroller kan användas för att göra datorer och spelkonsoler mer tillgängliga för vissa rörelsehindrade användare. Detta examensarbete utforskar användbarheten av ansiktsdragsföljning för att kontrollera en karaktär i ett plattformsspel. Målet är att tolka data från en appliktion som följer ansiktsdrag och översätta ansiktsdragens rörelser till handkontrollsinmatning. Ansiktsdragsinmatningen jämförs med inmatning med huvudposeuppskattning, en hybrid mellan ansikstdragsföljning och huvudposeuppskattning, samt traditionella tangentbordskontroller. Huvudposeuppskattning är en teknik där applikationen extraherar de vinklar användarens huvud lutar. Hybridmetoden använder både ansiktsdragsföljning och huvudposeuppskattning. Inmatningsmetoderna granskas genom att mäta effektivitet i form av tid, antal hopp och antal vändningar samt subjektiva värderingar av frivilliga testanvändare som spelar ett plattformspel med de olika inmatningsmetoderna. Att hoppa är viktigt i ett plattformsspel. För att nå målet, måste spelaren hoppa mellan plattformar. En inefektiv inmatningsmetod kan göra detta svårt. En vändning är när spelarkaraktären byter riktning från att rikta sig åt höger till att rikta sig åt vänster och vice versa. Ett högt antal vändningar kan tyda på att det är svårt att kontrollera spelarkaraktärens rörelser på ett effektivt sätt. Resultaten tyder på att tangentbordsinmatning är den mest effektiva metoden för att kontrollera plattformsspel. Samtidigt fick metoden lägst resultat gällande hur roligt användaren hade under spelets gång. Där var ingen statisktiskt signifikant skillnad mellan huvudposeinmatning och ansikstsdragsinmatning. Hybriden mellan ansiktsdragsinmatning och huvudposeinmatning fick bäst helhetsresultat av de alternativa inmatningsmetoderna. Nyckelord: Datorseende, Följning av Ansiktsdrag, Följning av Huvud, Spelinmatning
APA, Harvard, Vancouver, ISO, and other styles
3

Khairat, Saif. "Clinical content tracking system an efficient request tracking via a graphical user interface /." Diss., Columbia, Mo. : University of Missouri-Columbia, 2007. http://hdl.handle.net/10355/4892.

Full text
Abstract:
Thesis (M.S.)--University of Missouri-Columbia, 2007.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file (viewed on March 28, 2008) Vita Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
4

Tornberg, Anna-Karin. "Interface tracking methods with application to multiphase flows." Doctoral thesis, KTH, Numerical Analysis and Computer Science, NADA, 2000. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-2953.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jan, Muhammad Asghar, and Syed Majid Ali Shah Bukhari. "Eye Tracking Interface Design for Controlling Mobile Robot." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-5030.

Full text
Abstract:
This thesis provides a baseline study for eye tracking user interface design for controlling a mobile robot. The baseline study is an experiment involving the use of a radio controller (RC) to drive the robot, while gaze data is collected from each subject monitoring the position of robot on the remote screen that displays the view for the turret-mounted video camera on the robot. Initial data from the experiment provides a foundation for interface design of actual control of the mobile robot by gaze interaction. Such an interface may provide Tele-presence for the disable. Patients with motor disability cannot use their hands and legs but only use their eye motions. Such applications of an eye tracking system can provide patients with much flexibility and freedom for search and identification of objects.
Muhammad Asghar Jan (+46-700183140) syedmaji@hotmail.com (+46-736805771)
APA, Harvard, Vancouver, ISO, and other styles
6

Shaw, Daniel. "An Eye-Tracking Evaluation of Multicultural Interface Designs." Thesis, Boston College, 2005. http://hdl.handle.net/2345/390.

Full text
Abstract:
Thesis advisor: James Gips
This paper examines the impact of a multicultural approach on the usability of web and software interface designs. Through the use of an eye-tracking system, the study compares the ability of American users to navigate traditional American and Japanese websites. The ASL R6 eye-tracking system recorded user search latency and the visual scan path in locating specific items on the American and Japanese pages. Experimental results found statistically significant latency values when searching for left- or right-oriented navigation menus. Among the participants, visual observations of scan paths indicated a strong preference for initial movements toward the left. These results demonstrate the importance of manipulating web layouts and navigation menus for American and Japanese users. This paper further discusses the potential strengths resulting from modifications of interface designs to correspond with such cultural search tendencies, and suggestions for further research
Thesis (BA) — Boston College, 2005
Submitted to: Boston College. College of Arts and Sciences
Discipline: Computer Science
Discipline: College Honors Program
APA, Harvard, Vancouver, ISO, and other styles
7

Maharaj, Robin. "Design of asset tracking device with GPRS Interface." Thesis, Cape Peninsula University of Technology, 2019. http://hdl.handle.net/20.500.11838/2815.

Full text
Abstract:
Thesis (Master of Engineering in Electrical Engineering)--Cape Peninsula University of Technology, 2018.
IOT devices have the potential to improve asset lifecycle optimization because of their ability to provide relevant real time data to high level applications. This data with minimal latency can assist asset managers to control the behaviour of assets and asset users to optimize asset lifecycle cost. There are many environments that require asset tracking devices but this design focussed on motor vehicles with auxiliary functions and apparatus. This research work documented the design of an asset tracking device built and tested on a 32.bit microcontroller platform with built-in CAN peripheral. This design resolved handling multiple serial interfaces collating data simultaneously concatenating this data and transmitting the data via GPRS interface as a single UDP sentence. Apart from interfacing various serial interfaces to the Stm24f4 this design also implemented a Wireless module as well as a multichannel ADC Module. This design was accomplished by researching and implementing software techniques as well as researching the hardware/firmware in terms of DMA and Nested Vector Interrupt Controller of the STM32 devices. The solution that this design will accomplish is to provide the industry an asset-tracking device with data capturing functionality capable of delivering the above needs at reasonable data cost. The device designed in this thesis is the client device of asset tracking network. This design was accomplished on a proof of concept basis delivering built hardware in the form of various application modules interfaced to a 32 –bit microcontroller via uart, SPI and CAN.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Yanxia. "Eye tracking and gaze interface design for pervasive displays." Thesis, Lancaster University, 2015. http://eprints.lancs.ac.uk/76906/.

Full text
Abstract:
Eye tracking for pervasive displays in everyday computing is an emerging area in research. There is an increasing number of pervasive displays in our surroundings, such as large displays in public spaces, digital boards in offices and smart televisions at home. Gaze is an attractive input modality for these displays, as people naturally look at objects of interest and use their eyes to seek information. Existing research has applied eye tracking in a variety of fields, but tends to be in constrained environments for lab applications. This thesis investigates how to enable robust gaze sensing in pervasive contexts and how eye tracking can be applied for pervasive displays that we encounter in our daily life. To answer these questions, we identify the technical and design challenges posed by using gaze for pervasive displays. Firstly, in out-of-lab environments, interactions are usually spontaneous where users and systems are unaware of each other beforehand. This poses the technical problem that gaze sensing should not need prior user training and should be robust in unconstrained environments. We develop novel vision-based systems that require only off-the-shelf RGB cameras to address this issue. Secondly, in pervasive contexts, users are usually unaware of gaze interactivity iii of pervasive displays and the technical restrictions of gaze sensing systems. However, there is little knowledge about how to enable people to use gaze interactive systems in daily life. Thus, we design novel interfaces that allow novice users to interact with contents on pervasive displays, and we study the usage of our systems through field deployments. We demonstrate that people can walk up to a gaze interactive system and start to use it immediately without human assistance. Lastly, pervasive displays could also support multiuser co-located collaborations. We explore the use of gaze for collaborative tasks. Our results show that sharing gaze information on shared displays can ease communications and improve collaboration. Although we demonstrate benefits of using gaze for pervasive displays, open challenges remain in enabling gaze interaction in everyday computing and require further investigations. Our research provides a foundation for the rapidly growing field of eye tracking for pervasive displays.
APA, Harvard, Vancouver, ISO, and other styles
9

Mahajan, Onkar. "Multimodal interface integrating eye gaze tracking and speech recognition." University of Toledo / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1430494171.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Oyekoya, Oyewole Kayode. "Eye tracking : a perceptual interface for content based image retrieval." Thesis, University College London (University of London), 2007. http://discovery.ucl.ac.uk/5413/.

Full text
Abstract:
In this thesis visual search experiments are devised to explore the feasibility of an eye gaze driven search mechanism. The thesis first explores gaze behaviour on images possessing different levels of saliency. Eye behaviour was predominantly attracted by salient locations, but appears to also require frequent reference to non-salient background regions which indicated that information from scan paths might prove useful for image search. The thesis then specifically investigates the benefits of eye tracking as an image retrieval interface in terms of speed relative to selection by mouse, and in terms of the efficiency of eye tracking mechanisms in the task of retrieving target images. Results are analysed using ANOVA and significant findings are discussed. Results show that eye selection was faster than a computer mouse and experience gained during visual tasks carried out using a mouse would benefit users if they were subsequently transferred to an eye tracking system. Results on the image retrieval experiments show that users are able to navigate to a target image within a database confirming the feasibility of an eye gaze driven search mechanism. Additional histogram analysis of the fixations, saccades and pupil diameters in the human eye movement data revealed a new method of extracting intentions from gaze behaviour for image search, of which the user was not aware and promises even quicker search performances. The research has two implications for Content Based Image Retrieval: (i) improvements in query formulation for visual search and (ii) new methods for visual search using attentional weighting. Futhermore it was demonstrated that users are able to find target images at sufficient speeds indicating that pre-attentive activity is playing a role in visual search. A current review of eye tracking technology, current applications, visual perception research, and models of visual attention is discussed. A review of the potential of the technology for commercial exploitation is also presented.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Interface tracking"

1

Henault, German A. A computer simulation study and component evaluation for a quaternion filter for sourceless tracking of human limb segment motion. Monterey, Calif: Naval Postgraduate School, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mark, Joseph F. A prototype multi-media data base for tracking interface relationships and performing cost tradeoffs for the Sea Launch and Recovery (SEALAR) Space Launch System. Monterey, Calif: Naval Postgraduate School, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Madritsch, Franz. Optical beacon tracking for human-computer interfaces. Wien: R. Oldenbourg, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Karasulu, Bahadir. Performance Evaluation Software: Moving Object Detection and Tracking in Videos. New York, NY: Springer New York, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Michael, Kelley, and Institute for Computer Applications in Science and Engineering., eds. Tracking a turbulent spot in an immersive environment. Hampton, VA: Institute for Computer Applications in Science and Engineering, NASA Langley Research Center, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Michael, Kelley, and Institute for Computer Applications in Science and Engineering., eds. Tracking a turbulent spot in an immersive environment. Hampton, VA: Institute for Computer Applications in Science and Engineering, NASA Langley Research Center, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

(Firm), Axiomatix, and Lyndon B. Johnson Space Center., eds. Shuttle communication and tracking systems signal design and interface compatibility analysis: Final report. Los Angeles, Calif: Axiomatix, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

S, Patt Frederick, Gregg Watson W, and United States. National Aeronautics and Space Administration. Scientific and Technical Information Program., eds. CATLAC: Calibration and validation Analysis Tool of Local Area Coverage for the SeaWIFS mission. [Washington, D.C.?]: National Aeronautics and Space Administration, Scientific and Technical Information Program, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

S, Patt Frederick, Gregg Watson W, and United States. National Aeronautics and Space Administration. Scientific and Technical Information Program., eds. CATLAC: Calibration and validation Analysis Tool of Local Area Coverage for the SeaWIFS mission. [Washington, D.C.?]: National Aeronautics and Space Administration, Scientific and Technical Information Program, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Analysis and Modeling of the Virtual Human Interface for the MARG Body Tracking System Using Quaternions. Storming Media, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Interface tracking"

1

Vincent, Stéphane, Jean-Luc Estivalézes, and Ruben Scardovelli. "Interface Tracking." In Small Scale Modeling and Simulation of Incompressible Turbulent Multi-Phase Flow, 51–109. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-09265-7_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Duchowski, Andrew T. "Using an Open Source Application Program Interface." In Eye Tracking Methodology, 131–40. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-57883-5_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tornberg, A. K., and B. Engquist. "Interface Tracking in Multiphase Flows." In Multifield Problems, 58–65. Berlin, Heidelberg: Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/978-3-662-04015-7_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kosikowski, Łukasz, Piotr Dalka, Piotr Odya, and Andrzej Czyżewski. "Multimedia Interface Using Head Movements Tracking." In Advances in Intelligent and Soft Computing, 41–47. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-23169-8_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dierich, Frank, and Kay Wittig. "Interface Tracking During Char Particle Gasification." In Gasification Processes, 171–204. Weinheim, Germany: Wiley-VCH Verlag GmbH & Co. KGaA, 2014. http://dx.doi.org/10.1002/9783527673186.ch7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Cicek, Muratcan, and Roberto Manduchi. "Learning a Head-Tracking Pointing Interface." In Lecture Notes in Computer Science, 399–406. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08648-9_46.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Lowengrub, John. "ModeLing Coarsening Dynamics Using Interface Tracking Methods." In Handbook of Materials Modeling, 2205–22. Dordrecht: Springer Netherlands, 2005. http://dx.doi.org/10.1007/1-4020-3286-2_114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Shin, Yunhee, and Eun Yi Kim. "Welfare Interface Using Multiple Facial Features Tracking." In Lecture Notes in Computer Science, 453–62. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11941439_49.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bai, Bondili Kohitha, Ankita Mittal, and Sanchita Mittal. "Infrared Source Tracking Robot with Computer Interface." In Advances in Parallel Distributed Computing, 86–90. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-24037-9_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Abdeljaoued, Yousri, David Marimon i Sanjuan, and Touradj Ebrahimi. "Tracking and User Interface for Mixed Reality." In 3D Videocommunication, 315–34. Chichester, UK: John Wiley & Sons, Ltd, 2006. http://dx.doi.org/10.1002/0470022736.ch17.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Interface tracking"

1

Tezduyar, Tayfun. "Finite Element Interface-Tracking and Interface-Capturing Techniques for Flows With Moving Boundaries and Interfaces." In ASME 2001 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/imece2001/htd-24206.

Full text
Abstract:
Abstract We provide an overview of the interface-tracking and interface-capturing techniques we have developed in recent years for computation of flow problems with moving boundaries and interfaces, including two-fluid interfaces. The interface-tracking techniques are based on the Deforming-Spatial-Domain/Stabilized Space-Time formulation, where the mesh moves to track the interface. The interface-capturing techniques, which were developed for two-fluid flows, are based on the stabilized formulation, over non-moving meshes, of both the flow equations and the advection equation governing the time-evolution of an interface function marking the location of the interface. For interface-capturing techniques, to increase the accuracy in representing the interface, the Enhanced-Discretization Interface-Capturing Technique (EDICT) can be used to to accomplish that goal. We also provide and overview of some of the additional ideas developed to increase the scope and accuracy of these two classes of techniques.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Qiaohui, Atsumi Imamiya, Kentaro Go, and Xiaoyang Mao. "Resolving ambiguities of a gaze and speech interface." In the Eye tracking research & applications symposium. New York, New York, USA: ACM Press, 2004. http://dx.doi.org/10.1145/968363.968383.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Estrany, B., P. Fuster, A. Garcia, and Y. Luo. "Human computer interface by EOG tracking." In the 1st ACM international conference. New York, New York, USA: ACM Press, 2008. http://dx.doi.org/10.1145/1389586.1389694.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rider, William, and Douglas Kothe. "Stretching and tearing interface tracking methods." In 12th Computational Fluid Dynamics Conference. Reston, Virigina: American Institute of Aeronautics and Astronautics, 1995. http://dx.doi.org/10.2514/6.1995-1717.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kaufman, A. E., A. Bandopadhay, and B. D. Shaviv. "An eye tracking computer user interface." In 1993 IEEE Research Properties in Virtual Reality Symposium. IEEE Comput. Soc. Press, 1993. http://dx.doi.org/10.1109/vrais.1993.378254.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Decker, Daniel, and Jenelle Armstrong Piepmeier. "Gaze Tracking Interface for Robotic Control." In 2008 40th Southeastern Symposium on System Theory (SSST). IEEE, 2008. http://dx.doi.org/10.1109/ssst.2008.4480236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kumar, S. Venkatesa Karthick, and Anitha Julian. "Oscitancy tracking using Brain-Computer Interface." In 2012 IEEE Students' Conference on Electrical, Electronics and Computer Science (SCEECS). IEEE, 2012. http://dx.doi.org/10.1109/sceecs.2012.6184783.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Savio, Domnic, and Thomas Ludwig. "Smart Carpet: A Footstep Tracking Interface." In 21st International Conference on Advanced Information Networking and Applications Workshops. IEEE, 2007. http://dx.doi.org/10.1109/ainaw.2007.338.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Davanzo, Nicola, Piercarlo Dondi, Mauro Mosconi, and Marco Porta. "Playing music with the eyes through an isomorphic interface." In ETRA '18: 2018 Symposium on Eye Tracking Research and Applications. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3206343.3206350.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hartescu, Dan, and Andreas Oikonomou. "Gaze tracking as a game input interface." In Serious Games (CGAMES). IEEE, 2011. http://dx.doi.org/10.1109/cgames.2011.6000327.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Interface tracking"

1

Erickson, Lindsay. An interface tracking model for droplet electrocoalescence. Office of Scientific and Technical Information (OSTI), September 2013. http://dx.doi.org/10.2172/1096257.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

O'Brien, M. Material Interface Reconstruction for Monte Carlo Particle Tracking. Office of Scientific and Technical Information (OSTI), March 2006. http://dx.doi.org/10.2172/895426.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Garimella, Rao Veerabhadra. Introduction to Interface Tracking in Multi-Material Flow Simulations. Office of Scientific and Technical Information (OSTI), June 2017. http://dx.doi.org/10.2172/1367800.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Garimella, Rao Veerabhadra. Introduction to Interface Tracking in Multi-Material Flow Simulations. Office of Scientific and Technical Information (OSTI), June 2018. http://dx.doi.org/10.2172/1457283.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Glimm, James, Valmor de Almeida, Xiangmin Jiao, Brett Sims, and Xaiolin Li. Sharp Interface Tracking in Rotating Microflows of Solvent Extraction. Office of Scientific and Technical Information (OSTI), January 2013. http://dx.doi.org/10.2172/1063992.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Perzanowski, Dennis, Alan C. Schultz, William Adams, and Elaine Marsh. Goal Tracking in a Natural Language Interface: Towards Achieving Adjustable Autonomy. Fort Belvoir, VA: Defense Technical Information Center, January 1999. http://dx.doi.org/10.21236/ada435151.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bane, K. L. F. LiTrack: A Fast Longitudinal Phase Space Tracking Code with Graphical User Interface. Office of Scientific and Technical Information (OSTI), March 2005. http://dx.doi.org/10.2172/839868.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Barry, R. E., G. A. Armstrong, and B. L. Burks. Position and Orientation Tracking System three-dimensional graphical user interface. CRADA final report. Office of Scientific and Technical Information (OSTI), September 1997. http://dx.doi.org/10.2172/629427.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Sharp, D. H., J. W. Grove, Y. Yang, B. Boston, R. Holmes, Q. Zhang, and J. Glimm. The application of front tracking to the simulation of shock refractions and shock accelerated interface mixing. Office of Scientific and Technical Information (OSTI), August 1993. http://dx.doi.org/10.2172/10175723.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Klymenko, Mykola V., and Andrii M. Striuk. Development of software and hardware complex of GPS-tracking. CEUR Workshop Proceedings, March 2021. http://dx.doi.org/10.31812/123456789/4430.

Full text
Abstract:
The paper considers the typical technical features of GPS-tracking systems and their development, as well as an analysis of existing solutions to the problem. Mathematical models for the operation of hardware and software of this complex have been created. An adaptive user interface has been developed that allows you to use this complex from a smartphone or personal computer. Methods for displaying the distance traveled by a moving object on an electronic map have been developed. Atmega162-16PU microcontroller software for GSM module and GPS receiver control has been developed. A method of data transfer from a GPS tracker to a web server has been developed. Two valid experimental samples of GPS-trackers were made and tested in uncertain conditions. The GPS-tracking software and hardware can be used to monitor the movement of moving objects that are within the coverage of GSM cellular networks.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography