Academic literature on the topic 'Intuitive interface'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Intuitive interface.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Intuitive interface"

1

Nakanishi, Mikiko, and Tsutomu Horikoshi. "Intuitive substitute interface." Personal and Ubiquitous Computing 17, no. 8 (2013): 1797–805. http://dx.doi.org/10.1007/s00779-013-0651-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Álvarez Reyes, Julio César. "Design of intuitive user interfaces for virtual assistants in university education." Journal of Scientific and Technological Research Industrial 4, no. 1 (2023): 17–20. http://dx.doi.org/10.47422/jstri.v4i1.34.

Full text
Abstract:
Designing an intuitive user interface for virtual assistants in higher education presents several challenges, chief among them being natural language understanding, virtual assistant customization, user-centered design, integration with existing technology, and consideration of the educational context. The success of virtual assistants in higher education depends on the ability of designers and developers to understand the needs and preferences of users and their ability to design intuitive and effective user interfaces that enhance the learning and teaching experience. To address the challenges in designing intuitive user interfaces for virtual assistants, various methods can be used, such as creating a natural language-based user interface, including visual elements, and designing a custom user interface. and adaptable.
APA, Harvard, Vancouver, ISO, and other styles
3

Goyzueta, Denilson V., Joseph Guevara M., Andrés Montoya A., et al. "Analysis of a User Interface Based on Multimodal Interaction to Control a Robotic Arm for EOD Applications." Electronics 11, no. 11 (2022): 1690. http://dx.doi.org/10.3390/electronics11111690.

Full text
Abstract:
A global human–robot interface that meets the needs of Technical Explosive Ordnance Disposal Specialists (TEDAX) for the manipulation of a robotic arm is of utmost importance to make the task of handling explosives safer, more intuitive and also provide high usability and efficiency. This paper aims to evaluate the performance of a multimodal system for a robotic arm that is based on Natural User Interface (NUI) and Graphical User Interface (GUI). The mentioned interfaces are compared to determine the best configuration for the control of the robotic arm in Explosive Ordnance Disposal (EOD) applications and to improve the user experience of TEDAX agents. Tests were conducted with the support of police agents Explosive Ordnance Disposal Unit-Arequipa (UDEX-AQP), who evaluated the developed interfaces to find a more intuitive system that generates the least stress load to the operator, resulting that our proposed multimodal interface presents better results compared to traditional interfaces. The evaluation of the laboratory experiences was based on measuring the workload and usability of each interface evaluated.
APA, Harvard, Vancouver, ISO, and other styles
4

Ciora, Radu Adrian, Daniela Gifu, and Carmen Mihaela Simion. "A Novel User Interface for Knowledge Base Browsing." Balkan Region Conference on Engineering and Business Education 1, no. 1 (2019): 377–82. http://dx.doi.org/10.2478/cplbu-2020-0045.

Full text
Abstract:
AbstractIntuitive user interfaces have been of great concern for GUI developers. The current research, who deals with their designing, faces the term intuitive constantly. The main question is how can the Interface be intuitive? For the moment, the researchers try to provide a very intuitive generic user interface that can be used in a variety of applications. In this paper we provide a solution that can model any applied ontology into a honeycomb menu. The hexagonal shape of the honeycomb has attracted the attention of humans for centuries. As a relevant consequence, the final user can browse any knowledge base very easily with the aid of this interface. Another useful feature is that programmers can take full advantage of semantic web technologies which can tailor results based on any knowledge base that is feed as input, without any need for code change, thus leading towards a panacea system.
APA, Harvard, Vancouver, ISO, and other styles
5

Ardiansyah, Hafizd, and Agung Fatwanto. "Application Design for Registration of Civil Appeals with Intuitive District Courts." IJID (International Journal on Informatics for Development) 9, no. 1 (2020): 45. http://dx.doi.org/10.14421/ijid.2020.09107.

Full text
Abstract:
In designing user interface that are intuitive or easy to understand, users have their own challenges where they must continue to develop so that users can easily use existing applications. There are many user interface designs that are less intuitive not only on the display design, but also on the text and the colors used. This paper aims to create an intuitive user interface design by interviewing potential users from the place where the application will be implemented. The method used is intuitive by means of a literature study approach, and interviews. The results of this research are user interface design and system flow design. With an intuitive display, it makes easier for users to use the application.
APA, Harvard, Vancouver, ISO, and other styles
6

Abrahamian, Edmond, Jerry Weinberg, Michael Grady, and C. Stanton. "The Effect of Personality-Aware Computer-Human Interfaces on Learning." JUCS - Journal of Universal Computer Science 10, no. (1) (2004): 27–37. https://doi.org/10.3217/jucs-010-01-0027.

Full text
Abstract:
Traditional software used for student-centered learning typically provides for a uniform user interface through which the student can interact with the software, and through which the information is delivered in a uniformly identical fashion to all users without regard to their learning style. This research classifies personality types of computer science undergraduate students using the Myers-Briggs Type Indicator, relates these types of personalities to defined learning preferences, and tests if a given user interface designed for a given learning preference enhances learning. The general approach of this study is as follows: given a set of user interfaces designed to fit personality types, provide a given user interface to participants with the matching personality type. In the control group, provide participants with a randomly chosen user interface. Observe the performance of all participants in a post-test. Additionally, observe if the test group had an enhanced learning experience. Quantitative results indicate that personality-aware user interfaces have a significant effect on learning. Qualitative results show that in most cases, users preferred user interfaces designed for their own personality type. Preliminary results show that for introverted intuitive persons and extraverted intuitive persons, the effect of a personality-aware human-computer interface on learning is significant.
APA, Harvard, Vancouver, ISO, and other styles
7

Takahashi, Yasutake, Kyohei Yoshida, Fuminori Hibino, and Yoichiro Maeda. "Human Pointing Navigation Interface for Mobile Robot with Spherical Vision System." Journal of Advanced Computational Intelligence and Intelligent Informatics 15, no. 7 (2011): 869–77. http://dx.doi.org/10.20965/jaciii.2011.p0869.

Full text
Abstract:
Human-robot interaction requires intuitive interface that is not possible using devices, such as, the joystick or teaching pendant, which also require some trainings. Instruction by gesture is one example of an intuitive interfaces requiring no training, and pointing is one of the simplest gestures. We propose simple pointing recognition for a mobile robot having an upwarddirected camera system. The robot using this recognizes pointing and navigates through simple visual feedback control to where the user points. This paper explores the feasibility and utility of our proposal as shown by the results of a questionnaire on proposed and conventional interfaces.
APA, Harvard, Vancouver, ISO, and other styles
8

Jang, Yongseok. "Hand Haptic Interface for Intuitive 3D Interaction." Journal of the HCI Society of Korea 2, no. 2 (2007): 53. http://dx.doi.org/10.17210/jhsk.2007.11.2.2.53.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Strauss, Howard. "The Good Interface: Friendly, Forgiving, and Intuitive." Campus-Wide Information Systems 10, no. 2 (1993): 17–20. http://dx.doi.org/10.1108/eb027522.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bang, Seungbae, and Sung-Hee Lee. "Spline Interface for Intuitive Skinning Weight Editing." ACM Transactions on Graphics 37, no. 5 (2018): 1–14. http://dx.doi.org/10.1145/3186565.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Intuitive interface"

1

Hofmann, Hansjörg [Verfasser]. "Intuitive speech interface technology for information exchange tasks / Hansjörg Hofmann." Ulm : Universität Ulm. Fakultät für Ingenieurwissenschaften und Informatik, 2015. http://d-nb.info/1065309414/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Rang-Roslund, Pontus, and Velazquez Guillermo Munguia. "Development of an Intuitive Interface Structure for Ergonomic Evaluation Software." Thesis, Högskolan i Skövde, Institutionen för ingenjörsvetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-15702.

Full text
Abstract:
During the spring semester of 2018 a product development project has been carried out at the University of Skövde by two Design Engineering Students, Pontus Rang-Roslun, and Guillermo Munguía Velazquez, in cooperation with the project group for Smart Textiles for Sustainable Work Life at the University of Skövde as they are now focusing to develop a web-based software for ergonomists and work leaders/coaches. The aim of the project is to design the interface for the software. The project carried out literature review focused on basic principles of usability, cognition, user interaction, human-computer interaction, user experience and ergonomic evaluation methods. In order to uncover user needs, interviews and observations were performed, and inputs and outputs of the management information were analyzed. Based on the gathered information, concepts were generated and evaluated through formative evaluation. The final iteration brought a flexible and usable interface for ergonomic evaluations<br>Smart Textiles for Sustainable Work Life
APA, Harvard, Vancouver, ISO, and other styles
3

Richards, Mark Andrew. "An intuitive motion-based input model for mobile devices." Thesis, Queensland University of Technology, 2006. https://eprints.qut.edu.au/16556/1/Mark_Richards_Thesis.pdf.

Full text
Abstract:
Traditional methods of input on mobile devices are cumbersome and difficult to use. Devices have become smaller, while their operating systems have become more complex, to the extent that they are approaching the level of functionality found on desktop computer operating systems. The buttons and toggle-sticks currently employed by mobile devices are a relatively poor replacement for the keyboard and mouse style user interfaces used on their desktop computer counterparts. For example, when looking at a screen image on a device, we should be able to move the device to the left to indicate we wish the image to be panned in the same direction. This research investigates a new input model based on the natural hand motions and reactions of users. The model developed by this work uses the generic embedded video cameras available on almost all current-generation mobile devices to determine how the device is being moved and maps this movement to an appropriate action. Surveys using mobile devices were undertaken to determine both the appropriateness and efficacy of such a model as well as to collect the foundational data with which to build the model. Direct mappings between motions and inputs were achieved by analysing users' motions and reactions in response to different tasks. Upon the framework being completed, a proof of concept was created upon the Windows Mobile Platform. This proof of concept leverages both DirectShow and Direct3D to track objects in the video stream, maps these objects to a three-dimensional plane, and determines device movements from this data. This input model holds the promise of being a simpler and more intuitive method for users to interact with their mobile devices, and has the added advantage that no hardware additions or modifications are required the existing mobile devices.
APA, Harvard, Vancouver, ISO, and other styles
4

Blackler, Alethea Liane. "Intuitive interaction with complex artefacts." Thesis, Queensland University of Technology, 2006. https://eprints.qut.edu.au/16219/1/Alethea_Blackler_Thesis.pdf.

Full text
Abstract:
This thesis examines the role of intuition in the way that people operate unfamiliar devices, and the importance of this for designers. Intuition is a type of cognitive processing that is often non-conscious and utilises stored experiential knowledge. Intuitive interaction involves the use of knowledge gained from other products and/or experiences. Therefore, products that people use intuitively are those with features they have encountered before. This position has been supported by two initial experimental studies, which revealed that prior exposure to products employing similar features helped participants to complete set tasks more quickly and intuitively, and that familiar features were intuitively used more often than unfamiliar ones. Participants who had a higher level of familiarity with similar technologies were able to use significantly more of the features intuitively the first time they encountered them, and were significantly quicker at doing the tasks. Those who were less familiar with relevant technologies required more assistance. A third experiment was designed to test four different interface designs on a remote control in order to establish which of two variables - a feature's appearance or its location - was more important in making a design intuitive to use. As with the previous experiments, the findings of Experiment 3 suggested that performance is affected by a person's level of familiarity with similar technologies. Appearance (shape, size and labelling of buttons) seems to be the variable that most affects time spent on a task and intuitive uses. This suggests that the cues that people store in memory about a product's features depend on how the features look, rather than where on the product they are placed. Three principles of intuitive interaction have been developed. A conceptual tool has also been devised to guide designers in their planning for intuitive interaction. Designers can work with these in order to make interfaces intuitive to use, and thus help users to adapt more easily to new products and product types.
APA, Harvard, Vancouver, ISO, and other styles
5

Blackler, Alethea Liane. "Intuitive interaction with complex artefacts." Queensland University of Technology, 2006. http://eprints.qut.edu.au/16219/.

Full text
Abstract:
This thesis examines the role of intuition in the way that people operate unfamiliar devices, and the importance of this for designers. Intuition is a type of cognitive processing that is often non-conscious and utilises stored experiential knowledge. Intuitive interaction involves the use of knowledge gained from other products and/or experiences. Therefore, products that people use intuitively are those with features they have encountered before. This position has been supported by two initial experimental studies, which revealed that prior exposure to products employing similar features helped participants to complete set tasks more quickly and intuitively, and that familiar features were intuitively used more often than unfamiliar ones. Participants who had a higher level of familiarity with similar technologies were able to use significantly more of the features intuitively the first time they encountered them, and were significantly quicker at doing the tasks. Those who were less familiar with relevant technologies required more assistance. A third experiment was designed to test four different interface designs on a remote control in order to establish which of two variables - a feature's appearance or its location - was more important in making a design intuitive to use. As with the previous experiments, the findings of Experiment 3 suggested that performance is affected by a person's level of familiarity with similar technologies. Appearance (shape, size and labelling of buttons) seems to be the variable that most affects time spent on a task and intuitive uses. This suggests that the cues that people store in memory about a product's features depend on how the features look, rather than where on the product they are placed. Three principles of intuitive interaction have been developed. A conceptual tool has also been devised to guide designers in their planning for intuitive interaction. Designers can work with these in order to make interfaces intuitive to use, and thus help users to adapt more easily to new products and product types.
APA, Harvard, Vancouver, ISO, and other styles
6

Gustafson-Pearce, Olinkha. "The application of the information architecture method to design an intuitive haptic interface." Thesis, Brunel University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.429234.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pinkney, James Bassey. "The design of an intuitive teaching interface for robot programming by human demonstration." Thesis, Massachusetts Institute of Technology, 1993. http://hdl.handle.net/1721.1/42822.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1993.<br>Includes bibliographical references (leaves 99-100).<br>This thesis deals with the design and implementation of an intuitive, lightweight, compact, low-cost human interface for robot programming by human demonstration. The key feature of this robotic teaching device is its ability to allow the operator to transfer manual manipulation skills to a robot for the completion of contact tasks. The prototype incorporates 6 degree of freedom force and position sensing with tactile and grip position sensing. Total mass was a low 850 grams. Preliminary experimental results proved ease of use and very low error: 20.3 grf. average force error for a 1 Kgf. applied load, and 16.6 grf. average force error for a 3 Kgf. grip force.<br>by James Bassey Pinkney.<br>M.S.
APA, Harvard, Vancouver, ISO, and other styles
8

Nawrot, Michael Thomas. "Design of a robust, intuitive piston interface for a needle free injection system." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/93008.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2014.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 193-195).<br>The MIT BioInstrumentation Lab's linear Lorentz force actuator based needle free injection system has been shown to have numerous benefits over needle-based and other needle-free drug delivery systems in a research environment. While the device has been used extensively on post mortem tissue and live animals, its use on humans has been restricted in large part because of an ineffective drug delivery ampoule interface which compromises sterility. A new ampoule interface has been developed to allow sterility to be maintained, while also improving robustness against manufacturing tolerances and user error. The new ampoule interface has been tested and compared to the previous ampoule interface and shown to have comparable performance during normal use, while also surviving misuse. An intuitive user interface has also been developed which eases the process of mounting and dismounting ampoules.<br>by Michael Thomas Nawrot.<br>S.M.
APA, Harvard, Vancouver, ISO, and other styles
9

Ramsamy, Priscilla. "An interface for intuitive & natural forms of human computer interaction in virtual environments." Thesis, University of Reading, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.533771.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Richards, Mark Andrew. "An intuitive motion-based input model for mobile devices." Queensland University of Technology, 2006. http://eprints.qut.edu.au/16556/.

Full text
Abstract:
Traditional methods of input on mobile devices are cumbersome and difficult to use. Devices have become smaller, while their operating systems have become more complex, to the extent that they are approaching the level of functionality found on desktop computer operating systems. The buttons and toggle-sticks currently employed by mobile devices are a relatively poor replacement for the keyboard and mouse style user interfaces used on their desktop computer counterparts. For example, when looking at a screen image on a device, we should be able to move the device to the left to indicate we wish the image to be panned in the same direction. This research investigates a new input model based on the natural hand motions and reactions of users. The model developed by this work uses the generic embedded video cameras available on almost all current-generation mobile devices to determine how the device is being moved and maps this movement to an appropriate action. Surveys using mobile devices were undertaken to determine both the appropriateness and efficacy of such a model as well as to collect the foundational data with which to build the model. Direct mappings between motions and inputs were achieved by analysing users' motions and reactions in response to different tasks. Upon the framework being completed, a proof of concept was created upon the Windows Mobile Platform. This proof of concept leverages both DirectShow and Direct3D to track objects in the video stream, maps these objects to a three-dimensional plane, and determines device movements from this data. This input model holds the promise of being a simpler and more intuitive method for users to interact with their mobile devices, and has the added advantage that no hardware additions or modifications are required the existing mobile devices.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Intuitive interface"

1

Brown, John N. A., Anton Josef Fercher, and Gerhard Leitner. Building an Intuitive Multimodal Interface for a Smart Home. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-56532-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

1969-, Grieser Gunter, and Tanaka Y, eds. Intuitive human interfaces for organizing and accessing intellectual assets: International workshop, Dagstuhl Castle, Germany, March 1-5, 2004 : revised selected papers. Springer, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Grieser, Gunter, and Yuzuru Tanaka, eds. Intuitive Human Interfaces for Organizing and Accessing Intellectual Assets. Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/b104697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Meeker, Cassie. Intuitive Human-Machine Interfaces for Non-Anthropomorphic Robotic Hands. [publisher not identified], 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Brenner, Everett H. Beyond Boolean: New approaches to information retrieval : the quest for intuitive online search systems past, present & future. National Federation of Abstracting and Information Services, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Leitner, Gerhard, John N.A N. A. Brown, and Anton Josef Fercher. Building an Intuitive Multimodal Interface for a Smart Home: Hunting the SNARK. Springer, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Grieser, Gunter, and Yuzuru Tanaka. Intuitive Human Interfaces for Organizing and Accessing Intellectual Assets: International Workshop, Dagstuhl Castle, Germany, March 1-5, 2004, Revised Selected Papers. Springer London, Limited, 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Blackler, Alethea. Intuitive Interaction. Taylor & Francis Group, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

1st, Dwivedi Rakesh Kumar. Improving User Experience with Intuitive Interfaces. Alexis Press LLC, 2022.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Intuition the Amiga User Interface. Addison-Wesley, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Intuitive interface"

1

Pirta, Raghubir Singh. "Interface between folk mind and intuitive cognition." In Intuitive Cognition. Routledge India, 2025. https://doi.org/10.4324/9781003377757-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yoshida, Yuichi, Kento Miyaoku, Takashi Satou, and Suguru Higashino. "Mobile Reacher Interface for Intuitive Information Navigation." In Human-Computer Interaction - INTERACT 2005. Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11555261_112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Takashina, Tomomi, Hitoshi Kawai, and Yuji Kokumai. "Tangible Microscope with Intuitive Stage Control Interface." In Human-Computer Interaction – INTERACT 2015. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-22723-8_71.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Christmann, Simon, Marvin Löhr, Imke Busboom, Volker K. S. Feige, and Hartmut Haehnel. "Towards Real-Time Human-Machine Interfaces for Robot Cells Using Open Standard Web Technologies." In Technologien für die intelligente Automation. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-662-64283-2_7.

Full text
Abstract:
AbstractScreen-based human-machine interfaces are one of the most important elements of industrial automation technologies since modern production lines became too complex to be controlled by a simple start/stop button. While web-based user interfaces have been used in non-industrial areas for many years, they have only been used in industrial applications since the beginning of the Industry 4.0 movement. However, commercially available solutions do not yet have the intuitive operation that customers from non-industrial sectors are accustomed to. In this work, we present a proof-of-concept development that aims to create an intuitive web-based user interface for displaying robot movements. The robot is displayed in a WebGL-based visualization, which also allows user interaction. The user interface is created purely from open standard web technologies so that it works without plugins and is immediately usable in all modern browsers.
APA, Harvard, Vancouver, ISO, and other styles
5

Sarkar, Rahul, Chrishnika de Almeida, Noureen Syed, Sheliza Jamal, and Jeff Orchard. "Intuitive Interface for the Exploration of Volumetric Datasets." In Advances in Computer and Information Sciences and Engineering. Springer Netherlands, 2008. http://dx.doi.org/10.1007/978-1-4020-8741-7_76.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Higel, Steffen, Tony O’Donnell, Dave Lewis, and Vincent Wade. "Towards an Intuitive Interface for Tailored Service Compositions." In Distributed Applications and Interoperable Systems. Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-40010-3_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Islam, Muhammad Nazrul. "Towards Determinants of User-Intuitive Web Interface Signs." In Design, User Experience, and Usability. Design Philosophy, Methods, and Tools. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39229-0_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Claassen, Lennart, Simon Aden, Johannes Gaa, Jens Kotlarski, and Tobias Ortmaier. "Intuitive Robot Control with a Projected Touch Interface." In Social Robotics. Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11973-1_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Morita, Satoru. "Eye Movement Navigation Interface Supporting Reading." In Intuitive Human Interfaces for Organizing and Accessing Intellectual Assets. Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/978-3-540-32279-5_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Spath, Dieter, Matthias Peissner, Lorenz Hagenmeyer, and Brigitte Ringbauer. "New Approaches to Intuitive Auditory User Interfaces." In Human Interface and the Management of Information. Methods, Techniques and Tools in Information Design. Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-73345-4_110.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Intuitive interface"

1

Pandhare, Atharva, Ravi Raghavan, Shreyas Ramachandran Aryan Patil, Vijay Chandhar, Maria Striki, and Sasan Haghani. "Cloud-Connected Human-Drone Interface for Intuitive Navigation." In 2024 IEEE International Symposium on Consumer Technology (ISCT). IEEE, 2024. https://doi.org/10.1109/isct62336.2024.10791117.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lee, Seo-Hyun, Ji-Ha Park, and Deok-Seon Kim. "Imagined Speech and Visual Imagery as Intuitive Paradigms for Brain-Computer Interfaces." In 2025 13th International Conference on Brain-Computer Interface (BCI). IEEE, 2025. https://doi.org/10.1109/bci65088.2025.10931355.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Chaudhary, Akash, Tiago Nascimento, and Martin Saska. "Intuitive Human-Robot Interface: A 3-Dimensional Action Recognition and UAV Collaboration Framework." In 21st International Conference on Informatics in Control, Automation and Robotics. SCITEPRESS - Science and Technology Publications, 2024. http://dx.doi.org/10.5220/0012921300003822.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rohith, B., RV Rohith, and V. Surya. "Boosting Social Media Security Fake Account Detection With An Intuitive Stream Lite Interface." In 2025 International Conference on Computing and Communication Technologies (ICCCT). IEEE, 2025. https://doi.org/10.1109/iccct63501.2025.11019223.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Huang, Jinmiao, and Rahul Rai. "Hand Gesture Based Intuitive CAD Interface." In ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-34070.

Full text
Abstract:
A key objective of gesture based computer aided design (CAD) interface is to enable humans to manipulate 3D models in virtual environments in a manner similar to how such objects are manipulated in real-life. In this paper, we outline the development of a novel real-time gesture based conceptual computer aided design tool which enables intuitive hand gesture based interaction with a given design interface. Recognized hand gestures along with hand position information are converted into commands for rotating, scaling, and translating 3D models. In the presented system, gestures are identified based solely on the depth information obtained via inexpensive depth sensing cameras (SoftKinetics DepthSense 311). Since the gesture recognition system is entirely based on using depth images, the developed system is robust and insensitive to variations in lighting conditions, hand color, and background noise. The difference between the input hand shape and the nearest neighboring point in the database is employed as the criterion to recognize different gestures. Extensive experiments with a design interface are also presented to demonstrate the accuracy, robustness, and effectiveness of the presented system.
APA, Harvard, Vancouver, ISO, and other styles
6

Islam, Muhammad Nazrul. "Towards Designing Users' Intuitive Web Interface." In 2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems (CISIS). IEEE, 2012. http://dx.doi.org/10.1109/cisis.2012.129.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Couse, Mary Margaret. "Creating an elegant, intuitive, user interface." In the 13th annual international conference. ACM Press, 1995. http://dx.doi.org/10.1145/223984.223991.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Qianxiang. "An Intuitive Approach for Specifying Interface Constraint." In 2009 9th International Conference on Quality Software (QSIC). IEEE, 2009. http://dx.doi.org/10.1109/qsic.2009.62.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kang, Sangseung, Jaehong Kim, and Jaeyeon Lee. "Intuitive control using a mediated interface module." In the International Conference. ACM Press, 2009. http://dx.doi.org/10.1145/1690388.1690487.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Negi, Suraj, Shakhaf Joseph, Cydnelle Alemao, and Vincy Joseph. "Intuitive User Interface for Enhanced Search Experience." In 2020 3rd International Conference on Communication System, Computing and IT Applications (CSCITA). IEEE, 2020. http://dx.doi.org/10.1109/cscita47329.2020.9137806.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Intuitive interface"

1

Rohde, Mitchell M., Victor E. Perlin, Karl D. Iagnemma, et al. PointCom: Semi-Autonomous UGV Control With Intuitive Interface. Defense Technical Information Center, 2009. http://dx.doi.org/10.21236/ada510465.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Stroumtsos, Nicholas, Gary Gilbreath, and Scott Przybylski. An Intuitive Graphical User Interface for Small UAS. Defense Technical Information Center, 2013. http://dx.doi.org/10.21236/ada587349.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shepherd, Adam, Dana Gerlach, Taylor Heyl, et al. Biological and Chemical Oceanography Data Management Office: Supporting a New Vision for Adaptive Management of Oceanographic Data [poster]. Woods Hole Oceanographic Institution, 2022. http://dx.doi.org/10.1575/1912/29047.

Full text
Abstract:
An unparalleled data catalog of well-documented, interoperable oceanographic data and information, openly accessible to all end-users through an intuitive web-based interface for the purposes of advancing marine research, education, and policy. Conference Website: https://web.whoi.edu/ocb-workshop/
APA, Harvard, Vancouver, ISO, and other styles
4

Tsidylo, Ivan M., Hryhoriy V. Tereshchuk, Serhiy V. Kozibroda, et al. Methodology of designing computer ontology of subject discipline by future teachers-engineers. [б. в.], 2019. http://dx.doi.org/10.31812/123456789/3249.

Full text
Abstract:
The article deals with the problem of the methodology of designing computer ontology of the subject discipline by the future teachers-engineers in the field of computer technologies. The scheme of ontology of the subject discipline is presented in which the set of concepts of the future computer ontology and the set of relations between them are represented. The main criteria of the choice of systems of computer ontologies for designing computer ontology of the subject discipline: software architecture and tools development; interoperability; intuitive interface are established. The selection of techniques for designing ontologies using computer ontology systems is carried out. The algorithm of designing computer ontology of the subject discipline by the future teachers-engineers in the field of computer technologies is proposed.
APA, Harvard, Vancouver, ISO, and other styles
5

Yip, Eugene, and Gerald Lüttgen. Concurrency, Shared Variables, Compositionality : An Unlikely Triple. Otto-Friedrich-Universität, 2024. http://dx.doi.org/10.20378/irb-97637.

Full text
Abstract:
This paper reports our experiences with extending Interface Automaton (IA) with shared variables by lifting IA’s intuitive notion of refinement and compositionality to shared variables. Although there are existing works that introduce shared variables to IA, they typically support a very restricted notion of sharing, e.g., the value of a shared variable is only defined for the duration of an atomic operation, with no ability to persist values for subsequent operations. When attempting to formulate the semantics of shared variables that could persist their values across operations, we encountered numerous challenges when defining a notion of refinement that respected compositionality. We conjecture that, even for a basic notion of variable persistence, concurrent shared variable accesses between automata create a tight data dependency that prevents a compositional reasoning. We discuss the generality of this negative result in relation to other concurrency theories.
APA, Harvard, Vancouver, ISO, and other styles
6

Tarko, Andrew P., Jose Thomaz, and Mario Romero. SNIP Light User Manual. Purdue University, 2020. http://dx.doi.org/10.5703/1288284317136.

Full text
Abstract:
A systemic approach to identifying road locations that exhibit safety problems was provided by the Safety Needs Identification Program (SNIP and SNIP2) developed by the Purdue University Center for Road Safety (CRS). The new version SNIP Light has been developed to provide other uses with planning level traffic safety analysis capability for a wider range of uses including Metropolitan Planning Agencies (MPOs) who want the tool for planning cost-effective safety programs in their metropolitan areas. The SNIP Light reduces the demand of computing and data storage resources and replaces the SQL server database system an integrated module coded in-house which is considerably faster than the original component. Furthermore, certain proficiency required to install and use the old version is no longer needed thanks to the intuitive single-window interface and executing file operations in the background without the user’s involvement. Some operations, such as optimizing funding of safety projects, are removed to simplify the tool.
APA, Harvard, Vancouver, ISO, and other styles
7

Marshak, Ronni. User Interfaces Shouldn’t Just Be Intuitive. Patricia Seybold Group, 2010. http://dx.doi.org/10.1571/psgp12-16-10cc.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Pasupuleti, Murali Krishna. Augmented Human Intelligence: Converging Generative AI, Quantum Computing, and XR for Enhanced Human-Machine Synergy. National Education Services, 2025. https://doi.org/10.62311/nesx/rrv525.

Full text
Abstract:
Abstract: Augmented Human Intelligence (AHI) represents a paradigm shift in human-AI collaboration, leveraging Generative AI, Quantum Computing, and Extended Reality (XR) to enhance cognitive capabilities, decision-making, and immersive interactions. Generative AI enables real-time knowledge augmentation, automated creativity, and adaptive learning, while Quantum Computing accelerates AI optimization, pattern recognition, and complex problem-solving. XR technologies provide intuitive, immersive environments for AI-driven collaboration, bridging the gap between digital and physical experiences. The convergence of these technologies fosters hybrid intelligence, where AI amplifies human potential rather than replacing it. This research explores AI-augmented cognition, quantum-enhanced simulations, and AI-driven spatial computing, addressing ethical, security, and societal implications of human-machine synergy. By integrating decentralized AI governance, privacy-preserving AI techniques, and brain-computer interfaces, this study outlines a scalable framework for next-generation augmented intelligence applications in healthcare, enterprise intelligence, scientific discovery, and immersive learning. The future of AHI lies in hybrid intelligence systems that co-evolve with human cognition, ensuring responsible and transparent AI augmentation to unlock new frontiers in human potential. Keywords: Augmented Human Intelligence, Generative AI, Quantum Computing, Extended Reality, XR, AI-driven Cognition, Hybrid Intelligence, Brain-Computer Interfaces, AI Ethics, AI-enhanced Learning, Spatial Computing, Quantum AI, Immersive AI, Human-AI Collaboration, Ethical AI Frameworks.
APA, Harvard, Vancouver, ISO, and other styles
9

Martinez, Kimberly D., and Gaojian Huang. Exploring the Effects of Meaningful Tactile Display on Perception and Preference in Automated Vehicles. Mineta Transportation Institute, 2022. http://dx.doi.org/10.31979/mti.2022.2164.

Full text
Abstract:
There is an existing issue in human-machine interaction, such that drivers of semi-autonomous vehicles are still required to take over control of the vehicle during system limitations. A possible solution may lie in tactile displays, which can present status, direction, and position information while avoiding sensory (e.g., visual and auditory) channels overload to reliably help drivers make timely decisions and execute actions to successfully take over. However, limited work has investigated the effects of meaningful tactile signals on takeover performance. This study synthesizes literature investigating the effects of tactile displays on takeover performance in automated vehicles and conducts a human-subject study to design and test the effects of six meaningful tactile signal types and two pattern durations on drivers’ perception and performance during automated driving. The research team performed a literature review of 18 articles that conducted human-subjects experiments on takeover performance utilizing tactile displays as takeover requests. Takeover performance in these studies were highlighted, such as response times, workload, and accuracy. The team then conducted a human-subject experiment, which included 16 participants that used a driving simulator to present 30 meaningful vibrotactile signals, randomly across four driving sessions measuring for reaction times (RTs), interpretation accuracy, and subjective ratings. Results from the literature suggest that tactile displays can present meaningful vibrotactile patterns via various in-vehicle locations to help improve drivers’ performance during the takeover and can be used to assist in the design of human-machine interfaces (HMI) for automated vehicles. The experiment yielded results illustrating higher urgency patterns were associated with shorter RTs and higher intuitive ratings. Also, pedestrian status and headway reduction signals presented shorter RTs and increased confidence ratings compared to other tactile signal types. Finally, the signal types that yielded the highest accuracy were the surrounding vehicle and navigation signal types. Implications of these findings may lie in informing the design of next-generation in-vehicle HMIs and future human factors studies on human-automation interactions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!