To see the other types of publications on this topic, follow the link: Intuitive user interface.

Dissertations / Theses on the topic 'Intuitive user interface'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 18 dissertations / theses for your research on the topic 'Intuitive user interface.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Rang-Roslund, Pontus, and Velazquez Guillermo Munguia. "Development of an Intuitive Interface Structure for Ergonomic Evaluation Software." Thesis, Högskolan i Skövde, Institutionen för ingenjörsvetenskap, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-15702.

Full text
Abstract:
During the spring semester of 2018 a product development project has been carried out at the University of Skövde by two Design Engineering Students, Pontus Rang-Roslun, and Guillermo Munguía Velazquez, in cooperation with the project group for Smart Textiles for Sustainable Work Life at the University of Skövde as they are now focusing to develop a web-based software for ergonomists and work leaders/coaches. The aim of the project is to design the interface for the software. The project carried out literature review focused on basic principles of usability, cognition, user interaction, human-computer interaction, user experience and ergonomic evaluation methods. In order to uncover user needs, interviews and observations were performed, and inputs and outputs of the management information were analyzed. Based on the gathered information, concepts were generated and evaluated through formative evaluation. The final iteration brought a flexible and usable interface for ergonomic evaluations<br>Smart Textiles for Sustainable Work Life
APA, Harvard, Vancouver, ISO, and other styles
2

Wu, Naomi. "A LONG-DISTANCE RELATIONSHIP : RECONNECTING HOTELS WITH THEIR GUESTS VIA INTUITIVE DESIGN." Thesis, Umeå universitet, Institutionen för psykologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-150792.

Full text
Abstract:
Currently for travel planning, guests will research via hotel websites while still preferring to book through third-party sites, which leads to a disconnect between hotels and their guests. A chat widget artifact that is added onto the hotel’s website and linked through messaging applications was created by a start-up company, Bookboost, to bridge this gap. The current intuitiveness of the artifact and future improvements that may increase intuitiveness was investigated through a case study of user and expert analysis. 10 participants – 5 hotel staff users and 5 guest users – were sampled at hotel lobbies via systematic sampling and non-random sampling. Participants ranged in age from 18 to 65 years old, with 30% being millennials. Task analysis, an interview, and a questionnaire were used for user analysis. The researcher acted as an evaluator and examined the artifact for flaws and possible improvements using activity theory’s human-artifact model (HAM). Analyses suggest that current intuitiveness is fairly high, but there is room for improvement. There seems to be a difference between millennials and non-millennials, especially regarding the amount of time taken and preference for the artifact (versus more familiar methods for communicating with others). Interest and comfort in technology usage was a factor in intuitiveness. Generally, those more comfortable with technology had higher zone of proximal development (ZPD) scores. Improvements have been suggested that may increase artifact intuitiveness, although this was not tested due to the scope of the study. Future research can continue to examine if the suggested improvements have indeed increased intuitiveness in the artifact for users of all ages.<br>Vid reseplanering brukar gäster ofta undersöka hotellwebbplatser men sedan ändå föredra att boka via tredjepartssidor, vilket leder till en klyfta mellan hotellen och deras gäster. För att överbrygga detta gap har startupföretaget Bookboost skapat en chattwidget (artefakt) som läggs till på hotellets webbplats och länkas till användarnas chattappar. Denna artefakts nuvarande intuitivitet och möjlighet till framtida intuitivitetsförbättringar undersöks genom en fallstudie av upplevelsen hos både experter och vanliga användare. 10 deltagare – 5 hotellmedarbetare och 5 gäster – rekryterades via förfrågan i hotellfoajéer utifrån ett systematiskt urval och icke-slumpmässigt urval. Deltagarna varierade i ålder från 18 till 65 år, med 30% inom milleniegenerationen. Uppgiftsanalys, intervju, och frågeformulär tillämpades i användaranalysen. Forskaren fungerade som utvärderare och undersökte artefaktens brister och möjliga förbättringar med hjälp av aktivitetsteorins human-artifact model (HAM). Analyserna tyder på att dagens intuitivitet är ganska hög, men att det finns utrymme för förbättringar. Det verkar finnas en generationsskillnad mellan äldre och yngre användare, särskilt när det gäller tidsåtgången och preferensen för artefakten (jämfört med mer välbekanta kommunikationsmetoder). Intresset för och komforten med teknologianvändning var en faktor i intuiviteten. I allmänhet uppnådde de som var mer bekväma med teknik en högre poäng i zonen för proximal utveckling (ZPD). Förbättringar som kan öka intuitiviteten för artefakten föreslås, även om prövandet av dessa inte ryms inom ramen för denna studie. Framtida forskning kan undersöka om de föreslagna förbättringarna verkligen ökar artefaktens intuitivitet för användare i alla åldrar.
APA, Harvard, Vancouver, ISO, and other styles
3

Horn, Carolin, and Christoph-Philipp Schreiber. "Augmented Reality als intuitive Benutzungsschnittstelle für das Roboterprogrammieren." Thelem Universitätsverlag & Buchhandlung GmbH & Co. KG, 2021. https://tud.qucosa.de/id/qucosa%3A75884.

Full text
Abstract:
Das Programmieren der Bewegungsbahnen von Robotern erfordert Fachwissen und ist ein zeitintensiver und aufwendiger Prozess. Dieser Beitrag beschäftigt sich mit dem Einsatz von Augmented Reality (AR) in Form eines AR Head Mounted Display (HMD) als intuitives Schnittstelle (engl. Interface) für die Roboterprogrammierung. Zunächst wird ein Überblick über aktuelle und relevante Forschung im Bereich AR Anwendungen in der Robotik gegeben. Aktuelle Forschungsarbeit auf dem Gebiet widmet sich vorrangig der technischen Umsetzung einzelner Funktionalitäten. In diesem Beitrag aus der Praxis sollen die technischen Möglichkeiten den Problematiken potenzieller Anwender:innen angepasst werden. Der Fokus liegt damit auf dem Mehrwert für spezifische Nutzergruppen und der einfachen und intuitiven Bedienung des AR Interfaces selbst. Zunächst wird, einem nutzerzentrierten Entwicklungsprozess folgend, erhoben, welchen Herausforderungen Expert:innen und Laien bei der Roboterprogrammierung begegnen. Auf dieser Basis werden Anforderungen abgeleitet und ein erlebbarer Prototyp entwickelt und gestaltet, der eine weitere Untersuchungen ermöglicht. Ein geplantes Untersuchungskonzept hinsichtlich Aspekten der User Experience (UX) wird im Ausblick beleuchtet.
APA, Harvard, Vancouver, ISO, and other styles
4

Richards, Mark Andrew. "An intuitive motion-based input model for mobile devices." Thesis, Queensland University of Technology, 2006. https://eprints.qut.edu.au/16556/1/Mark_Richards_Thesis.pdf.

Full text
Abstract:
Traditional methods of input on mobile devices are cumbersome and difficult to use. Devices have become smaller, while their operating systems have become more complex, to the extent that they are approaching the level of functionality found on desktop computer operating systems. The buttons and toggle-sticks currently employed by mobile devices are a relatively poor replacement for the keyboard and mouse style user interfaces used on their desktop computer counterparts. For example, when looking at a screen image on a device, we should be able to move the device to the left to indicate we wish the image to be panned in the same direction. This research investigates a new input model based on the natural hand motions and reactions of users. The model developed by this work uses the generic embedded video cameras available on almost all current-generation mobile devices to determine how the device is being moved and maps this movement to an appropriate action. Surveys using mobile devices were undertaken to determine both the appropriateness and efficacy of such a model as well as to collect the foundational data with which to build the model. Direct mappings between motions and inputs were achieved by analysing users' motions and reactions in response to different tasks. Upon the framework being completed, a proof of concept was created upon the Windows Mobile Platform. This proof of concept leverages both DirectShow and Direct3D to track objects in the video stream, maps these objects to a three-dimensional plane, and determines device movements from this data. This input model holds the promise of being a simpler and more intuitive method for users to interact with their mobile devices, and has the added advantage that no hardware additions or modifications are required the existing mobile devices.
APA, Harvard, Vancouver, ISO, and other styles
5

Richards, Mark Andrew. "An intuitive motion-based input model for mobile devices." Queensland University of Technology, 2006. http://eprints.qut.edu.au/16556/.

Full text
Abstract:
Traditional methods of input on mobile devices are cumbersome and difficult to use. Devices have become smaller, while their operating systems have become more complex, to the extent that they are approaching the level of functionality found on desktop computer operating systems. The buttons and toggle-sticks currently employed by mobile devices are a relatively poor replacement for the keyboard and mouse style user interfaces used on their desktop computer counterparts. For example, when looking at a screen image on a device, we should be able to move the device to the left to indicate we wish the image to be panned in the same direction. This research investigates a new input model based on the natural hand motions and reactions of users. The model developed by this work uses the generic embedded video cameras available on almost all current-generation mobile devices to determine how the device is being moved and maps this movement to an appropriate action. Surveys using mobile devices were undertaken to determine both the appropriateness and efficacy of such a model as well as to collect the foundational data with which to build the model. Direct mappings between motions and inputs were achieved by analysing users' motions and reactions in response to different tasks. Upon the framework being completed, a proof of concept was created upon the Windows Mobile Platform. This proof of concept leverages both DirectShow and Direct3D to track objects in the video stream, maps these objects to a three-dimensional plane, and determines device movements from this data. This input model holds the promise of being a simpler and more intuitive method for users to interact with their mobile devices, and has the added advantage that no hardware additions or modifications are required the existing mobile devices.
APA, Harvard, Vancouver, ISO, and other styles
6

Fu, Hongbo. "Differential methods for intuitive 3D shape modeling /." View abstract or full-text, 2007. http://library.ust.hk/cgi/db/thesis.pl?CSED%202007%20FU.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Blackler, Alethea Liane. "Intuitive interaction with complex artefacts." Thesis, Queensland University of Technology, 2006. https://eprints.qut.edu.au/16219/1/Alethea_Blackler_Thesis.pdf.

Full text
Abstract:
This thesis examines the role of intuition in the way that people operate unfamiliar devices, and the importance of this for designers. Intuition is a type of cognitive processing that is often non-conscious and utilises stored experiential knowledge. Intuitive interaction involves the use of knowledge gained from other products and/or experiences. Therefore, products that people use intuitively are those with features they have encountered before. This position has been supported by two initial experimental studies, which revealed that prior exposure to products employing similar features helped participants to complete set tasks more quickly and intuitively, and that familiar features were intuitively used more often than unfamiliar ones. Participants who had a higher level of familiarity with similar technologies were able to use significantly more of the features intuitively the first time they encountered them, and were significantly quicker at doing the tasks. Those who were less familiar with relevant technologies required more assistance. A third experiment was designed to test four different interface designs on a remote control in order to establish which of two variables - a feature's appearance or its location - was more important in making a design intuitive to use. As with the previous experiments, the findings of Experiment 3 suggested that performance is affected by a person's level of familiarity with similar technologies. Appearance (shape, size and labelling of buttons) seems to be the variable that most affects time spent on a task and intuitive uses. This suggests that the cues that people store in memory about a product's features depend on how the features look, rather than where on the product they are placed. Three principles of intuitive interaction have been developed. A conceptual tool has also been devised to guide designers in their planning for intuitive interaction. Designers can work with these in order to make interfaces intuitive to use, and thus help users to adapt more easily to new products and product types.
APA, Harvard, Vancouver, ISO, and other styles
8

Blackler, Alethea Liane. "Intuitive interaction with complex artefacts." Queensland University of Technology, 2006. http://eprints.qut.edu.au/16219/.

Full text
Abstract:
This thesis examines the role of intuition in the way that people operate unfamiliar devices, and the importance of this for designers. Intuition is a type of cognitive processing that is often non-conscious and utilises stored experiential knowledge. Intuitive interaction involves the use of knowledge gained from other products and/or experiences. Therefore, products that people use intuitively are those with features they have encountered before. This position has been supported by two initial experimental studies, which revealed that prior exposure to products employing similar features helped participants to complete set tasks more quickly and intuitively, and that familiar features were intuitively used more often than unfamiliar ones. Participants who had a higher level of familiarity with similar technologies were able to use significantly more of the features intuitively the first time they encountered them, and were significantly quicker at doing the tasks. Those who were less familiar with relevant technologies required more assistance. A third experiment was designed to test four different interface designs on a remote control in order to establish which of two variables - a feature's appearance or its location - was more important in making a design intuitive to use. As with the previous experiments, the findings of Experiment 3 suggested that performance is affected by a person's level of familiarity with similar technologies. Appearance (shape, size and labelling of buttons) seems to be the variable that most affects time spent on a task and intuitive uses. This suggests that the cues that people store in memory about a product's features depend on how the features look, rather than where on the product they are placed. Three principles of intuitive interaction have been developed. A conceptual tool has also been devised to guide designers in their planning for intuitive interaction. Designers can work with these in order to make interfaces intuitive to use, and thus help users to adapt more easily to new products and product types.
APA, Harvard, Vancouver, ISO, and other styles
9

Appl, Martin. "Intuitivní kreslení na platformě Android." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2012. http://www.nusl.cz/ntk/nusl-236437.

Full text
Abstract:
This master's thesis deals with design and implementation of finger painting application for mobile devices with Android operating system. Main focus is on well designed, intuitive and friendly user interface. Solved problems are spline interpolation of points, zoom and pinch with transformation matrices, extensive history for action reversal and few basic tools.
APA, Harvard, Vancouver, ISO, and other styles
10

McEwan, Mitchell W. "The influence of naturally mapped control interfaces for video games on the player experience and intuitive interaction." Thesis, Queensland University of Technology, 2017. https://eprints.qut.edu.au/107983/2/Mitchell_McEwan_Thesis.pdf.

Full text
Abstract:
This thesis empirically explores the influence of different types of naturally mapped control interfaces (NMCIs) for video games on the player experience and intuitive interaction. Across two repeated-measures experiments on racing and tennis games, more naturally mapped controls were shown to have largely positive effects, with some differences associated with player characteristics. The compensatory effects of natural mapping for casual players are revealed, along with some aversion to NMCIs amongst hardcore players. Overall implications are discussed, and a new NMCI Dimensions Framework presented, to aid future academic and design work leveraging NMCIs to improve video game accessibility and experiences.
APA, Harvard, Vancouver, ISO, and other styles
11

Hedges, Mitchell Lawrence. "An investigation into the use of intuitive control interfaces and distributed processing for enhanced three dimensional sound localization." Thesis, Rhodes University, 2016. http://hdl.handle.net/10962/d1020615.

Full text
Abstract:
This thesis investigates the feasibility of using gestures as a means of control for localizing three dimensional (3D) sound sources in a distributed immersive audio system. A prototype system was implemented and tested which uses state of the art technology to achieve the stated goals. A Windows Kinect is used for gesture recognition which translates human gestures into control messages by the prototype system, which in turn performs actions based on the recognized gestures. The term distributed in the context of this system refers to the audio processing capacity. The prototype system partitions and allocates the processing load between a number of endpoints. The reallocated processing load consists of the mixing of audio samples according to a specification. The endpoints used in this research are XMOS AVB endpoints. The firmware on these endpoints were modified to include the audio mixing capability which was controlled by a state of the art audio distribution networking standard, Ethernet AVB. The hardware used for the implementation of the prototype system is relatively cost efficient in comparison to professional audio hardware, and is also commercially available for end users. the successful implementation and results from user testing of the prototype system demonstrates how it is a feasible option for recording the localization of a sound source. The ability to partition the processing provides a modular approach to building immersive sound systems. This removes the constraint of a centralized mixing console with a predetermined speaker configuration.
APA, Harvard, Vancouver, ISO, and other styles
12

Hedges, M. L. "An investigation into the use of intuitive control interfaces and distributed processing for enhanced three dimensional sound localization." Thesis, Rhodes University, 2016. http://hdl.handle.net/10962/2992.

Full text
Abstract:
This thesis investigates the feasibility of using gestures as a means of control for localizing three dimesional (3D) sound sources in a distributed immersive audio system. A prototype system was implemented and tested which uses state of the art technology to achieve the stated goals. A Windows Kinect is used for gesture recognition which translates human gestures into control messages by the prototype system, which in turn performs actions based on the recognized gestures. The term distributed in the context of this system refers to the audio processing capacity. The prototype system partitions and allocates the processing load between a number of endpoints. The reallocated processing load consists of the mixing of audio samples according to a specification. The endpoints used in this research are XMOS AVB endpoints. The firmware on these endpoints were modified to include the audio mixing capability which was controlled by a state of the art audio distribution networking standard, Ethernet AVB. The hardware used for the implementation of the prototype system is relatively cost efficient in comparison to professional audio hardware, and is also commercially available for end users. The successful implementation and results from user testing of the prototype system demonstrates how it is a feasible option for recording the localization of a sound source. The ability to partition the processing provides a modular approach to building immersive sound systems. This removes the constraint of a centralized mixing console with a predetermined speaker configuration.
APA, Harvard, Vancouver, ISO, and other styles
13

De, Martini Alessandro. "Intuitive programming of mobile manipulation applications : A functional and modular GUI architecture for End-User robot programming." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-303006.

Full text
Abstract:
Mobile manipulators are changing the way companies and industries complete their work. Untrained end users risk facing unfunctional and nonuser- friendly Graphical User Interfaces. Recently, there has been shortages of people and talent in the heathcare industry where these applications would benefit in being used to accomplish easy and low level tasks. All these reasons contribute to the need of finding functional robot-user ways of communicating that allow the expansion of mobile manipulation applications. This thesis addresses the problem of finding an intuitive way to deploy a mobile manipulator in a laboratory environment. This thesis has analyzed whether it is possible to permit the user to work with a manipulator efficiently and without too much effort via a functional graphical user interface. Creating a modular interface based on user needs is the innovation value of this work. It allows the expansion of mobile manipulator applications that increases the number of possible users. To accomplish this purpose a Graphical User Interface application is proposed using an explanatory research strategy. First, user data was acquired using an ad hoc research survey and mixed with literature implementations to create the right application design. Then, an iterative implementation based on code-creation and tests was used to design a valuable solution. Finally, the results from an observational user study with non-roboticist programmers are presented. The results were validated with the help of 10 potential end users and a validation matrix. This demonstrated how the system is both functional and user-friendly for novices, but also expressive for experts.<br>Mobilmanipulatorer förändrar sättet som företag och industrier utför sitt arbete. Otränade slutanvändare och särskilt de utan programmeringskunskap kommer att bemötas av icke-funktionella och användarovänliga grafiska användargränssnitt. Den senaste tiden har det varit brist på specialiserad personal inom hälsovårdsindustrin som har resulterat i ett beroende på dessa applikationer för att genomföra enkla uppgifter samt uppgifter på låg nivå. Alla dessa faktorer bidrar till det ökande behovet att hitta ett funktionellt sätt att kommunicera mellan robot och slutanvändare vilket tillåter expansionen av mobilmanipulatorapplikationer. Arbetet som beskrivs i denna avhandling adresserar problemet att finna ett intuitivt sätt att använda en mobilmanipulator i ett laboratoriemijö. Möjligheten att tillåta användaren att på ett enkelt och effektivt sätt arbeta med en manipulator via ett funtionellt grafiskt användargränssnitt analyseras. Innovationsvärdet och detta examensarbetes bidrag till nuvarande kunskap betraktar möjligheten att skapa ett modulärt gränssnitt baserat på användares behov. Detta möjliggör expansionen av mobilmanipulatörers applikation vilket ökar antalet möjliga användare. En förklarande forskningsstrategi används för att föreslå en grafisk användargränssnittsapplikation för att uppnå detta mål. Först användes data från ad hoc-undersökningar blandat med litteraturimplementeringar för att skapa den rätta applikationsdesignen. En iterativ implementering baserad på kodskapande samt tester användes sedan för att designa en värdefull lösning redo att testas. Slutligen presenteras resultat från en användarobservationsstudie med icke-robotikprogrammerare. De insamlade resultaten som samlades in under valideringsstadiet tack vare en grupp bestående av tio potentiella slutanvändare har analyserats genom användandet av en valideringsmatris som är baserad på tre parametrar. Detta demonstrerade hur systemet är både funktionellt och användarvänligt för nybörjare men också expressivt för experter.
APA, Harvard, Vancouver, ISO, and other styles
14

Akgun, Mahir. "The Effect Of Apologetic Error Messages And Mood States On Computer Users." Master's thesis, METU, 2007. http://etd.lib.metu.edu.tr/upload/3/12608479/index.pdf.

Full text
Abstract:
The main aim of this study, in which 310 university students participated, is to investigate whether or not computer interfaces offering human-like apologetic error messages influence users&rsquo<br>self-appraisals of performances and actual performances in the computerized environment. For the study, an online instructional material which includes deliberate design problems leading to user frustration was developed. The study is comprised of three phases. In the first phase, based on the CCSARP (Cross-Cultural Study of Speech Act Realization Patterns) coding manual and the studies conducted with the framework provided by the manual, apology strategy sequences were elicited from Turkish participants. Two of these apology strategy sequences were selected for producing two apology error messages. In addition to these apology messages, one plain computer error message was also developed for experimental control. The second phase of the study was conducted to determine whether these three messages were perceived as apologies. It was found out that the two apology messages were perceived as apologies and the plain computer message was not perceived as an apology. In the third phase these three messages were used to investigate the relationship between mood, self-appraisal of performance and actual performance after the transmission of the apologetic error messages. The findings of this study show that the frequencies of apology strategies preferred in the computerized environment are similar with those utilized in the social context. Statistical analyses also reveal that the influence of apology messages on self-appraisal of performance depends on participants&rsquo<br>mood state and the contents of the apology messages.
APA, Harvard, Vancouver, ISO, and other styles
15

Hui-TszChen and 陳慧慈. "Digital Brush with Intuitive Operation Design for Children on Tangible User Interface." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/05679091741937534894.

Full text
Abstract:
碩士<br>國立成功大學<br>工業設計學系碩博士班<br>98<br>This study was based on the intuitive operation. Combining with tangible user interface (TUI), investigate the form of digital brush with intuitive operation that is suitable for 4-7 years-old children. Considering with the needs and cognitive ability of children, applying their tactile and visual abilities to learning in this period and convey the use by form. Through identify different patterns and observe the shape, change the brush intuitively and show more plentiful lines. The value of traditional drawing will be extended in the digital drawing. The first phase in the study is surveying painting status of 4-7 years-old children, through a questionnaire and user survey, summarizing the needs of children in the drawing and operating characteristics, defining system functions and control mechanisms. The second stage is brain storming, diverging the thinking of several intuitive operations and converging ideas into a workable program. The third stage is previous experiment, allowing users to practice and test those concepts. The final stage is the system performance test and subjective evaluation. The experimental results showed that the intuitive operation performance was superior to graphic user interface. Children can use the senses of tactile and visual to feel the texture and form, use the strokes smoothly with change grip position and angle. Second, they use plentiful strokes and improved significantly when using the digital brush with intuitive operation. Third, children get more fun in use and promoting the sustainable use of the will. Therefore, the intuitive operating TUI digital pen for children aged 4-7, is feasible to promote the use of digital drawing of the fluency, fun and learning.
APA, Harvard, Vancouver, ISO, and other styles
16

Sly, Ian M. P. "A new vision interface : "defining what instead of how" : making image analysis functions transparent to the user by coupling them to handling tasks in an intuitive interface for materials handling applications." 1997. http://hdl.handle.net/2292/2506.

Full text
Abstract:
This thesis addresses the need for adaptability in vision systems that measure system state information in a sensory feedback role for the control and coordination of flexible discreteitem materials handling operations, such as those performed by a robotic palletising system. In addition, this thesis addresses the need for vision systems that are more easily configured by users, such as factory technicians and operators, who have lower skill levels than those generally required to (re-)configure a machine vision system. In response, a unique coupling mechanism and intuitive human-computer interface have been developed, hiding the complexity of image analysis from the end-user and simplifying the way that a machine vision system is configured. The mechanism couples machine vision-related "visual checks” to materials handling tasks in a generic framework of materials handling activities. Visual checks which define what control information is required are implicitly linked to image analysis functions which define how that information is extracted from digitised images of a materials handling system. Consequently, this research has developed a set of task - visual check "building blocks" that can be used in various combinations to define the sequence of actions and image analysis required to perform a variety of materials handling operations. In addition, a number of pre-defined task – visual check combinations and mechanisms for manipulating them have been developed, providing solution templates that can be used immediately or modified to suit application-specific requirements. These developments have been realised together with several aesthetic, ergonomic and functional features in a machine vision configuration interface, known as SlyVision. SlyVision's modularity, extensibility and upgradeability expressed to both the end-user and the system developer through its underlying object oriented architecture and intuitive user interface design make important contributions to its overall adaptability. Demonstrations involving a typical palletising and a de-palletising operation have shown how SlyVlsion is used to specify visual checks and configure the associated machine vision components without requiring the end-user to select or apply image analysis techniques or functions. In addition, the relative simplicity of the configuration process is demonstrated. Consequently, these developments assist people with limited understanding of machine vision technology to set up and maintain a vision system, thereby improving their ability to keep pace with frequent changes in their materials handling operations, while limiting the cost in time, money and effort required to (re-)configure a vision system.<br>Whole document restricted, but available by request, use the feedback form to request access.
APA, Harvard, Vancouver, ISO, and other styles
17

Cheng, Ya-Wen, and 鄭雅文. "Intuitive Interface Design For Elderly-demented Users." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/52106175688370556926.

Full text
Abstract:
碩士<br>輔仁大學<br>應用美術學系碩士班<br>103<br>The elderly have deteriorating physical conditions and they have the gradual memory deterioration. Dementia is a progressive degenerative disease in which executive dysfunction is a core symptom that causes patients to be unable to sustain a normal daily life. If products are designed to strengthen the intuitive interaction of the product interface, then it will help elderly patients with dementia to live more independently. This study proposes a product interface design that is useful for intuitive operation by user groups comprised of elderly patients with dementia. This study identifies the requirements for intuitive operation of product interfaces from duration records, video observations of the analog interface, and observations from remote control prototypes. There were 27 subjects recorded in this simulation interface study. Six subjects were recorded by remote control prototype testing. The study results indicated the number keys were designed to be presented as separate buttons and this made them more likely to guide intuitive pressing operations in affordance with the needs of elderly patients with dementia. For example, when designing the volume and channel buttons on a remote control, the up and down keys should be separate. Additionally, the models should be simple and the buttons should be free of text but represented by symbols, because this will most likely allow elderly patients with dementia to perceive intuitive operation affordance.
APA, Harvard, Vancouver, ISO, and other styles
18

Anderson, Nathan. "Intuitive interaction in mobile application interfaces and the role animation has on information integration: an empirical user study." Thesis, 2020. https://hdl.handle.net/10539/30674.

Full text
Abstract:
A dissertation submitted in fulfilment of the requirements for the degree of Master of Digital Arts at the University of the Witwatersrand, Johannesburg, 2020<br>In the design and development industry, animation in the mobile interface is regarded as making interaction with mobile apps more intuitive. This study investigates the claim from the perspective of intuitive interaction research in Human-Computer Interaction (HCI), and Judgment and Decision making (JDM). The hypothesis is that animation in the mobile interface can influence how individuals integrate information, which is an underlying process of intuition. A wholly between-subjects design was used to test the relationship between animation, information integration, and judgmental evaluation. One hundred and fifty-two (152)participants were randomly assigned to either the experimental or control condition. The control condition is a replication of an experiment in automatic processing (Betsch, Plessner, Schwieren, & Gütig, 2001) and the experimental condition is an extension of this earlier work where animation is introduced as the independent variable. The results suggest that animation has a significant effect on how information is integrated and the resulting judgmental evaluations that were formed by participants<br>CK2021
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!