To see the other types of publications on this topic, follow the link: Y2K problem (Computer systems).

Dissertations / Theses on the topic 'Y2K problem (Computer systems)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Y2K problem (Computer systems).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Johansson, Andreas. "Projektplaner vid y2k-projekt." Thesis, University of Skövde, Department of Computer Science, 1999. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-355.

Full text
Abstract:

Detta arbete berör hur företag har arbetat med att anpassa sina verksamheter till att klara millenniumskiftet utan att problem uppstår. Jag har genomfört undersökningen genom att intervjua fem respondenter från företag som jag anser vara intressanta i sammanhanget.

Jag har i undersökningen fokuserat på hur företagen har upprättat sina projektplaner för projekten av det här slaget. Som utgångspunkt för det här har jag använt mig utav ett förslag till projektplan som 2000-delegationen har tagit fram och som innefattar åtta olika faser som de anser vara värdefulla vid ett sådant här projekt.

De slutsatser som jag har kommit fram till är att de medverkande företagen i undersökningen har använt sig av väl fungerande projektplaner, även om dessa inte är lika omfattande och kompletta som 2000-delegationens förslag. Min bedömning är att företagen generellt sett ligger bra till tidsmässigt och jag anser att de har goda förutsättningar för att klara millenniumskiftets komplikationer.

APA, Harvard, Vancouver, ISO, and other styles
2

Hassebroek, Pamela Burns. "Institutionalized Environments and Information Security Management: Learning from Y2K." Diss., Available online, Georgia Institute of Technology, 2007, 2007. http://etd.gatech.edu/theses/available/etd-06192007-111256/.

Full text
Abstract:
Thesis (Ph. D.)--Public Policy, Georgia Institute of Technology, 2008.
Rogers, Juan D., Committee Chair ; Klein, Hans K., Committee Member ; Bolter, Jay David, Committee Member ; Nelson-Palmer, Mike, Committee Member ; Kingsley, Gordon, Committee Member.
APA, Harvard, Vancouver, ISO, and other styles
3

Chen, Hsinchun. "Collaborative Systems: Solving the vocabulary problem." IEEE, 1994. http://hdl.handle.net/10150/105966.

Full text
Abstract:
Artificial Intelligence Lab, Department of MIS, University of Arizona
Can on-line information retrieval systems negotiate the diverse vocabularies of different users? This article suggests a robust algorithmic solution to the vocabulary problem in collaborative systems.
APA, Harvard, Vancouver, ISO, and other styles
4

Lemons, Seth N. "Potential problem areas of design metrics for object oriented systems." Virtual Press, 2007. http://liblink.bsu.edu/uhtbin/catkey/1380103.

Full text
Abstract:
This study provides information on the effectiveness of design metrics when used on object oriented systems and explores means of increasing metric use-fulness in regard to the problem areas identified. Evidence shows that current metrics are weak in assessing some qualities when faced with concepts central to object orientation. It describes practices in design and implementation that cause complications in calculating metrics and what effects those practices may have on various types of metrics by examining specific examples as well as discussing the theory involved. It examines assumptions which can be made in the formulation of metrics to avoid the issues caused by these practices and what effect these assumptions will have on metric results.
Department of Computer Science
APA, Harvard, Vancouver, ISO, and other styles
5

Martin, Olga J. "Retranslation a problem in computing with perceptions /." Diss., Online access via UMI:, 2008.

Find full text
Abstract:
Thesis (Ph. D.)--State University of New York at Binghamton, Thomas J. Watson School of Engineering and Applied Science, Department of Systems Science and Industrial Engineering, 2008.
Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
6

Guertler, Siegfried. "Large scale computer-simulations of many-body Bose and Fermi systems at low temperature." Thesis, Click to view the E-thesis via HKUTO, 2008. http://sunzi.lib.hku.hk/hkuto/record/B40887741.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bieszczad, Andrzej Carleton University Dissertation Engineering Systems and Computer. "Neuromorphic distributed general problem solvers." Ottawa, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Callvik, Johan, and Alva Liu. "Using Demographic Information to Reduce the New User Problem in Recommender Systems." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-209332.

Full text
Abstract:
Recommender systems rely heavily on user data to make accurate rec- ommendations. This presents a problem for new users for whom no such data is available. This study investigated if this problem could be reduced by basing recommendations solely on user’s demographic in- formation. Experiments were conducted using a framework that em- ploys K-means clustering. To evaluate the framework, the MovieLens 100K dataset was applied to a set of experiments. While the results did not exhibit any correlation between ratings and demographic features in the MovieLens 100K dataset, it does not exclude that the framework is not effective on other datasets with more demographic features.
Rekommendationssystem förlitar sig starkt på användardata för att göra korrekta rekommendationer. Detta medför ett problem för nya användare för vilka det inte finns någon sådan data tillgängligt. Den- na studie undersökte om detta problem kunde minskas genom att ba- sera rekommendationer enbart på demografisk användarinformation. Experimenten utfördes med ett nytt ramverk som använder K-means klustering. För att evaluera ramverket tillämpades MovieLens 100K datasetet på ett antal experiment. Även fast resultatet inte uppvisade någon korrelation mellan filmbetyg och användaregenskaper i Movie- Lens 100K datasetet, så motsäger det inte att ramverket inte är effektivt på andra dataset med mer demografisk data.
APA, Harvard, Vancouver, ISO, and other styles
9

Jansson, Andreas. "Web-baserade informationssystem : problem och möjligheter." Thesis, University of Skövde, Department of Computer Science, 1998. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-174.

Full text
Abstract:

Syftet med denna rapport är att utvärdera problem och möjligheter med att använda Web-baserade informationssystem, d.v.s. att använda World Wide Web som plattform för ett informationssystem.

Rapporten avhandlar utvecklingen av organisationer och vikten av att de använder sig av ett flexibelt informationssystem, samt fördelar och nackdelar för organisationer att använda ett Web-baserat informationssystem.

Användbarheten är en viktig del i ett system och en fallstudie har gjorts på ett existerande hypermedia-system. Överblicksproblemet är idag ett stort problem vid användandet av hypermedia och det diskuteras specifikt.

Resultatet är att Web-baserade informationssystem har många fördelar men samtidigt även stora nackdelar, vilket gör att dessa system inte idag kan inte ersätta de traditionella informationssystemen. Däremot kan de fungera som ett komplement. Det finns även problem ur användbarhets-synpunkt som måste lösas, för att hypermedia-system ska bli riktigt effektiva.

APA, Harvard, Vancouver, ISO, and other styles
10

Aidemark, Jan. "Strategic Planning of Knowledge Management Systems : A Problem Exploration Approach." Doctoral thesis, Kista : Department of Computer and Systems Sciences, Stockholm University, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-6836.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Berggren, Robert, and Timmy Nielsen. "Investigating the Reliability of Known University Course Timetabling Problem Solving Algorithms with Updated Constraints." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229695.

Full text
Abstract:
Scheduling lectures, exams, seminars etc. for a university turns out to be a harder task than what it seems to be at first glance. This problem is known as the University Course Timetabling Problem (UCTP). The UCTP has been hosted for a number of competitions throughout the years by an organization called Practice and Theory of Automated Timetabling (PATAT). Because of these competitions, the problem has been given a standard description and set of constraints as well as standard problem instances for easier comparison of research and work on the subject. However, setting a standard like this have a major drawback; no variety is introduced since new research for finding the greatest method to solve the UCTP is forced to focus on a specific set of constraints, and algorithms developed will only be optimized with these constraints in consideration. In this research we compared five well known UCTP algorithms with the standard set of constraints to a different set of constraints. The comparisons showed a difference in the rank of performance between the algorithms when some constraints were changed to fit a certain need. The differences were not great but big enough to state that previous research declaring what algorithms are best for the UCTP problem cannot be relied upon unless you use close to identical sets of constraints. If the goal is to find the best algorithm for a new set of constraints then one should not rely on a single previously defined great algorithm but instead take two or three of the top performing ones for the greatest chance of finding the most optimized solution possible.
Schemaläggning av föreläsningar, tentamen, seminarier etc. för ett universitet visar sig vara en svårare uppgift än vad det verkar vid första anblicken. Detta problem är känt som University Course Timetabling Problem (UCTP). UCTP har varit centralt i ett antal tävlingar genom åren av organisationen Practice and Theory of Automated Timetabling (PATAT). På grund av dessa tävlingar har problemet fått en standardbeskrivning och en uppsättning specifika begränsningar samt standard problemdata för enklare jämförelse av forskning och arbete i ämnet. Att sätta denna typ av standard har dock en stor nackdel; ingen variation tillförs då ny forskning för att hitta den bästa optimeringsmetoden inom UCTP tvingas att fokusera på en specifik uppsättning begränsningar och algoritmer som utvecklas kommer då endast att optimeras med dessa begränsningar i beaktande. I den här rapporten jämförde vi fem välkända UCTP algoritmer med standarduppsättningen av begränsningar mot en annan uppsättning begränsningar. Jämförelserna visade en skillnad i prestationsordningen mellan algoritmerna när vissa begränsningar ändrats för att passa ett visst behov. Skillnaderna var inte enorma men tillräckligt stora för att påvisa att tidigare forskning som förklarar vilka algoritmer som är bäst för UCTP-problemet ej är pålitlig om du inte använder nära till identiska uppsättningar av begränsningar. Om målet är att hitta den bästa algoritmen för en ny uppsättning begränsningar, bör man inte lita på en tidigare definierad effektiv algoritm utan istället använda sig utav två eller tre av de starkaste algoritmerna för den största chansen att hitta den mest optimerade lösningen.
APA, Harvard, Vancouver, ISO, and other styles
12

Morehead, Leslie Anne. "Determining the Factors Influential in the Validation of Computer-based Problem Solving Systems." PDXScholar, 1996. https://pdxscholar.library.pdx.edu/open_access_etds/1245.

Full text
Abstract:
Examination of the literature on methodologies for verifying and validating complex computer-based Problem Solving Systems led to a general hypothesis that there exist measurable features of systems that are correlated with the best testing methods for those systems. Three features (Technical Complexity, Human Involvement, and Observability) were selected as the basis of the current study. A survey of systems currently operating in over a dozen countries explored relationships between these system features, test methods, and the degree to which systems were considered valid. Analysis of the data revealed that certain system features and certain test methods are indeed related to reported levels of confidence in a wide variety of systems. A set of hypotheses was developed, focused in such a way that they correspond to linear equations that can be estimated and tested for significance using statistical regression analysis. Of 24 tested hypotheses, 17 were accepted, resulting in 49 significant models predicting validation and verification percentages, using 37 significant variables. These models explain between 28% and 86% of total variation. Interpretation of these models (equations) leads directly to useful recommendations regarding system features and types of validation methods that are most directly associated with the verification and validation of complex computer systems. The key result of the study is the identification of a set of sixteen system features and test methods that are multiply correlated with reported levels of verification and validation. Representative examples are: • People are more likely to trust a system if it models a real-world event that occurs frequently. • A system is more likely to be accepted if users were involved in its design. • Users prefer systems that give them a large choice of output. • The longer the code, or the greater the number of modules, or the more programmers involved on the project, the less likely people are to believe a system is error-free and reliable. From these results recommendations are developed that bear strongly on proper resource allocation for testing computer-based Problem Solving Systems. Furthermore, they provide useful guidelines on what should reasonably be expected from the validation process.
APA, Harvard, Vancouver, ISO, and other styles
13

Vasandani, Vijay. "Intelligent tutoring for diagnostic problem solving in complex dynamic systems." Diss., Georgia Institute of Technology, 1991. http://hdl.handle.net/1853/24934.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Pirovolou, Dimitrios K. "The tracking problem using fuzzy neural networks." Diss., Georgia Institute of Technology, 1996. http://hdl.handle.net/1853/14824.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Andersson, Marcus. "Complexity and problem solving : A tale of two systems." Thesis, Umeå universitet, Institutionen för psykologi, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-150937.

Full text
Abstract:
The purpose of this thesis is to investigate if increasing complexity for a problem makes a difference for a learning system with dual parts. The dual parts of the learning system are modelled after the Actor and Critic parts from the Actor-Critic algorithm, using the reinforcement learning framework. The results conclude that not any difference can be found in the relative performance in the Actor and Critic parts when increasing the complexity of a problem. These results could depend on technical difficulties in comparing the environments and the algorithms. The difference in complexity would then be non-uniform in an unknowable way and uncertain to use as comparison. If on the other hand the change of complexity is uniform, this could point to the fact that there is an actual difference in how each of the actor and critic handles different types of complexity. Further studies with a controlled increase in complexity are needed to establish which of the scenarios is most likely to be true. In the discussion an idea is presented of using the Actor-Critic framework as a model to understand the success rate of psychological treatments better.
Syftet med den här uppsatsen är att undersöka om en ökande komplexitet på ett problem, innebär en skillnad för ett lärande system med två samverkande. De två samverkande delarna som används är från “Actor” och “Critic”, som kommer ifrån algoritmen “Actor-Critic”. som implementeras med hjälp av ramverket “Reinforcement learning”. Resultaten bekräftar att det inte verkar vara någon skillnad i relativ effektivitet mellan “Actor” och “Critic” när komplexiteten ändras mellan två problem. Detta kan bero på tekniska svårigheter att jämföra miljöerna i experimentet och algoritmerna som används. Om det finns problem med jämförelserna skulle skillnaden i komplexitet vara icke-uniform på ett obestämbart sätt, och att kunna göra jämförelser blir därför svårt. Däremot om skillnaden i komplexitet är uniform, skulle det kunna tyda på det kanske finns en skillnad i hur “Actor” och “Critic” hanterar olika typer av komplexitet. Vidare studier med kontrollerade ökningar för komplexiteten är nödvändiga för att fastställa hur “Actor-Crtic” algoritmen samverkar med skillnader i komplexitet. I diskussionen presenteras iden att använda Actor-Critic modellen för att förstå metoder för psykologiska behandlingar bättre.
APA, Harvard, Vancouver, ISO, and other styles
16

Fent, Darla. "An Examination Of The Variation In Information Systems Project Cost Estimates: The Case Of Year 2000 Compliance Projects." Thesis, University of North Texas, 2000. https://digital.library.unt.edu/ark:/67531/metadc2535/.

Full text
Abstract:
The year 2000 (Y2K) problem presented a fortuitous opportunity to explore the relationship between estimated costs of software projects and five cost influence dimensions described by the Year 2000 Enterprise Cost Model (Kappelman, et al., 1998) -- organization, problem, solution, resources, and stage of completion. This research was a field study survey of (Y2K) project managers in industry, government, and education and part of a joint project that began in 1996 between the University of North Texas and the Y2K Working Group of the Society for Information Management (SIM). Evidence was found to support relationships between estimated costs and organization, problem, resources, and project stage but not for the solution dimension. Project stage appears to moderate the relationships for organization, particularly IS practices, and resources. A history of superior IS practices appears to mean lower estimated costs, especially for projects in larger IS organizations. Acquiring resources, especially external skills, appears to increase costs. Moreover, projects apparently have many individual differences, many related to size and to project stage, and their influences on costs appear to be at the sub-dimension or even the individual variable level. A Revised Year 2000 Enterprise Model is presented incorporating this granularity. Two primary conclusions can be drawn from this research: (1) large software projects are very complex and thus cost estimating is also; and (2) the devil of cost estimating is in the details of knowing which of the many possible variables are the important ones for each particular enterprise and project. This points to the importance of organizations keeping software project metrics and the historical calibration of cost-estimating practices. Project managers must understand the relevant details and their interaction and importance in order to successfully develop a cost estimate for a particular project, even when rational cost models are used. This research also indicates that software cost estimating has political as well as rational influences at play.
APA, Harvard, Vancouver, ISO, and other styles
17

Kyhlbäck, Hans. "The Problem of Objects in Design of Health Care Information Systems." Licentiate thesis, Karlskrona : Blekinge Institute of Technology, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-00293.

Full text
Abstract:
This thesis is about two different theoretical interpretations of objects and object-orientation in design of health care information systems – the interpretations of Activity Theory/Developmental Work Research and Computer Science respectively. One motive to my interest in objects of work and software, is to better understand the problems and possibilities in an inter¬dis¬ciplinary research project. With an origin in 2001, a Wound Care Project began as a joint R & D endeavour with the initial idea of utili¬zing digital photos. Soon, an information system (“Hedvig”) was developed for the purpose of managing digital photos and related treatment records on wounds. Later, this work expanded in creation of a distributed information system (“Helar”), a digital prototype for support of wound care treatment. Eventually, the thesis is summing up reflections related to the object concepts. AT/DWR has its strength in analysis and design of required change in a work practice but is still weak in method and techniques for support of making specific computa¬tional systems. In a way this shortcoming is thought of to be balanced by the technological CS discipline of which one of its main forces is to develop theory and practice for construction of computational information systems. This thesis suggest, in the inter¬dis¬ciplinary field of Health Care Information Systems Design, a further developed object con¬cept, and related scenarios and use cases, as a way of taking advantage of a combination of those two different strengths.
Avhandlingen handlar om två olika teoretiska tolkningar av objekt och objektorientering i design av informationssystem för hälso- och omvårdnadsarbete - tolkningar utifrån verksamhetsteori/utvecklande arbetsforskning (activity theory/developmental work research: AT/DWR) å ena sidan och datavetenskap (computer science: CS) å den andra. Ett motiv för mitt intresse för objekt i arbete och i programvara, är att bättre förstå problem och möjligheter i ett tvärvetenskapligt forskningsprojekt. Med en början i 2001, startade ett sårvårdsprojekt som ett forsknings- och utvecklingsarbete med den initiala idén att nyttja digitala foton, och snart utvecklades ett informationssystem ("Hedvig") för syftet att hantera digitala foton och annan relaterad behandlingsdokumentation för sårvård. Senare expanderade detta arbetet i skapandet av ett distribuerat informationssystem ("Helar"), en digital prototyp för stöd av sårbehandlingsarbete. Denna avhandling summerar reflektioner relaterade till objektbegreppen. AT/DWR har sin styrka i analys och design av efterfrågad förändring av en arbetspraktik, men är fortfarande svag i metod och tekniker för att stödja skapandet av specifika datorsystem. På ett sätt är det här tillkortakommandet tänkt att balanseras av den teknologiskt datavetenskapliga disciplinen, där en av dess främsta drivkrafter är att utveckla teori och praktik för konstruktion av datoriserade informationssystem. I det tvärvetenskapliga fältet av design av informationssystem för hälso- och omvårdnadsarbete, föreslår denna avhandling fortsatt utveckling av objektkoncepten, och utveckling av de relaterade "scenarios" och "use cases", som ett sätt att dra fördel av en kombination av dessa två olika förtjänster, som de olika disciplinerna står för. objekt, objektorientering, sårvård, digitala foton, verksamhetsteori, datavetenskap, informationssystem
APA, Harvard, Vancouver, ISO, and other styles
18

Li, Vincent. "Knowledge representation and problem solving for an intelligent tutoring system." Thesis, University of British Columbia, 1990. http://hdl.handle.net/2429/29657.

Full text
Abstract:
As part of an effort to develop an intelligent tutoring system, a set of knowledge representation frameworks was proposed to represent expert domain knowledge. A general representation of time points and temporal relations was developed to facilitate temporal concept deductions as well as facilitating explanation capabilities vital in an intelligent advisor system. Conventional representations of time use a single-referenced timeline and assigns a single unique value to the time of occurrence of an event. They fail to capture the notion of events, such as changes in signal states in microcomputer systems, which do not occur at precise points in time, but rather over a range of time with some probability distribution. Time is, fundamentally, a relative quantity. In conventional representations, this relative relation is implicitly defined with a fixed reference, "time-zero", on the timeline. This definition is insufficient if an explanation of the temporal relations is to be constructed. The proposed representation of time solves these two problems by representing a time point as a time-range and making the reference point explicit. An architecture of the system was also proposed to provide a means of integrating various modules as the system evolves, as well as a modular development approach. A production rule EXPERT based on the rule framework used in the Graphic Interactive LISP tutor (GIL) [44, 45], an intelligent tutor for LISP programming, was implemented to demonstrate the inference process using this time point representation. The EXPERT is goal-driven and is intended to be an integral part of a complete intelligent tutoring system.
Applied Science, Faculty of
Electrical and Computer Engineering, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
19

Johnson, Todd Richard. "Generic tasks in the problem-space paradigm : building flexible knowledge systems while using task-level constraints /." The Ohio State University, 1991. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487688507503976.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Adlerborn, Björn. "Parallel Algorithms and Library Software for the Generalized Eigenvalue Problem on Distributed Memory Computer Systems." Licentiate thesis, Umeå universitet, Institutionen för datavetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-119439.

Full text
Abstract:
We present and discuss algorithms and library software for solving the generalized non-symmetric eigenvalue problem (GNEP) on high performance computing (HPC) platforms with distributed memory. Such problems occur frequently in computational science and engineering, and our contributions make it possible to solve GNEPs fast and accurate in parallel using state-of-the-art HPC systems. A generalized eigenvalue problem corresponds to finding scalars y and vectors x such that Ax = yBx, where A and B are real square matrices. A nonzero x that satisfies the GNEP equation is called an eigenvector of the ordered pair (A,B), and the scalar y is the associated eigenvalue. Our contributions include parallel algorithms for transforming a matrix pair (A,B) to a generalized Schur form (S,T), where S is quasi upper triangular and T is upper triangular. The eigenvalues are revealed from the diagonals of S and T. Moreover, for a specified set of eigenvalues an associated pair of deflating subspaces can be computed, which typically is requested in various applications. In the first stage the matrix pair (A,B) is reduced to a Hessenberg-triangular form (H,T), where H is upper triangular with one nonzero subdiagonal and T is upper triangular, in a finite number of steps. The second stage reduces the matrix pair further to generalized Schur form (S,T) using an iterative QZ-based method. Outgoing from a one-stage method for the reduction from (A,B) to (H,T), a novel parallel algorithm is developed. In brief, a delayed update technique is applied to several partial steps, involving low level operations, before associated accumulated transformations are applied in a blocked fashion which together with a wave-front task scheduler makes the algorithm scale when running in a parallel setting. The potential presence of infinite eigenvalues makes a generalized eigenvalue problem ill-conditioned. Therefore the parallel algorithm for the second stage, reduction to (S,T) form, continuously scan for and robustly deflate infinite eigenvalues. This will reduce the impact so that they do not interfere with other real eigenvalues or are misinterpreted as real eigenvalues. In addition, our parallel iterative QZ-based algorithm makes use of multiple implicit shifts and an aggressive early deflation (AED) technique, which radically speeds up the convergence. The multi-shift strategy is based on independent chains of so called coupled bulges and computational windows which is an important source of making the algorithm scalable. The parallel algorithms have been implemented in state-of-the-art library software. The performance is demonstrated and evaluated using up to 1600 CPU cores for problems with matrices as large as 100000 x 100000. Our library software is described in a User Guide. The software is, optionally, tunable via a set of parameters for various thresholds and buffer sizes etc. These parameters are discussed, and recommended values are specified which should result in reasonable performance on HPC systems similar to the ones we have been running on.
APA, Harvard, Vancouver, ISO, and other styles
21

Kim, Yoonsoo. "Addressing the data recency problem in collaborative filtering systems." Link to electronic thesis, 2004. http://www.wpi.edu/Pubs/ETD/Available/etd-09244-024515/.

Full text
Abstract:
Thesis (M.S.)--Worcester Polytechnic Institute.
Keywords: Data recency problem; Recommender system; Time-based forgetting function; Time-based forgetting strategy; Collaborative filtering system. Includes bibliographical references (p. 73-74).
APA, Harvard, Vancouver, ISO, and other styles
22

Kotnour, Timothy G. "Design, development, and testing of an automated knowledge-acquisition tool to aid problem solving, decision making, and planning." Thesis, This resource online, 1992. http://scholar.lib.vt.edu/theses/available/etd-12302008-063800/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Smith, Jill Yvonne. "Communication Quality in Information Systems Development: The Effect of Computer-Mediated Communication on Task-Oriented Problem Solving." Thesis, North Texas State University, 1986. https://digital.library.unt.edu/ark:/67531/metadc331600/.

Full text
Abstract:
The problem motivating this research is that ineffective communication may hamper systems development efforts. Specifically germane to this study are development efforts characterized as task-oriented, and which require information-sharing and problem-solving activities. This research problem motivated an analysis of the communication process and lead to the development of a temporal framework that delineates variables associated with task-oriented, end user/systems analyst communication interactions. Several variables within this framework are depicted in two theoretical models. The first model shows the theoretical relationship between an independent variable, communication mode (represented by asynchronous computer conferencing and face-to-face conferencing), and five dependent variables: (1) the amount of information shared, (2) the significance of the information shared, (3) the comprehensiveness of the information shared, (4) the perception of progress toward the goal, and (5) the perception of freedom to participate. The second model depicts the assumed interaction between communication mode, the five variables cited above (now acting as independent variables), and a dependent variable, communication quality. There are two theoretical components of communication quality: (1) deviation from an optimal set of user requirements, and (2) the degree of convergence (unity based on mutual understanding and mutual agreement) emanating from a communication interaction. Using the theoretical models as a guide, an experiment was designed and performed to test the relationships among the variables. The experimental results led to the rejection of all null hypotheses; the results strongly favored face-to-face conferencing for solving task-oriented, information-sharing problems analagous to the case used in the present study. The findings indicate that asynchronous computer conferencing may have a detrimental effect on the thoroughness of information exchange, on the relevance of the information shared in terms of making the correct decision, and on the completeness of the consideration given to all problem dimensions.
APA, Harvard, Vancouver, ISO, and other styles
24

Roberts, T. Dale (Terrance Dale) Carleton University Dissertation Computer Science. "Learning automata solutions to the capacity assignment problem." Ottawa, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
25

PARK, SEUNG YIL. "A GENERALIZED INTELLIGENT PROBLEM SOLVING SYSTEM BASED ON A RELATIONAL MODEL FOR KNOWLEDGE REPRESENTATION (SUPPORT SYSTEMS, EXPERT, DECISION AIDS)." Diss., The University of Arizona, 1986. http://hdl.handle.net/10150/183779.

Full text
Abstract:
Over the past decade, two types of decision aids, i.e., decision support systems (DSS) and expert systems (ES), have been developed along parallel paths, showing some significant differences in their software architectures, capabilities, limitations, and other characteristics. The synergy of DSS and ES, however, has great potential for helping make possible a generalized approach to developing a decision aid that is powerful, intelligent, and friendly. This research establishes a framework for such decision aids in order to determine the elementary components and their interactions. Based on this framework, a generalized intelligent problem solving system (GIPSS) is deveolped as a decision aid generator. A relational model is designed to provide a unified logical view of each type of knowledge including factual data, modeling knowledge, and heuristic rules. In this knowledge model, a currently existing relational DBMS, with some extension, is utilized to manage each type of knowledge. For this purpose a relational resolution inference mechanism has been devised. A prototype GIPSS has been developed based on this framework. Two domain specific decision aids, COCOMO which estimates software development effort and cost, and CAPO which finds optimal process organization, have been implemented by using the GIPSS as a decision aid generator, demonstrating such features as its dynamic modeling capabilities and learning capabilities.
APA, Harvard, Vancouver, ISO, and other styles
26

Arikenbi, Temitayo. "Decision Support for Multi-Criteria Energy Generation Problem." Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-6073.

Full text
Abstract:
In this study, an attempt is made to apply Decision Support Systems (DSS) in planning for the expansion of energy generation infrastructure in Nigeria. There is an increasing demand for energy in that country, and the study will try to show that DSS modelling, using A Mathematical Programming Language (AMPL) as the modelling tool, can offer satisficing results which would be a good decision support resource for motivating how to expend investment for energy generation.
+46707267798
APA, Harvard, Vancouver, ISO, and other styles
27

Troy, Matthew A. "Producing Online Software Documentation at Ontario Systems, LLC." Miami University / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=miami1114178834.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Tanga, Vikas Reddy. "The Chief Security Officer Problem." Thesis, University of North Texas, 2018. https://digital.library.unt.edu/ark:/67531/metadc1404557/.

Full text
Abstract:
The Chief Security Officer Problem (CSO) consists of a CSO, a group of agents trying to communicate with the CSO and a group of eavesdroppers trying to listen to the conversations between the CSO and its agents. Through Lemmas and Theorems, several Information Theoretic questions are answered.
APA, Harvard, Vancouver, ISO, and other styles
29

Rindeback, Henry. "Att leva som man lär : problem och möjligheter med upper-CASE i systemutvecklingen." Thesis, University of Skövde, Department of Computer Science, 1997. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-259.

Full text
Abstract:

Den här rapporten handlar om upper-CASE samt dess effekter, möjligheter och de hinder som finns för dess införande. Fem personer i fem organisationer har frågats ut med enkät eller intervju om deras erfarenheter och synpunkter. I tre av organisationerna användes upper-CASE-verktyg i systemutvecklingen och i två användes det ej.

· De förra ansåg alla att underlättad dokumentation, förbättrad kommunikation mellan projektmedlemmar, förbättrad kvalitet hos det färdiga systemet samt en viss tidsvinst i systemutveckling var effekter som uppnåtts av användandet av upper-CASE. Det utvecklade systemet blir också lättare att förvalta ibland. Av negativa effekter tyckte man sig se ett visst beroende av CASE-leverantörens support. Det verkar också som att gränssnittet brister i vissa verktyg. Användandet av upper-CASE i en organisation kan påverka kunskaperna hos användarna. I en organisation yttrade sig detta på så vis att mängden människor med samma kompetens - skill set - växte. Detta underlättade matchning uppgift - kompetens i systemutveckling.

· Ett av de konsultföretag som ej använde upper-CASE arbetade oftast efter kundens metod. Eftersom upper-CASE-verktyg ofta har en metod inbakat blev det klumpigt att använda sådana. Ett tecken hittades på att faktorer som ekonomi och användbarhet gör att upper-CASE kanske bidrar dåligt i systemutveckling hos små organisationer, i jämförelse med lower-CASE.

· Underlättad dokumentation, förbättrad kommunikation, och ökad kvalitet var fördelar som sågs av samtliga organisationer i den här undersökningen.

APA, Harvard, Vancouver, ISO, and other styles
30

Jacobsson, Magnus. "Identifiering av typsituationer som kan skapa problem vid design och implementering av relationsdatabaser." Thesis, University of Skövde, Department of Computer Science, 1998. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-166.

Full text
Abstract:

Snabb tillgång till korrekt information är idag ett viktigt konkurrensmedel för företag. Databaser gör detta möjligt genom att flera användare kan dela på verksamhetens information och därmed kan informationen hållas aktuell och konsistent. Den här studien behandlar relationsdatabasen som kännetecknas av att datan lagras i tabeller. Databasens fördelar medför att den utgör kärnan i flera informationssystem och det är därför viktigt att databasen uppfyller kundens och användarnas krav.

Studien identifierar typsituationer som kan skapa problem vid design och implementering av relationsdatabaser. Syftet är att databasutvecklare kan utnyttja studien för att veta vilka typsituationer som kan skapa problem i ett databasprojekt och på så sätt vara förberedda. De identifierade problemen exemplifieras och förslag på lösning presenteras.

ER-modellen är en datamodell som ofta används i designfasen av informations-systemets relationsdatabas. Idag existerar det situationer som måste modelleras och som är mer komplexa än när ER-modellen utvecklades i början av sjuttiotalet. Det leder till att de begrepp som traditionella ER-modeller har inte är tillräckliga för modellering i flera situationer. Förekomst av arv, komplexa datatyper och användande av CASE-verktyg är enligt studien situationer som kan skapa problem vid databasdesignen om en traditionell ER-modell används.

De problemsituationer som rör implementeringen av en relationsdatabas är uppdelade i två delar, de som kan härledas till funktionella krav och de som kan härledas till icke-funktionella krav. Implementering av komplexa datatyper och affärsregler, överföring av objektmodell till relationsdatabas jämföra data och söka i trädstrukturer i SQL, är exempel på situationer som kan skapa problem vid implementering och som härleds till funktionella krav i studien. Prestanda och skalbarhet, maskinberoende och kostnad är de icke-funktionella krav som studien behandlar.

APA, Harvard, Vancouver, ISO, and other styles
31

Kuivinen, Fredrik. "Tight Approximability Results for the Maximum Solution Equation Problem over Abelian Groups." Thesis, Linköping University, Department of Computer and Information Science, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-3240.

Full text
Abstract:

In the maximum solution equation problem a collection of equations are given over some algebraic structure. The objective is to find an assignment to the variables in the equations such that all equations are satisfied and the sum of the variables is maximised. We give tight approximability results for the maximum solution equation problem when the equations are given over finite abelian groups. We also prove that the weighted and unweighted versions of this problem have asymptotically equal approximability thresholds.

Furthermore, we show that the problem is equally hard to solve as the general problem even if each equation is restricted to contain at most three variables and solvable in polynomial time if the equations are restricted to contain at most two variables each. All of our results also hold for the generalised version of maximum solution equation where the elements of the group are mapped arbitrarily to non-negative integers in the objective function.

APA, Harvard, Vancouver, ISO, and other styles
32

Andersson, Christofer. "Webfarming : En studie av möjligheter och problem." Thesis, University of Skövde, Department of Computer Science, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-501.

Full text
Abstract:

För att företag skall kunna konkurrera på ständigt föränderliga marknader krävs att de har kunskap om exempelvis kundbehov, konkurrenter och nya produkter. Källor till denna kunskap finns både internt inom organisationer och externt i deras omgivning. Både de interna och externa kunskapskällorna är viktiga, men under de senaste åren har ett allt större intresse riktats mot de externa källorna, då de har blivit allt viktigare för att organisationer effektivt skall kunna hantera sina affärsprocesser.

En av de kunskapskällorna som ofta nämnts är Internet och den data som lagras på hemsidor runt om i världen. Dock är det svårt att hitta relevant data på Internet på grund av dess omfattning och dynamiska karaktär. Det har därför börjat uppkomma nya tekniker för att inhämta, strukturera och integrera data från Internet. En av dessa tekniker är Webfarming som används för att automatiskt inhämta data från Internet och integrera denna med intern data lagrad i så kallade datalager.

Dock är mängden litteratur som berör Webfarming begränsad och den som finns är till stor del författad av grundaren till Webfarming, Richard Hackathorn. Denna rapport syftar därför till att undersöka vilka möjligheter och problem som företag ser med Webfarming. Resultatet av denna undersökning har sedan jämförts med befintlig litteratur, för att konstatera eller förstärka de idéer som Hackathorn ger uttryck för.

APA, Harvard, Vancouver, ISO, and other styles
33

Nafkha, Amor. "A geometrical approach detector for solving the combinatorial optimisation problem: Application to wireless communication systems." Phd thesis, Université de Bretagne Sud, 2006. http://tel.archives-ouvertes.fr/tel-00106708.

Full text
Abstract:
Cette thèse s'intéresse à la résolution du problème classique de décodage d'un mélange linéaire entaché d'un bruit additif gaussien. A partir d'une observation
bruitée: y = Hx+b, d'un signal x ∈ {±1}n mélangé linéairement par une matrice H connue, b étant un vecteur de bruit, on cherche le vecteur x minimisant la distance Euclidienne entre y et le vecteur Hx. Ce problème est réputé NP-complet. Il intervient dans un grand nombre de systèmes de télécommunications (CDMA,MIMO, MC-CDMA, etc.). Nous proposons dans cette thèse un algorithme de résolution quasi optimal de ce problème et bien adapté à une implémentation
matérielle.
Notre démarche s'appuie sur l'utilisation des méthodes classiques de recherche opérationnelle : trouver des points initiaux répartis sur l'espace des solutions
possibles et potentiellement proches de la solution optimale (diversification) et effectuer une recherche locale au voisinage des ces points (intensification). Dans
ce travail, la diversification est basée sur une approche géométrique utilisant les axes dominants de concentration du bruit (vecteurs singuliers associés aux
valeurs singulires minimales de la matrice H). Les performances en terme de taux d'erreur par bit de la méthode proposée sont proches de l'optimum tout en
gardant une complexité constante et un degré de parallélisme important (même pour des matrices H de taille très importantes, de l'ordre de 100). Nous avons
étendu cette méthode à la constellation MAQ-16 d'une part, et à la génération d'une décision souple d'autre part.
Nous avons étudié l'algorithme proposé du point de vue implémentation matérielle.
La sensibilité à l'utilisation de la précision finie et des normes sous optimales est décrite. Une étude de complexité de l'algorithme est présentée ainsi que les effets d'une mauvaise estimation de la matrice H.
L'algorithme proposé présente d'une part une nouvelle alternative pour le 11 décodage quasi optimal du mélange linéaire bruité et d'autre part un important
degré de parallélisme permettant une implémentation matérielle efficace et rapide.
APA, Harvard, Vancouver, ISO, and other styles
34

Yang, Quangang Mechanical &amp Manufacturing Engineering Faculty of Engineering UNSW. "The development of an integrated design system and its embedded frameworks for information handling, design space characterization and problem solving." Awarded by:University of New South Wales. School of Mechanical and Manufacturing Engineering, 2007. http://handle.unsw.edu.au/1959.4/29545.

Full text
Abstract:
In today's highly competitive landscape, new product development strategies are imperatives for companies to create and sustain competitive advantages. The objective of this research is to develop an integrated approach to automate, or aid, the design problem solving process. An Integrated Design System (IDS) is proposed focusing on the parametric and detail design. In this system, generation and evaluation of new design problems occur quickly and easily by changing the inputs for the design model. The IDS provides an integrated platform to incorporate available application programs such as CAD and FEM tools into a single system. Four major frameworks, namely information handling, problem decomposition, design space characterization, and problem solving, are proposed and embedded in it to implement the product development process. The information handling includes five aspects. A naming protocol is devised to organize the historical design cases. A search algorithm is proposed to retrieve a design case. A system-generated report is used to distribute the design information. A constraint definition frame is presented to define the relationships between design parameters. Two schemas, information matrix and constraint tree, are developed to represent information in the IDS. A diagonal-centered decomposition scheme is developed utilizing a Genetic Algorithm to decompose a complex design problem. In addition to the conventional genetic operators, two novel genetic operators, unequal position crossover and insertion mutation, are proposed. To characterize the design space, two methods, Incremental Response Method (IRM) and Artificial Neural Network (ANN), are presented. The IRM is derived from response surface method, while the back-propagated ANN is coded to be self-evaluated. The presented problem solving algorithm constitutes the solving mechanism of the IDS. Based on the assessment of the design objectives, all design parameters are given a priority index to facilitate the parameter selection. An independent recursive method is introduced in this algorithm to handle the design constraints. The case studies are performed on two design problems: a hard disk drive actuator arm and a shaft. The results show that the system can automatically align parameter values towards the objective values in a reasonable manner, and thus verify the feasibility of the embedded frameworks.
APA, Harvard, Vancouver, ISO, and other styles
35

Ibidunmoye, Olumuyiwa. "Performance problem diagnosis in cloud infrastructures." Licentiate thesis, Umeå universitet, Institutionen för datavetenskap, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-120287.

Full text
Abstract:
Cloud datacenters comprise hundreds or thousands of disparate application services, each having stringent performance and availability requirements, sharing a finite set of heterogeneous hardware and software resources. The implication of such complex environment is that the occurrence of performance problems, such as slow application response and unplanned downtimes, has become a norm rather than exception resulting in decreased revenue, damaged reputation, and huge human-effort in diagnosis. Though causes can be as varied as application issues (e.g. bugs), machine-level failures (e.g. faulty server), and operator errors (e.g. mis-configurations), recent studies have attributed capacity-related issues, such as resource shortage and contention, as the cause of most performance problems on the Internet today. As cloud datacenters become increasingly autonomous there is need for automated performance diagnosis systems that can adapt their operation to reflect the changing workload and topology in the infrastructure. In particular, such systems should be able to detect anomalous performance events, uncover manifestations of capacity bottlenecks, localize actual root-cause(s), and possibly suggest or actuate corrections. This thesis investigates approaches for diagnosing performance problems in cloud infrastructures. We present the outcome of an extensive survey of existing research contributions addressing performance diagnosis in diverse systems domains. We also present models and algorithms for detecting anomalies in real-time application performance and identification of anomalous datacenter resources based on operational metrics and spatial dependency across datacenter components. Empirical evaluations of our approaches shows how they can be used to improve end-user experience, service assurance and support root-cause analysis.
Cloud Control (C0590801)
APA, Harvard, Vancouver, ISO, and other styles
36

Andersson, Linda. "Problem i samband med processmodellering vid utveckling av branschspecifika affärssystem." Thesis, University of Skövde, Department of Computer Science, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-513.

Full text
Abstract:

En trend på marknaden för affärssystem är att utveckla branschspecifika affärssystem, det vill säga system som är anpassade till affärsprocesserna inom en viss bransch. Eftersom det är affärsprocesserna i en verksamhet som affärssystemet ska stödja är det vid utveckling av affärssystem viktigt att kartlägga dessa processer.

Detta arbete syftar till att undersöka vilka problem som kan uppstå i samband med processkartläggning vid utveckling av branschspecifika affärssystem. Vidare är arbetet begränsat till att endast behandla problem vad det gäller val, uppdelning och benämning av affärsprocesser. Arbetet har genomförts dels genom en fallstudie, dels genom intervjuer av leverantörer av affärssystem.

Resultatet av detta arbete visar på problem med att avgöra hur många och vilka affärsprocesser som ska innefattas av affärssystemet. Vidare kan problem uppstå bland annat vad det gäller identifiering av huvudprocesser och antalet processnivåer som ska tillämpas. Problemen kring benämning av affärsprocesser visade sig vara av mindre omfattning, dock kan det finnas risk att benämningen avspeglar avdelningar inom verksamheter istället för affärsprocesser.

APA, Harvard, Vancouver, ISO, and other styles
37

Kaya, Hamza. "Using Social Graphs In One-class Collaborative Filtering Problem." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/12611131/index.pdf.

Full text
Abstract:
One-class collaborative filtering is a special type of collaborative filtering methods that aims to deal with datasets that lack counter-examples. In this work, we introduced social networks as a new data source to the one-class collaborative filtering (OCCF) methods and sought ways to benefit from them when dealing with OCCF problems. We divided our research into two parts. In the first part, we proposed different weighting schemes based on social graphs for some well known OCCF algorithms. One of the weighting schemes we proposed outperformed our baselines for some of the datasets we used. In the second part, we focused on the dataset differences in order to find out why our algorithm performed better on some of the datasets. We compared social graphs with the graphs of users and their neighbors generated by the k-NN algorithm. Our research showed that social graphs generated from a specialized domain better improves the recommendation performance than the social graphs generated from a more generic domain.
APA, Harvard, Vancouver, ISO, and other styles
38

Cobb, Nicholas. "A Problem Solving Approach to Enterprise FileVault 2 Management and Integration." TopSCHOLAR®, 2013. http://digitalcommons.wku.edu/theses/1296.

Full text
Abstract:
Consumer technology adoption into large enterprise environments is occurring at an unprecedented rate. Employees require the flexibility and efficiency of using operating systems, computers, and mobility products they are familiar with and that enable their productivity. Due to this industry phenomenon, one large shipping enterprise must work to create solutions to integrate Apple’s OS X operating system into its traditional Windows-based operating environment. This level of integration must take place carefully to enable usability and foster the continued data security of enterprise assets. This paper describes the steps and methodology taken, as well as the rationale used, to accomplish the task of integrating Apple’s FileVault 2 full disk encryption technology into existing McAfee management infrastructure and traditional deployment and support workflows. Using a combination of industry and community solutions and techniques, a low-cost software solution named EscrowToEPO is created to facilitate the secure and user-friendly adoption of FileVault 2 as a full disk encryption solution. This paper also includes the success/failure rate of adoption and implications as to how the adoption of similar solutions can occur to support future operating systems or other environments.
APA, Harvard, Vancouver, ISO, and other styles
39

Wojnowski, Christine. "Reasoning with visual knowledge in an object recognition system /." Online version of thesis, 1990. http://hdl.handle.net/1850/10596.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Bajaj, Manas. "Knowledge composition methodology for effective analysis problem formulation in simulation-based design." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/26639.

Full text
Abstract:
Thesis (Ph.D)--Mechanical Engineering, Georgia Institute of Technology, 2009.
Committee Co-Chair: Dr. Christiaan J. J. Paredis; Committee Co-Chair: Dr. Russell S. Peak; Committee Member: Dr. Charles Eastman; Committee Member: Dr. David McDowell; Committee Member: Dr. David Rosen; Committee Member: Dr. Steven J. Fenves. Part of the SMARTech Electronic Thesis and Dissertation Collection.
APA, Harvard, Vancouver, ISO, and other styles
41

Pennada, Venkata Sai Teja. "Solving Multiple Objective Optimization Problem using Multi-Agent Systems: A case in Logistics Management." Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-20745.

Full text
Abstract:
Background: Multiple Objective Optimization problems(MOOPs) are common and evident in every field. Container port terminals are one of the fields in which MOOP occurs. In this research, we have taken a case in logistics management and modelled Multi-agent systems to solve the MOOP using Non-dominated Sorting Genetic Algorithm-II (NSGA-II). Objectives: The purpose of this study is to build AI-based models for solving a Multiple Objective Optimization Problem occurred in port terminals. At first, we develop a port agent with an objective function of maximizing throughput and a customer agent with an objective function of maximizing business profit. Then, we solve the problem using the single-objective optimization model and multi-objective optimization model. We then compare the results of both models to assess their performance. Methods: A literature review is conducted to choose the best algorithm among the existing algorithms, which were used previously in solving other Multiple Objective Optimization problems. An experiment is conducted to know how well the models performed to solve the problem so that all the participants are benefited simultaneously. Results: The results show that all three participants that are port, customer one and customer two have gained profits by solving the problem in multi-objective optimization model. Whereas in a single-objective optimization model, a single participant has achieved earnings at a time, leaving the rest of the participants either in loss or with minimal profits. Conclusion: We can conclude that multi-objective optimization model has performed better than the single-objective optimization model because of the impartial results among the participants.
APA, Harvard, Vancouver, ISO, and other styles
42

Kavirayani, Srikanth. "Classical and neural net control and identification of non-linear systems with application to the two-joint inverted pendulum control problem." Diss., Columbia, Mo. : University of Missouri-Columbia, 2005. http://hdl.handle.net/10355/5835.

Full text
Abstract:
Thesis (M.S.)--University of Missouri-Columbia, 2005.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file viewed on (January 23, 2007) Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
43

Glossenger, John Kenneth. "The role of planning in two artificial intelligence architectures." Instructions for remote access. Click here to access this electronic resource. Access available to Kutztown University faculty, staff, and students only, 1991. http://www.kutztown.edu/library/services/remote_access.asp.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Cirett, Galan Federico M. "Using Real-Time Physiological and Behavioral Data to Predict Students' Engagement during Problem Solving: A Machine Learning Approach." Diss., The University of Arizona, 2012. http://hdl.handle.net/10150/241971.

Full text
Abstract:
The goal of this study was to evaluate whether Electroencephalography (EEG) estimates of attention and cognitive workload captured as students solved math problems could be used to predict success or failure at solving the problems. Students solved a series of SAT math problems while wearing an EEG headset that generated estimates of sustained attention and cognitive workload each second. Students also reported on their level of frustration and the perceived difficulty of each problem. Results from a Support Vector Machine (SVM) training indicated that problem outcomes could be correctly predicted from the combination of attention and workload signals at rates better than chance. The EEG data was also correlated with students' self-report of problem difficulty. Findings suggest that relatively non-intrusive EEG technologies could be used to improve the efficacy of tutoring systems.
APA, Harvard, Vancouver, ISO, and other styles
45

Drott, Ingemar. "Identifiering av problem och möjligheter i RE-processen : med avseende på intressenterna." Thesis, University of Skövde, Department of Computer Science, 2000. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-448.

Full text
Abstract:

Examensarbetets problemområde återfinns i den tidiga delen inom informationssystemutveckling. Denna utvecklingsfas, kallad Requirements Engineering (RE), innefattar aktiviteter för att förstå, dokumentera och validera de behov och krav som föreligger för ett framtida informationssystem. RE är en avgörande aktivitet i processen att utveckla ett informationssystem som tillfredsställer användare och kunders förväntningar och behov.

De olika intressenterna som är involverade i RE-processen har olika roller, bakgrund och kunskaper. Av denna anledning föreligger olika problem och möjligheter med avseende på intressenterna. I examensarbetet utreds vilka dessa möjligheter och problem kan vara. För att besvara frågeställningen har en mindre litteraturstudie och fem intervjuer med analytiker genomförts.

Intressentprofiler har tagits fram där respektive intressenttyps roll och bidrag framgår och för dessa har även svårigheter och problem som kan uppstå i RE-processen identifierats. För att ge en tydlig bild över identifierade möjligheter och problem samt redovisa interaktionen mellan intressenterna har även sammanfattande figurer över detta skapats.

APA, Harvard, Vancouver, ISO, and other styles
46

Chambart, Pierre. "On Post's embedding problem and the complexity of lossy channels." Phd thesis, École normale supérieure de Cachan - ENS Cachan, 2011. http://tel.archives-ouvertes.fr/tel-00777541.

Full text
Abstract:
Lossy channel systems were originally introduced to model communication protocols. It gave birth to a complexity class wich remained scarcely undersood for a long time. In this thesis we study some of the most important gaps. In particular, we bring matching upper and lower bounds for the time complexity. Then we describe a new proof tool : the Post Embedding Problem (PEP) which is a simple problem, closely related to the Post Correspondence Problem, and complete for this complexity class. Finally, we study PEP, its variants and the languages of solutions of PEP on which we provide complexity results and proof tools like pumping lemmas.
APA, Harvard, Vancouver, ISO, and other styles
47

Lopez-Rojas, Edgar Alonso. "Applying Simulation to the Problem of Detecting Financial Fraud." Doctoral thesis, Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-12932.

Full text
Abstract:
This thesis introduces a financial simulation model covering two related financial domains: Mobile Payments and Retail Stores systems.   The problem we address in these domains is different types of fraud. We limit ourselves to isolated cases of relatively straightforward fraud. However, in this thesis the ultimate aim is to introduce our approach towards the use of computer simulation for fraud detection and its applications in financial domains. Fraud is an important problem that impact the whole economy. Currently, there is a lack of public research into the detection of fraud. One important reason is the lack of transaction data which is often sensitive. To address this problem we present a mobile money Payment Simulator (PaySim) and Retail Store Simulator (RetSim), which allow us to generate synthetic transactional data that contains both: normal customer behaviour and fraudulent behaviour.    These simulations are Multi Agent-Based Simulations (MABS) and were calibrated using real data from financial transactions. We developed agents that represent the clients and merchants in PaySim and customers and salesmen in RetSim. The normal behaviour was based on behaviour observed in data from the field, and is codified in the agents as rules of transactions and interaction between clients and merchants, or customers and salesmen. Some of these agents were intentionally designed to act fraudulently, based on observed patterns of real fraud. We introduced known signatures of fraud in our model and simulations to test and evaluate our fraud detection methods. The resulting behaviour of the agents generate a synthetic log of all transactions as a result of the simulation. This synthetic data can be used to further advance fraud detection research, without leaking sensitive information about the underlying data or breaking any non-disclose agreements.   Using statistics and social network analysis (SNA) on real data we calibrated the relations between our agents and generate realistic synthetic data sets that were verified against the domain and validated statistically against the original source.   We then used the simulation tools to model common fraud scenarios to ascertain exactly how effective are fraud techniques such as the simplest form of statistical threshold detection, which is perhaps the most common in use. The preliminary results show that threshold detection is effective enough at keeping fraud losses at a set level. This means that there seems to be little economic room for improved fraud detection techniques.   We also implemented other applications for the simulator tools such as the set up of a triage model and the measure of cost of fraud. This showed to be an important help for managers that aim to prioritise the fraud detection and want to know how much they should invest in fraud to keep the loses below a desired limit according to different experimented and expected scenarios of fraud.
APA, Harvard, Vancouver, ISO, and other styles
48

Svanberg, Maria. "Heuristisk utvärdering eller användbarhetstestning : Vilken är mer effektiv på att identifiera kognitiva problem?" Thesis, University of Skövde, Department of Computer Science, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-723.

Full text
Abstract:

Metoder som används för att utvärdera om en produkts gränssnitt är användbart, är av stort intresse för dagens forskare inom området människa-datorinteraktion. Med anledning av att användbarhetstester i allmänhet är tidskrävande och kan innebära stora kostnader, används idag metoder som heuristisk utvärdering. Användbarhetsproblem som identifieras vid användandet av inspektionsmetoder (däribland heuristisk utvärdering) har dock ifrågasatts på grund av att de vid användning bland annat identifierar andra slags användbarhetsproblem, än vad som identifieras när reella användare utför användbarhetstester. Om resultaten från utvärderingarna används som vägledning vid omdesign av befintliga system, kan detta ge felaktiga beslutsunderlag som kan ge upphov till allvariga konsekvenser för en designprocess.

Syftet med detta arbete var att undersöka om heuristisk utvärdering är en effektivare metod med avseende på validiteten, i jämförelse med metoden användbarhetstestning, att identifiera kognitiva användbarhetsproblem. För att få ett svar på arbetes frågeställning har tekniker som intervju, scenarier och ?tänka högt?, används. Även om det inte erhölls några signifikanta resultat, tyder det ändå på att heuristisk utvärdering är en effektivare metod. Resultatet visade även att metoderna kompletterar varandra.

APA, Harvard, Vancouver, ISO, and other styles
49

Madraki, Golshan. "Efficient Algorithm to Find Performance Measures in Systems under Structural Perturbations." Ohio University / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1501838878410153.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Braun, Susan Lynn. "Exploration of the functionality requirements associated with development of a problem generation facility to supplement an intelligent tutoring system." Diss., Georgia Institute of Technology, 1993. http://hdl.handle.net/1853/21268.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography