To see the other types of publications on this topic, follow the link: Automated Design Framework.

Dissertations / Theses on the topic 'Automated Design Framework'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 29 dissertations / theses for your research on the topic 'Automated Design Framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Hwang, Yves. "An automated software design synthesis framework." University of Western Australia. School of Electrical, Electronic and Computer Engineering, 2009. http://theses.library.uwa.edu.au/adt-WU2009.0157.

Full text
Abstract:
This thesis presents an automated software design synthesis framework known as Project Calliope. This framework aligns with Harel's automated software development process as it addresses the aspect of automating design and implementation. Project Calliope is based on a Statecharts synthesis approach in the literature. The main goal of Project Calliope is to automatically generate testable Unified Modeling Language (UML) Statecharts that are deterministic, visually manageable and UML compliant. In order to minimise design errors in the generated UML Statecharts, Project Calliope supports model checking through Statecharts execution. In addition, executable code is automatically generated based on the synthesised UML Statecharts. This framework seeks to provide a pragmatic design framework that can be readily incorporated into software development methodologies that leverage UML. In this thesis, Project Calliope is applied to three simple applications from Whittle and Schumann's examples and a case study based on a commercial application. They are automatic teller machine, coffee dispenser, an agent application, and a groupware application respectively.
APA, Harvard, Vancouver, ISO, and other styles
2

Young, Jared Matthew. "Nesting Automated Design Modules In An Interconnected Framework." BYU ScholarsArchive, 2005. https://scholarsarchive.byu.edu/etd/636.

Full text
Abstract:
This thesis seeks to extend the PDG methodology by developing a generalized formal method for nesting PDGs in an interconnected system. A procedure for decomposing an individual PDG into reusable modules will be defined and a software architecture will be presented which takes advantage of these reusable modules. This method breaks the PDG structure into discrete elements known as PDG objects, PDG modules and PDG services. Each of these elements forms a distinct unit of reuse and each can be seen as a "little" PDG. Two different industrial implementations of this method are presented. These examples show that it is possible to share PDG services amongst multiple PDGs and provide a mechanism to create a PDG for a complicated system.
APA, Harvard, Vancouver, ISO, and other styles
3

Young, Jared M. "Nesting automated design modules in an interconnected framework /." Diss., CLICK HERE for online access, 2005. http://contentdm.lib.byu.edu/ETD/image/etd973.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Deas, Alexander Roger. "An idiomatic framework for the automated synthesis of topographical information from behavioural specifications." Thesis, University of Edinburgh, 1985. http://hdl.handle.net/1842/13604.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Syrko, Ariel. "Development and evaluation of a framework for semi-automated formalization of automotive requirements." Thesis, Blekinge Tekniska Högskola, Institutionen för tillämpad signalbehandling, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-11644.

Full text
Abstract:
Quantity and intricacy of features implemented in vehicle have expanded rapidly over a past few years. Currently vision of autonomous vehicle is no longer a dream or SF movie, but instead a coming reality. In order to reach the better quality and high safety, advanced verification techniques are required. Simulink Design Verifier is a model checking tool based on formal verification, which can be effectively used to solve problems concerning error detection and testing at earlier stages of project. The transformation of requirements written in traditional form into Simulink Design Verifier objectives can be time consuming as well as requiring knowledge of system model and the verification tools. In order to reduce time consumption and to guide a user through the system model and the verification tool, the semi-automated framework has been developed. An implementation of restricted English grammar patterns into Simulink objects supports description of patterns to engineers and reduces time consumption. The developed framework is flexible and intuitive hence can be a solution for other branches of industry, but further tests and verification would be required. This thesis highlights the whole process of transformation system requirements written in natural language into Simulink Design Verifier objectives. The Fuel Level Display System model currently used by almost all Scania’s vehicles is analysed. Limitations and errors encountered during development process like a flexibility of Simulink Design Verifier to capture requirements and the patterns behaviour or ambiguity of system requirements are analysed and described in this thesis.
APA, Harvard, Vancouver, ISO, and other styles
6

Yim, Sungshik. "A Retrieval Method (DFM Framework) for Automated Retrieval of Design for Additive Manufacturing Problems." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/14553.

Full text
Abstract:
Problem: The process planning task for a given design problem in additive manufacturing can be greatly enhanced by referencing previously developed process plans. However, identifying appropriate process plans for the given design problem requires appropriate mapping between the design domain and the process planning domain. Hence, the objective of this research is to establish mathematical mapping between the design domain and the process planning domain such that the previously developed appropriate process plans can be identified for the given design task. Further more, identification of an appropriate mathematical theory that enables computational mapping between the two domains is of interest. Through such computational mapping, previously developed process plans are expected to be shared in a distributed environment using an open repository. Approach: The design requirements and process plans are discretized using empirical models that compute exact values of process variables for the given design requirements. Through this discretization, subsumption relations among the discretized design requirements and process plans are identified. Appropriate process plans for a given design requirement are identified by subsumption relations in the design requirements. Also, the design requirements that can be satisfied by the given process plans are identified by subsumption relations among the process plans. To computationally realize such mapping, a description logic (ALE) is identified and justified to represent and compute subsumption relation. Based on this investigation, a retrieval method (DFM framework) is realized that enables storage and retrieval of process plans. Validation: Theoretical and empirical validations are performed using the validation square method. For the theoretical validation, an appropriate description logic (ALE) is identified and justified. Also, subsumption utilization in mapping two domains and realizing the DFM framework is justified. For the empirical validation, the storing and retrieval performance of the DFM framework is tested to demonstrate its theoretical validity. Contribution: In this research, two areas of contributions are identified: DFM and engineering information management. In DFM, the retrieval method that relates the design problem to appropriate process plans through mathematical mapping between design and process planning domain is the major contribution. In engineering information management, the major contributions are the development of information models and the identification of their characteristics. Based on this investigation, an appropriate description logic (ALE) is selected and justified. Also, corresponding computational feasibility (non deterministic polynomial time) of subsumption is identified.
APA, Harvard, Vancouver, ISO, and other styles
7

Yim, Sungshik. "A retrieval method (DF FRAMEWORK) for automated retrieval of design for additive manufacturing problems." Available online, Georgia Institute of Technology, 2007, 2007. http://etd.gatech.edu/theses/available/etd-03012007-113030/.

Full text
Abstract:
Thesis (Ph. D.)--Mechanical Engineering, Georgia Institute of Technology, 2007.
Nelson Baker, Committee Member ; Charles Eastman, Committee Member ; Christiaan Paredis, Committee Member ; Janet Allen, Committee Member ; David Rosen, Committee Chair.
APA, Harvard, Vancouver, ISO, and other styles
8

Moodley, Anand. "Development of a unified mass and heat integration framework for sustainable design an automated approach /." Diss., Pretoria : [s.n.], 2007. http://upetd.up.ac.za/thesis/available/etd-04222008-094925/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Aram, Shiva. "A Knowledge-based system framework for semantic enrichment and automated detailed design in the AEC projects." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/53496.

Full text
Abstract:
Adoption of a streamlined BIM workflow throughout the AEC projects’ lifecycle will provide the project stakeholders with the rich information embedded in the parametric design models. Users can incorporate this rich information in various activities, improving efficiency and productivity of project activities and potentially enhancing accuracy and reducing errors and reworks. Two main challenges for such a streamlined information flow throughout the AEC projects that haven’t been sufficiently addressed by previous research efforts include lack of semantic interoperability and a large gap and misalignment of information between available BIM information provided by design activities and the required information for performing preconstruction and construction activities. This research effort proposes a framework for a knowledge-based system (KBS) that encapsulates domain experts’ knowledge and represents it through modularized rule set libraries as well as connected design automation and optimization solutions. The research attempts to provide a methodology for automatic semantic enrichment of design models as well as automated detailed design to fill the information gap between design and preconstruction project activities, streamlining BIM workflow and enhancing its value in the AEC projects.
APA, Harvard, Vancouver, ISO, and other styles
10

Moura, César. "Conceiving and Implementing a language-oriented approach for the design of automated learning scenarios." Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2007. http://tel.archives-ouvertes.fr/tel-00156874.

Full text
Abstract:
Cette thèse a pour sujet la conception de scénarios pédagogiques destinés à l'e-formation. Afin de faciliter les échanges de matériaux décrivant des stratégies pédagogiques, la communauté s'est récemment mobilisée pour proposer un langage standard suffisamment générique pour permettre la représentation de n'importe quel scénario, indépendant même du paradigme éducationnel sous-jacent. Appelé génériquement Educational Modeling Language (EML), ce type de langage engendre une nouvelle façon de concevoir des EIAH, en s'éloignant du traditionnel Instructional System Design, une fois que, au lieu de proposer une application finie, les EML proposent un modèle conceptuel standard, une notation pour l'exprimer et des éditeurs et frameworks, laissant aux concepteurs finaux la tâche de créer leurs propres « applications ». Les EMLs permettent alors la création et exécution d'instances de scénarios, dans une approche plus ouverte et flexible, augmentant, ainsi, les possibilités d'adaptation des applications résultantes aux besoins des usagers.
Cette flexibilité reste pourtant limitée et, après quelques années de recherche, les EMLs commencent à montrer ses faiblesses. En fait, le langage choisi pour devenir le standard du domaine, le IMS-LD, s'est montré générique, certes, mais peu expressive, ne permettant pas une représentation fidèle des divers scénarios existants. C'est à dire, c'est aux usagers de s'adapter à la syntaxe et sémantique de cet standard.
Cette thèse part d'un constat quant aux difficultés du processus de conception lui-même, et aux risques de coupure qu'il peut y avoir entre pédagogues et développeurs de logiciels. Pour améliorer la capacité des équipes pédagogiques à pouvoir spécifier, et même implémenter, des scénarios pédagogiques, nous proposons une approche où c'est l'EML qui doit s'adapter aux besoins de l'usager. L'usager a la possibilité de créer son propre langage (ou ses propres langages), s'il en a besoin. En plus, un même scénario peut être décrit en même temps par des différents EMLs (ou modèles) respectant des différents perspectives - et même paradigmes - de chaque stake holder.
Cette approche, appelée multi-EML, est possible grâce aux avancées récentes du génie logiciel, telle l'Architecture Dirigée par les Modèles – l'implémentation la plus connue d'un nouvel paradigme de programmation surnommé Languages Oriented Programming (LOP), qui inclut encore d'autres implémentations.
Notre proposition réside dans la conception d'un environnement informatique « auteur », qui repose sur les principes des Languages Oriented Programming, en utilisant la plateforme ouverte ECLIPSE et, plus particulièrement son implémentation du LOP, l'Eclipse Modeling Framework (EMF). Ainsi, les concepteurs auront un outil qui leur permettra de créer des spécifications formelles décrivant les scénarios envisagés et d'en générer automatiquement des applications correspondantes, dans un processus qui démarre avec les descriptions informelles des experts du domaine.
Reconnaissant que les experts d'éducation - ceux qui mieux comprennent le domaine - ne sont pas nécessairement des informaticiens, l'environnement proposé, appelé MDEduc, fournit aussi un éditeur permettant de décrire un scénario dans une notation informelle, à savoir le pattern pédagogique, à partir de laquelle les modèles formels peuvent être dérivés. En plus, nous proposons de garder côte à côte et en coïncidence ces descriptions en langage informelles, et les descriptions plus formelles et normatives et d'offrir la possibilité d'effectuer des allers-retours à toutes les phases du cycle de vie du dispositif pédagogique.
APA, Harvard, Vancouver, ISO, and other styles
11

Soliman, Junior João. "Framework para suporte à verificação automatizada de requisitos regulamentares em projetos hospitalares." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2018. http://hdl.handle.net/10183/182369.

Full text
Abstract:
Empreendimentos hospitalares são reconhecidos pela complexidade que está associada a todas as fases de seu ciclo de vida: projeto, construção e operação. Os projetos da saúde são altamente influenciados por regulamentações locais. Estes conjuntos de códigos e legislações contém informações prescritivas e são importantes ao projeto, uma vez que as especificações são, usualmente, definidas de acordo com os critérios neles contidos. Ao longo do processo de desenvolvimento de produto, as especificações devem ser verificadas frente aos requisitos extraídos destas regulamentações. Este processo, se realizado manualmente, tende a ser demorado e propenso a erros. Tentativas de desenvolvimento de sistemas de verificação automatizada não se mostraram completamente satisfatórias. Muitos dos problemas estão relacionados à forma como novas abordagens são concebidas, muitas vezes desenvolvidas de acordo com métodos codificados e fragmentados, e à tipologia de informação que está nas normas e regulamentações. A abordagem metodológica utilizada nesta pesquisa foi a Design Science Research. Como artefato, foi desenvolvido um framework com base na abordagem semântica, para fornecer suporte ao desenvolvimento de sistemas de verificação automatizada, com ênfase em requisitos regulamentares no contexto de projetos de edificações hospitalares. As principais contribuições teóricas deste estudo, portanto, estão relacionadas às taxonomias e às transformações da informação, bem como às relações entre os constructos utilizados. Os resultados indicam que a natureza das regulamentações possui impacto significativo na possibilidade de tradução em regra lógica parametrizável. Apesar de a automação ser desejável, os resultados deste estudo indicam, ainda, que atualmente nem todos os requisitos podem ser completamente traduzidos em termos de processamento e verificação automatizados. Apesar de este fato diminuir o nível geral de automatização no processo, ele pode trazer benefícios ao contexto de projetos da saúde. O atendimento de alguns dos requisitos depende em um certo grau, em critérios subjetivos, que estão relacionados à interpretação humana e à criatividade.
Healthcare facilities are recognized for the complexity associated to all phases of their lifecycle: design, construction and operation. The design of healthcare projects is highly influenced by local healthcare regulations. These legislations usually contain prescriptive information and play an important role, as design specifications should be defined based on the criteria defined therein. In the design phase, during the product development process, requirements extracted from legal regulations must be verified against design specifications. This process, if done manually, tends to be time consuming and error prone. Attempts to develop automated rule checking systems for healthcare projects have not been fully successful. Most flaws appear to be related to the way new approaches are conceived, being mostly developed according to hard-coded and fragmented approaches, and the typology of information bounded by the regulations. The methodological approach adopted in this investigation was the Design Science Research. The main outcome of this research study is a semantic-based framework, devised to support the development of automated rule checking systems, focused on regulatory requirements of healthcare building design. The main theoretical contributions of this research work are concerned with the taxonomies and information transformation, as well as the relationships among the constructs involved. The results indicate that the nature of regulations have a major impact on the possibility of translating them into logic rules. Even though automation is desirable, the findings of this study also indicate that currently not all requirements can be fully translated into rules for automated processing and checking. Although this decreases the overall degree of automation in the process, this fact may provide benefits to the healthcare context. The fulfillment of some requirements to some extend should rely on subjective criteria, which depends on human interpretation and creativity.
APA, Harvard, Vancouver, ISO, and other styles
12

Ganapathy, Priya. "Development and Evaluation of a Flexible Framework for the Design of Autonomous Classifier Systems." Wright State University / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=wright1261335392.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Acimovic, Aleksandar, and Aleksandar Bajceta. "Test script design approaches supporting reusability, maintainability and review process." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-44724.

Full text
Abstract:
Software testing is widely considered to be one of the most important parts of software development life-cycle. In this research, we investigated potential improvements in the testing process and design of automated test scripts inside Bombardier Transportation. For the creation of automated test scripts BT is using a group of programs called TAF (Test Automation Framework). These scripts are used for testing Train Control Management System (TCMS), software that is used for managing the train. TAF can export its test scripts in XML format. XML scripts are analyzed in order to identify the most frequent changes. To better understand the life cycle of automated Test scripts official documentation that defines the Verification and Validation process inside BT was analyzed. Also, an interview was conducted with one of the responsible persons for testing. We believe that we have found a possible solution for improving testing process and creation of automated test scripts in BT, and to evaluate it proof of concept tool was developed. The main idea behind the tool is to write the test script using keywords which are based on analysis that was conducted on test specification documentation. These keywords represent frequent actions that are being tested on the train. By storing those actions in keywords re-usability of test script is being increased. Also, because they are based on naturally language, they are having positive effect on readability and maintenance of the test script.
APA, Harvard, Vancouver, ISO, and other styles
14

Koprnicky, Miroslav. "Towards a Versatile System for the Visual Recognition of Surface Defects." Thesis, University of Waterloo, 2005. http://hdl.handle.net/10012/888.

Full text
Abstract:
Automated visual inspection is an emerging multi-disciplinary field with many challenges; it combines different aspects of computer vision, pattern recognition, automation, and control systems. There does not exist a large body of work dedicated to the design of generalized visual inspection systems; that is, those that might easily be made applicable to different product types. This is an important oversight, in that many improvements in design and implementation times, as well as costs, might be realized with a system that could easily be made to function in different production environments.

This thesis proposes a framework for generalizing and automating the design of the defect classification stage of an automated visual inspection system. It involves using an expandable set of features which are optimized along with the classifier operating on them in order to adapt to the application at hand. The particular implementation explored involves optimizing the feature set in disjoint sets logically grouped by feature type to keep search spaces reasonable. Operator input is kept at a minimum throughout this customization process, since it is limited only to those cases in which the existing feature library cannot adequately delineate the classes at hand, at which time new features (or pools) may have to be introduced by an engineer with experience in the domain.

Two novel methods are put forward which fit well within this framework: cluster-space and hybrid-space classifiers. They are compared in a series of tests against both standard benchmark classifiers, as well as mean and majority vote multi-classifiers, on feature sets comprised of just the logical feature subsets, as well as the entire feature sets formed by their union. The proposed classifiers as well as the benchmarks are optimized with both a progressive combinatorial approach and with an genetic algorithm. Experimentation was performed on true colour industrial lumber defect images, as well as binary hand-written digits.

Based on the experiments conducted in this work, it was found that the sequentially optimized multi hybrid-space methods are capable of matching the performances of the benchmark classifiers on the lumber data, with the exception of the mean-rule multi-classifiers, which dominated most experiments by approximately 3% in classification accuracy. The genetic algorithm optimized hybrid-space multi-classifier achieved best performance however; an accuracy of 79. 2%.

The numeral dataset results were less promising; the proposed methods could not equal benchmark performance. This is probably because the numeral feature-sets were much more conducive to good class separation, with standard benchmark accuracies approaching 95% not uncommon. This indicates that the cluster-space transform inherent to the proposed methods appear to be most useful in highly dependant or confusing feature-spaces, a hypothesis supported by the outstanding performance of the single hybrid-space classifier in the difficult texture feature subspace: 42. 6% accuracy, a 6% increase over the best benchmark performance.

The generalized framework proposed appears promising, because classifier performance over feature sets formed by the union of independently optimized feature subsets regularly met and exceeded those classifiers operating on feature sets formed by the optimization of the feature set in its entirety. This finding corroborates earlier work with similar results [3, 9], and is an aspect of pattern recognition that should be examined further.
APA, Harvard, Vancouver, ISO, and other styles
15

Arief, Leonardus Budiman. "A framework for supporting automatic simulation generation from design." Thesis, University of Newcastle Upon Tyne, 2001. http://hdl.handle.net/10443/1816.

Full text
Abstract:
Building a new software system requires careful planning and investigation in order to avoid any problems in the later stages of the development. By using a universally accepted design notation such as the Unified Modeling Language (UML), ambiguities in the system specification can be eliminated or minimised. The aspect that frequently needs to be investigated before the implementation stage can be commenced concerns the proposed system’s performance. It is necessary to predict whether a particular design will meet the performance requirement - i.e. is it worth implementing the system - or not. One way to obtain this performance prediction is by using simulation programs to mimic the execution of the system. Unfortunately, it is often difficult to transform the design into a simulation program without some sound knowledge of simulation techniques. In addition, new simulation programs need to be built each time for different systems - which can be tedious, time consuming and error prone. The currently available UML tools do not provide any facilities for generating simulation programs automatically from UML specifications. This shortcoming is the main motivation for this research. The work involved here includes an investigation of which UML design notations can be used; the available simulation languages or environments for running the simulation; and more importantly, a framework that can capture the simulation information from UML design notation. Using this framework, we have built tools that enable an automatic transformation of a UML design notation into a simulation program. Two tools (parsers) that can perform such a transformation have been constructed. We provide case studies to demonstrate the applicability of these tools and the usefulness of our simulation framework in general.
APA, Harvard, Vancouver, ISO, and other styles
16

Lacroix, René. "A framework for the design of simulation-based greenhouse control." Thesis, McGill University, 1994. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=41652.

Full text
Abstract:
The main objectives were: (1) to develop tools to aid in the design of enclosed agro-ecosystems, and (2) to use these tools to develop a prototype simulation-based control system. Three tools were developed: (1) a conceptual framework, (2) a (simulated) greenhouse system and (3) a simulation approach within OS/2.
Part of the conceptual framework was dedicated to "conscious control", defined as a form of control practised by an entity that uses models of itself in its decision-making processes. The greenhouse system was composed of six modules (a simulation manager, a weather generator, a greenhouse model, a crop model, a Pavlovian controller and a cognitive controller), which were implemented under OS/2 as separate processes.
The greenhouse system was used to develop a prototype simulation-based controller. Primarily, the role of the controller was to determine temperature setpoints that would minimize the heating load. The simulation model used by the controller was an artificial neural network. The controller adapted temperature setpoints to anticipated meteorological conditions and reduced greenhouse energy consumption, in comparison with a more traditional controller.
Generally, the results showed the feasibility and illustrated some of the advantages of using simulation-based control. The research resulted in the definition of elements that will allow the creation of a methodological framework for the design of simulation-based control and, eventually, a theory of conscious control.
APA, Harvard, Vancouver, ISO, and other styles
17

Höflinger, Kilian. "Design of an Automatic Specification-based Test-framework for On-board Software of Satellites." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-175864.

Full text
Abstract:
Satelliter är sofistikerade och därför komplicerade konstruktioner som kräver tvärvetenskapligt lagarbete mellan olika experter från olika akademiska discipliner. Integrationen av specifika nyttolastkomponenter, liksom vetenskapliga experiment, med inbyggd programvara för satelliter är mycket utmanande. Domänexperten, som ägare av nyttolastkomponenten, besitter detaljerade insikter om hans eller hennes del, men saknar tillräckliga kunskaper i programmering för att implementera den i den inbyggda programvaran. Programmeraren är i stånd att skriva rätt kod för den inbyggda programvaran, men är oerfaren med nyttolastkomponenten. Denna rapport beskriver utformningen och genomförandet av ett automatisk, specifikationsbaserat testramverk för inbyggd programvara för satelliter för att överbrygga kunskaps- och kommunikationsklyftan mellan programmeraren och domänexperten. Modell- och testdriven utveckling är i fokus för testramverket. Med hjälp av ett domänspecifikt språk kan domänexperten modellera en specifikation i formell notation, som representerar potentiella användningsscenarier av komponenten. Dessa scenarier är automatiskt översatta till kompilerbara testfall i C++, som hjälper programmeraren att kontrollera den funktionella korrektheten av den inbyggda programvaran för nyttolastkomponenten när han eller hon programmerar den.
Satellites are sophisticated and therefore complicated constructs that require interdisciplinary teamwork of various experts of different academic disciplines. The integration of specific payload components, like scientific experiments, in the on-board software of the satellite is very challenging. The domain expert, as the owner of the payload component, possesses detailed insights on his or her component, but lacks sufficient programming skills to implement it in the on-board software. The programmer is able to write proper code for the onboard software, but is inexperienced with the payload component of the domain expert. This report describes the design and the implementation of an automatic specification-based test-framework for on-board software of satellites to bridge the knowledge and communication gap between the programmer and the domain expert. Model- and test-driven development are in the focus of the testframework. With the help of a domain-specific language, the domain expert is able to model a specification in formal notation, representing potential use-case scenarios of the component. These scenarios are automatically translated to compilable C++ test cases, which help the programmer to verify the functional correctness of the on-board software implementation of the payload component while he or she is programming it.
APA, Harvard, Vancouver, ISO, and other styles
18

Chandrasekar, Maheshwar. "Search State Extensibility based Learning Framework for Model Checking and Test Generation." Diss., Virginia Tech, 2010. http://hdl.handle.net/10919/28978.

Full text
Abstract:
The increasing design complexity and shrinking feature size of hardware designs have created resource intensive design verification and manufacturing test phases in the product life-cycle of a digital system. On the contrary, time-to-market constraints require faster verification and test phases; otherwise it may result in a buggy design or a defective product. This trend in the semiconductor industry has considerably increased the complexity and importance of Design Verification, Manufacturing Test and Silicon Diagnosis phases of a digital system production life-cycle. In this dissertation, we present a generalized learning framework, which can be customized to the common solving technique for problems in these three phases. During Design Verification, the conformance of the final design to its specifications is verified. Simulation-based and Formal verification are the two widely known techniques for design verification. Although the former technique can increase confidence in the design, only the latter can ensure the correctness of a design with respect to a given specification. Originally, Design Verification techniques were based on Binary Decision Diagram (BDD) but now such techniques are based on branch-and-bound procedures to avoid space explosion. However, branch-and-bound procedures may explode in time; thus efficient heuristics and intelligent learning techniques are essential. In this dissertation, we propose a novel extensibility relation between search states and a learning framework that aids in identifying non-trivial redundant search states during the branch-and-bound search procedure. Further, we also propose a probability based heuristic to guide our learning technique. First, we utilize this framework in a branch-and-bound based preimage computation engine. Next, we show that it can be used to perform an upper-approximation based state space traversal, which is essential to handle industrial-scale hardware designs. Finally, we propose a simple but elegant image extraction technique that utilizes our learning framework to compute over-approximate image space. This image computation is later leveraged to create an abstraction-refinement based model checking framework. During Manufacturing Test, test patterns are applied to the fabricated system, in a test environment, to check for the existence of fabrication defects. Such patterns are usually generated by Automatic Test Pattern Generation (ATPG) techniques, which assume certain fault types to model arbitrary defects. The size of fault list and test set has a major impact on the economics of manufacturing test. Towards this end, we propose a fault col lapsing approach to compact the size of target fault list for ATPG techniques. Further, from the very beginning, ATPG techniques were based on branch-and-bound procedures that model the problem in a Boolean domain. However, ATPG is a problem in the multi-valued domain; thus we propose a multi-valued ATPG framework to utilize this underlying nature. We also employ our learning technique for branch-and-bound procedures in this multi-valued framework. To improve the yield for high-volume manufacturing, silicon diagnosis identifies a set of candidate defect locations in a faulty chip. Subsequently physical failure analysis - an extremely time consuming step - utilizes these candidates as an aid to locate the defects. To reduce the number of candidates returned to the physical failure analysis step, efficient diagnostic patterns are essential. Towards this objective, we propose an incremental framework that utilizes our learning technique for a branch-and-bound procedure. Further, it learns from the ATPG phase where detection-patterns are generated and utilizes this information during diagnostic-pattern generation. Finally, we present a probability based heuristic for X-filling of detection-patterns with the objective of enhancing the diagnostic resolution of such patterns. We unify these techniques into a framework for test pattern generation with good detection and diagnostic ability. Overall, we propose a learning framework that can speed up design verification, test and diagnosis steps in the life cycle of a hardware system.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
19

Rudraiah, Dakshinamurthy Amruth. "A Compiler-based Framework for Automatic Extraction of Program Skeletons for Exascale Hardware/Software Co-design." Master's thesis, University of Central Florida, 2013. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/5695.

Full text
Abstract:
The design of high-performance computing architectures requires performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a "program skeleton" that we discuss in this paper is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed for the purposes of the skeleton. In this work, we develop a semi-automatic approach for extracting program skeletons based on compiler program analysis. We demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator. Extracting such a program skeleton from a large-scale parallel program requires a substantial amount of manual effort and often introduces human errors. We outline a semi-automatic approach for extracting program skeletons from large-scale parallel applications that reduces cost and eliminates errors inherent in manual approaches. Our skeleton generation approach is based on the use of the extensible and open-source ROSE compiler infrastructure that allows us to perform flow and dependency analysis on larger programs in order to determine what code can be removed from the program to generate a skeleton.
M.S.
Masters
Electrical Engineering and Computer Science
Engineering and Computer Science
Computer Science
APA, Harvard, Vancouver, ISO, and other styles
20

Teonacio, Bezerra Leonardo. "A component-wise approach to multi-objective evolutionary algorithms: From flexible frameworks to automatic design." Doctoral thesis, Universite Libre de Bruxelles, 2016. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/232586.

Full text
Abstract:
Multi-objective optimization is a growing field of interest for both theoretical and applied research, mostly due to the higher accuracy with which multi-objective problems (MOPs) model real- world scenarios. While single-objective models simplify real-world problems, MOPs can contain several (and often conflicting) objective functions to be optimized at once. This increased accuracy, however, comes at the expense of a higher difficulty that MOPs pose for optimization algorithms in general, and so a significant research effort has been dedicated to the development of approximate and heuristic algorithms. In particular, a number of proposals concerning the adaptation of evolutionary algorithms (EAs) for multi-objective problems can be seen in the literature, evidencing the interest they have received from the research community.This large number of proposals, however, does not mean that the full search power offered by multi- objective EAs (MOEAs) has been properly exploited. For instance, in an attempt to propose significantly novel algorithms, many authors propose a number of algorithmic components at once, but evaluate their proposed algorithms as monolithic blocks. As a result, each time a novel algorithm is proposed, several questions that should be addressed are left unanswered, such as (i) the effectiveness of individual components, (ii) the benefits and drawbacks of their interactions, and (iii) whether a better algorithm could be devised if some of the selected/proposed components were replaced by alternative options available in the literature. This component-wise view of MOEAs becomes even more important when tackling a new application, since one cannot antecipate how they will perform on the target scenario, neither predict how their components may interact. In order to avoid the expensive experimental campaigns that this analysis would require, many practitioners choose algorithms that in the end present suboptimal performance on the application they intend to solve, wasting much of the potential MOEAs have to offer.In this thesis, we take several significant steps towards redefining the existng algorithmic engineering approach to MOEAs. The first step is the proposal of a flexible and representative algorithmic framework that assembles components originally used by many different MOEAs from the literature, providing a way of seeing algorithms as instantiations of a unified template. In addition, the components of this framework can be freely combined to devise novel algorithms, offering the possibility of tailoring MOEAs according to the given application. We empirically demonstrate the efficacy of this component-wise approach by designing effective MOEAs for different target applications, ranging from continuous to combinatorial optimization. In particular, we show that the MOEAs one can tailor from a collection of algorithmic components is able to outperform the algorithms from which those components were originally gathered. More importantly, the improved MOEAs we present have been designed without manual assistance by means of automatic algorithm design. This algorithm engineering approach considers algorithmic components of flexible frameworks as parameters of a tuning problem, and automatically selects the component combinations that lead to better performance on a given application. In fact, this thesis also represents significant advances in this research direction. Primarily, this is the first work in the literature to investigate this approach for problems with any number of objectives, as well as the first to apply it to MOEAs. Secondarily, our efforts have led to a significant number of improvements in the automatic design methodology applied to multi-objective scenarios, as we have refined several aspects of this methodology to be able to produce better quality algorithms.A second significant contribution of this thesis concerns understanding the effectiveness of MOEAs (and in particular of their components) on the application domains we consider. Concerning combina- torial optimization, we have conducted several investigations on the multi-objective permutation flowshop problem (MO-PFSP) with four variants differing as to the number and nature of their objectives. Through thorough experimental campaigns, we have shown that some components are only effective when jointly used. In addition, we have demonstrated that well-known algorithms could easily be improved by replacing some of their components by other existing proposals from the literature. Regarding continuous optimization, we have conducted a thorough and comprehensive performance assessment of MOEAs and their components, a concrete first step towards clearly defining the state-of-the-art for this field. In particular, this assessment also encompasses many-objective optimization problems (MaOPs), a sub-field within multi-objective optimization that has recently stirred the MOEA community given its theoretical and practical demands. In fact, our analysis is instrumental to better understand the application of MOEAs to MaOPs, as we have discussed a number of important insights for this field. Among the most relevant, we highlight the empirical verification of performance metric correlations, and also the interactions between structural problem characteristics and the difficulty increase incurred by the high number of objectives.The last significant contribution from this thesis concerns the previously mentioned automatically generated MOEAs. In an initial feasibility study, we have shown that MOEAs automatically generated from our framework are able to consistently outperform the original MOEAs from where its components were gathered both for the MO-PFSP and for MOPs/MaOPs. The major contribution from this subset, however, regards continuous optimization, as we significantly advance the state-of-the-art for this field. To accomplish this goal, we have extended our framework to encompass approaches that are primarily used for this continuous problems, although the conceptual modeling we use is general enough to be applied to any domain. From this extended framework we have then automatically designed state-of- the-art MOEAs for a wide range of experimental scenarios. Moreover, we have conducted an in-depth analysis to explain their effectiveness, correlating the role of algorithmic components with experimental factors such as the stopping criterion or the performance metric adopted.Finally, we highlight that the contributions of this thesis have been increasingly recognized by the scientific community. In particular, the contributions to the research of MOEAs applied to continuous optimization are remarkable given that this is the primary application domain for MOEAs, having been extensively studied for a couple decades now. As a result, chapters from this work have been accepted for publication in some of the best conferences and journals from our field.
Doctorat en Sciences de l'ingénieur et technologie
info:eu-repo/semantics/nonPublished
APA, Harvard, Vancouver, ISO, and other styles
21

Chenini, Hanen. "A rapid design methodology for generating of parallel image processing applications and parallel architectures for smart camera." Thesis, Clermont-Ferrand 2, 2014. http://www.theses.fr/2014CLF22459.

Full text
Abstract:
Dû à la complexité des algorithmes de traitement d’images récents et dans le but d'accélérer la procédure de la conception des MPSoCs, méthodologies de prototypage rapide sont nécessaires pour fournir différents choix pour le programmeur de générer des programmes parallèles efficaces. Ce manuscrit présente les travaux menés pour proposer une méthodologie de prototypage rapide permettant la conception des architectures MPSOC ainsi que la génération automatique de système matériel / logiciel dédié un circuit reprogrammable (FPGA). Pour faciliter la programmation parallèle, l'approche MPSoC proposée est basée sur l’utilisation de Framework « CubeGen » qui permet la génération des différentes solutions envisageables pour réaliser des prototypes dans le domaine du traitement d’image. Ce document décrit une méthode basée sur le concept des squelettes générés en fonction des caractéristiques d'application afin d'exploiter tous les types de parallélisme des algorithmes réels. Un ensemble d’expérimentations utilisant des algorithmes courants permet d’évaluer les performances du flot de conception proposé équivalente à une architecture basé des processeurs hardcore et les solutions traditionnels basé sur cibles ASIC
Due to the complexity of image processing algorithms and the restrictions imposed by MPSoC designs to reach their full potentials, automatic design methodologies are needed to provide guidance for the programmer to generate efficient parallel programs. In this dissertation, we present a MPSoC-based design methodology solution supporting automatic design space exploration, automatic performance evaluation, as well as automatic hardware/software system generation. To facilitate the parallel programming, the presented MPSoC approach is based on a CubeGen framework that permits the expression of different scenarios for architecture and algorithmic design exploring to reach the desired level of performance, resulting in short time development. The generated design could be implemented in a FPGA technology with an expected improvement in application performance and power consumption. Starting from the application, we have evolved our effective methodology to provide several parameterizable algorithmic skeletons in the face of varying application characteristics to exploit all types of parallelism of the real algorithms. Implementing such applications on our parallel embedded system shows that our advanced methods achieve increased efficiency with respect to the computational and communication requirements. The experimental results demonstrate that the designed multiprocessing architecture can be programmed efficiently and also can have an equivalent performance to a more powerful designs based hard-core processors and better than traditional ASIC solutions which are too slow and too expensive
APA, Harvard, Vancouver, ISO, and other styles
22

Wong, Brandon Gei-Chin. "Design, development and application of an automated framework for cell growth and laboratory evolution." Thesis, 2018. https://hdl.handle.net/2144/30737.

Full text
Abstract:
Precise control over microbial cell growth conditions could enable detection of minute phenotypic changes, which would improve our understanding of how genotypes are shaped by adaptive selection. Although automated cell- culture systems such as bioreactors offer strict control over liquid culture conditions, they often do not scale to high-throughput or require cumbersome redesign to alter growth conditions. I report the design and validation of eVOLVER, a scalable DIY framework that can be configured to carry out high- throughput growth experiments in molecular evolution, systems biology, and microbiology. I perform high-throughput evolution of yeast across systematically varied population density niches to show how eVOLVER can precisely characterize adaptive niches. I describe growth selection using time-varying temperature programs on a genome-wide yeast knockout library to identify strains with altered sensitivity to changes in temperature magnitude or frequency. Inspired by large-scale integration of electronics and microfluidics, I also demonstrate millifluidic multiplexing modules that enable multiplexed media routing, cleaning, vial-to-vial transfers and automated yeast mating.
APA, Harvard, Vancouver, ISO, and other styles
23

"Design, Simulation and Testing of a Controller And Software Framework for Automated Construction by a Robotic Manipulator." Master's thesis, 2019. http://hdl.handle.net/2286/R.I.53883.

Full text
Abstract:
abstract: The construction industry is very mundane and tiring for workers without the assistance of machines. This challenge has changed the trend of construction industry tremendously by motivating the development of robots that can replace human workers. This thesis presents a computed torque controller that is designed to produce movements by a small-scale, 5 degree-of-freedom (DOF) robotic arm that are useful for construction operations, specifically bricklaying. A software framework for the robotic arm with motion and path planning features and different control capabilities has also been developed using the Robot Operating System (ROS). First, a literature review of bricklaying construction activity and existing robots’ performance is discussed. After describing an overview of the required robot structure, a mathematical model is presented for the 5-DOF robotic arm. A model-based computed torque controller is designed for the nonlinear dynamic robotic arm, taking into consideration the dynamic and kinematic properties of the arm. For sustainable growth of this technology so that it is affordable to the masses, it is important that the energy consumption by the robot is optimized. In this thesis, the trajectory of the robotic arm is optimized using sequential quadratic programming. The results of the energy optimization procedure are also analyzed for different possible trajectories. A construction testbed setup is simulated in the ROS platform to validate the designed controllers and optimized robot trajectories on different experimental scenarios. A commercially available 5-DOF robotic arm is modeled in the ROS simulators Gazebo and Rviz. The path and motion planning is performed using the Moveit-ROS interface and also implemented on a physical small-scale robotic arm. A Matlab-ROS framework for execution of different controllers on the physical robot is described. Finally, the results of the controller simulation and experiments are discussed in detail.
Dissertation/Thesis
Masters Thesis Mechanical Engineering 2019
APA, Harvard, Vancouver, ISO, and other styles
24

Herber, Paula [Verfasser]. "A framework for automated HW/SW co-verification of SystemC designs using tmed automata / vorgelegt von Paula Herber." 2010. http://d-nb.info/1004208499/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Liao, Tso-Hua, and 廖佐華. "Automatic Program Design Framework in PC-BASE Equipment Research." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/25159688479280111068.

Full text
Abstract:
碩士
義守大學
資訊管理學系碩士班
97
The initial stage of an automation-equipment-development project includes selecting a manager, system analysts, programmers. After that, the members of the project need to communicate and cooperate to complete the project. They usually first encountered some problems like undetermined kinematic process of mechanism, no electronic control hardware for testing software, the programming language not matching to the requirement etc. . These uncertainties make software developing difficult. As a result, before writing the program, the programmer have to think about the possibly coming problem, the solution to these problems, workability of the program, and the most important, that how to rapidly revise the finished program. The worst-case is to abort the unfinished program and start a new program. This will waste corporate resources, increase labor costs and lower gross margin. Moreover it may result in distrust in members of the project, increasing communication cost, and even dissolving the project, and end with failure. Therefore the manager, system analysts, programmers must be responsible for their own parts, and a flexible framework is necessary to solve the problem that may occur in the future. The underlying architecture could be very complex in the situation of multi-programmer-developed software. This research proposes a best mode for developing a flexible programming framework. Based on the questionnaire answered by the member of the project, we apply FMCDM (Fuzzy Multiple Criteria Decision-Making) to evaluate the automatic program framework, then construct the framework for equipment manufacturers to develop software in the future. Owing to a personal perspective, a common consensus for selecting a decision is hard to be reached. Therefore a systematic way to select a decision is an important issue for decision-making group. This research apples the structure of Fuzzy PROMETHEE (Fuzzy Preference Ranking Organization Method for Enrichment Evaluation) on FMCDM to compromise decision-making process conflicts and to systematically rationalize the decision-making.
APA, Harvard, Vancouver, ISO, and other styles
26

HUANG, CHENG-LONG, and 黃承龍. "A conceptual framework for automatic design for assembly evaluation." Thesis, 1990. http://ndltd.ncl.edu.tw/handle/88427519545431081734.

Full text
Abstract:
碩士
國立清華大學
工業工程研究所
78
DFA 評估(Design for Assembly Evaluation)的理念,是要在設計階段中分析產品的 可裝配性(Assemblability),找出設計上的缺陷,使產品的成本降低,并得到生產力 提高之益處。 然而,DFA 評估在輸入所需資料時,要判斷很多有關幾何及裝配上的資訊,將之填入 表格,或回答問題,花費的人工時間及精力十分可觀,造成了設計者額外的負擔。自 動DFA 評估能解決輸入費時的缺點,降低產品的設計成本,減少產品設計完成的時間 。 本論文以產品的自動裝配為研究對象,整合了CAD 系統與DFA 評估,而建立一個達成 自動DFA 評估(Automatic DFA Evaluation)的概念架構(Conceptual framework),并 探討概念架構中各模組的解決方法,尤其針對仍存在著許多難題的模組--判斷裝配 特征模組,提出了一些自動判斷裝配特征的演算法及法則。 最后發展了一個自動DFA 評估的雛形系統,來驗證所提出之概念架構與裝配特征的判 斷演算法及法則。
APA, Harvard, Vancouver, ISO, and other styles
27

Marriage, Christopher. "Automatic Implementation of Multidisciplinary Design Optimization Architectures Using piMDO." Thesis, 2008. http://hdl.handle.net/1807/17199.

Full text
Abstract:
Automatic Implementation of Multidisciplinary Design Optimization Architectures Using piMDO Christopher Marriage Masters of Applied Science Graduate Department of Aerospace Engineering University of Toronto 2008 Multidisciplinary Design Optimization (MDO) provides optimal solutions to complex, coupled, multidisciplinary problems. MDO seeks to manage the interactions between disciplinary simulations to produce an optimum, and feasible, design with a minimum of computational effort. Many MDO architectures and approaches have been developed, but usually in isolated situations with little chance for comparison. piMDO was developed to provide a unified framework for the solution of coupled op- timization problems and refinement of MDO approaches. The initial implementation of piMDO showed the benefits of a modular, object oriented, approach and laid the groundwork for future development of MDO architectures. This research furthered the development of piMDO by expanding the suite of available problems, incorporat- ing additional MDO architectures, and extending the object oriented approach to all of the required components for MDO. The end result is a modular, flexible software framework which is user friendly and intuitive to the practitioner. It allows complex problems to be quickly implemented and optimized with a variety of powerful numerical tools and MDO architectures. Importantly, it allows any of its components to be reorganized and sets the stage for future researchers to continue the development of MDO methods.
APA, Harvard, Vancouver, ISO, and other styles
28

DeHart, Brandon James. "Design and Implementation of a Framework for the Interconnection of Cellular Automata in Software and Hardware." Thesis, 2011. http://hdl.handle.net/10012/6264.

Full text
Abstract:
There has been a move recently in academia, industry, and the consumer space towards the use of unsupervised parallel computation and distributed networks (i.e., networks of computing elements working together to achieve a global outcome with only local knowledge). To fully understand the types of problems that these systems are applied to regularly, a representative member of this group of unsupervised parallel and distributed systems is needed to allow the development of generalizable results. Although not the only potential candidate, the field of cellular automata is an excellent choice for analyzing how these systems work as it is one of the simplest members of this group in terms of design specification. The current ability of the field of cellular automata to represent the realm of unsupervised parallel and distributed systems is limited to only a subset of the possible systems, which leads to the main goal of this work of finding a method of allowing cellular automata to represent a much larger range of systems. To achieve this goal, a conceptual framework has been developed that allows the definition of interconnected systems of cellular automata that can represent most, if not all, unsupervised parallel and distributed systems. The framework introduces the concept of allowing the boundary conditions of a cellular automaton to be defined by a separately specified system, which can be any system that is capable of producing the information needed, including another cellular automaton. Using this interconnection concept, two forms of computational simplification are enabled: the deconstruction of a large system into smaller, modular pieces; and the construction of a large system built from a heterogeneous set of smaller pieces. This framework is formally defined using an interconnection graph, where edges signify the flow of information from one node to the next and the nodes are the various systems involved. A library has been designed which implements the interconnection graphs defined by the framework for a subset of the possible nodes, primarily to allow an exploration of the field of cellular automata as a potential representational member of unsupervised parallel and distributed systems. This library has been developed with a number of criteria in mind that will allow it to be instantiated on both hardware and software using an open and extendable architecture to enable interaction with external systems and future expansion to take into account novel research. This extendability is discussed in terms of combining the library with genetic algorithms to find an interconnected system that will satisfy a specific computational goal. There are also a number of novel components of the library that further enhance the capabilities of potential research, including methods for automatically building interconnection graphs from sets of cellular automata and the ability to skip over static regions of a given cellular automaton in an intelligent way to reduce computation time. With a particular set of cellular automaton parameters, the use of this feature reduced the computation time by 75%. As a demonstration of the usefulness of both the library and the framework that it implements, a hardware application has been developed which makes use of many of the novel aspects that have been introduced to produce an interactive art installation named 'Aurora'. This application has a number of design requirements that are directly achieved through the use of library components and framework definitions. These design requirements included a lack of centralized control or data storage, a need for visibly dynamic behaviour in the installation, and the desire for the visitors to the installation to be able to affect the visible movement of patterns across the surface of the piece. The success of the library in this application was heavily dependent on its instantiation on a mixture of hardware and software, as well as the ability to extend the library to suit particular needs and aspects of the specific application requirements. The main goal of this thesis research, finding a method that allows cellular automata to represent a much larger range of unsupervised parallel and distributed systems, has been partially achieved in the creation of a novel framework which defines the concept of interconnection, and the design of an interconnection graph using this concept. This allows the field of cellular automata, in combination with the framework, to be an excellent representational member of an extended set of unsupervised parallel and distributed systems when compared to the field alone. A library has been developed that satisfies a broad set of design criteria that allow it to be used in any future research built on the use of cellular automata as this representational member. A hardware application was successfully created that makes use of a number of novel aspects of both the framework and the library to demonstrate their applicability in a real world situation.
APA, Harvard, Vancouver, ISO, and other styles
29

Lin, Kuan-Fu, and 林冠甫. "A Framework for Hardware-Software Co-Design for Real Time and Automatic Spike Sorting of Multichannel Neuronal Activity." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/47317710648413240500.

Full text
Abstract:
碩士
國立交通大學
電控工程研究所
100
Spike sorting is a primary and essential procedure for the realization of brain in the neuroscience and it provides a connection between the neural behavior and external behavior of animal for further application such as movement prediction and brain machine interface (BMI). With different objectives and improvement in implantable device technology, multichannel recording has become a standard tool for the research on neurophysiology. Besides, the accuracy of the spike sorting has crucial relations with the stability of the advanced application, and it would result in fatal influence for the application related to humans if the spike sorting was not reliable. A real-time and automatic spike sorting system for 16-channel neural recording based on hardware-software co-design is proposed in this study. The two-stage spike detection, combining the benefit of threshold method and nonlinear energy operator (NEO), is presented as the initial step of spike sorting process. The feature extraction in this study utilizes the discrete derivative method to improve the spike separation and chooses the principal component analysis to select few dominant features for reduction of indistinctive data. The single linkage method, with Mahalanobis distance as the distance metric, is used for spike clustering. The cross electrode validation is presented for the purpose that validates whether there is a single neuron recorded by two or more electrodes. The algorithms for this multichannel spike sorting system were verified and evaluated through simulations and experiments. The two-stage spike detection cooperating with feedback rule could decrease the probability of false detection. There is a significant improvement for spike separation on feature space with the help of the discrete derivative method, and, thus, the accuracy of the spike sorting is enhanced on indistinguishable data set from the result. After spike sorting, the cross electrode validation could lower the redundant neuronal information to be recorded. The proposed framework for real-time and automatic spike sorting of multichannel neuronal activity is feasible as the first step for neuroscientist to figure out the brain function of animals.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography