Academic literature on the topic 'Component Description Framework'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Component Description Framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Component Description Framework"

1

Pârv, Bazil, Ioan Lazăr, and Simona Motogna. "COMDEVALCO framework - the modeling language for procedural paradigm." International Journal of Computers Communications & Control 3, no. 2 (June 1, 2008): 183. http://dx.doi.org/10.15837/ijccc.2008.2.2386.

Full text
Abstract:
This work is part of a series referring to COMDEVALCO - a framework for Software Component Definition, Validation, and Composition. Its constituents are: a modeling language, a component repository and a set of tools. This is the first paper describing the object-oriented modeling language, which contains finegrained constructions, aimed to give a precise description of software components. The current status of the language reflects the procedural paradigm constructs.
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Shao Bo, Yao Hu, and Qing Sheng Xie. "Heterogeneous System Integration Based on Service Component." Applied Mechanics and Materials 20-23 (January 2010): 1305–10. http://dx.doi.org/10.4028/www.scientific.net/amm.20-23.1305.

Full text
Abstract:
For problem of heterogeneous system integration, Service-oriented component of integration model is proposed and the integration framework of component-based is constructed in this paper,which is based on research service component and integrated structure of SOA. The structure of service components is Designed,and analysis its working principle. WEB service and description criterion of registered for system is defined. Paper further study the key technical of heterogeneous information integration and mapping model. Finaly, with examples no heterogeneous systems integrate framework of service component is designed and implemented.
APA, Harvard, Vancouver, ISO, and other styles
3

Spekman, Nancy J., and Froma P. Roth. "An Intervention Framework for Learning Disabled Students with Communication Disorders." Learning Disability Quarterly 11, no. 3 (August 1988): 248–56. http://dx.doi.org/10.2307/1510769.

Full text
Abstract:
This paper presents an intervention framework for the management of communication disorders in learning disabled children. The model is comprised of three major components: communicative intentions, presupposition, and the social organization of discourse. A description of each component is provided along with a review of relevant research. Finally, a set of general instructional guidelines and principles is presented.
APA, Harvard, Vancouver, ISO, and other styles
4

Alonso, Diego, Francisco Sánchez-Ledesma, Pedro Sánchez, Juan A. Pastor, and Bárbara Álvarez. "Models and Frameworks: A Synergistic Association for Developing Component-Based Applications." Scientific World Journal 2014 (2014): 1–17. http://dx.doi.org/10.1155/2014/687346.

Full text
Abstract:
The use offrameworksandcomponentshas been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on themodel driven software developmentparadigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
APA, Harvard, Vancouver, ISO, and other styles
5

KUZEMSKY, A. L. "TWO-COMPONENT ALLOY MODEL FOR BISMUTHATE CERAMICS." Modern Physics Letters B 10, no. 14 (June 20, 1996): 627–33. http://dx.doi.org/10.1142/s0217984996000699.

Full text
Abstract:
The disordered binary substitutional A 1−x B x alloy model has been proposed for the description of the normal and superconducting properties of bismuthate ceramics Ba(Pb,Bi)O 3. The Eliashberg-type equations for the strong coupling superconductivity in strongly disordered alloys have been used to describe the superconducting properties. The relevant configurational averaging has been performed in the framework of CPA. The concentration dependence of electron-phonon coupling constant λ(x) and transition temperature Tc(x) has been calculated.
APA, Harvard, Vancouver, ISO, and other styles
6

KITCHAROENSAKKUL, SUPANAT, and VILAS WUWONGSE. "TOWARDS A UNIFIED VERSION MODEL USING THE RESOURCE DESCRIPTION FRAMEWORK (RDF)." International Journal of Software Engineering and Knowledge Engineering 11, no. 06 (December 2001): 675–701. http://dx.doi.org/10.1142/s0218194001000748.

Full text
Abstract:
Version control, a fundamental component of Software Configuration Management (SCM) — a key mechanism for software development especially in large-scale systems — lacks a unified foundation for interchangeability, interoperability, and integrability. A practical solution of such problems is not readily achieved because version controls of different SCM systems employ various version models. Hence a single unified version model is required. The work develops a single formalism to model major version control approaches as well as a prototype system. Its architecture is based on a federated SCM architecture and introduces a new SCM foundation which contains two layers: unified SCM representation and computation, based on the Resource Description Framework (RDF) and the Equivalent Transformation computational model, respectively. The evolution of software objects, their relationships as well as team cooperation strategies can be uniformly expressed under the representation layer. The computational layer is briefly mentioned for framework demonstration. Finally, a simple prototype system is presented.
APA, Harvard, Vancouver, ISO, and other styles
7

Goddard, Nigel H., Michael Hucka, Fred Howell, Hugo Cornelis, Kavita Shankar, and David Beeman. "Towards NeuroML: Model Description Methods for Collaborative Modelling in Neuroscience." Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences 356, no. 1412 (August 29, 2001): 1209–28. http://dx.doi.org/10.1098/rstb.2001.0910.

Full text
Abstract:
Biological nervous systems and the mechanisms underlying their operation exhibit astonishing complexity. Computational models of these systems have been correspondingly complex. As these models become ever more sophisticated, they become increasingly difficult to define, comprehend, manage and communicate. Consequently, for scientific understanding of biological nervous systems to progress, it is crucial for modellers to have software tools that support discussion, development and exchange of computational models. We describe methodologies that focus on these tasks, improving the ability of neuroscientists to engage in the modelling process. We report our findings on the requirements for these tools and discuss the use of declarative forms of model description—equivalent to object–oriented classes and database schema—which we call templates. We introduce NeuroML, a mark–up language for the neurosciences which is defined syntactically using templates, and its specific component intended as a common format for communication between modelling–related tools. Finally, we propose a template hierarchy for this modelling component of NeuroML, sufficient for describing models ranging in structural levels from neuron cell membranes to neural networks. These templates support both a framework for user–level interaction with models, and a high–performance framework for efficient simulation of the models.
APA, Harvard, Vancouver, ISO, and other styles
8

PEÑA-MORA, FENIOSKY, SANJEEV VADHAVKAR, and SIVA KUMAR DIRISALA. "Component-based software development for integrated construction management software applications." Artificial Intelligence for Engineering Design, Analysis and Manufacturing 15, no. 2 (April 2001): 173–87. http://dx.doi.org/10.1017/s0890060401152054.

Full text
Abstract:
This paper presents a framework and a prototype for designing Integrated Construction Management (ICM) software applications using reusable components. The framework supports the collaborative development of ICM software applications by a group of ICM application developers from a library of software components. The framework focuses on the use of an explicit software development process to capture and disseminate specialized knowledge that augments the description of the ICM software application components in a library. The importance of preserving and using this knowledge has become apparent with the recent trend of combining the software development process with the software application code. There are three main components in the framework: design patterns, design rationale model, and intelligent search algorithms. Design patterns have been chosen to represent, record, and reuse the recurring design structures and associated design experience in object-oriented software development. The Design Recommendation and Intent Model (DRIM) was extended in the current research effort to capture the specific implementation of reusable software components. DRIM provides a method by which design rationale from multiple ICM application designers can be partially generated, stored, and later retrieved by a computer system. To address the issues of retrieval, the paper presents a unique representation of a software component, and a search mechanism based on Reggia's setcover algorithm to retrieve a set of components that can be combined to get the required functionality is presented. This paper also details an initial, proof-of-concept prototype based on the framework. By supporting nonobtrusive capture as well as effective access of vital design rationale information regarding the ICM application development process, the framework described in this paper is expected to provide a strong information base for designing ICM software.
APA, Harvard, Vancouver, ISO, and other styles
9

Malyshev, V. P., and A. M. Makasheva. "Description of dynamic viscosity depending on the alloys composition and temperature using state diagrams." Izvestiya Visshikh Uchebnykh Zavedenii. Chernaya Metallurgiya = Izvestiya. Ferrous Metallurgy 61, no. 9 (October 21, 2018): 743–49. http://dx.doi.org/10.17073/0368-0797-2019-9-743-749.

Full text
Abstract:
The equilibrium nature of viscosity and fluidity is discovered on the basis of the Boltzmann distribution within the framework of the concept of randomized particles as a result of the virtual presence of crystal-mobile, liquid-mobile and vapor-mobile particles. It allows one to consider the viscosity and fluidity of solutions, in particular, melts of metal alloys, from the point of view of the equilibrium partial contributions of each component in the total viscosity and fluidity, despite the kinetic interpretation of natural expressions for these properties of the liquid. A linearly additive partial expression of viscosity is possible only for perfect solutions, in this case, for alloys with unrestricted mutual solubility of the components. Alloys with eutectics, chemical compounds and other features of the state diagram are characterized by viscosity dependencies that repeat the shape of liquidus curve over entire range of the alloy composition at different temperatures, with an increase in smoothness and convergence of these curves at increasing temperature. It was established that these features of viscosity temperature dependence are completely revealed within the framework of the concept of randomized particles and the virtual cluster model of viscosity in calculating the fraction of clusters determining the viscosity of the alloy. That viscosity of the alloy is found by the formula in which thermal energy RTcr at liquidus temperature is the thermal barrier of chaotization, characterizing the crystallization temperature of the melt Tcr, as well as the melting point of pure substances. On this basis, a method is proposed for calculating the alloys viscosity by phase diagrams using the temperature dependences of pure components viscosity to change the alloy’s viscosity in proportion to ratio of the clusters fractions at any temperature above liquidus line and for the pure component, taking into account the mole fraction of each component. As a result, a three-factor model of the liquid alloy viscosity has been obtained in which the thermal barrier of chaotization RTcr is used as variable for the first time. It determines the fraction of clusters for both pure substances (at RTcr = RTm ) and for alloys. This thermal barrier reflects the essence of the virtual cluster theory of liquid and adequacy of the concept of randomized particles.
APA, Harvard, Vancouver, ISO, and other styles
10

Zatsman, Igor, and Pavel Buntman. "Theoretical Framework and Denotatum-Based Models of Knowledge Creation for Monitoring and Evaluating R&D Program Implementation." International Journal of Software Science and Computational Intelligence 5, no. 1 (January 2013): 15–31. http://dx.doi.org/10.4018/ijssci.2013010102.

Full text
Abstract:
The paper presents two semiotic models for a description of development stages of indicators, including generation processes of expert knowledge about developed indicators. For the description of stages of these processes, a new notion of “Frege’s space“ is introduced. The authors described two applied semiotic models of knowledge acquisition. Those processes were studied in the context of forming an expert knowledge base named proactive dictionary, which represents a component of an evaluation system. This dictionary enables experts to fix stages of indicators development, to present the results of developing different variants of indicators in graphic form, to compare and evaluate these variants.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Component Description Framework"

1

Dinger, Ulrich. "Integrierte und hybride Konstruktion von Software-Produktlinien." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2009. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-24105.

Full text
Abstract:
Die Konzepte zur Erstellung von Software-Produktlinien dienen der ingenieurmäßigen, unternehmensinternen Wiederverwendung existierender Software-Artefakte. Existierende Ansätze nutzen von Hand erstellte und gewartete Kompositionsprogramme zum Assemblieren der Produkte entsprechend einer Variantenauswahl. Der Einsatz einer automatischen Planungskomponente sowie eines einfachen, erweiterbaren Komponenten-Meta-Modells hilft dabei, die dabei anfallenden Daten computergestützt zu verarbeiten. Die Integration beider Konzepte zu einem hybriden Ansatz ermöglicht die Neuerstellung von Produkten, die nicht von Anfang an als Produktlinie konzipiert sind, ohne eine spätere Umarbeitung unter Nutzung der automatischen Planungskomponente unnötig zu erschweren.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Component Description Framework"

1

Corradini, Antonella. Essence and Necessity. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198796299.003.0008.

Full text
Abstract:
In this chapter I try to show what an ontology for normative nonnaturalism could look like. Firstly, I inquire into the modal nature of the supervenience relation within a normative nonnaturalist framework. I surmise that the necessity preceding a strong normative supervenience relation is neither analytical nor nomological, but metaphysical. Borrowing from E. J. Lowe’s essentialist theory of necessity I argue that among the metaphysical determinations of an object’s essence there is also the property of possessing a certain telic structure. Secondly, I analyze the normative constitution relation, whose function is to identify what grounds supervenience. This analysis highlights the fact that end-directedness is a component of constitution along with descriptive properties. Thirdly, I show that supervenience follows from constitution and that the supervenience relation between descriptive and normative properties is obtained by means of the constitution analysis of the telic structure of the object displaying such properties.
APA, Harvard, Vancouver, ISO, and other styles
2

Martins, Ana Maria, and Adriana Cardoso. Word order change from a diachronic generative syntax perspective. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198747307.003.0001.

Full text
Abstract:
This introductory overview chapter focuses on the relation between movement operations and word order by assembling the pieces of information offered by the book’s authors. It shows how the essays published in the book indicate, when considered together, that word order change is mainly the effect of the interaction between clause structure and syntactic movement, thus identifying these two components of grammar as the main factors behind word order variation. It also demonstrates that the study of word order change (set within the framework of diachronic generative syntax) is a means to test the descriptive adequacy and explanatory potential of competing analyses of word order phenomena not restricted to historical change, and identifies (theoretical and empirical) research issues that emerge from the type of approach to word order change envisaged in the book.
APA, Harvard, Vancouver, ISO, and other styles
3

Bäck, Thomas. Evolutionary Algorithms in Theory and Practice. Oxford University Press, 1996. http://dx.doi.org/10.1093/oso/9780195099713.001.0001.

Full text
Abstract:
This book presents a unified view of evolutionary algorithms: the exciting new probabilistic search tools inspired by biological models that have immense potential as practical problem-solvers in a wide variety of settings, academic, commercial, and industrial. In this work, the author compares the three most prominent representatives of evolutionary algorithms: genetic algorithms, evolution strategies, and evolutionary programming. The algorithms are presented within a unified framework, thereby clarifying the similarities and differences of these methods. The author also presents new results regarding the role of mutation and selection in genetic algorithms, showing how mutation seems to be much more important for the performance of genetic algorithms than usually assumed. The interaction of selection and mutation, and the impact of the binary code are further topics of interest. Some of the theoretical results are also confirmed by performing an experiment in meta-evolution on a parallel computer. The meta-algorithm used in this experiment combines components from evolution strategies and genetic algorithms to yield a hybrid capable of handling mixed integer optimization problems. As a detailed description of the algorithms, with practical guidelines for usage and implementation, this work will interest a wide range of researchers in computer science and engineering disciplines, as well as graduate students in these fields.
APA, Harvard, Vancouver, ISO, and other styles
4

Gisborne, Nikolas, and Andrew Hippisley, eds. Defaults in Morphological Theory. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198712329.001.0001.

Full text
Abstract:
Default-based analyses of linguistic data are most prevalent in morphological descriptions because morphology is pervaded by idiosyncrasy and irregularity, and defaults allow for a representation of the facts by construing regularity not as all or nothing but as a matter of degree. Defaults manifest themselves in a variety of ways in a group of morphological theories that have received much attention in the last few years, and whose main ideas and claims have been recently consolidated as important monographs. In May 2012 a workshop was convened at the University of Kentucky in Lexington to show-case default usage in four prominent theories of morphology. The presenters were key proponents of the theories, in most cases a theory’s author. The role of defaults was outlined in Construction Morphology, Network Morphology, Paradigm Function Morphology, and Word Grammar. With reference to these theories, as well as the lexical syntactic framework of HPSG, this book addresses questions about the role of defaults in the lexicon, including: (1) Does a defaults-based account of language have implications for the architecture of the grammar, particularly the proposal that morphology is an autonomous component? (2) How does a default differ from the canonical or prototypical in morphology? (3) Do defaults have a psychological basis? (4) How do defaults help us understand language as a sign-based system that is flawed, where the one to one association of form and meaning breaks down in the morphology?
APA, Harvard, Vancouver, ISO, and other styles
5

Ricci, Edmund M., Ernesto A. Pretto, Jr., and Knut Ole Sundnes. Disaster Evaluation Research. Oxford University Press, 2019. http://dx.doi.org/10.1093/med/9780198796862.001.0001.

Full text
Abstract:
The ultimate hope and great challenge undertaken by the authors of this volume is to improve disaster preparedness and response efforts globally by providing a standardized way to conduct rigorous and comprehensive scientific evaluative studies of the medical and public health response to these horrific events. It is our strongly held belief that the framework for the conduct of evaluative studies, as developed by specialists in scientific evaluation, offers the most appropriate and comprehensive structure for such studies. Our ‘eight-step approach’ is based upon a conceptual framework that is now widely used by health organizations globally as a basis for the evaluation of community-based medical and public health programs. We contend that many more disaster-related injuries and deaths can be prevented if the concepts and methods of evaluation science are applied to disaster events. In Part 1 of this book we describe the basic concepts and scientific methods used by program evaluation scientists to assess the structure, process, and outcomes of medical and public health interventions. In addition, a detailed description of a comprehensive medical and public health response system is described. In Part 2 we present an eight-step model for conducting an evaluative study of the response, again with a focus on the medical and public health components. Ethical issues that come into play in the conduct of disaster evaluative disaster research, and how these should be addressed, are the focus of Chapter 13. The final chapter offers a look to the future as new technology for data collection becomes available. We are not so naïve as to believe that disaster preparedness and response will change as a direct result of the availability of scientifically conducted assessments. Change requires a double pronged commitment—leaders from both the ranks of government and of the health professions must carefully consider, fund, and adopt policy positions and programs that are based upon the findings and recommendations that emerge from scientific evaluation studies. That is the most certain pathway to a better future.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Component Description Framework"

1

Windmann, Stefan, and Christian Kühnert. "Information modeling and knowledge extraction for machine learning applications in industrial production systems." In Machine Learning for Cyber Physical Systems, 73–81. Berlin, Heidelberg: Springer Berlin Heidelberg, 2020. http://dx.doi.org/10.1007/978-3-662-62746-4_8.

Full text
Abstract:
AbstractIn this paper, a new information model for machine learning applications is introduced, which allows for a consistent acquisition and semantic annotation of process data, structural information and domain knowledge from industrial productions systems. The proposed information model is based on Industry 4.0 components and IEC 61360 component descriptions. To model sensor data, components of the OGC SensorThings model such as data streams and observations have been incorporated in this approach. Machine learning models can be integrated into the information model in terms of existing model serving frameworks like PMML or Tensorflowgraph. Based on the proposed information model, a tool chain for automatic knowledge extraction is introduced and the automatic classification of unstructured text is investigated as a particular application case for the proposed tool chain.
APA, Harvard, Vancouver, ISO, and other styles
2

Bhatti, Rafae, Daniel Sanz, Elisa Bertino, and Arif Ghafoor. "A Policy-Based Authorization Framework for Web Services." In Securing Web Services, 138–61. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-639-6.ch006.

Full text
Abstract:
This chapter describes a policy-based authorization framework to apply fine-grained access control on Web services. The framework is designed as a profile of the well-known WS-Policy specification tailored to meet the access control requirements in Web services by integrating WS-Policy with an access control policy specification language, X-GTRBAC. The profile is aimed at bridging the gap between available policy standards for Web services and existing policy specification languages for access control. The profile supports the WS-Policy Attachment specification, which allows separate policies to be associated with multiple components of a Web service description, and one of our key contributions is an algorithm to compute the effective policy for the Web service given the multiple policy attachments. To allow Web service applications to use our solution, we have adopted a component-based design approach based on well-known UML notations. We have also prototyped our architecture in a loosely coupled Web services environment.
APA, Harvard, Vancouver, ISO, and other styles
3

Dahanayake, Ajantha. "Computer Aided Method Engineering." In Successful Software Reengineering, 1–15. IGI Global, 2002. http://dx.doi.org/10.4018/978-1-931777-12-4.ch001.

Full text
Abstract:
The relationship between information systems development methods, organizational information systems engineering requirements, and the advantage of flexible automated support environments is presented. CASE technology is presented as a possible solution to provide flexible automated support. In this chapter the major topic is a conceptual model to specify the functionality of a support environment. First a review of a number of basic concepts and approaches for deriving models for CASE environments are given. An informal description of service component concepts used to derive a generic framework is presented. Further, a configuration of service components, to support Computer Aided Method Engineering (CAME), is outlined.
APA, Harvard, Vancouver, ISO, and other styles
4

Li, Yinsheng, Hamada Ghenniwa, and Weiming Shen. "Service-Oriented Agents and Meta-Model Driven Implementation." In Service-Oriented Software System Engineering, 270–91. IGI Global, 2005. http://dx.doi.org/10.4018/978-1-59140-426-2.ch013.

Full text
Abstract:
Current efforts have not enforced Web services as loosely coupled and autonomous entities. Web services and software agents have gained different focuses and accomplishments due to their development and application backgrounds. This chapter proposes service-oriented agents (SOAs) to unify Web services and software agents. Web services features can be well realized through introducing software agents’ sophisticated software modeling and interaction behaviors. We present a natural framework to integrate their related technologies into a cohesive body. Several critical challenges with SOAs have been addressed. The concepts, system and component structures, a meta-model driven semantic description, agent-oriented knowledge representation, and an implementation framework are proposed and investigated. They contribute to the identified setbacks with Web services technologies, such as dynamic composition, semantic description, and implementation framework. A prototype of the proposed SOAs implementation framework has been implemented. Several economic services are working on it.
APA, Harvard, Vancouver, ISO, and other styles
5

Radenkovic, Sonya, Nenad Krdžavac, and Vladan Devedžic. "Towards More Intelligent Assessment Systems." In Technology Enhanced Learning, 258–83. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-600-6.ch011.

Full text
Abstract:
This chapter presents a framework for intelligent analysis of the students’ knowledge in assessment systems, using description logics (DLs) reasoning techniques. The framework is based on Model Driven Architecture (MDA) software engineering standards. It starts from the IMS Question and Test Interoperability (QTI) standard and includes MDA-based metamodel and model transformations for QTI assessment systems. It also specifies an architecture for QTI assessment systems that is reusable, extensible, and facilitates interoperability between its component systems. An implementation of the QTI metamodel and the relevant example of transformations is provided in order to support developments according to the proposed framework.
APA, Harvard, Vancouver, ISO, and other styles
6

Lu, Gehao, and Joan Lu. "Neural Trust Model." In Examining Information Retrieval and Image Processing Paradigms in Multidisciplinary Contexts, 296–316. IGI Global, 2017. http://dx.doi.org/10.4018/978-1-5225-1884-6.ch017.

Full text
Abstract:
The problems found in the existing models push the researcher to look for a better solution for computational trust and computational reputation. According the problem exposed earlier, the newly proposed model should be a systematic model which supports both trust and reputation. The model should also take the learning capability for agents into consideration because agents cannot quickly adapt to the changes without learning. The model also needs to have the ability to make decisions according to its recognition of trust. Before actually building the model, it is necessary to analyze the concept of trust. Usually when people say trust they mean human trust, however, in this research trust refers to computational trust. How human trust is different from computational trust is a very interesting question. The answers to the question helped the researcher recover many features of computational trust and built a solid theoretical foundation for the proposed model. The definitions of trust in different disciplines such as economy, sociology and psychology will be compared. A possible definition of computational trust will be made and such trust from several different perspectives will be analyzed. The description of the model is important. As a whole, it is represented as a framework that defines components and component relationships. As the concrete components, the purposes and responsibilities of the specific component are explained. This is to illustrate the static structure of the model. The dynamic structure of the model is described as the process of executing the model.
APA, Harvard, Vancouver, ISO, and other styles
7

Nowack, Kenneth M. "From Insight to Successful Behavior Change." In Handbook of Strategic 360 Feedback, 175–92. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780190879860.003.0011.

Full text
Abstract:
This chapter provides an integrated and theoretically derived individual change framework within strategic 360 Feedback interventions to facilitate successful behavioral change in the face of realistic issues and potential challenges. A brief description of a new individual change model is introduced and issues specific to each stage are discussed. The importance of this individual behavior change model is that it highlights the diverse roles of the practitioner, employee, and organization that appear throughout the 360 Feedback literature to facilitate accurate self-awareness, self-directed learning, goal setting processes, deliberate practice, and evaluation. Practitioners are provided with practical tips and suggestions to maximize understanding, acceptance, motivation, and successful goal accomplishment—the real impact of strategic 360 Feedback interventions that also has a development component.
APA, Harvard, Vancouver, ISO, and other styles
8

Bouras, Athanasios, Panagiotis Gouvas, and Gregoris Mentzas. "A Semantic Service-Oriented Architecture for Business Process Fusion." In Electronic Business, 504–32. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-056-1.ch032.

Full text
Abstract:
Most enterprises contain several heterogeneous systems, creating a fuzzy network of interconnected applications, services, and data sources. In this emerging business context, a clear need appears to link these former incompatible systems by using enterprise application integration (EAI) solutions. We propose a semantically enriched service-oriented business applications (SE-SOBA) framework that will provide a dynamically reconfigurable architecture enabling enterprises to respond quickly and flexibly to market changes. We also propose the development of a pure semantic-based implementation of the universal description, discovery, and integration (UDDI) specification, called pure semantic registry (PSR), which provides a flexible, extendable core architectural component allowing the deployment and business exploitation of Semantic Web services. The implementation of PSR involves the development of a semantic-based repository and an embedded resource definition framework (RDF)-based reasoning engine, providing strong query and inference capabilities to support effective service discovery and composition. We claim that when SE-SOBAs are combined with PSR and rule-based formalizations of business scenarios and processes, they constitute a holistic business-driven semantic integration framework, called FUSION, applied to intra- and inter- organizational EAI scenarios.
APA, Harvard, Vancouver, ISO, and other styles
9

Bouras, Athanasios, Panagiotis Gouvas, and Gregoris Mentzas. "A Semantic Service-Oriented Architecture for Business Process Fusion." In Semantic Web Technologies and E-Business, 40–76. IGI Global, 2007. http://dx.doi.org/10.4018/978-1-59904-192-6.ch002.

Full text
Abstract:
Most enterprises contain several heterogeneous systems, creating a fuzzy network of interconnected applications, services, and data sources. In this emerging business context, a clear need appears to link these former incompatible systems by using enterprise application integration (EAI) solutions. We propose a semantically enriched service-oriented business applications (SE-SOBA) framework that will provide a dynamically reconfigurable architecture enabling enterprises to respond quickly and flexibly to market changes. We also propose the development of a pure semantic-based implementation of the universal description, discovery, and integration (UDDI) specification, called pure semantic registry (PSR), which provides a flexible, extendable core architectural component allowing the deployment and business exploitation of Semantic Web services. The implementation of PSR involves the development of a semantic-based repository and an embedded resource definition framework (RDF)-based reasoning engine, providing strong query and inference capabilities to support effective service discovery and composition. We claim that when SE-SOBAs are combined with PSR and rule-based formalizations of business scenarios and processes, they constitute a holistic business-driven semantic integration framework, called FUSION, applied to intra- and inter- organizational EAI scenarios.
APA, Harvard, Vancouver, ISO, and other styles
10

"Pacific Salmon Environmental and Life History Models: Advancing Science for Sustainable Salmon in the Future." In Pacific Salmon Environmental and Life History Models: Advancing Science for Sustainable Salmon in the Future, edited by Mark D. Scheuerell and Ray Hilborn. American Fisheries Society, 2009. http://dx.doi.org/10.47886/9781934874097.ch11.

Full text
Abstract:
<em>Abstract.</em>—The 1996 Sustainable Fisheries Act states that all federal fisheries management plans should contain a description of essential fish habitat (EFH). While much emphasis has been placed on estimating EFH for marine stocks, very little attention has been paid to doing so for Pacific salmon <em>Oncorhynchus </em>spp., in part due to their complex life histories. An earlier assessment of EFH for Pacific salmon across the west coast of the United States focused on the freshwater component of EFH due to limited knowledge about marine distributions. That analysis concluded that a more in-depth and smaller-scale examination was needed to assess how freshwater habitat affects the various life stages. Here we use a detailed life history model for Pacific salmon to estimate the freshwater component of EFH for two threatened populations of Chinook salmon within a large watershed draining into Puget Sound, Washington, USA. By accounting for proposed harvest rates, hatchery practices, and habitat structure, we identified 23 of 50 subbasins as EFH for ensuring no significant decrease in the total number of spawners relative to current average escapement. Our analytical framework could be easily applied to other populations or species of salmon to aid in developing recovery and management plans.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Component Description Framework"

1

Dani, Tushar H., and Rajit Gadh. "A Framework for Designing Component Shapes in a Virtual Reality Environment." In ASME 1997 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1997. http://dx.doi.org/10.1115/detc97/dfm-4372.

Full text
Abstract:
Abstract Despite advances in Computer-Aided Design (CAD) and the evolution of the graphical user interfaces, rapid creation, editing and visualization of three-dimensional (3D) shapes remains a tedious task. Though the availability of Virtual Reality (VR)-based systems allows enhanced three-dimensional interaction and visualization, the use of VR for ab initio shape design, as opposed to ‘importing’ models from existing CAD systems, is a relatively new area of research. Of interest are computer-human interaction issues and the design and geometric tools for shape modeling in a Virtual Environment (VE). The focus of this paper is on the latter i.e. in defining the geometric tools required for a VR-CAD system and in describing a framework that meets those requirements. This framework, the Virtual Design Software Framework (VDSF) consists of the interaction and design tools, and an underlying geometric engine that provides the representation and algorithms required by these tools. The geometric engine called the Virtual Modeler uses a graph-based representation (Shape-Graph) for modeling the shapes created by the user. The Shape-Graph facilitates interactive editing by localizing the effect of editing operations and in addition provides constraint-based design and editing mechanisms that are useful in a 3D interactive virtual environment. The paper concludes with a description of the prototype system, called the Virtual Design Studio (VDS), that is currently being implemented.1.
APA, Harvard, Vancouver, ISO, and other styles
2

Yoo, John Jung-Woon, and Anirudh Aryasomayajula. "Branch-and-Bound Algorithm for Interface-Based Modular Product Design." In ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/detc2012-70712.

Full text
Abstract:
In our earlier work we have proposed a collaboration system for modular product design. One of the main components of the system is a design repository to which suppliers can upload their component descriptions using machine-readable, interface-based component description language, so that manufacturers can refer to the descriptions during product design phases. A mathematical formulation for modular product design has been proposed based on Artificial Intelligence Planning framework. The proposed Binary Integer Programming formulation generates the optimal design of a product. The optimal design consists of multiple components that are compatible with each other in terms of input and out interfaces. However, the mathematical approach is faced with scalability issue. The development of a heuristic algorithm that generates a high quality solution within a reasonable amount of time is the final goal of the research. In this paper, we propose an algorithmic approach based on branch-and-bound method as an intermediate step for the final goal. This paper describes the details of the proposed branch-and-bound algorithm using a case study and experimental results are discussed.
APA, Harvard, Vancouver, ISO, and other styles
3

Sonneville, Valentin, Olivier A. Bauchau, and Olivier Brüls. "A Motion Formalism Approach to Modal Reduction for Flexible Multibody System Applications." In ASME 2018 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/detc2018-86143.

Full text
Abstract:
Multibody systems are often modeled as interconnected multibody and modal components: multibody components, such as rigid bodies, beams, plates, and kinematic joints, are treated via multibody techniques whereas the modal components are handled via a modal reduction approach based on the small strain assumption. In this work, the problem is formulated within the framework of the motion formalism. The kinematic description involves simple, straightforward frame transformations and leads naturally to consistent deformation measures. Derivatives are expressed in local frames, which results in the remarkable property that the tangent matrices are independent of the position and orientation of the modal component with respect to an inertia frame. This implies a reduced level of geometric non-linearity as compared to standard description. In particular, geometrically non-linear problems can be solved with the tangent matrices of the reference configuration, without re-evaluation and re-factorization.
APA, Harvard, Vancouver, ISO, and other styles
4

Scapolan, Matteo, Minghe Shan, and Olivier A. Bauchau. "Modal Reduction Procedures for Flexible Multibody System Applications." In ASME 2020 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/detc2020-22149.

Full text
Abstract:
Abstract The comprehensive simulation of flexible multibody systems calls for the ability to model various types of structural components such as rigid bodies, beams, plates, and kinematic joints. Modal components test offer additional modeling versatility by enabling the treatment of complex, three-dimensional structures via modal reduction procedures based on the small deformation assumption. In this paper, the problem is formulated within the framework of the motion formalism. The kinematic description involves simple, straightforward frame transformations and leads to deformation measures that are both objective and tensorial. Derivatives are expressed in the material frame, which results in the remarkable property that the tangent matrices are independent of the configuration of the modal component with respect to an inertial frame. This implies a reduced level of geometric nonlinearity as compared to standard description. In particular, geometrically nonlinear problems can be solved with the constant tangent matrices of the reference configuration, without re-evaluation and re-factorization.
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Xiaodong, and Richard A. Wysk. "The Technology of Components in Feature-Based Design and Process Planning." In ASME 1999 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1999. http://dx.doi.org/10.1115/detc99/cie-9100.

Full text
Abstract:
Abstract In current feature-based design and process planning research, two major problems are application system development mode and product information description. Particularly, with our current development mode, programs from other researchers cannot be reused, each researcher has to redevelop a set of new feature-application code almost built-from-scratch to create features and/or produce process planning.... As a result, though we have developed countless features-application programs, the problem about integrating CAD and CAM application is still largely unsolved. From the viewpoint of the technology of components (that is a new, powerful software development technology), we seem to be wasting large amount of time to do a repetitive and ineffective work that is orders of magnitude greater than we should do. By solving several key obstacles that obstruct us from developing the small, reusable components in advanced CAD/CAM fields, the paper creatively introduces into advanced CAD/ CAM fields the brand new, powerful component technology that has never been explored in the fields. And the paper gives a concrete example (MFIC) in the machinable feature application fields to demonstrate how to apply the technology in a specific field. The framework in the paper is the first component framework for advanced CAD/CAM applications. Because the technology of components provides developers with more reusability, extensibility, reliability and compatibility, so it has been universally accepted as the future software development technology by the entire software world. We believe the component technology is the future software development technology in advanced CAD/CAM fields too, and the component framework will change the development mode of advanced CAD/CAM applications, just like the interchangeability completely changes the manufacturing mode in mechanical industry.
APA, Harvard, Vancouver, ISO, and other styles
6

Elkady, Ayssam, Jovin Joy, and Tarek Sobh. "A Plug and Play Middleware for Sensory Modules, Actuation Platforms and Task Descriptions in Robotic Manipulation Platforms." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-47185.

Full text
Abstract:
We are developing a framework (RISCWare) for the modular design and integration of sensory modules, actuation platforms, and task descriptions that will be implemented as a tool to reduce efforts in designing and utilizing robotic platforms. The framework is used to customize robotic platforms by simply defining the available sensing devices, actuation platforms, and required tasks. The main purpose for designing this framework is to reduce the time and complexity of the development of robotic software and maintenance costs, and to improve code and component reusability. Usage of the proposed framework prevents the need to redesign or rewrite algorithms or applications due to changes in the robot’s platform, operating systems, or the introduction of new functionalities. In this paper, the RISCWare framework is developed and described. RISCWare is a robotic middleware used for the integration of heterogeneous robotic components. RISCWare consists of three modules. The first module is the sensory module, which represents sensors that collect information about the remote or local environment. The platform module defines the robotic platforms and actuation methods. The last module is the task-description module, which defines the tasks and applications that the platforms will perform such as teleoperation, navigation, obstacle avoidance, manipulation, 3-D reconstruction, and map building. The plug-and-play approach is one of the key features of RISCWare, which allows auto-detection and auto-reconfiguration of the attached standardized components (hardware and software) according to current system configurations. These components can be dynamically available or unavailable. Dynamic reconfiguration provides the facility to modify a system during its execution and can be used to apply patches and updates, to implement adaptive systems, or to support third-party modules. This automatic detection and reconfiguration of devices and driver software makes it easier and more efficient for end users to add and use new devices and software applications. In addition, the software components should be written in a flexible way to get better usage of the hardware resource and also they should be easy to install/uninstall. Several experiments, performed on the RISCbot II mobile manipulation platform, are described and implemented to evaluate the RISCWare framework with respect to applicability and resource utilization.
APA, Harvard, Vancouver, ISO, and other styles
7

Kaleel, Ibrahim, Marco Petrolo, Erasmo Carrera, and Anthony M. Waas. "Micromechanical Progressive Failure Analysis of Fiber-Reinforced Composite Using Refined Beam Models." In ASME 2017 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/imece2017-71304.

Full text
Abstract:
An efficient and novel micromechanical computational platform for progressive failure analysis of fiber reinforced composites is presented. The numerical framework is based on a class of refined beam models called Carrera Unified Formulation (CUF), a generalized hierarchical formulation which yields a refined structural theory via variable kinematic description. The crack band theory is implemented in the framework to capture the damage propagation within the constituents of composite materials. A representative volume element (RVE) containing randomly distributed fibers is modeled using the Component-Wise approach (CW), an extension of CUF beam model based on Lagrange type polynomials. The efficiency of the proposed numerical framework is achieved through the ability of the CUF models to provide accurate three-dimensional displacement and stress fields at a reduced computational cost.
APA, Harvard, Vancouver, ISO, and other styles
8

Iacob, Robert, Peter Mitrouchev, and Jean-Claude Le´on. "A Simulation Framework for Assembly/Disassembly Process Modeling." In ASME 2007 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/detc2007-34804.

Full text
Abstract:
Simulations of Assembly/Disassembly (A/D) processes covers a large range of objectives, i.e. A/D sequencing, path finding, ergonomic analysis …, where the 3D shape description of the component plays a key role. In addition, the A/D simulations can be performed either from an automated or interactive point of view using standard computer equipment or through immersive and real-time simulation schemes. In order to address this diversity of configurations, this paper presents a simulation framework for A/D analysis based on a new simulation preparation process which allows a simulation process to address up to two types of shape representations, i.e. B-Rep NURBS and polyhedral ones, at the same time, thus handling efficiently the configurations where 3D shape representations of assemblies play a key role. In order to illustrate the simulation preparation process some specific steps are addressed. To this end, the automatic identification of contacts in a 3D product model and their corresponding list is described. After this first stage of identification, an interpretation of the results is needed in order to have the complete list with the mechanical contacts for a product. During the preparation process, three major stages of the framework are detailed: model tessellation, surface merging and contacts identification. Our framework is based on STEP exchange format. The contacts are related to basic geometrical surfaces like: planes, cylinders, cones, spheres. Some examples are provided in order to illustrate the contributions of the proposed framework. This software environment can assist designers to achieve a satisfactory assembly analysis rapidly and can reduce the lead-time of product development. Further consequences of the present work is its ability to produce models and treatments that improve integration of assembly models in immersive environments taking into account of the haptic and visual models needed.
APA, Harvard, Vancouver, ISO, and other styles
9

Khalifeh, Elias, Elsa Piollet, Antoine Millecamps, and Alain Batailly. "Non-Linear Modeling of Centrifugal Stiffening Effects for Accurate Bladed Component Reduced-Order Models." In ASME Turbo Expo 2017: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/gt2017-63629.

Full text
Abstract:
The modeling of centrifugal stiffening effects on bladed components is of primary importance in order to accurately capture their dynamics depending on the rotor angular speed. Centrifugal effects impact both the stiffness of the component and its geometry. In the context of the small perturbation framework, when considering a linear finite element model of the component, an assumption typically made in the scientific literature involves a fourth-order polynomial development of the stiffness matrix in terms of the angular speed. This polynomial development may fail to provide an accurate representation of the geometry evolution of a blade. Indeed, the error on the blade-tip displacement associated to the use of a linear finite element model quickly reaches the same order of magnitude as the blade-tip/casing clearance itself thus yielding a 100 % error on the blade-tip/casing clearance configuration. This article focuses on the presentation of a methodology that allows for creating accurate reduced order models of a 3D finite element model accounting for centrifugal stiffening with a very precise description of the blade-tip/casing clearance configuration throughout a given angular speed range. The quality of the obtained reduced order model is underlined before its numerical behaviour in the context of non-linear dynamic simulations be investigated. It is evidenced that the new reduced order model features specific interactions that could not be predicted with a linear model. In addition, results highlight the limitations of numerical predictions made for high angular speeds with a linear model. Finally, a particular attention is paid to the numerical sensitivity of the proposed model. As a downside of its increased accuracy, it is underlined that its computation must be done carefully in order to avoid numerical instabilities.
APA, Harvard, Vancouver, ISO, and other styles
10

Myers, Michael R. "Rapid Qualification of Additive Manufactured Parts Using OpenMETA." In ASME 2019 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/imece2019-10981.

Full text
Abstract:
Abstract This paper focuses on a method to integrate additive manufacturing (AM) structure, processing, and property-geometry modeling methods to facilitate the qualification and certification of AM-fabricated metal parts and enable their rapid deployment. The conventional approach to qualifying AM parts is to destructively evaluate a significant number of parts, measure the properties of interest, and look for anomalies in each part. This approach also requires statistical sampling of parts for destructive testing to verify the process is still operating correctly over time. This is costly, time consuming, and negates much of the benefits offered by AM. The approach outlined in this paper leverages OpenMETA, a suite of tools that provide a unified design space, a unified system representation across engineering specialties, and multidisciplinary workflow for optimization, design-of-experiments, and trade-off studies. OpenMETA provides the ability to conduct high-fidelity, low-fidelity, and hybrid analyses within one framework. Test benches are built within OpenMETA that capture the requirements for the component in an executable manner to support automated analysis such as thermal analysis, finite element analysis, and high-fidelity physics-based (HFPB) analysis. Test benches also describe the intended environment in which the component will be used after manufacture. This description of the environment might include details of the surrounding components, the interfaces to those components, and corrosive agents. Test bench results are used to estimate and assess component performance related to requirements. This paper focuses on one technology for manufacturing the parts, a wire-fed, robotic, pulsed-arc AM process, although the OpenMETA platform can be applied to other AM technologies.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography