To see the other types of publications on this topic, follow the link: Component Description Framework.

Journal articles on the topic 'Component Description Framework'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Component Description Framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Pârv, Bazil, Ioan Lazăr, and Simona Motogna. "COMDEVALCO framework - the modeling language for procedural paradigm." International Journal of Computers Communications & Control 3, no. 2 (June 1, 2008): 183. http://dx.doi.org/10.15837/ijccc.2008.2.2386.

Full text
Abstract:
This work is part of a series referring to COMDEVALCO - a framework for Software Component Definition, Validation, and Composition. Its constituents are: a modeling language, a component repository and a set of tools. This is the first paper describing the object-oriented modeling language, which contains finegrained constructions, aimed to give a precise description of software components. The current status of the language reflects the procedural paradigm constructs.
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Shao Bo, Yao Hu, and Qing Sheng Xie. "Heterogeneous System Integration Based on Service Component." Applied Mechanics and Materials 20-23 (January 2010): 1305–10. http://dx.doi.org/10.4028/www.scientific.net/amm.20-23.1305.

Full text
Abstract:
For problem of heterogeneous system integration, Service-oriented component of integration model is proposed and the integration framework of component-based is constructed in this paper,which is based on research service component and integrated structure of SOA. The structure of service components is Designed,and analysis its working principle. WEB service and description criterion of registered for system is defined. Paper further study the key technical of heterogeneous information integration and mapping model. Finaly, with examples no heterogeneous systems integrate framework of service component is designed and implemented.
APA, Harvard, Vancouver, ISO, and other styles
3

Spekman, Nancy J., and Froma P. Roth. "An Intervention Framework for Learning Disabled Students with Communication Disorders." Learning Disability Quarterly 11, no. 3 (August 1988): 248–56. http://dx.doi.org/10.2307/1510769.

Full text
Abstract:
This paper presents an intervention framework for the management of communication disorders in learning disabled children. The model is comprised of three major components: communicative intentions, presupposition, and the social organization of discourse. A description of each component is provided along with a review of relevant research. Finally, a set of general instructional guidelines and principles is presented.
APA, Harvard, Vancouver, ISO, and other styles
4

Alonso, Diego, Francisco Sánchez-Ledesma, Pedro Sánchez, Juan A. Pastor, and Bárbara Álvarez. "Models and Frameworks: A Synergistic Association for Developing Component-Based Applications." Scientific World Journal 2014 (2014): 1–17. http://dx.doi.org/10.1155/2014/687346.

Full text
Abstract:
The use offrameworksandcomponentshas been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on themodel driven software developmentparadigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
APA, Harvard, Vancouver, ISO, and other styles
5

KUZEMSKY, A. L. "TWO-COMPONENT ALLOY MODEL FOR BISMUTHATE CERAMICS." Modern Physics Letters B 10, no. 14 (June 20, 1996): 627–33. http://dx.doi.org/10.1142/s0217984996000699.

Full text
Abstract:
The disordered binary substitutional A 1−x B x alloy model has been proposed for the description of the normal and superconducting properties of bismuthate ceramics Ba(Pb,Bi)O 3. The Eliashberg-type equations for the strong coupling superconductivity in strongly disordered alloys have been used to describe the superconducting properties. The relevant configurational averaging has been performed in the framework of CPA. The concentration dependence of electron-phonon coupling constant λ(x) and transition temperature Tc(x) has been calculated.
APA, Harvard, Vancouver, ISO, and other styles
6

KITCHAROENSAKKUL, SUPANAT, and VILAS WUWONGSE. "TOWARDS A UNIFIED VERSION MODEL USING THE RESOURCE DESCRIPTION FRAMEWORK (RDF)." International Journal of Software Engineering and Knowledge Engineering 11, no. 06 (December 2001): 675–701. http://dx.doi.org/10.1142/s0218194001000748.

Full text
Abstract:
Version control, a fundamental component of Software Configuration Management (SCM) — a key mechanism for software development especially in large-scale systems — lacks a unified foundation for interchangeability, interoperability, and integrability. A practical solution of such problems is not readily achieved because version controls of different SCM systems employ various version models. Hence a single unified version model is required. The work develops a single formalism to model major version control approaches as well as a prototype system. Its architecture is based on a federated SCM architecture and introduces a new SCM foundation which contains two layers: unified SCM representation and computation, based on the Resource Description Framework (RDF) and the Equivalent Transformation computational model, respectively. The evolution of software objects, their relationships as well as team cooperation strategies can be uniformly expressed under the representation layer. The computational layer is briefly mentioned for framework demonstration. Finally, a simple prototype system is presented.
APA, Harvard, Vancouver, ISO, and other styles
7

Goddard, Nigel H., Michael Hucka, Fred Howell, Hugo Cornelis, Kavita Shankar, and David Beeman. "Towards NeuroML: Model Description Methods for Collaborative Modelling in Neuroscience." Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences 356, no. 1412 (August 29, 2001): 1209–28. http://dx.doi.org/10.1098/rstb.2001.0910.

Full text
Abstract:
Biological nervous systems and the mechanisms underlying their operation exhibit astonishing complexity. Computational models of these systems have been correspondingly complex. As these models become ever more sophisticated, they become increasingly difficult to define, comprehend, manage and communicate. Consequently, for scientific understanding of biological nervous systems to progress, it is crucial for modellers to have software tools that support discussion, development and exchange of computational models. We describe methodologies that focus on these tasks, improving the ability of neuroscientists to engage in the modelling process. We report our findings on the requirements for these tools and discuss the use of declarative forms of model description—equivalent to object–oriented classes and database schema—which we call templates. We introduce NeuroML, a mark–up language for the neurosciences which is defined syntactically using templates, and its specific component intended as a common format for communication between modelling–related tools. Finally, we propose a template hierarchy for this modelling component of NeuroML, sufficient for describing models ranging in structural levels from neuron cell membranes to neural networks. These templates support both a framework for user–level interaction with models, and a high–performance framework for efficient simulation of the models.
APA, Harvard, Vancouver, ISO, and other styles
8

PEÑA-MORA, FENIOSKY, SANJEEV VADHAVKAR, and SIVA KUMAR DIRISALA. "Component-based software development for integrated construction management software applications." Artificial Intelligence for Engineering Design, Analysis and Manufacturing 15, no. 2 (April 2001): 173–87. http://dx.doi.org/10.1017/s0890060401152054.

Full text
Abstract:
This paper presents a framework and a prototype for designing Integrated Construction Management (ICM) software applications using reusable components. The framework supports the collaborative development of ICM software applications by a group of ICM application developers from a library of software components. The framework focuses on the use of an explicit software development process to capture and disseminate specialized knowledge that augments the description of the ICM software application components in a library. The importance of preserving and using this knowledge has become apparent with the recent trend of combining the software development process with the software application code. There are three main components in the framework: design patterns, design rationale model, and intelligent search algorithms. Design patterns have been chosen to represent, record, and reuse the recurring design structures and associated design experience in object-oriented software development. The Design Recommendation and Intent Model (DRIM) was extended in the current research effort to capture the specific implementation of reusable software components. DRIM provides a method by which design rationale from multiple ICM application designers can be partially generated, stored, and later retrieved by a computer system. To address the issues of retrieval, the paper presents a unique representation of a software component, and a search mechanism based on Reggia's setcover algorithm to retrieve a set of components that can be combined to get the required functionality is presented. This paper also details an initial, proof-of-concept prototype based on the framework. By supporting nonobtrusive capture as well as effective access of vital design rationale information regarding the ICM application development process, the framework described in this paper is expected to provide a strong information base for designing ICM software.
APA, Harvard, Vancouver, ISO, and other styles
9

Malyshev, V. P., and A. M. Makasheva. "Description of dynamic viscosity depending on the alloys composition and temperature using state diagrams." Izvestiya Visshikh Uchebnykh Zavedenii. Chernaya Metallurgiya = Izvestiya. Ferrous Metallurgy 61, no. 9 (October 21, 2018): 743–49. http://dx.doi.org/10.17073/0368-0797-2019-9-743-749.

Full text
Abstract:
The equilibrium nature of viscosity and fluidity is discovered on the basis of the Boltzmann distribution within the framework of the concept of randomized particles as a result of the virtual presence of crystal-mobile, liquid-mobile and vapor-mobile particles. It allows one to consider the viscosity and fluidity of solutions, in particular, melts of metal alloys, from the point of view of the equilibrium partial contributions of each component in the total viscosity and fluidity, despite the kinetic interpretation of natural expressions for these properties of the liquid. A linearly additive partial expression of viscosity is possible only for perfect solutions, in this case, for alloys with unrestricted mutual solubility of the components. Alloys with eutectics, chemical compounds and other features of the state diagram are characterized by viscosity dependencies that repeat the shape of liquidus curve over entire range of the alloy composition at different temperatures, with an increase in smoothness and convergence of these curves at increasing temperature. It was established that these features of viscosity temperature dependence are completely revealed within the framework of the concept of randomized particles and the virtual cluster model of viscosity in calculating the fraction of clusters determining the viscosity of the alloy. That viscosity of the alloy is found by the formula in which thermal energy RTcr at liquidus temperature is the thermal barrier of chaotization, characterizing the crystallization temperature of the melt Tcr, as well as the melting point of pure substances. On this basis, a method is proposed for calculating the alloys viscosity by phase diagrams using the temperature dependences of pure components viscosity to change the alloy’s viscosity in proportion to ratio of the clusters fractions at any temperature above liquidus line and for the pure component, taking into account the mole fraction of each component. As a result, a three-factor model of the liquid alloy viscosity has been obtained in which the thermal barrier of chaotization RTcr is used as variable for the first time. It determines the fraction of clusters for both pure substances (at RTcr = RTm ) and for alloys. This thermal barrier reflects the essence of the virtual cluster theory of liquid and adequacy of the concept of randomized particles.
APA, Harvard, Vancouver, ISO, and other styles
10

Zatsman, Igor, and Pavel Buntman. "Theoretical Framework and Denotatum-Based Models of Knowledge Creation for Monitoring and Evaluating R&D Program Implementation." International Journal of Software Science and Computational Intelligence 5, no. 1 (January 2013): 15–31. http://dx.doi.org/10.4018/ijssci.2013010102.

Full text
Abstract:
The paper presents two semiotic models for a description of development stages of indicators, including generation processes of expert knowledge about developed indicators. For the description of stages of these processes, a new notion of “Frege’s space“ is introduced. The authors described two applied semiotic models of knowledge acquisition. Those processes were studied in the context of forming an expert knowledge base named proactive dictionary, which represents a component of an evaluation system. This dictionary enables experts to fix stages of indicators development, to present the results of developing different variants of indicators in graphic form, to compare and evaluate these variants.
APA, Harvard, Vancouver, ISO, and other styles
11

Horvath, Imre. "Combining Unmergeables: A Methodological Framework for Axiomatic Fusion of Qualitative Design Theories." Proceedings of the Design Society: International Conference on Engineering Design 1, no. 1 (July 2019): 3591–600. http://dx.doi.org/10.1017/dsi.2019.366.

Full text
Abstract:
AbstractThe proposed methodological framework concerns axiomatic theory fusion (ATF) of non-additive engineering design theories. ATF includes seven steps: (i) semantic discretization of the composite theories, (ii) deriving epistemological entities by logical/semantic analysis, (iii) establishing and representation of relations among all relevant epistemological entities, (iv) combining the inter- theoretical epistemological entities of the component theories, (v) deriving propositions based on the combined set of epistemological entities, (vi) transcription of the epistemological entities and propositions into a textual/visual theory description, and (vii) validation of the resultant theory in application contexts. The proposed framework makes ATF an effective, content independent methodology for fusing component theories, no matter if they are descriptive, explanatory, predictive or controlling in nature. ATF methodology requires professional comprehension and rigor from the researchers. It is necessary to justify the logical correctness and practical validity of the target theory in the specific application context.
APA, Harvard, Vancouver, ISO, and other styles
12

FRANK, MICHAEL, and MICHAEL CODISH. "Logic Programming with Graph Automorphism: Integratingnautywith Prolog (Tool Description)." Theory and Practice of Logic Programming 16, no. 5-6 (September 2016): 688–702. http://dx.doi.org/10.1017/s1471068416000223.

Full text
Abstract:
AbstractThis paper presents thepl-nautylibrary, a Prolog interface to thenautygraph-automorphism tool. Adding the capabilities ofnautyto Prolog combines the strength of the “generate and prune” approach that is commonly used in logic programming and constraint solving, with the ability to reduce symmetries while reasoning over graph objects. Moreover, it enables the integration ofnautyin existing tool-chains, such as SAT-solvers or finite domain constraints compilers which exist for Prolog. The implementation consists of two components:pl-nauty, an interface connectingnauty's C library with Prolog, andpl-gtools, a Prolog framework integrating the software component ofnauty, calledgtools, with Prolog. The complete tool is available as a SWI-Prolog module. We provide a series of usage examples including two that apply to generate Ramsey graphs.
APA, Harvard, Vancouver, ISO, and other styles
13

Crofts, S. B., S. M. Smith, and P. S. L. Anderson. "Beyond Description: The Many Facets of Dental Biomechanics." Integrative and Comparative Biology 60, no. 3 (July 11, 2020): 594–607. http://dx.doi.org/10.1093/icb/icaa103.

Full text
Abstract:
Synopsis Teeth lie at the interface between an animal and its environment and, with some exceptions, act as a major component of resource procurement through food acquisition and processing. Therefore, the shape of a tooth is closely tied to the type of food being eaten. This tight relationship is of use to biologists describing the natural history of species and given the high instance of tooth preservation in the fossil record, is especially useful for paleontologists. However, correlating gross tooth morphology to diet is only part of the story, and much more can be learned through the study of dental biomechanics. We can explore the mechanics of how teeth work, how different shapes evolved, and the underlying forces that constrain tooth shape. This review aims to provide an overview of the research on dental biomechanics, in both mammalian and non-mammalian teeth, and to synthesize two main approaches to dental biomechanics to develop an integrative framework for classifying and evaluating dental functional morphology. This framework relates food material properties to the dynamics of food processing, in particular how teeth transfer energy to food items, and how these mechanical considerations may have shaped the evolution of tooth morphology. We also review advances in technology and new techniques that have allowed more in-depth studies of tooth form and function.
APA, Harvard, Vancouver, ISO, and other styles
14

Baldini, Gianmarco, José L. Hernandez-Ramos, Slawomir Nowak, Ricardo Neisse, and Mateusz Nowak. "Mitigation of Privacy Threats due to Encrypted Traffic Analysis through a Policy-Based Framework and MUD Profiles." Symmetry 12, no. 9 (September 22, 2020): 1576. http://dx.doi.org/10.3390/sym12091576.

Full text
Abstract:
It has been proven in research literature that the analysis of encrypted traffic with statistical analysis and machine learning can reveal the type of activities performed by a user accessing the network, thus leading to privacy risks. In particular, different types of traffic (e.g., skype, web access) can be identified by extracting time based features and using them in a classifier. Such privacy attacks are asymmetric because a limited amount of resources (e.g., machine learning algorithms) can extract information from encrypted traffic generated by cryptographic systems implemented with a significant amount of resources. To mitigate privacy risks, studies in research literature have proposed a number of techniques, but in most cases only a single technique is applied, which can lead to limited effectiveness. This paper proposes a mitigation approach for privacy risks related to the analysis of encrypted traffic which is based on the integration of three main components: (1) A machine learning component which proactively analyzes the encrypted traffic in the network to identify potential privacy threats and evaluate the effectiveness of various mitigation techniques (e.g., obfuscation), (2) a policy based component where policies are used to enforce privacy mitigation solutions in the network and (3) a network node profile component based on the Manufacturer Usage Description (MUD) standard to enable changes in the network nodes in the cases where the first two components are not effective in mitigating the privacy risks. This paper describes the different components and how they interact in a potential deployment scenario. The approach is evaluated on the public dataset ISCXVPN2016 and the results show that the privacy threat can be mitigated significantly by removing completely the identification of specific types of traffic or by decreasing the probability of their identification as in the case of VOIP by 50%, Chat by 40% and Browsing by 33%, thus reducing significantly the privacy risk.
APA, Harvard, Vancouver, ISO, and other styles
15

Connolly, John H. "The Contextual Component within a dynamic implementation of the FDG model." Pragmatics. Quarterly Publication of the International Pragmatics Association (IPrA) 24, no. 2 (June 1, 2014): 229–48. http://dx.doi.org/10.1075/prag.24.2.03con.

Full text
Abstract:
The central issue addressed in this paper concerns the design of an appropriate contextual framework to support a dynamic implementation of FDG. The first part of the paper is concerned with the internal structure of the contextual framework. A particular hierarchical structure for the analysis and description of context, articulated in Connolly (2007a) and termed the Extended Model of Context (EMC), is presented as the starting-point. Alternative frameworks are considered, but all are found to have shortcomings. However, the original version of the EMC has also received some criticism. Consequently, a revised model of the EMC is proposed, in which the treatment of context is enhanced, and which is appropriate to a dynamic implementation of FDG. The application of the revised EMC not only to the grammatical model, but also to a broader discourse model, is also discussed. The next part of the paper is concerned with the interaction between the EMC and the FDG Grammatical and Conceptual Components. It is contended that all of the main types of context recognised within the EMC have a significant effect upon grammar. However, the only way in which contextual factors may directly influence the production and interpretation of discourse is through their presence in the minds of the discourse-participants. Consequently, the Conceptual Component plays a vital, mediating role in the handling of interactions between the EMC and the Grammatical Component. This point is particularly salient when considering a dynamic implementation, in which the flow of information around the model is of crucial importance. It is contended that this flow is essentially cyclic in nature.
APA, Harvard, Vancouver, ISO, and other styles
16

OGAWA, YOKO, HIROSHI TOKI, SETSUO TAMENAGA, and AKIHIRO HAGA. "THE ROLE OF PION ON NUCLEI WITH CHARGE AND PARITY PROJECTED CHIRAL MEAN FIELD MODEL." Modern Physics Letters A 23, no. 27n30 (September 30, 2008): 2585–88. http://dx.doi.org/10.1142/s021773230802985x.

Full text
Abstract:
We construct a relativistic framework, charge and parity projected chiral mean field model, which takes into account explicitly a pion-exchange interaction. We take the chiral σ model Lagrangian for the construction of finite nuclei. We discuss several essential points of this framework with respect to explicit description of pionic-correlations, especially high-momentum component, in nuclear ground state wave function. We apply this framework to several even-even nuclei. We found that the pion contribution brings a significant change for nuclear ground state. We found that pion plays a role on the formation of jj-magic structure.
APA, Harvard, Vancouver, ISO, and other styles
17

Agop, Maricel, Tudor-Cristian Petrescu, Dumitru Filipeanu, Claudia Elena Grigoraș-Ichim, Ana Iolanda Voda, Andrei Zala, Lucian Dobreci, Constantin Baciu, and Decebal Vasincu. "Toward Complex Systems Dynamics through Flow Regimes of Multifractal Fluids." Symmetry 13, no. 5 (April 27, 2021): 754. http://dx.doi.org/10.3390/sym13050754.

Full text
Abstract:
In the framework of the Multifractal Theory of Motion, which is expressed by means of the multifractal hydrodynamic model, complex system dynamics are explained through uniform and non-uniform flow regimes of multifractal fluids. Thus, in the case of the uniform flow regime of the multifractal fluid, the dynamics’ description is “supported” only by the differentiable component of the velocity field, the non-differentiable component being null. In the case of the non-uniform flow regime of the multifractal fluid, the dynamics’ description is “supported” by both components of the velocity field, their ratio specifying correlations through homographic transformations. Since these transformations imply metric geometries explained, for example, by means of Killing–Cartan metrics of the SL(2R)-type algebra, of the set of 2 × 2 matrices with real elements, and because these metrics can be “produced” as Cayleyan metrics of absolute geometries, the dynamics’ description is reducible, based on a minimal principle, to harmonic mappings from the usual space to the hyperbolic space. Such a conjecture highlights not only various scenarios of dynamics’ evolution but also the types of interactions “responsible” for these scenarios. Since these types of interactions become fundamental in the self-structuring processes of polymeric-type materials, finally, the theoretical model is calibrated based on the author’s empirical data, which refer to controlled drug release applications.
APA, Harvard, Vancouver, ISO, and other styles
18

Lv, Xin Xin. "Implementation on Data Layer Component for Music Equipment Accounting Treatment System." Applied Mechanics and Materials 543-547 (March 2014): 4569–72. http://dx.doi.org/10.4028/www.scientific.net/amm.543-547.4569.

Full text
Abstract:
With the national efforts to increase investment for quality education and an increase in the number of college enrollment music class students, so the college needs more music equipments. And the music equipment management is always one of the challenges for colleges and universities. Accounting treatment system is the core part of the music equipment management system, and this paper carries on study based on SQLServer database and ADO.NET framework. First, database design which includes conceptual structure design and logical structure design should be conducted; then, ADO.NET architecture can be researched, including the description of the structure model and its components; finally, studying the implementation on data layer component which includes the process when creating solutions and the three main processes when implementing data layer component. The successful use of research results in this paper will certainly improve the quality of development efficiency of music equipment management accounting treatment system and promote information construction in colleges and universities.
APA, Harvard, Vancouver, ISO, and other styles
19

Gaede, Frank, Markus Frank, Marko Petric, and Andre Sailer. "DD4hep a community driven detector description for HEP." EPJ Web of Conferences 245 (2020): 02004. http://dx.doi.org/10.1051/epjconf/202024502004.

Full text
Abstract:
Detector description is an essential component in simulation, reconstruction and analysis of data resulting from particle collisions in high energy physics experiments and for the detector development studies for future experiments. Current detector description implementations of running experiments are mostly specific implementations. DD4hep [1] is an open source toolkit created in 2012 to serve as a generic detector description solution. The main motivation behind DD4hep is to provide the community with an integrated solution for all these stages and address detector description in a broad sense, including the geometry and the materials used in the device, and additional parameters describing e.g. the detection techniques, constants required for alignment and calibration, description of the readout structures and conditions data. In these proceedings, we will give an overview of the project and discuss recent developments in DD4hep as well as showcase adaptions of the framework by LHC and upcoming accelerator projects together with the road map of future developments.
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Jianbo, Yiping Yao, Wenjie Tang, and Feng Zhu. "Research on the Development Approach for Reusable Model in Parallel Discrete Event Simulation." Discrete Dynamics in Nature and Society 2015 (2015): 1–11. http://dx.doi.org/10.1155/2015/580328.

Full text
Abstract:
Model reuse is an essential means to meet the demand of model development in complex simulation. An effective approach to realize the model reusability is to establish standard model specification including interface specification and representation specification. By standardizing model’s external interfaces, Reusable Component Model Framework (RCMF) achieves the model reusability acting as an interface specification. However, the RCMF model is presently developed just through manual programing. Besides implementing model’s business logic, modeler should also ensure the model strictly following the reusable framework, which is very distracting. And there lacks model description information for instructing model reuse or integration. To address these issues, we first explored an XML-based model description file which completed RCMF as the model representation and then proposed a RCMF model development tool—SuKit. Model description file describes a RCMF model and can be used for regenerating a model and instructing model integration. SuKit can generate a skeleton RCMF model together with a model-customized description file with the configured information. Modeler then just needs to concentrate on the model processing logic. The case study indicates that SuKit has good capability of developing RCMF models and the well-formed description file can be used for model reuse and integration.
APA, Harvard, Vancouver, ISO, and other styles
21

Schut, Peter, Scott Smith, Walter Fraser, Xiaoyuan Geng, and David Kroetsch. "Soil Landscapes of Canada: Building a National Framework for Environmental Information." GEOMATICA 65, no. 3 (September 2011): 293–309. http://dx.doi.org/10.5623/cig2011-045.

Full text
Abstract:
The Soil Landscapes of Canada (SLC) is a national soil map and accompanying database of environmental information for all of Canada, produced and maintained by the Canadian Soil Information Service (CanSIS) which is a part of Agriculture and Agri-Food Canada. The SLC maps were originally published as a set of paper products for individual provinces and regions. The maps were digitized in CanSIS, using one of the first geographic information systems in the world, and linked to soil and landscape attribute tables to serve an evolving variety of spatial modelling applications. The SLCs form the lowest level of the National Ecological Framework for Canada. The latest public release of the SLC is version 3.2, which provides updated soil and landscape information for the agricultural areas of Canada. The SLC v3.2 digital coverage includes an extensive set of relational data tables. The component table lists the soil components in each agricultural polygon along with their predicted dominant slope, class, and extent. The soil component codes are also linked to soil attribute tables which define fundamental soil properties, such as classification and parent material, as well as a description of the soil horizons and key soil attributes to a depth of 100 cm. SLC v3.2 adds a new set of landform tables which identify the major landform type in each polygon and indicates the most likely soil components in the upper, mid slope, lower slope, and depressional positions. These soil and landform attributes are designed to support a wide variety of national and international environmental modelling applications, such as the prediction of soil quality change, soil carbon sequestration, and land productivity for different agricultural crops in response to agricultural policy, land management, and climate change scenarios. Future versions of the SLC are under development that will have improved spatial resolution and include soils data for areas beyond the present agricultural zone of Canada.
APA, Harvard, Vancouver, ISO, and other styles
22

Beeken, Jeannine. "Const." Schrijven in moedertaal en vreemde taal 40 (January 1, 1991): 169–75. http://dx.doi.org/10.1075/ttwia.40.16bee.

Full text
Abstract:
The proposed research and application programme can be seen as a text-constitutive programme, which enables the user to appeal to computerized information, while planning, formulating and revising a particular writing task. The computerized information is stored in interactive modules, i.e. 1) a CONST-specific thesaurus and an adapted version of the electronic Van DALE 12 dictionary, 2) an algorithmic framework, actualized by a number of question-packages regarding business texts and memos, and 3) a set of corresponding predefined and ad hoc constructed structural descriptions. Important topics of research are the symmetric internal and external electronic communication links inside and between the various modules, as for example the internal link between the two thesauri or the external links between structural description, question-package and thesauri. What can be demonstrated, is a prototypical framework, in which each CONST-component is represented properly and in which the various communication links have been actualized.
APA, Harvard, Vancouver, ISO, and other styles
23

Lin, Xi, Hehua Zhang, and Ming Gu. "OntCheck: An Ontology-Driven Static Correctness Checking Tool for Component-Based Models." Journal of Applied Mathematics 2013 (2013): 1–11. http://dx.doi.org/10.1155/2013/934349.

Full text
Abstract:
Component-based models are widely used for embedded systems. The models consist of components with input and output ports linked to each other. However, mismatched links or assumptions among components may cause many failures, especially for large scale models. Binding semantic knowledge into models can enable domain-specific checking and help expose modeling errors in the early stage. Ontology is known as the formalization of semantic knowledge. In this paper we propose an ontology-driven tool for static correctness checking of domain-specific errors. two kinds of important static checking, semantic type and domain-restrcted rules, are fulfilled in a unified framework. We first propose a formal way to precisely describe the checking requirements by ontology and then separately check them by a lattice-based constraint solver and a description logic reasoner. Compared with other static checking methods, the ontology-based method we proposed is model-externally configurable and thus flexible and adaptable to the changes of requirements. The case study demonstrates the effectiveness of our method.
APA, Harvard, Vancouver, ISO, and other styles
24

Jiang, Jingchao, A.-Xing Zhu, Cheng-Zhi Qin, and Junzhi Liu. "A knowledge-based method for the automatic determination of hydrological model structures." Journal of Hydroinformatics 21, no. 6 (September 3, 2019): 1163–78. http://dx.doi.org/10.2166/hydro.2019.029.

Full text
Abstract:
Abstract To determine a suitable hydrological model structure for a specific application context using integrated modelling frameworks, modellers usually need to manually select the required hydrological processes, identify the appropriate algorithm for each process, and couple the algorithms' software components. However, these modelling steps are difficult and require corresponding knowledge. It is not easy for modellers to master all of the required knowledge. To alleviate this problem, a knowledge-based method is proposed to automatically determine hydrological model structures. First, modelling knowledge for process selection, algorithm identification, and component coupling is formalized in the formats of the Rule Markup Language (RuleML) and Resource Description Framework (RDF). Second, the formalized knowledge is applied to an inference engine to determine model structures. The method is applied to three hypothetical experiments and a real experiment. These experiments show how the knowledge-based method could support modellers in determining suitable model structures. The proposed method has the potential to reduce the knowledge burden on modellers and would be conducive to the promotion of integrated modelling frameworks.
APA, Harvard, Vancouver, ISO, and other styles
25

Kuchina, S. A. "Electronic Literary Text in the Framework of Post-Structuralism Textology." Bulletin of Kemerovo State University 21, no. 3 (October 5, 2019): 821–29. http://dx.doi.org/10.21603/2078-8975-2019-21-3-821-829.

Full text
Abstract:
The article features the phenomenon of electronic literary text. The research objective was to identify the structural and semantic features of electronic literary texts within the framework of post-structuralism. The electronic literary text resulted not only from the development of information technology: it is also the product of the development of philosophical and linguistics ideas of post-structuralism. The post-structuralism perspective was not repeated exactly on the technological level of the electronic text representation. However, the post-structuralist text theory was reflected in the electronic literary text structure, i.e. its rhizomatic, decentered, fragmentary, intertextual, and simulation character. In particular, the attempt to build cohesion of semiotically diverse components in the electronic environment reflects the post-structuralists idea of chaos and disorganization. The attempt to provide the navigation in this multi-component unity by the key-word hyperlinks reflects the idea of the total interconnectivity of all structural components, i.e. intertextuality. The phenomenon of intertextuality defines the culmination of decenterment and nomadism in the text theory. It is connected with the rhizomatic concept and hypertextuality. The research used electronic literary texts based on Adobe Flash and HTML. The research employed general scientific methods, such as monitoring and description, in conjunction with the method of comparative linguistic analysis. The author concludes that the text perception and electronic virtual world immersion become much more important than the artifacts in the electronic literature of XXI century. The electronic literary text became the poststructuralist concept of the new esthetic object, lost its integrity and composition stability, and opened itself to external input.
APA, Harvard, Vancouver, ISO, and other styles
26

Mehrabian, Albert, and Catherine A. Stefl. "BASIC TEMPERAMENT COMPONENTS OF LONELINESS, SHYNESS, AND CONFORMITY." Social Behavior and Personality: an international journal 23, no. 3 (January 1, 1995): 253–63. http://dx.doi.org/10.2224/sbp.1995.23.3.253.

Full text
Abstract:
The PAD Temperament Model provides a general framework for personality description and consists of three nearly independent dimensions: Trait Pleasure-displeasure, Trait Arousability, and Trait Dominance-submissiveness. The present study analyzed the Shyness, Loneliness, and Conformity scales within the PAD Model. Shyness resembled trait anxiety and neuroticism and was composed of unpleasant, arousable, and submissive characteristics, with submissiveness the strongest and arousability the weakest components. Loneliness, like depression, was composed of unpleasant and submissive qualities, although submissiveness was a much weaker component of loneliness than of depression. A new Conformity Scale evidenced a strong negative correlation with Trait Dominance and was independent of Trait Pleasure and Trait Arousability. Implications of the findings for clinical intervention were discussed.
APA, Harvard, Vancouver, ISO, and other styles
27

KARHUNEN, JUHA, SIMONA MĂlĂROIU, and MIKA ILMONIEMI. "LOCAL LINEAR INDEPENDENT COMPONENT ANALYSIS BASED ON CLUSTERING." International Journal of Neural Systems 10, no. 06 (December 2000): 439–51. http://dx.doi.org/10.1142/s0129065700000429.

Full text
Abstract:
In standard Independent Component Analysis (ICA), a linear data model is used for a global description of the data. Even though linear ICA yields meaningful results in many cases, it can provide a crude approximation only for general nonlinear data distributions. In this paper a new structure is proposed, where local ICA models are used in connection with a suitable grouping algorithm clustering the data. The clustering part is responsible for an overall coarse nonlinear representation of the data, while linear ICA models of each cluster are used for describing local features of the data. The goal is to represent the data better than in linear ICA while avoiding computational difficulties related with nonlinear ICA. Several data grouping methods are considered, including standard K-means clustering, self-organizing maps, and neural gas. Connections to existing methods are discussed, and experimental results are given for artificial data and natural images. Furthermore, a general theoretical framework encompassing a large number of methods for representing data is introduced. These range from global, dense representation methods to local, very sparse coding methods. The proposed local ICA methods lie between these two extremes.
APA, Harvard, Vancouver, ISO, and other styles
28

Radtke, Sarah. "A Conceptual Framework for Clinical Education in Athletic Training." Athletic Training Education Journal 3, no. 2 (April 1, 2008): 36–42. http://dx.doi.org/10.4085/1947-380x-3.2.36.

Full text
Abstract:
Objective: To develop a model for clinical education in athletic training education based on integration of various allied health professional clinical education models. Background: Clinical education is a critical component of allied health education programs. It allows for the transfer of knowledge and skills from classroom to practical application. Clinical education needs to be structured. In addition the Clinical Instructor (CI) also needs to facilitate athletic training students' development into effective, evidence-based practitioners. Description: A brief discussion on the need for transfer of knowledge in athletic training education is discussed. A review of the various clinical education models from allied health professional education is presented. Finally, a model for athletic training clinical education is presented with implications for practice. Clinical Advantages: As athletic training education continues to develop, a need to formalize clinical education and develop a clinical education model for athletic training is warranted. Focusing on the structure and function of clinical education will continue to move athletic training education forward and will align athletic training education with other allied health professional education programs.
APA, Harvard, Vancouver, ISO, and other styles
29

Chen, Ang. "Meaningfulness in Physical Education: A Description of High School Students’ Conceptions." Journal of Teaching in Physical Education 17, no. 3 (April 1998): 285–306. http://dx.doi.org/10.1123/jtpe.17.3.285.

Full text
Abstract:
A theoretical framework distinguishing meaning and meaningfulness guided this study of high school students’ conceptions of meaningfulness in physical education. A 9-dimensional meaningfulness construct was developed through analyzing former high school students’ (N = 35) oral reflection on physical education. A 9-dimensional meaningfulness scale was prepared and administered to high school students (N = 698). The principal component analysis reduced the students’ responses to a 6-dimensional construct: Social Bonding, Cultural Appreciation, Challenge, Tension Release, Fitness Development, and Self-Expression. The construct was modified through confirmatory factor analyses and had a Goodness of Fit Index of .91. The reconstruction demonstrated sophisticated internalization of perceived meaning by students. AMANOVA revealed that the students’ conceptions of meaningfulness differentiated (p < .05) based on gender, grade, and socioeconomic status. The findings suggest that a pluralistic perspective be considered in curriculum design, given the sophistication and differentiation of students’ conceptions of meaningfulness in physical education.
APA, Harvard, Vancouver, ISO, and other styles
30

Taflanidis, Alexandros Angfelos, Andrew B. Kennedy, Joannes J. Westerink, Jane McKee Smith, Tracy Kijewski-Correa, and Kwok Fai Cheung. "REAL-TIME ASSESSMENT OF WAVE AND SURGE RISK DUE TO LANDFALLING HURRICANES." Coastal Engineering Proceedings 1, no. 33 (December 15, 2012): 17. http://dx.doi.org/10.9753/icce.v33.management.17.

Full text
Abstract:
In this work, a probabilistic framework is presented for real-time assessment of wave and surge risk for hurricanes approaching landfall. This framework has two fundamental components. The first is the development of a surrogate model for the rapid evaluation of hurricane waves, water levels, and runup based on a small number of parameters describing each hurricane: hurricane landfall location and heading, central pressure, forward speed, and radius of maximum winds. This surrogate model is developed using a response surface methodology fed by information from hundreds of pre-computed, high-fidelity model runs. For a specific set of hurricane parameters (i.e., a specific landfalling hurricane), the surrogate model is able to evaluate the maximum wave height, water level, and runup during the storm at a cost that is more than seven orders of magnitude less than the high fidelity models and thus meet time constraints imposed by emergency managers and decision makers. The second component to this framework is a description of the uncertainty in the parameters used to characterize the hurricane, through appropriate probability models, which then leads to quantification of hurricane-risk in terms of a probabilistic integral. This integral is then efficiently computed using the already established surrogate model by analyzing thousands of different scenarios (based on the aforementioned probabilistic description). Finally, by leveraging the computational simplicity and efficiency of the surrogate model, a simple stand-alone PC-based risk assessment tool is developed that allows non-expert end users to take advantage of the full potential of the framework. An illustrative example is presented that considers applications of these tools for hurricane risk estimation for Oahu. The development of cyber-infrastructure at the University of Notre Dame to further support these initiatives is also discussed.
APA, Harvard, Vancouver, ISO, and other styles
31

Urešová, Zdeňka, Eva Fučíková, Eva Hajičová, and Jan Hajič. "Meaning and Semantic Roles in CzEngClass Lexicon." Journal of Linguistics/Jazykovedný casopis 70, no. 2 (December 1, 2019): 403–11. http://dx.doi.org/10.2478/jazcas-2019-0069.

Full text
Abstract:
Abstract This paper focuses on Semantic Roles, an important component of studies in lexical semantics, as they are captured as part of a bilingual (Czech-English) synonym lexicon called CzEngClass. This lexicon builds upon the existing valency lexicons included within the framework of the annotation of the various Prague Dependency Treebanks. The present analysis of Semantic Roles is being approached from the Functional Generative Description point of view and supported by the textual evidence taken specifically from the Prague Czech-English Dependency Treebank.
APA, Harvard, Vancouver, ISO, and other styles
32

Sneed, Joseph D. "Particle Theories. A Sketch of a Structuralist Reconstruction." Metatheoria – Revista de Filosofía e Historia de la Ciencia 11, no. 1 (October 31, 2020): 33–52. http://dx.doi.org/10.48160/18532330me11.263.

Full text
Abstract:
Particle theories intend to describe the fundamental constituents from which all matter is constructed and the interactions among them. These constituents include atoms and molecules as well as their subatomic constituents, nuclei and their component parts including elementary particles. We consider an alternative to the usual particle theories (PT’s), but dealing with the same phenomena. We call these theories ‘QT’s’. This is an attempt to provide a formal description of the essential features of elementary particle theories within the framework of metatheoretical structuralism.
APA, Harvard, Vancouver, ISO, and other styles
33

Narayana Bhat, Talapady, and John Barkley. "Development of a use Case for Chemical Resource Description Framework for Acquired Immune Deficiency Syndrome Drug Discovery." Open Bioinformatics Journal 2, no. 1 (July 2, 2008): 20–27. http://dx.doi.org/10.2174/1875036200802010020.

Full text
Abstract:
There is considerable interest in RDF (Resource Description Framework) as a data representation standard for the growing information technology needs of drug discovery. Though several efforts towards this goal have been reported, most of the reported efforts have focused on text-based data. Structural data of chemicals are a key component of drug discovery and molecular images may offer certain advantages over text-based representations for them. Here we discuss the steps that we used to develop and search chemical Resource Description Framework (RDF) using text and image for structures of relevant to Acquired Immune Deficiency Syndrome (AIDS). These steps are (a) acquisition of the data on drugs, (b) definition of the framework to establish RDF on drugs using commonly asked questions during a drug discovery effort, (c) annotation of the structural data on drugs into RDF using the framework established in step (b), (d) validation of the annotation methods using Semantic Web concepts and tools, (e) design and development of public Web to distribute data to the public, (f) generation and distribution of data using OWL (Web Ontology Language). This paper describes this effort, discusses our observations and announces the availability of the OWL model at the W3C Web site (http://esw.w3.org/topic/HCLS/ChemicalTaxonomiesUseCase). The style of this paper is chosen so as to cover a broad audience including structural biologists, medicinal chemists, and information technologists and at times may appear say to the obvious for certain experts. A full discussion of our method and its comparison to other published methods is beyond the scope of this publication.
APA, Harvard, Vancouver, ISO, and other styles
34

BRODA, BOGUSLAW. "QUANTUM THEORY OF NON-ABELIAN DIFFERENTIAL FORMS AND LINK POLYNOMIALS." Modern Physics Letters A 09, no. 07 (March 7, 1994): 609–21. http://dx.doi.org/10.1142/s0217732394003841.

Full text
Abstract:
A topological quantum field theory of non-Abelian differential forms is investigated from the point of view of its possible applications to description of polynomial invariants of higher-dimensional two-component links. A path-integral representation of the partition function of the theory, which is a highly on-shell reducible system, is obtained in the framework of the antibracket-antifield formalism of Batalin and Vilkovisky. The quasi-monodromy matrix, giving rise to corresponding skein relations, is formally derived in a manifestly covariant non-perturbative manner.
APA, Harvard, Vancouver, ISO, and other styles
35

Nowicki, Adam, Michał Grochowski, and Kazimierz Duzinkiewicz. "Data-driven models for fault detection using kernel PCA: A water distribution system case study." International Journal of Applied Mathematics and Computer Science 22, no. 4 (December 28, 2012): 939–49. http://dx.doi.org/10.2478/v10006-012-0070-1.

Full text
Abstract:
Kernel Principal Component Analysis (KPCA), an example of machine learning, can be considered a non-linear extension of the PCA method. While various applications of KPCA are known, this paper explores the possibility to use it for building a data-driven model of a non-linear system-the water distribution system of the Chojnice town (Poland). This model is utilised for fault detection with the emphasis on water leakage detection. A systematic description of the system’s framework is followed by evaluation of its performance. Simulations prove that the presented approach is both flexible and efficient.
APA, Harvard, Vancouver, ISO, and other styles
36

Shafighi, Nesa, and Babak Shirazi. "Ontological Map of Service Oriented Architecture Based on Zachman." Research in Economics and Management 2, no. 4 (July 18, 2017): 33. http://dx.doi.org/10.22158/rem.v2n4p33.

Full text
Abstract:
<em>Service orientation is an approach in the field of enterprise architecture, business information systems and software application that its main element is the service. Shared services is an organization model of sharing, across an organization. It enables collaboration among the functions/departments. Main motivations for shared services are sharing, promote efficiency, reduce cost, and support scalability. Despite of the widespread use of these two approaches in information technology, there is no tool to optimize the management of them. The aim of this study is Ontological map of service oriented architecture based on zachman framework to adapt it in the reference enterprise architecture framework through implementation ontology views on system architect software and as well as equivalent ontology component with UML diagrams. After the implementation of the suggested model, the results showed that ontology is a formal description and explicit display of objects, concepts and other entities in the relationship between them. In other words, there is a model that describe all that is in fact in to understandable language for the system. Thus the proposed establishes have association between all aspects of zachman framework, also to create a clear description of business concepts in the management of shared services and is effective to provide a unified platform for enterprise modeling.</em>
APA, Harvard, Vancouver, ISO, and other styles
37

Kedrin, V. S., and A. V. Rodyukov. "System technologies for the formation of a data control contour for the personal account of an applicant based on the 1C:Enterprise 8.3 platform." Informatics and education, no. 2 (April 27, 2021): 12–23. http://dx.doi.org/10.32517/0234-0453-2021-36-2-12-23.

Full text
Abstract:
The article considers the actual information technologies of the organization of a component distributed system for the distance enrollment of applicants within the framework of the 1C:Enterprise 8.3 platform. The concept of a modified component architecture of interaction with a web contour is proposed, which implementing the module of dynamic designing of web contour interface, as well as the automated functionality of data management directly on the standard 1C software products. The system principles for the organization of the contour of the website “Personal account of an applicant” have been formulated, they allow to integrate the management of the web contour into the circulation software solutions both for higher education (1C:University PROF) and for secondary vocational education (1C:College PROF) in terms of mechanisms for the dynamic designing of interfaces and the contour of data processing of site users. The implemented contour of designing site interfaces allows you to dynamically change the components of the site’s web forms, as well as to define the details displayed in the web user interface. A description of the mechanisms of dynamic interaction of the interfaces of the “Personal account of an applicant” site with the dynamic data management contour within the framework of the 1C:Enterprise 8.3 platform is given. The information components of the interaction contour of the site interfaces have been determined. The elements of the site control contour and their purpose have been specified. A conceptual universal scheme for the development of a web based data moderating contour has been formulated, and technologies for interaction with an internal accounting system for automating an admission campaign have been determined.
APA, Harvard, Vancouver, ISO, and other styles
38

Norton, Alexander J., Peter J. Rayner, Ernest N. Koffi, and Marko Scholze. "Assimilating solar-induced chlorophyll fluorescence into the terrestrial biosphere model BETHY-SCOPE v1.0: model description and information content." Geoscientific Model Development 11, no. 4 (April 17, 2018): 1517–36. http://dx.doi.org/10.5194/gmd-11-1517-2018.

Full text
Abstract:
Abstract. The synthesis of model and observational information using data assimilation can improve our understanding of the terrestrial carbon cycle, a key component of the Earth's climate–carbon system. Here we provide a data assimilation framework for combining observations of solar-induced chlorophyll fluorescence (SIF) and a process-based model to improve estimates of terrestrial carbon uptake or gross primary production (GPP). We then quantify and assess the constraint SIF provides on the uncertainty in global GPP through model process parameters in an error propagation study. By incorporating 1 year of SIF observations from the GOSAT satellite, we find that the parametric uncertainty in global annual GPP is reduced by 73 % from ±19.0 to ±5.2 Pg C yr−1. This improvement is achieved through strong constraint of leaf growth processes and weak to moderate constraint of physiological parameters. We also find that the inclusion of uncertainty in shortwave down-radiation forcing has a net-zero effect on uncertainty in GPP when incorporated into the SIF assimilation framework. This study demonstrates the powerful capacity of SIF to reduce uncertainties in process-based model estimates of GPP and the potential for improving our predictive capability of this uncertain carbon flux.
APA, Harvard, Vancouver, ISO, and other styles
39

Benavides Segura, Bianchinetta. "El tratamiento completo: alternativa para la traducción del verso inserto en prosa no literaria." LETRAS, no. 39 (January 30, 2006): 183–203. http://dx.doi.org/10.15359/rl.1-39.9.

Full text
Abstract:
Se describen aspectos teóricos y la aplicación del denominado tratamiento completo, en la traducción de fragmentos poéticos. Se examinan los potenciales efectos en el texto terminal, a partir de un cuidadoso análisis del original; por tanto, el componente cultural cobra importancia y valor metodológicos. Se profundiza en los rasgos particulares del discurso poético como género, desde los cuales se extraen propuestas de trabajo (doce pasos) y varias soluciones traductológicas.A description is provided of the theoretical framework and application of the so-called complete treatment, in the translation of poetic fragments. Their potential effects on the target text are examined, based on a careful analysis of the source text, thus giving the cultural component methodological importance and value. Specific aspects of poetry as a genre are discussed in depth, as a basis for the proposal of twelve steps to follow and for different solutions for translation problems.
APA, Harvard, Vancouver, ISO, and other styles
40

Hu, Jie. "The Aircraft Airworthiness and Safety Standards Analysis." Applied Mechanics and Materials 533 (February 2014): 371–74. http://dx.doi.org/10.4028/www.scientific.net/amm.533.371.

Full text
Abstract:
The safety of the aircraft is assured by the airworthiness conditions laid down by national and international bodies. This chapter starts with a brief description of the framework in which these regulations are managed. The airworthiness regulations govern the ethos of the design. It is important to understand these aspects prior to the detailed consideration of aircraft component design as they may present constraints to the layout and performance of the aircraft. Although airworthiness is described under the separate headings of structural integrity, system integrity, operation integrity and crashworthiness there is considerable interdependence involved in the overall aircraft configuration.
APA, Harvard, Vancouver, ISO, and other styles
41

Nunez, Paul L. "Toward a quantitative description of large-scale neocortical dynamic function and EEG." Behavioral and Brain Sciences 23, no. 3 (April 2000): 371–98. http://dx.doi.org/10.1017/s0140525x00003253.

Full text
Abstract:
A general conceptual framework for large-scale neocortical dynamics based on data from many laboratories is applied to a variety of experimental designs, spatial scales, and brain states. Partly distinct, but interacting local processes (e.g., neural networks) arise from functional segregation. Global processes arise from functional integration and can facilitate (top down) synchronous activity in remote cell groups that function simultaneously at several different spatial scales. Simultaneous local processes may help drive (bottom up) macroscopic global dynamics observed with electroencephalography (EEG) or magnetoencephalography (MEG).A local/global dynamic theory that is consistent with EEG data and the proposed conceptual framework is outlined. This theory is neutral about properties of neural networks embedded in macroscopic fields, but its global component makes several qualitative and semiquantitative predictions about EEG measures of traveling and standing wave phenomena. A more general “metatheory” suggests what large-scale quantitative theories of neocortical dynamics may be like when more accurate treatment of local and nonlinear effects is achieved.The theory describes the dynamics of excitatory and inhibitory synaptic action fields. EEG and MEG provide large-scale estimates of modulation of these synaptic fields around background levels. Brain states are determined by neuromodulatory control parameters. Purely local states are dominated by local feedback gains and rise and decay times of postsynaptic potentials. Dominant local frequencies vary with brain region. Other states are purely global, with moderate to high coherence over large distances. Multiple global mode frequencies arise from a combination of delays in corticocortical axons and neocortical boundary conditions. Global frequencies are identical in all cortical regions, but most states involve dynamic interactions between local networks and the global system. EEG frequencies may involve a “matching” of local resonant frequencies with one or more of the many, closely spaced global frequencies.
APA, Harvard, Vancouver, ISO, and other styles
42

Song, Cheeyang, and Eunsook Cho. "An Integrated Design Method for SOA-Based Business Modeling and Software Modeling." International Journal of Software Engineering and Knowledge Engineering 26, no. 02 (March 2016): 347–77. http://dx.doi.org/10.1142/s0218194016500157.

Full text
Abstract:
Service-oriented architecture (SOA)-based system development requires a systematic integration technique for software modeling and business modeling methods that approach the implementation component from the perspective of a business service. We proposes the integrated design method (architecture, metamodel, framework, process) for the integration of component software modeling in business process modeling notation (BPMN) business modeling to service-oriented modeling based on model-driven architecture (MDA) and model view controller (MVC) patterns according to SOA. The integrated architecture is composed of a metamodel and a process framework. The integrated metamodel is mapped to the core modeling elements of the SOA-based extended layered (XL)-BPMN/business process execution language (BPEL)/web service description language (WSDL)/component models, and the conversion profile is defined. For the establishment of the integrated process between business and software modeling, the framework is first defined; using this framework, we apply MDA (CIM: Conceptual Independent Modeling, PIM: Platform Independent Modeling, PSM: Platform Specific Modeling) and MVC patterns to define the integrated modeling process for the three development phases. The proposed modeling process was applied to the design of an online shopping mall system (OSMS). The design models were described on the basis of MDA/MVC according to the layered modeling elements defined in the individual/integrated metamodel and the three modeling phases of the integrated modeling process. The case study demonstrated that the conversion modeling task maintains the consistency and practicality between the SOA-based business and software modeling. The use of this method will make the consistent conversion modeling work between businesses and software convenient with a service orientation, will make it easy to change a business process, and will maximize the number of established reuse models.
APA, Harvard, Vancouver, ISO, and other styles
43

LuperFoy, Susann. "Machine interpretation of bilingual dialogue." Interpreting. International Journal of Research and Practice in Interpreting 1, no. 2 (January 1, 1996): 213–33. http://dx.doi.org/10.1075/intp.1.2.03lup.

Full text
Abstract:
This paper examines the role of the dialogue manager component of a machine interpreter. It is a report on one project to design the discourse module for such a voice-to-voice machine translation (MT) system known as the Interpreting Telephone. The theoretical discourse framework that underlies the proposed dialogue manager supports the job of extracting and collecting information from the context, and facilitating human-machine language interaction in a multi-user environment. Empirical support for the dialogue theory and the implementation described herein, comes from an observational study of one human interpreter engaged in a three-way, bilingual telephone conversation. We begin with a brief description of the interpreting telephone research endeavor, then examine the discourse requirements of such a language-processing system, and finally, report on the application of the discourse processing framework to this voice-to-voice machine translation task.
APA, Harvard, Vancouver, ISO, and other styles
44

Egerton, Thorlene, Rana S. Hinman, David J. Hunter, Jocelyn L. Bowden, Philippa J. A. Nicolson, Lou Atkins, Marie Pirotta, and Kim L. Bennell. "PARTNER: a service delivery model to implement optimal primary care management of people with knee osteoarthritis: description of development." BMJ Open 10, no. 10 (October 2020): e040423. http://dx.doi.org/10.1136/bmjopen-2020-040423.

Full text
Abstract:
ObjectiveImplementation strategies, such as new models of service delivery, are needed to address evidence practice gaps. This paper describes the process of developing and operationalising a new model of service delivery to implement recommended care for people with knee osteoarthritis (OA) in a primary care setting.MethodsThree development stages occurred concurrently and iteratively. Each stage considered the healthcare context and was informed by stakeholder input. Stage 1 involved the design of a new model of service delivery (PARTNER). Stage 2 developed a behavioural change intervention targeting general practitioners (GPs) using the behavioural change wheel framework. In stage 3, the ‘Care Support Team’ component of the service delivery model was operationalised.ResultsThe focus of PARTNER is to provide patients with education, exercise and/or weight loss advice, and facilitate effective self-management through behavioural change support. Stage 1 model design: based on clinical practice guidelines, known evidence practice gaps in current care, chronic disease management frameworks, input from stakeholders and the opportunities and constraints afforded by the Australian primary care context, we developed the PARTNER service-delivery model. The key components are: (1) an effective GP consultation and (2) follow-up and ongoing care provided remotely (telephone/email/online resources) by a ‘Care Support Team’. Stage 2 GP behavioural change intervention: a multimodal behavioural change intervention was developed comprising a self-audit/feedback activity, online professional development and desktop software to provide decision support, patient information resources and a referral mechanism to the ‘Care Support Team’. Stage 3 operationalising the ‘care support team’—staff recruited and trained in evidence-based knee OA management and behavioural change methodology.ConclusionThe PARTNER model is the result of a comprehensive implementation strategy development process using evidence, behavioural change theory and intervention development guidelines. Technologies for scalable delivery were harnessed and new primary evidence was generated as part of the process.Trial registration number ACTRN12617001595303 (UTN U1111-1197-4809)
APA, Harvard, Vancouver, ISO, and other styles
45

Fried, Eliot, and Michel Jabbour. "Dynamical equations for the contact line of an evaporating or condensing sessile drop." Journal of Fluid Mechanics 703 (June 13, 2012): 204–37. http://dx.doi.org/10.1017/jfm.2012.209.

Full text
Abstract:
AbstractThe equations that govern, away from equilibrium and accounting for dissipation, the evolution of the contact line of an evaporating or condensing sessile drop on a rigid, planar substrate are derived. Aside from the normal and tangential components of the standard (Newtonian) force balance, these include a configurational force balance. At equilibrium, the normal component of the standard force balance reduces to the modified Young’s equation first mentioned by Gibbs. The remaining balances are purely dissipative and hence are vacuous in equilibrium. A complete description of contact-line dynamics generally involves all three equations. The theory is embedded in a thermodynamic framework that ensures consistency of all constitutive relations with the second law. In the linearly dissipative case, these involve six contact-line viscosities. When viscous coupling is neglected, only three viscosities remain. One is associated with stretching of the fluid along the contact line. The remaining two are related to dissipation that accompanies mass transfer between liquid and vapour phases during evaporation or condensation.
APA, Harvard, Vancouver, ISO, and other styles
46

HENSERUK, HALYNA, and MARTYNIUK SERHII. "METHODICAL COMPONENT OF THE SYSTEM OF DEVELOPMENT OF DIGITAL COMPETENCE OF FUTURE TEACHERS OF THE HUMANITARIAN PROFILE." Scientific Issues of Ternopil Volodymyr Hnatiuk National Pedagogical University. Series: pedagogy 1, no. 1 (July 7, 2021): 123–31. http://dx.doi.org/10.25128/2415-3605.21.1.15.

Full text
Abstract:
The development of digital competence of new generation teachers is a priority of a modern higher education institution. It is important in this context to develop a methodological component of the system of digital competence development of future teachers of humanities, which will determine the goals, content, methods, forms and means of digital competence development. Therefore, the aim of the article is to substantiate the methodological component of the model of development of digital competence of future teachers of humanities. The study analyzes the international framework of digital competence for citizens, in particular the framework of digital competence DigCompEdu, UNESCO standards “UNESCO ICT Competency Framework for Teachers. Version 3”, which reflects the latest technological and pedagogical advances in the use of ICT in education. It is important to describe the digital competence of the pedagogical worker, which includes the requirements to a humanities teacher and a set of skills to acquire, as well as the levels of digital competence of a pedagogical worker. A new Digital Education Action Plan (2021–2027) has been substantiated, outlining the European Commission's vision for high-quality, inclusive and accessible digital education in Europe. In the context of the study the analysis of the Concept of development of digital competencies adopted in Ukraine in 2020, the ways and means of solving the problems outlined in the document and the expected results are very important. The action plan for the implementation of the Concept of development of digital competencies includes regulatory, scientific and methodological, information support, deadlines and performance indicators. The described methodical component of the system of digital competence development of future teachers of humanities is developed in accordance with the European framework of digital competences, the concept of digital competence development and the description of digital competence of a pedagogical worker. This system is focused on application in the system of higher education. The methodological component of the system of development of digital competence of future teachers of humanities, which is carried out in the digital educational environment of Ternopil Volodymyr Hnatiuk National Pedagogical University, has been also substantiated.
APA, Harvard, Vancouver, ISO, and other styles
47

Sun, Xiu Juan, Chuan Jiang Wang, and Jian Wu. "Forward Kinematics Analysis and Experiment of Small-Caliber Well Rescue System." Advanced Materials Research 945-949 (June 2014): 1426–29. http://dx.doi.org/10.4028/www.scientific.net/amr.945-949.1426.

Full text
Abstract:
As for the new designed small-caliber well rescue system for the fallen person in deep well, we made a detailed description about the rescue mechanism, which was composed of framework, anchorage set, stretching arm and clasp arm, and illustrated the function of each mechanical component. Moreover, we deduced the forward kinematics matrices of the clasp arm and the stretching arm, and analyzed and provided the parameters of each degree of freedom. Meanwhile, the experimental prototype of the rescue system was designed and the imitated rescue experiment for a doll was fulfilled. The experimental results show that the rescue system is feasible in small caliber deep well rescue.
APA, Harvard, Vancouver, ISO, and other styles
48

Zhang, Li, and Gang Zhang. "The Commercial Bank Card System Based on SOA." Advanced Engineering Forum 4 (June 2012): 172–76. http://dx.doi.org/10.4028/www.scientific.net/aef.4.172.

Full text
Abstract:
SOA has been the effective technical solution to enterprise application integration and business process rebuilding. The paper presents a reference implementation scheme of Commercial Bank Card System based on SOA. In requirement tier, the implementation plan first establishes the full concept model and logical model of Commercial Bank Card System from business request(business use case-business step-atomic business) to system request (system use case-system step-atomic component), then in semantics tier, we give out full description of the model through the combination of OWL and BPEL. At last a SOA’s three-tier framework of “requirement-semantics-service” has been built for the Commercial Band Card System.
APA, Harvard, Vancouver, ISO, and other styles
49

Zelenikin, Aleksandr Y. "Children’s Academy of Fractal Geometry as a Means of Forming Design and Research Competence of High School Students." Volga Region Pedagogical Search 2, no. 36 (2021): 115–21. http://dx.doi.org/10.33065/2307-1052-2021-2-36-115-121.

Full text
Abstract:
The article proposes the structure and content of the project of a children’s academy in the framework of additional mathematical education in «Fractal geometry for senior students». The system of building the educational process in the children’sacademy is described and a table of thematic planning for one academic year is given. The purpose of this article is to present a new form of the educational process with the use of design and research activities in the additional education of high school students through fractal geometry for implementation in educational organizations. The objectives of the article are to scientifically substantiate the need to include the humanitarian component in teaching fractal geometry; study of the theoretical foundations of fractal geometry in order to use it in additional mathematical education in the organization of design and research activities; description of the types of organization of mathematical activities and the identification of methodological techniques for students; development of a methodological apparatus, description of the design and research form of education.
APA, Harvard, Vancouver, ISO, and other styles
50

de Figueiredo, Leandro Passos, Dario Grana, Mauro Roisenberg, and Bruno B. Rodrigues. "Gaussian mixture Markov chain Monte Carlo method for linear seismic inversion." GEOPHYSICS 84, no. 3 (May 1, 2019): R463—R476. http://dx.doi.org/10.1190/geo2018-0529.1.

Full text
Abstract:
We have developed a Markov chain Monte Carlo (MCMC) method for joint inversion of seismic data for the prediction of facies and elastic properties. The solution of the inverse problem is defined by the Bayesian posterior distribution of the properties of interest. The prior distribution is a Gaussian mixture model, and each component is associated to a potential configuration of the facies sequence along the seismic trace. The low frequency is incorporated by using facies-dependent depositional trend models for the prior means of the elastic properties in each facies. The posterior distribution is also a Gaussian mixture, for which the Gaussian component can be analytically computed. However, due to the high number of components of the mixture, i.e., the large number of facies configurations, the computation of the full posterior distribution is impractical. Our Gaussian mixture MCMC method allows for the calculation of the full posterior distribution by sampling the facies configurations according to the acceptance/rejection probability. The novelty of the method is the use of an MCMC framework with multimodal distributions for the description of the model properties and the facies along the entire seismic trace. Our method is tested on synthetic seismic data, applied to real seismic data, and validated using a well test.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography