Academic literature on the topic 'Ontology language standard'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Ontology language standard.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Ontology language standard"

1

PEASE, ADAM, and IAN NILES. "IEEE standard upper ontology: a progress report." Knowledge Engineering Review 17, no. 1 (2002): 65–70. http://dx.doi.org/10.1017/s0269888902000395.

Full text
Abstract:
The IEEE Standard Upper Ontology (IEEE, 2001) is an effort to create a large, general-purpose, formal ontology. The ontology will be an open standard that can be reused for both academic and commercial purposes without fee, and it will be designed to support additional domain-specific ontologies. The effort is targeted for use in automated inference, semantic interoperability between heterogeneous information systems and natural language processing applications. The effort was begun in May 2000 with an e-mail discussion list, and since then there have been over 6000 e-mail messages among 170 subscribers. These subscribers include representatives from government, academia and industry in various countries. The effort was officially approved as an IEEE standards project in December 2000. Recently a successful workshop was held at IJCAI 2001 to discuss progress and proposals for this project (IJCAI, 2001).
APA, Harvard, Vancouver, ISO, and other styles
2

Bassiliades, Nick. "A Tool for Transforming Semantic Web Rule Language to SPARQL Infererecing Notation." International Journal on Semantic Web and Information Systems 16, no. 1 (2020): 87–115. http://dx.doi.org/10.4018/ijswis.2020010105.

Full text
Abstract:
Semantic web rule language (SWRL) combines web ontology language (OWL) ontologies with horn logic rules of the rule markup language (RuleML) family. Being supported by ontology editors, rule engines and ontology reasoners, it has become a very popular choice for developing rule-based applications on top of ontologies. However, SWRL is probably not going to become a WWW Consortium standard, prohibiting industrial acceptance. On the other hand, SPARQL Inferencing Notation (SPIN) has become a de-facto industry standard to represent SPARQL rules and constraints on semantic web models, building on the widespread acceptance of SPARQL (SPARQL Protocol and RDF Query Language). In this article, we argue that the life of existing SWRL rule-based ontology applications can be prolonged by converting them to SPIN. To this end, we have developed the SWRL2SPIN tool in Prolog that transforms SWRL rules into SPIN rules, considering the object-orientation of SPIN, i.e. linking rules to the appropriate ontology classes and optimizing them, as derived by analysing the rule conditions.
APA, Harvard, Vancouver, ISO, and other styles
3

Zhang, Tie Feng, Shu Juan Han, and Jian Wei Gu. "Modeling and Mapping Implementation of Substation Knowledge Ontology Based on Protégé." Applied Mechanics and Materials 198-199 (September 2012): 786–89. http://dx.doi.org/10.4028/www.scientific.net/amm.198-199.786.

Full text
Abstract:
Based on the basic knowledge of ontology and protégé, and the deficiency of semantic expression in the IEC61850 and IEC61970 Standard, this paper puts forward a mapping method from SCL to CIM, adopting Web Ontology Language OWL to build the semantic information model of SCL and CIM of substation knowledge ontology. In substation model, this mapping method could solve the problem of information sharing and interoperation between digitized substation and dispatch master station, and lay a foundation for further research on fusion of the two standards.
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Gang, Jie Lin, Qing Qi Long, and Zhi Juan Hu. "OWL-Based Description for Agent." Advanced Materials Research 217-218 (March 2011): 1218–23. http://dx.doi.org/10.4028/www.scientific.net/amr.217-218.1218.

Full text
Abstract:
This paper presents a detailed formal specification of agents and their properties and abilities,based on the Web Ontology Language (OWL). It allows an agent to be specified entirely using standard mark-up languages from the Semantic Web community, namely RDF, RDF Schemaand OWL. The basic agent components are identified and their implementation using ontology development tools is described.The description improves consistency, interoperability and maintainability of agent program. Therefore,the design errors in the early development stages could be efficiently detected and avoided.
APA, Harvard, Vancouver, ISO, and other styles
5

Pan, Wen Lin. "A Formal EXPRESS-to-OWL Mapping Algorithm." Key Engineering Materials 419-420 (October 2009): 689–92. http://dx.doi.org/10.4028/www.scientific.net/kem.419-420.689.

Full text
Abstract:
Network-based collaborative product development has become a trend in manufacturing industry, which depends on two key information technology: the Semantic Web and the standard for the exchange of product model data (STEP). EXPRESS is the STEP product data modeling language, and OWL is the standard ontology representing language used in the Semantic Web. Only when the EXPRESS models are converted to OWL models, product information can be exchanged at the Web. The ontology meta-modeling theory was employed to analysis the ontology definition metamodel of EXPRESS and OWL, in order to build up the mapping relationship between them. A formal EXPRESS-to-OWL mapping algorithm is then proposed, represented by abstract syntax.
APA, Harvard, Vancouver, ISO, and other styles
6

Staykova, Kamenka, Petya Osenova, and Kiril Simov. "New Applications of “Ontology-to-Text Relation” Strategy for Bulgarian Language." Cybernetics and Information Technologies 12, no. 4 (2012): 43–51. http://dx.doi.org/10.2478/cait-2012-0029.

Full text
Abstract:
Abstract The paper presents new applications of the Ontology-to-Text Relation Strategy to Bulgarian Iconographic Domain. First the strategy itself is discussed within the triple ontology-terminological lexicon-annotation grammars, then - the related works. Also, the specifics of the semantic annotation and evaluation over iconographic data are presented. A family of domain ontologies over the iconographic domain are created and used. The evaluation against a gold standard shows that this strategy is good enough for more precise, but shallow results, and can be supported further by deep parsing techniques.
APA, Harvard, Vancouver, ISO, and other styles
7

Vacura, Miroslav. "Modeling artificial agents’ actions in context – a deontic cognitive event ontology." Applied Ontology 15, no. 4 (2020): 493–527. http://dx.doi.org/10.3233/ao-200236.

Full text
Abstract:
Although there have been efforts to integrate Semantic Web technologies and artificial agents related AI research approaches, they remain relatively isolated from each other. Herein, we introduce a new ontology framework designed to support the knowledge representation of artificial agents’ actions within the context of the actions of other autonomous agents and inspired by standard cognitive architectures. The framework consists of four parts: 1) an event ontology for information pertaining to actions and events; 2) an epistemic ontology containing facts about knowledge, beliefs, perceptions and communication; 3) an ontology concerning future intentions, desires, and aversions; and, finally, 4) a deontic ontology for modeling obligations and prohibitions which limit agents’ actions. The architecture of the ontology framework is inspired by deontic cognitive event calculus as well as epistemic and deontic logic. We also describe a case study in which the proposed DCEO ontology supports autonomous vehicle navigation.
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Yan Ling, Yi Duo Liang, and Jun Zhai. "Fuzzy Knowledge Representation Based on Fuzzy Linguistic Variable Ontology and SWRL on the Semantic Web." Applied Mechanics and Materials 58-60 (June 2011): 1707–11. http://dx.doi.org/10.4028/www.scientific.net/amm.58-60.1707.

Full text
Abstract:
Ontology is adopted as a standard for knowledge representation on the Semantic Web, and Ontology Web Language (OWL) is used to add structure and meaning to web applications. In order to share and resue the fuzzy knowledge on the Semantic Web, we propose the fuzzy linguistic variables ontology (FLVO), which utilizes ontology to represent formally the fuzzy linguistic variables and defines the semantic relationships between fuzzy concepts. Then fuzzy rules are described in Semantic Web Rule Language (SWRL) on the basis of FLVO model. Taking a sample case for students’ performance in physics for example, the fuzzy rule management system is built by using the tool protégé and SWRLTab, which shows that this research enables distributed fuzzy applications on the Semantic Web.
APA, Harvard, Vancouver, ISO, and other styles
9

Adel, Ebtsam, Shaker El-Sappagh, Sherif Barakat, Jong-Wan Hu, and Mohammed Elmogy. "An Extended Semantic Interoperability Model for Distributed Electronic Health Record Based on Fuzzy Ontology Semantics." Electronics 10, no. 14 (2021): 1733. http://dx.doi.org/10.3390/electronics10141733.

Full text
Abstract:
Semantic interoperability of distributed electronic health record (EHR) systems is a crucial problem for querying EHR and machine learning projects. The main contribution of this paper is to propose and implement a fuzzy ontology-based semantic interoperability framework for distributed EHR systems. First, a separate standard ontology is created for each input source. Second, a unified ontology is created that merges the previously created ontologies. However, this crisp ontology is not able to answer vague or uncertain queries. We thirdly extend the integrated crisp ontology into a fuzzy ontology by using a standard methodology and fuzzy logic to handle this limitation. The used dataset includes identified data of 100 patients. The resulting fuzzy ontology includes 27 class, 58 properties, 43 fuzzy data types, 451 instances, 8376 axioms, 5232 logical axioms, 1216 declarative axioms, 113 annotation axioms, and 3204 data property assertions. The resulting ontology is tested using real data from the MIMIC-III intensive care unit dataset and real archetypes from openEHR. This fuzzy ontology-based system helps physicians accurately query any required data about patients from distributed locations using near-natural language queries. Domain specialists validated the accuracy and correctness of the obtained results.
APA, Harvard, Vancouver, ISO, and other styles
10

Espinoza, Angelina, Ernesto Del-Moral, Alfonso Martínez-Martínez, and Nour Alí. "A validation & verification driven ontology: An iterative process." Applied Ontology 16, no. 3 (2021): 297–337. http://dx.doi.org/10.3233/ao-210251.

Full text
Abstract:
Designing an ontology that meets the needs of end-users, e.g., a medical team, is critical to support the reasoning with data. Therefore, an ontology design should be driven by the constant and efficient validation of end-users needs. However, there is not an existing standard process in knowledge engineering that guides the ontology design with the required quality. There are several ontology design processes, which range from iterative to sequential, but they fail to ensure the practical application of an ontology and to quantitatively validate end-user requirements through the evolution of an ontology. In this paper, an ontology design process is proposed, which is driven by end-user requirements, defined as Competency Questions (CQs). The process is called CQ-Driven Ontology DEsign Process (CODEP) and it includes activities that validate and verify the incremental design of an ontology through metrics based on defined CQs. CODEP has also been applied in the design and development of an ontology in the context of a Mexican Hospital for supporting Neurologist specialists. The specialists were involved, during the application of CODEP, in collecting quality measurements and validating the ontology increments. This application can demonstrate the feasibility of CODEP to deliver ontologies with similar requirements in other contexts.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Ontology language standard"

1

Dippold, Mindi M. "A Biological and Bioinformatics Ontology for Service Discovery and Data Integration." Thesis, 2006. http://hdl.handle.net/1805/621.

Full text
Abstract:
Submitted to the faculty of Indiana University in partial fulfillment of the requirements for the degree Masters of Science in the School of Informatics Indiana University December 2005<br>This project addresses the need for an increased expressivity and robustness of ontologies already supporting BACIIS and SIBIOS, two systems for data and service integration in the life sciences. The previous ontology solutions as global schema and facilitator of service discovery sustained the purposes for which they were built to provide, but were in need of updating in order to keep up with more recent standards in ontology descriptions and utilization as well as increase the breadth of the domain and expressivity of the content. Thus, several tasks were undertaken to increase the worth of the system ontologies. These include an upgrade to a more recent ontology language standard, increased domain coverage, and increased expressivity via additions of relationships and hierarchies within the ontology as well as increased ease of maintenance by a distributed design.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Ontology language standard"

1

Brezany, Peter, Ivan Janciak, and A. Min Tjoa. "Ontology-Based Construction of Grid Data Mining Workflows." In Data Warehousing and Mining. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-951-9.ch054.

Full text
Abstract:
This chapter introduces an ontology-based framework for automated construction of complex interactive data mining workflows as a means of improving productivity of Grid-enabled data exploration systems. The authors first characterize existing manual and automated workflow composition approaches and then present their solution called GridMiner Assistant (GMA), which addresses the whole life cycle of the knowledge discovery process. GMA is specified in the OWL language and is being developed around a novel data mining ontology, which is based on concepts of industry standards like the Predictive Model Markup Language, Cross Industry Standard Process for Data Mining and Java Data Mining API. The ontology introduces basic data mining concepts like data mining elements, tasks, services, etc. In addition, conceptual and implementation architectures of the framework are presented and its application to an example taken from the medical domain is illustrated. The authors hope that the further research and development of this framework can lead to productivity improvements, which can have significant impact on many real-life spheres. For example, it can be a crucial factor in achievement of scientific discoveries, optimal treatment of patients, productive decision making, cutting costs, etc.
APA, Harvard, Vancouver, ISO, and other styles
2

Dietrich, Jens, and Chris Elgar. "An Ontology Based Representation of Software Design Patterns." In Software Applications. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-060-8.ch036.

Full text
Abstract:
This chapter introduces an approach to define Design patterns using semantic Web technologies. For this purpose, a vocabulary based on the Web ontology language OWL is developed. Design patterns can be defined as RDF documents instantiating this vocabulary, and can be published as resources on standard Web servers. This facilitates the use of patterns as knowledge artefacts shared by the software engineering community. The instantiation of patterns in programs is discussed, and the design of a tool is presented that can x-ray programs for pattern instances based on their formal definitions.
APA, Harvard, Vancouver, ISO, and other styles
3

Nair, Aditi, Syed Sibte Raza Abidi, William Van Woensel, and Samina Abidi. "Ontology-Based Personalized Cognitive Behavioural Plans for Patients with Mild Depression." In Studies in Health Technology and Informatics. IOS Press, 2021. http://dx.doi.org/10.3233/shti210268.

Full text
Abstract:
Cognitive Behavioural Therapy (CBT) is an action-oriented psychotherapy that combines cognitive and behavioural techniques for psychosocial treatment for depression, and is considered by many to be the golden standard in psychotherapy. More recently, computerized CBT (CCBT) has been deployed to help increase availability and access to this evidence-based therapy. In this vein, a CBT ontology, as a shared common understanding of the domain, can facilitate the aggregation, verification, and operationalization of computerized CBT knowledge. Moreover, as opposed to black-box applications, ontology-enabled systems allow recommended, evidence-based treatment interventions to be traced back to the corresponding psychological concepts. We used a Knowledge Management approach to synthesize and computerize CBT knowledge from multiple sources into a CBT ontology, which allows generating personalized action plans for treating mild depression, using the Web Ontology Language (OWL) and Semantic Web Rule Language (SWRL). We performed a formative evaluation of the CBT ontology in terms of its completeness, consistency, and conciseness.
APA, Harvard, Vancouver, ISO, and other styles
4

Abdoullaev, Azamat. "Ways to View the World." In Reality, Universal Ontology and Knowledge Systems. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-966-3.ch002.

Full text
Abstract:
As a general science of the universe dealing with the most universal truths about all existence, an account of what there is in the world and the study of reality with its content, ontology has long had a critical import only for a small set of professionals. For the majority of researchers, not mentioning the non-committed people, it has been the realm of the occult, an esoteric domain of discourse, the region of most abstract reasoning, or the sphere of philosophical opinion. Today the field is rapidly increasing its audience and practical value. Maybe only a few people will call into question the importance of world models for such advanced knowledge domains as information sciences and computing technology. Increasingly, the universal science is identified with the life and soul of knowledge technologies and intelligent systems. The search for the world description standard as an exhaustive theoretical account and model of generic entities is becoming a research activity promising unprecedented advancements in the new cross area of science and technology. Particularly, this is important for creating knowledge systems of extraordinary performance, such as the emerging Semantic Web, implying the entire gamut of novel applications: knowledge management, intelligent databases, conceptual/semantic search and retrieval, software agents, speech and natural language understanding, e-commerce, and ubiquitous computing (Semantic Web Topic Hierarchy, 2007; Semantic Web Technology, 2007). Considering these design purposes, there are several technical requirements for building an ontology standard: expressivity, efficiency, scalability, compatibility, extensibility, and relative simplicity. But the most significant prerequisite appears to be its consistency with existent commonsense models of the world and scientific learning.
APA, Harvard, Vancouver, ISO, and other styles
5

Arenas, Marcelo, Georg Gottlob, and Andreas Pieris. "Querying the Semantic Web via Rules." In Applications and Practices in Ontology Design, Extraction, and Reasoning. IOS Press, 2020. http://dx.doi.org/10.3233/ssw200044.

Full text
Abstract:
The problem of querying RDF data is a central issue for the development of the Semantic Web. The query language SPARQL has become the standard language for querying RDF since its W3C standardization in 2008. However, the 2008 version of this language missed some important functionalities: reasoning capabilities to deal with RDFS and OWL vocabularies, navigational capabilities to exploit the graph structure of RDF data, and a general form of recursion much needed to express some natural queries. To overcome those limitations, a new version of SPARQL, called SPARQL 1.1, was released in 2013, which includes entailment regimes for RDFS and OWL vocabularies, and a mechanism to express navigation patterns through regular expressions. Nevertheless, there are useful navigation patterns that cannot be expressed in SPARQL 1.1, and the language lacks a general mechanism to express recursive queries. This chapter is a gentle introduction to a tractable rule-based query language, in fact, an extension of Datalog with value invention, stratified negation, and falsum, that is powerful enough to define SPARQL queries enhanced with the desired functionalities focussing on a core fragment of the OWL 2 QL profile of OWL 2.
APA, Harvard, Vancouver, ISO, and other styles
6

Mishra, Sanju, and Sarika Jain. "An Intelligent Knowledge Treasure for Military Decision Support." In Research Anthology on Military and Defense Applications, Utilization, Education, and Ethics. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-9029-4.ch027.

Full text
Abstract:
Information integration is great for military operations because the range of pertinent information sources is significantly distinct and dynamic. This article develops an intelligent knowledge treasure comprised of military resource ontology and procedures, as a learning model for better interoperability of heterogeneous resources of the Indian military. This model can interpret and learn the context of military information automatically, thereby facilitating the military commanders with decision making in several operations, such as command and control, teaching and training, military coalition, situation awareness and many more. To design the military resource ontology, this article specifies the core concepts of the ontology based on terms derived from heterogeneous resources. WWW standard ontology language, OWL has been used to codify the ontology. This article develops an intelligent tool—“QueryOnto”—as an interface to the military resource ontology that provides a commander decision support service and demonstrates how to apply the military ontology in practice. The developed ontology has been verified and validated with the best known approaches and metrics available. The presented model is helpful for military commanders to train their juniors in a systematic way and will provide an efficient web-based learning of different military tasks in future.
APA, Harvard, Vancouver, ISO, and other styles
7

Abdoullaev, Azamat. "The World Code." In Reality, Universal Ontology and Knowledge Systems. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-966-3.ch003.

Full text
Abstract:
Formalizing the world in rigorous mathematical terms is no less significant than its fundamental understanding and modeling in terms of ontological constructs. Like black and white, opposite sexes or polarity signs, ontology and mathematics stand complementary to each other, making up the unique and unequaled knowledge domain or knowledge base, which involves two parts: • Ontological (real) mathematics, which defines the real significance for the mathematical entities, so studying the real status of mathematical objects, functions, and relationships in terms of ontological categories and rules. • Mathematical (formal) ontology, which defines the mathematical structures of the real world features, so concerned with a meaningful representation of the universe in terms of mathematical language. The combination of ontology and mathematics and substantial knowledge of sciences is likely the only one true road to reality understanding, modeling and representation. Ontology on its own can’t specify the fabric, design, architecture, and the laws of the universe. Nor theoretical physics with its conceptual tools and models: general relativity, quantum physics, Lagrangians, Hamiltonians, conservation laws, symmetry groups, quantum field theory, string and M theory, twistor theory, loop quantum gravity, the big bang, the standard model, or theory of everything material. Nor mathematics alone with its abstract tools, complex number calculus, differential calculus, differential geometry, analytical continuation, higher algebras, Fourier series and hyperfunctions is the real path to reality (Penrose, 2005).
APA, Harvard, Vancouver, ISO, and other styles
8

Kotinurmi, Paavo, Armin Haller, and Eyal Oren. "Ontologically Enhanced RosettaNet B2B Integration." In Semantic Web for Business. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-066-0.ch010.

Full text
Abstract:
RosettaNet is an industry-driven e-business process standard that defines common inter-company public processes and their associated business documents. RosettaNet is based on the Service-oriented architecture (SOA) paradigm and all business documents are expressed in DTD or XML Schema. Our “ontologically-enhanced RosettaNet” effort translates RosettaNet business documents into a Web ontology language, allowing business reasoning based on RosettaNet message exchanges. This chapter describes our extension to RosettaNet and shows how it can be used in business integrations for better interoperability. The usage of a Web ontology language in RosettaNet collaborations can help accommodate partner heterogeneity in the setup phase and can ease the back-end integration, enabling for example more competition in the purchasing processes. It provides also a building block to adopt a semantic SOA with richer discovery, selection and composition capabilities.
APA, Harvard, Vancouver, ISO, and other styles
9

Walton, Christopher. "Web knowledge." In Agency and the Semantic Web. Oxford University Press, 2006. http://dx.doi.org/10.1093/oso/9780199292486.003.0008.

Full text
Abstract:
In the introductory chapter of this book, we discussed the means by which knowledge can be made available on the Web. That is, the representation of the knowledge in a form by which it can be automatically processed by a computer. To recap, we identified two essential steps that were deemed necessary to achieve this task: 1. We discussed the need to agree on a suitable structure for the knowledge that we wish to represent. This is achieved through the construction of a semantic network, which defines the main concepts of the knowledge, and the relationships between these concepts. We presented an example network that contained the main concepts to differentiate between kinds of cameras. Our network is a conceptualization, or an abstract view of a small part of the world. A conceptualization is defined formally in an ontology, which is in essence a vocabulary for knowledge representation. 2. We discussed the construction of a knowledge base, which is a store of knowledge about a domain in machine-processable form; essentially a database of knowledge. A knowledge base is constructed through the classification of a body of information according to an ontology. The result will be a store of facts and rules that describe the domain. Our example described the classification of different camera features to form a knowledge base. The knowledge base is expressed formally in the language of the ontology over which it is defined. In this chapter we elaborate on these two steps to show how we can define ontologies and knowledge bases specifically for the Web. This will enable us to construct Semantic Web applications that make use of this knowledge. The chapter is devoted to a detailed explanation of the syntax and pragmatics of the RDF, RDFS, and OWL Semantic Web standards. The resource description framework (RDF) is an established standard for knowledge representation on the Web. Taken together with the associated RDF Schema (RDFS) standard, we have a language for representing simple ontologies and knowledge bases on the Web.
APA, Harvard, Vancouver, ISO, and other styles
10

Rodrigues, Mário, Gonçalo Paiva Dias, and António Teixeira. "Towards E-Government Information Platforms for Enterprise 2.0." In Advances in Business Information Systems and Analytics. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-4373-4.ch033.

Full text
Abstract:
Enterprise 2.0 aims to help employees, customers, and suppliers collaborate, share, and organize information. As governments are relevant partners for enterprises (legislation, contracts, etc.) e-government platforms need to be ready for Enterprise 2.0 to what concerns e-government interactions. The public sector holds huge quantities of information and just a small proportion is relevant to each enterprise. Enterprises should only be confronted with relevant information and not flooded with lots of data. This implies data organization with semantic description and services using open standards. The goal is to build a durable information infrastructure for government that can be readily accessed by enterprises. The authors propose a conceptual model for government information provisioning. The rationale for this proposal is to motivate the creation of durable, standard, and open government information infrastructures. The model acquires information from natural language documents and represents it using ontology. A proof-of-concept prototype and its preliminary results are presented.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Ontology language standard"

1

Lee, Jae-Hyun, and Hyo-Won Suh. "OWL-Based Product Ontology Architecture and Representation for Sharing Product Knowledge on a Web." In ASME 2007 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/detc2007-35312.

Full text
Abstract:
A collaborative product development environment requires sharing of product information among its participants. Product knowledge models and international standards have been developed for product information sharing. However, their models have limitations in representing their semantics explicitly, so a computer can not understand their semantics properly. It results in lack of information sharing. Recently, ontology gets attentions for automatic knowledge sharing because it can specify the semantics explicitly and logically. In addition, ontology-related standard language, such as web ontology language (OWL), is also proposed. In this paper, we propose an architecture for an ontology-based product knowledge and a product web ontology language (POWL) based on OWL. The architecture consists of three-level ontologies; meta-, generic and particular product ontology. The meta-product ontology is derived from previous top-level ontologies such as SUMO, DOLCE and Guarino’s ontology. The generic product ontology is developed to provide comprehensive knowledge primitives representing product knowledge. A particular product ontology specify knowledge about a specific product such as car, telephone, ship, etc., and it is defined based on the generic product ontology. Meanwhile, POWL has product knowledge primitives defined in the generic product ontology, and it can be transformed to OWL. So users can define the specific product knowledge based on POWL. We implement the tranformation logic with XSLT and demonstrate a POWL usage with an example.
APA, Harvard, Vancouver, ISO, and other styles
2

Zhu, Lijuan, Uma Jayaram, Sankar Jayaram, and OkJoon Kim. "Ontology-Driven Integration of CAD/CAE Applications: Strategies and Comparisons." In ASME 2009 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/detc2009-87768.

Full text
Abstract:
In this paper we present a detailed exploration of ontology-driven approaches and strategies for integrating product data between CAD/CAE applications. We structure the ontology model into three layers: General Domain Ontology, Domain Specific Ontology, and Application Specific Ontology. In particular, Application Specific Ontologies are built for PRO/E, CATIA, and a virtual assembly design tool called VADE. This allows the integration processes to be demonstrated for a) two applications in the common domain of product design, and b) two applications in different domains, one in the product design domain and the other in an assembly simulation domain. In addition, these ontology-driven strategies are compared with two other approaches. The first study focuses on the knowledge modeling aspect and compares the ontology approach with a standard modeling language, UML. The second study focuses on data integration and translation aspect and compares the ontology-driven approach with a traditional one. It is concluded that an ontology-driven approach is superior for solving heterogeneous data problems involving multiple applications by managing data on semantic level.
APA, Harvard, Vancouver, ISO, and other styles
3

Ortiz, Magdalena. "Improving Data Management using Domain Knowledge." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/814.

Full text
Abstract:
The development of tools and techniques for flexible and reliable data management is a long-standing challenge, ever more pressing in today’s data-rich world. We advocate using domain knowledge expressed in ontologies to tackle it, and summarize some research efforts to this aim that follow two directions. First, we consider the problem of ontology-mediated query answering (OMQA), where queries in a standard database query language are enriched with an ontology expressing background knowledge about the domain of interest, used to retrieve more complete answers when querying incomplete data. We discuss some of our contributions to OMQA, focusing on (i) expressive languages for OMQA, with emphasis on combining the open- and closed-world assumptions to reason about partially complete data; and (ii) OMQA algorithms based on rewriting techniques. The second direction we discuss proposes to use ontologies to manage evolving data. In particular, we use ontologies to model and reason about constraints on datasets, effects of operations that modify data, and the integrity of the data as it evolves.
APA, Harvard, Vancouver, ISO, and other styles
4

Patil, Lalit, Debasish Dutta, and Ram Sriram. "Ontology Formalization of Product Semantics for Product Lifecycle Management." In ASME 2005 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2005. http://dx.doi.org/10.1115/detc2005-85121.

Full text
Abstract:
Product Lifecycle Management (PLM) is a concept that takes into account that the development of a product is influenced by knowledge from various stakeholders throughout its lifecycle. Computing environments in the PLM framework are expected to have several independent information resources. This requires a meaningful formal representation of product data semantics throughout the product’s lifecycle. This paper presents an ontological approach to formalize product semantics into a Product Semantic Representation Language (PSRL). Building blocks to develop the explicit, extensible and comprehensive PSRL are described. The PSRL is open and based on standard W3L OWL constructs. The extensibility is demonstrated by considering an example product. The representation and the method of its development is expected to support several applications in the context of PLM. The use of OWL will enable the provision of the application software and information resources as Web services in the context of the Semantic Web.
APA, Harvard, Vancouver, ISO, and other styles
5

Vucinic, Dean, Marina Pesut, Franjo Jovic, and Chris Lacor. "Exploring Ontology-Based Approach to Facilitate Integration of Multi-Physics and Visualization for Numerical Models." In ASME 2009 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/detc2009-86477.

Full text
Abstract:
Today, within the engineering design process, we have interactions between different design teams, where each team has its own design objective and continuous need to present and share results with other groups. Common engineering environments are equipped with advanced modeling and simulation tools, specially designed to improve engineer’s productivity. In this paper we propose the use of ontologies, the semantic metadata descriptors, to facilitate the software development process in building such multidisciplinary engineering environments. The important development task is to perform integration of several numerical simulation components (models of data and processes) together with the interactive visualization of the engineering models in a unified 3D scene. In addition, we explore the possibilities on how the prototyped ontologies can become standard components in such software systems, where the presence of the inference engine grants and enables continuous semantic integration of the involved data and processes. The semantic integration is based on: 1) mapping discovery between two or more ontologies, 2) declarative formal representation of mappings to enable 3) reasoning with mappings and find what types of reasoning are involved; and we have explored these three dimensions. The proposed solution involves two web based software standards: Semantic Web and X3D. The developed prototype make use of the “latest” available XML-based software technologies, such X3D (eXtensible 3D) and OWL (Web Ontology Language), and demonstrates the modeling approach to integrate heterogeneous data sources, their interoperability and 3D visual representations to enhance the end-users interactions with the engineering content. We demonstrate that our ontology-based approach is appropriate for the reuse, share and exchange of software constructs, which implements differential-geometric algorithms used in multidisciplinary numerical simulations, by applying adopted ontologies that are used in the knowledge-based systems. The selected engineering test case represents a complex multi-physics problem FSI (Fluid Structure Interaction). It involves numerical simulations of a multi-component box structure used for the drop test in a still water. The numerical simulations of the drop test are performed through combined used of the FEM (Finite Element Method) and CFD (Computational Fluid Dynamics) solvers. The important aspect is the design of a common graphics X3D model, which combines the FEM data model, which is coupled with the CFD data model in order to preserve all the relationships between CFD and FEM data. Our ultimate vision is to build intelligent and powerful mechanical engineering software by developing infrastructure that may enable efficient data sharing and process integration mechanisms. We see our current work in exploring the ontology-based approach as a first step towards semantic interoperability of numerical simulations and visualization components for designing complex multi-physics solutions.
APA, Harvard, Vancouver, ISO, and other styles
6

Huang, Ting, Gehui Shen, and Zhi-Hong Deng. "Leap-LSTM: Enhancing Long Short-Term Memory for Text Categorization." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/697.

Full text
Abstract:
Recurrent Neural Networks (RNNs) are widely used in the field of natural language processing (NLP), ranging from text categorization to question answering and machine translation. However, RNNs generally read the whole text from beginning to end or vice versa sometimes, which makes it inefficient to process long texts. When reading a long document for a categorization task, such as topic categorization, large quantities of words are irrelevant and can be skipped. To this end, we propose Leap-LSTM, an LSTM-enhanced model which dynamically leaps between words while reading texts. At each step, we utilize several feature encoders to extract messages from preceding texts, following texts and the current word, and then determine whether to skip the current word. We evaluate Leap-LSTM on several text categorization tasks: sentiment analysis, news categorization, ontology classification and topic classification, with five benchmark data sets. The experimental results show that our model reads faster and predicts better than standard LSTM. Compared to previous models which can also skip words, our model achieves better trade-offs between performance and efficiency.
APA, Harvard, Vancouver, ISO, and other styles
7

Bourhis, Pierre, Michael Morak, and Andreas Pieris. "Making Cross Products and Guarded Ontology Languages Compatible." In Twenty-Sixth International Joint Conference on Artificial Intelligence. International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/122.

Full text
Abstract:
Cross products form a useful modelling tool that allows us to express natural statements such as "elephants are bigger than mice", or, more generally, to define relations that connect every instance in a relation with every instance in another relation. Despite their usefulness, cross products cannot be expressed using existing guarded ontology languages, such as description logics (DLs) and guarded existential rules. The question that comes up is whether cross products are compatible with guarded ontology languages, and, if not, whether there is a way of making them compatible. This has been already studied for DLs, while for guarded existential rules remains unanswered. Our goal is to give an answer to the above question. To this end, we focus on the guarded fragment of first-order logic (which serves as a unifying framework that subsumes many of the aforementioned ontology languages) extended with cross products, and we investigate the standard tasks of satisfiability and query answering. Interestingly, we isolate relevant fragments that are compatible with cross products.
APA, Harvard, Vancouver, ISO, and other styles
8

Braun, German, Giuliano Marinelli, Emiliano Rios Gavagnin, Laura Cecchi, and Pablo Fillottrani. "Web Interoperability for Ontology Development and Support with crowd 2.0." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/707.

Full text
Abstract:
In this work, we treat web interoperability in terms of interchanging ontologies (as knowledge models) within user-centred ontology engineering environments, involving visual and serialised representations of ontologies. To do this, we deal with the tool interoperability problem by re-using an enough expressive ontology-driven metamodel, named KF, proposed as a bridge for interchanging both knowledge models. We provide an extensible web framework, named crowd 2.0, unifying the standard conceptual data modelling languages for generating OWL 2 ontologies from semantic visualisations. Visual models are designed as UML, ER or ORM 2 diagrams, represented as KF instances, and finally, formalised as DL-based models. Reasoning results may be newly incorporated into the shared KF instance to be visualised in any of the provided languages.
APA, Harvard, Vancouver, ISO, and other styles
9

Whitsitt, Sean, Sonia Vohnout, Timothy Wilmering, Disha Mathad, and Eric Smith. "A Visual Ontological Language for Technical Standards (VOLTS)." In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-59594.

Full text
Abstract:
Research shows that failures in the standardization process often result from communication and organizational issues between those involved in the committee and the user community. This is mainly caused by two issues: first, a lack of integration of available standards development tools with communication and social interfaces; and second, to the difficulties inherent in organizing and collating information in a semantically meaningful manner. To this effect, the authors present a Visual Ontological Language for Technical Standards (VOLTS). VOLTS is a prototype environment that seeks to address the latter problem introduced above. In VOLTS, standards developers visually create standards within a network of information. VOLTS builds upon a tool developed by the National Institute of Standards and Technology (NIST) called the NIST Ontological Visualization Interface for Standards (NOVIS), which presented a novel method for visualizing the content and connections of standards, but lacked the ability to allow users to alter that information. VOLTS focuses on providing users with a process that allows for verification and validation at all stages of development. To that effect, VOLTS incorporates research done by NIST on building a Framework for Analysis Comparison, and Test of Standards (FACTS). The examples presented herein use the openly available standards World Wide Web Consortium’s (W3C) Web Ontology Language (OWL) 2 and the Data Mining Group’s (DMG) Predictive Model Markup Language (PMML) to demonstrate the VOLTS process and methodology. Future work discussed will seek to address the former problem introduced above.
APA, Harvard, Vancouver, ISO, and other styles
10

Tessier, Sean, and Yan Wang. "Ontology-Based Representation and Verification to Enable Feature Interoperability Between CAD Systems." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-48537.

Full text
Abstract:
Data interoperability between computer-aided design (CAD) systems remains a major obstacle in the information integration and exchange in a collaborative engineering environment. The standards for CAD data exchange have remained largely restricted to geometric representations, causing the design intent portrayed through construction history, features, parameters, and constraints to be discarded in the exchange process. In this research paper, an ontology-based framework is proposed to allow for the full exchange of semantic feature data. The Ontology Web Language (OWL) is used to represent feature types as well as the concepts and properties that define features, which allows the use of existing ontology reasoning tools to infer new relationships and information between heterogeneous data. Boundary representation (B-Rep) data corresponding to the output of the feature operation is also stored for purposes of feature identification and translation verification. The base ontology and a small feature library are built in OWL, and a combination of OWL and SWRL (Semantic Web Rule Language) rules are developed to allow a feature from an arbitrary source system to be automatically classified and translated into the target system through the use of a reasoner. These rules relate input parameter and reference types to expected B-Rep objects, allowing classification even when feature definitions vary or when little is known about the source system. In cases when the source system is well known, this approach also permits direct translation rules to be implemented. With such a flexible framework, a neutral feature exchange format could be developed.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!