To see the other types of publications on this topic, follow the link: GRAPE (Computer file).

Journal articles on the topic 'GRAPE (Computer file)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'GRAPE (Computer file).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wang, Yi, Han Ding, and Ge Ging Xu. "Computer Aided Organic Synthesis Based on Graph Grammars." Applied Mechanics and Materials 411-414 (September 2013): 227–30. http://dx.doi.org/10.4028/www.scientific.net/amm.411-414.227.

Full text
Abstract:
Traditionally, computer aided organic synthesis is based on the one-dimensional string model that employs string grammars to tackle the structure of molecular; the processing of organic reactions, and the establishment of the knowledge bases and file systems. Because of the limitations of one-dimensional method for tackling two-dimensional issues like organic syntheses, this paper presents a method for computer aided organic synthesis based on two-dimensional graph grammars. The method could apply the basic principle of the graph grammars to effectively and efficiently solve organic synthesis problems.
APA, Harvard, Vancouver, ISO, and other styles
2

Richter, Tobias, Johannes Naumann, and Stephan Noller. "LOGPAT: A semi-automatic way to analyze hypertext navigation behavior." Swiss Journal of Psychology 62, no. 2 (June 2003): 113–20. http://dx.doi.org/10.1024//1421-0185.62.2.113.

Full text
Abstract:
In hypertext research, log files represent a useful source of information about users’ navigational behavior. Since log files can contain enormous amounts of data, methods for data reduction with a minimum loss of information are needed. In this paper, LOGPAT (Log file Pattern Analysis) is presented, a Web-based tool for analyzing log files. With LOGPAT, single-unit, sequential, and graph-theoretic measures (including distance matrices) for the description of user navigation can be computed. The paper gives an overview of these methods and discusses their value for psychological research on hypertext. Components and analysis options of LOGPAT are described in detail. The program’s basic options are illustrated by data from a study on learning with hypertext.
APA, Harvard, Vancouver, ISO, and other styles
3

Mustikasari, Dyah. "Ibm Smk Muhammadiyah Sumoroto Pelatihan Pembuatan Film Animas." Adimas : Jurnal Pengabdian Kepada Masyarakat 1, no. 1 (April 12, 2017): 31. http://dx.doi.org/10.24269/adi.v1i1.415.

Full text
Abstract:
Sekolah Menengah Kejuruan (vocational school) has a purpose to generate graduate which is ready to be employed. Therefore, the students need to have additional competence. It will be additional value for students to compete in the workplace environment. SMK Muhammadiyah Sumoroto is one of vocational schools in Kecamatan Kauman, Kabupaten Ponorogo. This school has two majors; there are accounting and computer networking technique. To improve students’ skill and creativity in graphic design needs to carry out training in Graphic Design, particularly animation design. This training is carried out in SMK Muhammadiyah 1 Sumoroto with participants as many as 20 students from first and second grade. The software used in this training was Alice3 because it is pretty easy to run for beginners in animation design. The training was conducted twice and additional meeting to review and to discuss the result. The biggest obstacle of this training is a lack of computers. The amount of computer was less adequate, so that one computer was used by more than one student. It lead the training become less maximum. Overall the students could have used the software though the result have not been perfect yet.
APA, Harvard, Vancouver, ISO, and other styles
4

White, Charles S. "Developing Information-Processing Skills through Structured Activities with a Computerized File-Management Program." Journal of Educational Computing Research 3, no. 3 (August 1987): 355–75. http://dx.doi.org/10.2190/mj4b-mh97-rr32-yf29.

Full text
Abstract:
Considerable attention has been focused recently on the development of thinking skills among pre-college students. Advocates of computer-based education have suggested that computerized file-management programs can enhance thinking skills, especially those involving the identification, retrieval, organization and evaluation of information required for effective problem solving. Employing a randomized block design, a two-treatment experiment to test this claim was devised, involving fourteen paired classrooms and 665 seventh- through twelfth-grade students. The treatments and treatment materials were adapted from commercially produced social studies curriculum data bases. The computer-using/structured-activities treatment group achieved significantly higher mean scores than the non-computer-using/non-structured-activities group on a 14-item power test of selected information-processing skills (effect size = .27). The difference persisted when verbal ability and grade level were controlled. Suggestions for further research are proposed, and implications for instructional methodology, curriculum development, and conceptions of pre-college computer literacy are discussed.
APA, Harvard, Vancouver, ISO, and other styles
5

HOLZRICHTER, MICHAEL, and SUELY OLIVEIRA. "A GRAPH BASED DAVIDSON ALGORITHM FOR THE GRAPH PARTITIONING PROBLEM." International Journal of Foundations of Computer Science 10, no. 02 (June 1999): 225–46. http://dx.doi.org/10.1142/s0129054199000162.

Full text
Abstract:
The problem of partitioning a graph such that the number of edges incident to vertices in different partitions is minimized, arises in many contexts. Some examples include its recursive application for minimizing fill-in in matrix factorizations and load-balancing for parallel algorithms. Spectral graph partitioning algorithms partition a graph using the eigenvector associated with the second smallest eigenvalue of a matrix called the graph Laplacian. The focus of this paper is the use graph theory to compute this eigenvector more quickly.
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Xin Jie. "Research and Application of Lost Foam Technology of Goods Train Low-Carbon Alloy Steel Casting." Advanced Materials Research 284-286 (July 2011): 2340–49. http://dx.doi.org/10.4028/www.scientific.net/amr.284-286.2340.

Full text
Abstract:
Apply an organic whole technological medium of computer aided three dimensions design 、computer aided solidification process numerical value simulation engineering & computer aided numerical control programming manufacturing, by means of optimization product pattern structure、foundry processing parameters 、die auxiliary equipment and tool、riser gating system & die cavity numerical control cutter location source file , the type of Z8A goods train low-carbon alloy adapter lost foam casting has been under development .Compare lost foam technology to ordinary sand mold casting ,the casting monomer weight lightened 3kg , grade of casting weight precision come to MT7 , grade of casting size precision come to CT8 ,foundry processing yield come to 65% ;Casting defects easily appeared in the course of large quantities production has been analysed & offered solution methods in this article.
APA, Harvard, Vancouver, ISO, and other styles
7

Ali, Faridah, and Imtiaz Ahmad. "Register-file allocation via graph coloring." Microprocessors and Microsystems 21, no. 9 (April 1998): 523–32. http://dx.doi.org/10.1016/s0141-9331(98)00047-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jiang, Ying, and Jie Liu. "Computer Aided Design on Thin-Film Evaporator in Parametric Drawing." Applied Mechanics and Materials 201-202 (October 2012): 267–70. http://dx.doi.org/10.4028/www.scientific.net/amm.201-202.267.

Full text
Abstract:
This article introduces parameterized drawing technology on complete film evaporators by the classification of parameterized drawing strategy. And by sample of film evaporators parts graph drawing process, parametric parts prints and the assembly drawing strategy in detail. Development of aided design software of film evaporators would help users realize integration design of process design, and can greatly shorten the development cycle of the products, improve the design quality and enhance the competitiveness of their products.
APA, Harvard, Vancouver, ISO, and other styles
9

Diomande, Bre Moussa, and Luc Laperrière. "AUTOMATIC LIAISON MODEL GENERATION FROM 3-D SOLID MODELS." Transactions of the Canadian Society for Mechanical Engineering 20, no. 4 (December 1996): 333–47. http://dx.doi.org/10.1139/tcsme-1996-0019.

Full text
Abstract:
Computer Aided Assembly Planning (CAAP) is characterized by the manual generation of a liaison (or graph) model used to represent explicitly mating information among parts. This paper describes an automatic method for generating such relational information. It is based on an exhaustive and systematic surface-level analysis of the Boundary Representation (B-Rep) file of the solid modeled product. Simple mathematical tests performed on pairs of surfaces each on a different part enable the identification of mating surfaces. The system which performs this analysis also enables a visual display of the mating information in the form of a graph model This paper describes how mating surfaces are identified from more elemental information found in the B-Rep file and also provides examples of the automatically generated graph model for some products.
APA, Harvard, Vancouver, ISO, and other styles
10

Rowley, Peter, and Paul Taylor. "Involutions in Janko’s simple group J4." LMS Journal of Computation and Mathematics 14 (November 1, 2011): 238–53. http://dx.doi.org/10.1112/s1461157009000540.

Full text
Abstract:
AbstractIn this paper we determine the suborbits of Janko’s largest simple group in its conjugation action on each of its two conjugacy classes of involutions. We also provide matrix representatives of these suborbits in an accompanying computer file. These representatives are used to investigate a commuting involution graph for J4.Supplementary materials are available with this article.
APA, Harvard, Vancouver, ISO, and other styles
11

Orsi, Nicolas M., Tathagata Dasgupta, Satabhisa Mukhopadhyay, Michele Cummings, and Angelene Berwick. "A novel, computer-aided, scanning platform agnostic solution for grading carcinomas in breast biopsy whole slide images." Journal of Clinical Oncology 39, no. 15_suppl (May 20, 2021): e12575-e12575. http://dx.doi.org/10.1200/jco.2021.39.15_suppl.e12575.

Full text
Abstract:
e12575 Background: The determination of breast cancer grade remains a problematic diagnostic issue in biopsies due to limited tissue sampling and the inherent inter-observer variability associated with the Nottingham (histologic) score. Significantly, biopsy grading corresponds only moderately with that based on excision specimens, with discordance rates of 21-30.0%, which is a concern for cases managed neoadjuvantly or with minimally ablative therapy. Although the advent of digital pathology has encouraged endeavors to automate breast cancer detection and grading, no method has yet received regulatory approval for clinical use or proven to be a platform agnostic solution. Methods: This study was the blinded clinical validation a novel, FDA breakthrough designation approved automated device-based predictive solution which uses a surrogate scale to determine breast carcinoma grade in core biopsies using H&E slide whole slide images (WSIs) only. Non-preselected malignant breast core biopsy clinical cases (n=173 WSIs; 107 cases) covering a broad spectrum of breast cancer morphological subtypes were scanned at x20 magnification on both on Aperio high-throughput T2/T3 systems (.SVS files) and Roche-Ventana DP200 scanners (.BIF files). The diagnostic gold standard reference was the reports of tertiary referral center breast subspecialty consultant histopathologists. Diagnostic outputs were also compared to case-matched resection specimen grades. Comparison of diagnostic outcomes with pathologists as gold standard was done using inter observer agreement (cohen kappa) with 95% CI and statistically significant chi-squared test p-value for two different scanner platforms. Results: Although biopsy-based diagnostic concordance with pathologists was very good, the device actually delivered a much better concordance between case-matched biopsy and resection specimen (accuracy: 95%, cohen kappa: 0.91) relative to pathologists’ assessments (accuracy: 80%). Conclusions: This indicates that the use of a single continuous surrogate grading scale is much less affected by limited tissue sampling than conventional ordinal morphological scales. This level of performance was independent of file format analyzed, demonstrating the device’s platform agnosia.
APA, Harvard, Vancouver, ISO, and other styles
12

Waters, Austin, and Risto Miikkulainen. "GRADE: Machine Learning Support for Graduate Admissions." AI Magazine 35, no. 1 (March 21, 2014): 64. http://dx.doi.org/10.1609/aimag.v35i1.2504.

Full text
Abstract:
This article describes GRADE, a statistical machine learning system developed to support the work of the graduate admissions committee at the University of Texas at Austin Department of Computer Science (UTCS). In recent years, the number of applications to the UTCS PhD program has become too large to manage with a traditional review process. GRADE uses historical admissions data to predict how likely the committee is to admit each new applicant. It reports each prediction as a score similar to those used by human reviewers, and accompanies each by an explanation of what applicant features most influenced its prediction. GRADE makes the review process more efficient by enabling reviewers to spend most of their time on applicants near the decision boundary and by focusing their attention on parts of each applicant’s file that matter the most. An evaluation over two seasons of PhD admissions indicates that the system leads to dramatic time savings, reducing the total time spent on reviews by at least 74 percent.
APA, Harvard, Vancouver, ISO, and other styles
13

Yang, Xue-Jun, Yu Deng, Li Wang, Xiao-Bo Yan, Jing Du, Ying Zhang, Gui-Bin Wang, and Tao Tang. "SRF Coloring: Stream Register File Allocation via Graph Coloring." Journal of Computer Science and Technology 24, no. 1 (January 2009): 152–64. http://dx.doi.org/10.1007/s11390-009-9211-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Zhang, Yipin, Xiaolin Chang, Yuzhou Lin, Jelena Misic, and Vojislav B. Misic. "Exploring Function Call Graph Vectorization and File Statistical Features in Malicious PE File Classification." IEEE Access 8 (2020): 44652–60. http://dx.doi.org/10.1109/access.2020.2978335.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Rowley, Peter, and Ben Wright. "Structure of the maximal -local geometry point-line collinearity graph." LMS Journal of Computation and Mathematics 19, no. 1 (2016): 105–54. http://dx.doi.org/10.1112/s1461157016000036.

Full text
Abstract:
The point-line collinearity graph ${\mathcal{G}}$ of the maximal 2-local geometry for the largest simple Fischer group, $Fi_{24}^{\prime }$, is extensively analysed. For an arbitrary vertex $a$ of ${\mathcal{G}}$, the $i\text{th}$-disc of $a$ is described in detail. As a consequence, it follows that ${\mathcal{G}}$ has diameter $5$. The collapsed adjacency matrix of ${\mathcal{G}}$ is given as well as accompanying computer files which contain a wealth of data about ${\mathcal{G}}$.Supplementary materials are available with this article.
APA, Harvard, Vancouver, ISO, and other styles
16

Yang, Qian Qian, Ying Jun Chen, and Huang Ping. "A Novel Method to Determine EHL Film Thickness with Optical Interference." Applied Mechanics and Materials 456 (October 2013): 549–54. http://dx.doi.org/10.4028/www.scientific.net/amm.456.549.

Full text
Abstract:
In this work, a new method to determine the interference grade of the central film is put forward. The interference figure is generated on a steel ball and a glass disc with a green monochromatic two-beam interferometry. With gradually increasing the rotational speed of the glass disc, the corresponding interference intensities of the central contact area are continually collected and sent to the computer. With these data, the periodical curve of the central interference grey value with the sliding speed has been drawn. Then, the curve is compared with the theoretical result so that the central interference grade of each figure can be determined. It has been found that while the sliding speed varies from 0.09 m/s to 0.21 m/s, the central interference grade of the interference figure is the first under the working conditions of the present paper. As the sliding speed increases between 0.21 m/s and 0.4 m/s, the second grade of the central interference figure is achieved. As the central interference grade has been known, the film thickness and the film shape can be easily determined by the interference picture of the EHL film and the relative optical intensity principle.
APA, Harvard, Vancouver, ISO, and other styles
17

Gyimesi, Péter. "Automatic Calculation of Process Metrics and their Bug Prediction Capabilities." Acta Cybernetica 23, no. 2 (2017): 537–59. http://dx.doi.org/10.14232/actacyb.23.2.2017.7.

Full text
Abstract:
Identifying fault-prone code parts is useful for the developers to help reduce the time required for locating bugs. It is usually done by characterizing the already known bugs with certain kinds of metrics and building a predictive model from the data. For the characterization of bugs, software product and process metrics are the most popular ones. The calculation of product metrics is supported by many free and commercial software products. However, tools that are capable of computing process metrics are quite rare. In this study, we present a method of computing software process metrics in a graph database. We describe the schema of the database created and we present a way to readily get the process metrics from it. With this technique, process metrics can be calculated at the file, class and method levels. We used GitHub as the source of the change history and we selected 5 open-source Java projects for processing. To retrieve positional information about the classes and methods, we used SourceMeter, a static source code analyzer tool. We used Neo4j as the graph database engine, and its query language - cypher - to get the process metrics. We published the tools we created as open-source projects on GitHub. To demonstrate the utility of our tools, we selected 25 release versions of the 5 Java projects and calculated the process metrics for all of the source code elements (files, classes and methods) in these versions. Using our previous published bug database, we built bug databases for the selected projects that contain the computed process metrics and the corresponding bug numbers for files and classes. (We published these databases as an online appendix.) Then we applied 13 machine learning algorithms on the database we created to find out if it is feasible for bug prediction purposes. We achieved F-measure values on average of around 0.7 at the class level, and slightly better values of between 0.7 and 0.75 at the file level. The best performing algorithm was the RandomForest method for both cases.
APA, Harvard, Vancouver, ISO, and other styles
18

Cho, H., A. Derebail, T. Hale, and R. A. Wysk. "A Formal Approach to Integrating Computer-Aided Process Planning and Shop Floor Control." Journal of Engineering for Industry 116, no. 1 (February 1, 1994): 108–16. http://dx.doi.org/10.1115/1.2901800.

Full text
Abstract:
A formal approach for integrating Computer-Aided Design (CAD), Computer-Aided Process Planning (CAPP), and shop floor control for rotational components is presented in this paper. It is assumed that this approach will be implemented within the framework of a three level hierarchical CIM architecture that consists of the following levels in the hierarchy: shop floor, workstation and equipment (Joshi et al., 1991). Our approach to CAPP consists of machining feature identification, definition, classification, representation, and reasoning, provided through a CAD model of a product. Geometric entities are identified from a Drawing Exchange Format (DXF) file. The identified entities form the basis for the construction of primitive manufacturing features. The primitive features are assembled together based upon the precedence among features, into a graph, called a feature graph. However, the primitive features may or may not be manufacturable in terms of depth of cut, tool geometry, surface finish, and material handling required. Hence it is necessary to convert the feature graph into a manufacturing task graph, which consists of specifications of alternative functional tasks that are manufacturable. The task graph may be converted into a hierarchical set of process plans, based on the planning criteria at each level in the control hierachy, to reflect the processing requirements at each level. The shop planning function decomposes the task graph into a set of workstation level plans. Each workstation level plan is aggregated into a set of equipment level process plans by the workstation planning function. The equipment level plan is converted into a unique task sequence by the equipment planning function. This sequence is then executed according to specifications by the equipment level execution function. Provision of alternative routes in process plans provides for flexible means of on-line planning and control.
APA, Harvard, Vancouver, ISO, and other styles
19

Yang, Kang, Huiqun Yu, Guisheng Fan, and Xingguang Yang. "Graph embedding code prediction model integrating semantic features." Computer Science and Information Systems 17, no. 3 (2020): 907–26. http://dx.doi.org/10.2298/csis190908027y.

Full text
Abstract:
With the advent of Big Code, code prediction has received widespread attention. However, the state-of-the-art code prediction techniques are inadequate in terms of accuracy, interpretability and efficiency. Therefore, in this paper, we propose a graph embedding model that integrates code semantic features. The model extracts the structural paths between the nodes in source code file?s Abstract Syntax Tree(AST). Then, we convert paths into training graph and extracted interdependent semantic structural features from the context of AST. Semantic structure features can filter predicted candidate values and effectively solve the problem of Out-of- Word(OoV). The graph embedding model converts the structural features of nodes into vectors which facilitates quantitative calculations. Finally, the vector similarity of the nodes is used to complete the prediction tasks of TYPE and VALUE. Experimental results show that compared with the existing state-of-the-art method, our method has higher prediction accuracy and less time consumption.
APA, Harvard, Vancouver, ISO, and other styles
20

Wang, Chunfeng, Caigang Peng, Yepo Hou, and Minmin Chen. "Augmented Reality Research of Measuring X-Ray Dental Film Alveolar Bone Based on Computer Image Analysis System." Journal of Healthcare Engineering 2021 (March 17, 2021): 1–11. http://dx.doi.org/10.1155/2021/5571862.

Full text
Abstract:
The important application of computer imaging technology in the medical field is a necessary auxiliary method for clinical diagnosis and treatment. At present, many people are affected by various factors and have various problems caused by the dental cellular bone. Traditional treatment methods are complex and long, which can cause damage to body tissues. Based on this problem, this paper takes the augmented reality measurement of X-ray dental film as the research object. Based on the in-depth measurement algorithm of the computer image analysis system, two three-dimensional reconstruction methods based on the center of gravity and the matching of the front and side positions are proposed. These two methods only need two X-rays of the front and side of the dental film, the three-dimensional parameters are obtained through calculation and analysis of each spine in the X-ray film, and these parameters are used to fit the dental alveolar bone model. The experimental results prove that the computer-based image analysis system has a great effect on the measurement of X-ray dental film alveolar bone. The positive correlation coefficient reaches 0.87. Compared with the cerebral infarction caused by other methods, the proportion of people with dental film alveolar bone injury is about 15%; after treatment, the functional recovery rate reaches more than 80%. Studies have found that there is a great difference in the age of the population that needs to be treated for dental slices and alveolar bone. The grade of patients is generally under 20 and over 60. This shows that the measurement of X-ray dental film alveolar bone based on computer image analysis system can play an important role in protecting people's oral health.
APA, Harvard, Vancouver, ISO, and other styles
21

Oshima, Jun. "Differences in Knowledge-Building between Two Types of Networked Learning Environments: An Information-Flow Analysis." Journal of Educational Computing Research 19, no. 3 (October 1998): 329–51. http://dx.doi.org/10.2190/yllx-m9cw-15x9-bjj9.

Full text
Abstract:
The main purpose of the present study was to examine how elementary school students improve their scientific discourse on a computer-networked database environment called “Computer-Supported Intentional Learning Environments (CSILE).” Students in two, five to six grade combined classrooms taught by the same teacher engaged in their computer-mediated collaborative learning for five weeks by utilizing different system configurations: single-note based (S-CSILE) vs. discussion-note based (D-CSILE). Tracking files automatically recorded by the system were used for an analysis of students' learning activities. Results showed that system affordance specially designed for joint written discourse in D-CSILE significantly facilitated students' joint knowledge-transformation activities as well as maintained each individual's activity to pursue her own agenda. Importance of such a system affordance was discussed from the perspective of distributed cognition.
APA, Harvard, Vancouver, ISO, and other styles
22

Dereniowski, Dariusz, and Adam Stański. "On Tradeoffs Between Width- and Fill-like Graph Parameters." Theory of Computing Systems 63, no. 3 (July 28, 2018): 450–65. http://dx.doi.org/10.1007/s00224-018-9882-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Allison, Tanine. "Race and the digital face: Facial (mis)recognition in Gemini Man." Convergence: The International Journal of Research into New Media Technologies 27, no. 4 (July 29, 2021): 999–1017. http://dx.doi.org/10.1177/13548565211031041.

Full text
Abstract:
Ang Lee’s 2019 film Gemini Man features the most realistic digital human to grace the cinematic screen, specifically a computer-generated version of young Will Smith who battles his more aged self throughout the film. And for the first time in film history, this photorealistic digital human is Black. This essay explores why this groundbreaking achievement has not been acknowledged or celebrated by the film's production or publicity teams. I argue that Will Smith’s particular “post-racial” identity mediates contemporary concerns related to the racialized implications of facial recognition and other digital imaging technologies, as well as to the future of the film industry in the digital age. In the second half of the essay, I examine how the appearance of Will Smith in deepfake parody videos illustrates how race circulates on screens of various media formats. I conclude with a call to use digital visual effects, deepfake tools, and other advanced technologies to further racial justice instead of repeating the problematic usage of the past.
APA, Harvard, Vancouver, ISO, and other styles
24

Choi, Sunoh. "Malicious Powershell Detection Using Graph Convolution Network." Applied Sciences 11, no. 14 (July 12, 2021): 6429. http://dx.doi.org/10.3390/app11146429.

Full text
Abstract:
The internet’s rapid growth has resulted in an increase in the number of malicious files. Recently, powershell scripts and Windows portable executable (PE) files have been used in malicious behaviors. To solve these problems, artificial intelligence (AI) based malware detection methods have been widely studied. Among AI techniques, the graph convolution network (GCN) was recently introduced. Here, we propose a malicious powershell detection method using a GCN. To use the GCN, we needed an adjacency matrix. Therefore, we proposed an adjacency matrix generation method using the Jaccard similarity. In addition, we show that the malicious powershell detection rate is increased by approximately 8.2% using GCN.
APA, Harvard, Vancouver, ISO, and other styles
25

Yang, Xuejun, Li Wang, Jingling Xue, Yu Deng, and Ying Zhang. "Comparability graph coloring for optimizing utilization of stream register files in stream processors." ACM SIGPLAN Notices 44, no. 4 (February 14, 2009): 111–20. http://dx.doi.org/10.1145/1594835.1504195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Wang, Baoji, Rongqing Zhang, Chen Chen, Xiang Cheng, Liuqing Yang, Hang Li, and Ye Jin. "Graph-Based File Dispatching Protocol With D2D-Enhanced UAV-NOMA Communications in Large-Scale Networks." IEEE Internet of Things Journal 7, no. 9 (September 2020): 8615–30. http://dx.doi.org/10.1109/jiot.2020.2994549.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Wang, Jihua, and Huayu Wang. "A study of 3D model similarity based on surface bipartite graph matching." Engineering Computations 34, no. 1 (March 6, 2017): 174–88. http://dx.doi.org/10.1108/ec-10-2015-0315.

Full text
Abstract:
Purpose This study aims to compute 3D model similarity by extracting and comparing shape features from the neutral files. Design/methodology/approach In this work, the clear text encoding document STEP (Standard for The Exchange of Product model data) of 3D models was analysed, and the models were characterized by two-depth trees consisting of both surface and shell nodes. All surfaces in the STEP files can be subdivided into three kinds, namely, free, analytical and loop surfaces. Surface similarity is defined by the variation coefficients of distances between data points on two surfaces, and subsequently, the shell similarity and 3D model similarity are determined using an optimal algorithm for bipartite graph matching. Findings This approach is used to experimentally verify the effectiveness of the 3D model similarity algorithm. Originality/value The novelty of this study research lies in the computation of 3D model similarity by comparison of all surfaces. In addition, the study makes several key observations: surfaces reflect the most information concerning the functions and attributes of a 3D model and so the similarity between surfaces generates more comprehensive content (both external and internal); semantic-based 3D retrieval can be obtained under the premise of comparison of surface semantics; and more accurate similarity of 3D models can be obtained using the optimal algorithm of bipartite graph matching for all surfaces.
APA, Harvard, Vancouver, ISO, and other styles
28

Cui, Zhihao, and Chaobing Yan. "Deep Integration of Health Information Service System and Data Mining Analysis Technology." Applied Mathematics and Nonlinear Sciences 5, no. 2 (December 21, 2020): 443–52. http://dx.doi.org/10.2478/amns.2020.2.00063.

Full text
Abstract:
AbstractThe scale and complexity of health information service system has increased dramatically, and its development activities and management are difficult to control. In the field of, Traditional methods and simple mathematical statistics methods are difficult to solve the problems caused by the explosive growth of data and information, which will adversely affect health information service system management finally. So, it is particularly important to find valuable information from the source code, design documents and collected software datasets and to guide the development and maintenance of software engineering. Therefore, some experts and scholars want to use mature data mining technologies to study the large amount of data generated in software engineering projects (commonly referred to as software knowledge base), and further explore the potential and valuable information inherently hidden behind the software data. This article initially gives a brief overview of the relevant knowledge of data mining technology and computer software technology, using decision tree graph mining algorithm to mine the function adjustment graph of the software system definition class, and then source code annotations are added to the relevant calling relationships. Data mining technology and computer software technology are deeply integrated, and the decision tree algorithm in data mining is used to mine the knowledge base of computer software. Potential defect changes are listed as key maintenance objects. The historical versions of source code change files with defects are found dynamically and corrected in time, to avoid the increase of maintenance cost in the future.
APA, Harvard, Vancouver, ISO, and other styles
29

Rush, Perry O., and William R. Boone. "The Implementation of Virtual Instruction in Relation to X-ray Anatomy and Positioning in a Chiropractic Degree Program: A Descriptive Paper." Journal of Chiropractic Education 23, no. 1 (April 1, 2009): 40–46. http://dx.doi.org/10.7899/1042-5055-23.1.40.

Full text
Abstract:
This article provides information regarding the introduction of virtual education into classroom instruction, wherein a method of classroom instruction was developed with the use of a computer, digital camera, and various software programs. This approach simplified testing procedures, thus reducing institutional costs substantially by easing the demand for manpower, and seemed to improve average grade performance. Organized files with hundreds of digital pictures have created a range of instructor resources. Much of the new course materials were organized onto compact disks to complement course notes. Customizing presentations with digital technology holds potential benefits for students, instructors and the institution.
APA, Harvard, Vancouver, ISO, and other styles
30

Edwards, Nicholas Jain, David Tonny Brain, Stephen Carinna Joly, and Mariana Karry Masucato. "Hadoop distributed file system mechanism for processing of large datasets across computers cluster using programming techniques." International research journal of management, IT and social sciences 6, no. 6 (September 7, 2019): 1–16. http://dx.doi.org/10.21744/irjmis.v6n6.739.

Full text
Abstract:
In this paper, we have proved that the HDFS I/O operations performance is getting increased by integrating the set associativity in the cache design and changing the pipeline topology using fully connected digraph network topology. In read operation, since there is huge number of locations (words) at cache compared to direct mapping the chances of miss ratio is very low, hence reducing the swapping of the data between main memory and cache memory. This is increasing the memory I/O operations performance. In Write operation instead of using the sequential pipeline we need to construct the fully connected graph using the data blocks listed from the NameNode metadata. In sequential pipeline, the data is getting copied to source node in the pipeline. Source node will copy the data to next data block in the pipeline. The same copy process will continue until the last data block in the pipeline. The acknowledgment process has to follow the same process from last block to source block. The time required to transfer the data to all the data blocks in the pipeline and the acknowledgment process is almost 2n times to data copy time from one data block to another data block (if the replication factor is n).
APA, Harvard, Vancouver, ISO, and other styles
31

Ravdin, P. M., and G. J. Davis. "A computer program designed to assist in NSCLC adjuvant therapy decision making." Journal of Clinical Oncology 24, no. 18_suppl (June 20, 2006): 7230. http://dx.doi.org/10.1200/jco.2006.24.18_suppl.7230.

Full text
Abstract:
7230 Background: The goal the computer program, Adjuvant! for NSCLC, is provide the health professional, and the NSCLC patient with information that might be helpful when deciding about adjuvant chemotherapy for completely resected Stage 1 to Stage 3A disease. The 4 elements of this decision are: estimates of prognosis, estimates of the efficacy of adjuvant chemotherapy, estimates of completing mortality, and estimates of toxicity. Methods: Estimates of prognosis are based on an analysis of 17, 130 patients in the SEER registry. Estimates for 5 year disease specific mortality for Stage 1A, 1B, 2A, 2B, and 3A disease are 22%, 37%, 51%, 63%, and 70%. Multivariate analysis justifies using tumor size, histologic grade, and BAC histology for refinement of prognostic estimates for Stage 1 patients. A small subgroup of Stage 1 patients (with tumors < 10mm, low grade, BAC histology) could be identified with a 5 year disease specific mortality risk of <10%. Estimates of the efficacy of chemotherapy were derived from meta-analyses. A proportional risk reduction of 20% is used by the program as the default efficacy estimate for platinum-based therapy. Pop-ups and help files discuss areas of uncertainty and controversy. The user has the option to adjust the default efficacy estimate to the more optimistic efficacy estimates of some of the recent trials, or to take into account some of the uncertainty as to whether adjuvant chemotherapy works as well in Stage 1 disease. SEER data shows the average NSCLC patient has competing mortality rates higher than would be suggested by chronologic age. Regimen specific risks of treatment related toxicity is given by the program. In the over 100 pages of help files there is a detailed discussion of the clinical evidence, adjuvant therapy guidelines, ongoing clinical trials, and corollary areas (radiotherapy, neoadjuvant therapy, biomarkers, etc.) Results: The program projects that for patients with favorable Stage 1 tumors the absolute OS benefit of therapy may be as low as 1% (of the same size as the risk of treatment related mortality), but for patients with Stage 2 and 3A disease the 5 yr OS benefit may be ∼10%. Conclusions: The impact of this tool on physician and patient knowledge about the adjuvant therapy of NSCLC will be presented. [Table: see text]
APA, Harvard, Vancouver, ISO, and other styles
32

M. Ramos, Maria Cristina, and Janice E. Velasquez. "DESIGN AND DEVELOPMENT OF AN ONLINE EXAM MAKER AND CHECKER." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 10, no. 5 (August 20, 2013): 1598–640. http://dx.doi.org/10.24297/ijct.v10i5.4151.

Full text
Abstract:
The study is an online, computer aided tool that was designed primarily for the conduct of online examination. The system was created using PHP, a web based scripting language, and MySQL as the database software. The system focuses on the automation of students' examinations; preparation, scheduling, checking and grading. A database is provided for the storage of exam questions, answers to questions and students' records. The system allows instructors to create an exam by entering questions with its corresponding answers into the database. Instructors are provided with three options on the type of exam; these include, True or False, Multiple Choice and Fill in the Blanks.There are three account types based on the intended users. One is the Administrator Account; this can be used to create instructor accounts. It can also be used to delete or suspend other accounts based on activity status. The Instructor Account allows teachers to create student accounts and enroll the same. This account can be used also to create, activate, edit, delete exams and monitor students' performances. The Student Account is for the officially enrolled students where they can take exams and view scores even from previous examinations.This software allows instructors to keep track of students' performances from all exams since the results will be stored in a database linked to an online system. While taking the online exam, students can choose the number of exam questions that will be displayed on the screen at a given time.A student can take the exam only on the specified date and time set by the instructor. Ideally, a particular exam should be taken only once. In cases of retakes due to valid reasons and special exam considerations, the instructor is given the option to administer the previously activated exam, edit or create a new set of questions.One limitation though, this online system is not to be used to compute for the class performance for the final grade since this requires other components such as seat works, graded recitations, laboratory activities, etc. This only computes and shows the scores from previous exams and the average.
APA, Harvard, Vancouver, ISO, and other styles
33

Sari, Sari Ali, and Kamaruddin Malik Mohamad. "Recent research in finding the optimal path by ant colony optimization." Bulletin of Electrical Engineering and Informatics 10, no. 2 (April 1, 2021): 1015–23. http://dx.doi.org/10.11591/eei.v10i2.2690.

Full text
Abstract:
The computation of the optimal path is one of the critical problems in graph theory. It has been utilized in various practical ranges of real world applications including image processing, file carving and classification problem. Numerous techniques have been proposed in finding optimal path solutions including using ant colony optimization (ACO). This is a nature-inspired metaheuristic algorithm, which is inspired by the foraging behavior of ants in nature. Thus, this paper study the improvement made by many researchers on ACO in finding optimal path solution. Finally, this paper also identifies the recent trends and explores potential future research directions in file carving.
APA, Harvard, Vancouver, ISO, and other styles
34

Bougoffa, Lazhar, Jun-Sheng Duan, and Randolph Rach. "Exact and approximate analytic solutions of the thin film flow of fourth-grade fluids by the modified Adomian decomposition method." International Journal of Numerical Methods for Heat & Fluid Flow 26, no. 8 (November 7, 2016): 2432–40. http://dx.doi.org/10.1108/hff-07-2015-0278.

Full text
Abstract:
Purpose The purpose of this paper is to first deduce a new form of the exact analytic solution of the well-known nonlinear second-order differential equation subject to a set of mixed nonlinear Robin and Neumann boundary conditions that model the thin film flows of fourth-grade fluids, and second to compare the approximate analytic solutions by the Adomian decomposition method (ADM) with the new exact analytic solution to validate its accuracy for parametric simulations of the thin film fluid flows, even for more complex models of non-Newtonian fluids in industrial applications. Design/methodology/approach The approach to calculating a new form of the exact analytic solution of thin film fluid flows rests upon a sequence of transformations including the modification of the classic technique due to Scipione del Ferro and Niccolò Fontana Tartaglia. Next the authors establish a lemma that justifies the new expression of the exact analytic solution for thin film fluid flows of fourth-grade fluids. Second, the authors apply a modification of the systematic ADM to quickly and easily calculate the sequence of analytic approximate solutions for this strongly nonlinear model of thin film flow of fourth-grade fluids. The ADM has been previously demonstrated to be eminently practical with widespread applicability to frontier problems arising in scientific and engineering applications. Herein, the authors seek to establish the relative merits of the ADM in the context of the thin film flows of fourth-grade fluids. Findings The ADM is shown to closely agree with the new expression of the exact analytic solution. The authors have calculated the error remainder functions and the maximal error remainder parameters in the error analysis to corroborate the solutions. The error analysis demonstrates the rapid rate of convergence and that we can approximate the exact solution as closely as we please; furthermore the rate of convergence is shown to be approximately exponential, and thus only a low-stage approximation will be adequate for engineering simulations as previously documented in the literature. Originality/value This paper presents an accurate work for solving thin film flows of fourth-grade fluids. The authors have compared the approximate analytic solutions by the ADM with the new expression of the exact analytic solution for this strongly nonlinear model. The authors commend this technique for more complex thin film fluid flow models.
APA, Harvard, Vancouver, ISO, and other styles
35

HOSAIN, MD SHAZZAD, and MUHAMMAD ABDUL HAKIM NEWTON. "MULTI-KEY INDEX FOR DISTRIBUTED DATABASE SYSTEM." International Journal of Software Engineering and Knowledge Engineering 15, no. 02 (April 2005): 433–38. http://dx.doi.org/10.1142/s0218194005002075.

Full text
Abstract:
In this paper we present a multi-key index model that enables us to search a record with more than one attribute values in distributed database systems. Indices provide fast and efficient access of data and so become a major aspect in centralized database systems. Most of the centralized database systems use B + tree or other types of index structures such as bit vector, graph structure, grid file etc. But in distributed database systems no index model is found in the literature. Therefore efficient access is a major problem in distributed databases. Our proposed index model avoids the query-flooding problem of existing system and thus optimizes network bandwidth.
APA, Harvard, Vancouver, ISO, and other styles
36

Wang, Zhanquan, Taoli Han, and Huiqun Yu. "Research of MDCOP mining based on time aggregated graph for large spatio-temproal data sets." Computer Science and Information Systems 16, no. 3 (2019): 891–914. http://dx.doi.org/10.2298/csis180828032w.

Full text
Abstract:
Discovering mixed-drove spatiotemporal co-occurrence patterns (MDCOPs) is important for network security such as distributed denial of service (DDoS) attack. There are usually many features when we are suffering from a DDoS attacks such as the server CPU is heavily occupied for a long time, bandwidth is hoovered and so on. In distributed cooperative intrusion, the feature information from multiple intrusion detection sources should be analyzed simultaneously to find the spatial correlation among the feature information. In addition to spatial correlation, intrusion also has temporal correlation. Some invasions are gradually penetrating, and attacks are the result of cumulative effects over a period of time. So it is necessary to discover mixed-drove spatiotemporal co-occurrence patterns (MDCOPs) in network security. However, it is difficult to mine MDCOPs from large attack event data sets because mining MDCOPs is computationally very expensive. In information security, the set of candidate co-occurrence attack event data sets is exponential in the number of object-types and the spatiotemporal data sets are too large to be managed in memory. To reduce the number of candidate co-occurrence instances, we present a computationally efficient MDCOP Graph Miner algorithm by using Time Aggregated Graph. which can deal with large attack event data sets by means of file index. The correctness, completeness and efficiency of the proposed methods are analyzed.
APA, Harvard, Vancouver, ISO, and other styles
37

Rasheed, Amer, Rab Nawaz, Sohail Ahmed Khan, Hanifa Hanif, and Abdul Wahab. "Numerical study of a thin film flow of fourth grade fluid." International Journal of Numerical Methods for Heat & Fluid Flow 25, no. 4 (May 5, 2015): 929–40. http://dx.doi.org/10.1108/hff-06-2014-0188.

Full text
Abstract:
Purpose – The purpose of this paper is to study the thin film flow of a fourth grade fluid subject to slip conditions in order to understand its velocity profile. Design/methodology/approach – An exact expression for flow velocity is derived in terms of hyperbolic sine functions. The practical usage of the exact flow velocity is restrictive as it involves very complicated integrals. Therefore, an approximate solution is also derived using a Galerkin finite element method and numerical error analysis is performed. Findings – The behavior of fluid velocity with respect to various flow parameters is discussed. The results are not restrictive to small values of flow parameters unlike those obtained earlier using homotopy analysis method and homotopy perturbation method. Originality/value – An approximate solution based on finite element technique is derived.
APA, Harvard, Vancouver, ISO, and other styles
38

Liu, Yuansheng, and Jinyan Li. "Hamming-shifting graph of genomic short reads: Efficient construction and its application for compression." PLOS Computational Biology 17, no. 7 (July 19, 2021): e1009229. http://dx.doi.org/10.1371/journal.pcbi.1009229.

Full text
Abstract:
Graphs such as de Bruijn graphs and OLC (overlap-layout-consensus) graphs have been widely adopted for the de novo assembly of genomic short reads. This work studies another important problem in the field: how graphs can be used for high-performance compression of the large-scale sequencing data. We present a novel graph definition named Hamming-Shifting graph to address this problem. The definition originates from the technological characteristics of next-generation sequencing machines, aiming to link all pairs of distinct reads that have a small Hamming distance or a small shifting offset or both. We compute multiple lexicographically minimal k-mers to index the reads for an efficient search of the weight-lightest edges, and we prove a very high probability of successfully detecting these edges. The resulted graph creates a full mutual reference of the reads to cascade a code-minimized transfer of every child-read for an optimal compression. We conducted compression experiments on the minimum spanning forest of this extremely sparse graph, and achieved a 10 − 30% more file size reduction compared to the best compression results using existing algorithms. As future work, the separation and connectivity degrees of these giant graphs can be used as economical measurements or protocols for quick quality assessment of wet-lab machines, for sufficiency control of genomic library preparation, and for accurate de novo genome assembly.
APA, Harvard, Vancouver, ISO, and other styles
39

Lin, Yung Chin, Kuo Lan Su, and Chih Hung Chang. "Development of the Searching Algorithm with Complexity Environment for Mobile Robots." Applied Mechanics and Materials 284-287 (January 2013): 1826–30. http://dx.doi.org/10.4028/www.scientific.net/amm.284-287.1826.

Full text
Abstract:
The article programs the shortest path searching problems of the mobile robot in the complexity unknown environment, and uses the mobile robot to present the movement scenario from the start point to the target point in a collision-free space. The complexity environment contains variety obstacles, such as road, tree, river, gravel, grass, highway and unknown obstacle. We set the relative dangerous grade for variety obstacles. The mobile robot searches the target point to locate the positions of unknown obstacles, and avoids these obstacles moving in the motion platform. We develop the user interface to help users filling out the positions of the mobile robot and the obstacles on the supervised computer, such the initial point of the mobile robot, the start point and the target point. The supervised computer programs the motion paths of the mobile robot according to A* searching algorithm, flood-fill algorithm and 2-op exchange algorithm The simulation results present the proposed algorithms that program the shortest motion paths from the initial point approach to the target point for the mobile robot. The supervised computer controls the mobile robot that follows the programmed motion path moving to the target point in the complexity environment via wireless radio frequency (RF) interface.
APA, Harvard, Vancouver, ISO, and other styles
40

Voicu, Adrian Catalin, Ion Gheorghe Gheorghe, Liliana Laura Badita, and Adriana Cirstoiu. "3D Measuring of Complex Automotive Parts by Multiple Laser Scanning." Applied Mechanics and Materials 371 (August 2013): 519–23. http://dx.doi.org/10.4028/www.scientific.net/amm.371.519.

Full text
Abstract:
Three-dimensional scanning is available for more than 15 years, however there are few that have heard of it and as few people know the applications of this technology. 3D scanning is also known as 3D digitizing, the name coming from the fact that this is a process that uses a contact or non-contact digitizing probe to capture the objects form and recreate them in a virtual workspace through a very dense network of points (xyz) as a 3D graph representation. Based on this information have been developed many new applications in many fields - computer games industry, prosthetics or forensic medicine, the arts and culture area - but the most common area where scanning systems are used remains the automotive industry, aircraft and consumer goods. Most automotive manufacturers currently use 3D metrology based on optical or laser systems to validate products quality. The pieces are initially measured by 3D scanning then they are compared with the designed model (CAD file) using a specialized software. By this comparison producer can interfere very quickly in the manufacturing process to remove the cause of defects, this technique being called Reverse Engineering (RE). The overall accuracy of a 3D acquisition system depends above all on the sensors precision and on the acquisition device (acquisition with contact) or acquisition structure (acquisition without contact). This accuracy may vary from micron to millimeter and the acquisitions size from a few points to several thousand points per second. In a perfect world or in an integrated production environment, 3D measuring systems should be able to measure all the necessary parameters in a single step without errors, and to render the results in the same way to the manufacturing networks equipped with computers, in formats useful for machines control and processes management.
APA, Harvard, Vancouver, ISO, and other styles
41

Nosrati, Masoud, and Mahmood Fazlali. "Community-based replica management in distributed systems." International Journal of Web Information Systems 14, no. 1 (April 16, 2018): 41–61. http://dx.doi.org/10.1108/ijwis-01-2017-0006.

Full text
Abstract:
Purpose One of the techniques for improving the performance of distributed systems is data replication, wherein new replicas are created to provide more accessibility, fault tolerance and lower access cost of the data. In this paper, the authors propose a community-based solution for the management of data replication, based on the graph model of communication latency between computing and storage nodes. Communities are the clusters of nodes that the communication latency between the nodes are minimum values. The purpose of this study if to, by using this method, minimize the latency and access cost of the data. Design/methodology/approach This paper used the Louvain algorithm for finding the best communities. In the proposed algorithm, by requesting a file according to the nodes of each community, the cost of accessing the file located out of the applicant’s community was calculated and the results were accumulated. On exceeding the accumulated costs from a specified threshold, a new replica of the file was created in the applicant’s community. Besides, the number of replicas of each file should be limited to prevent the system from creating useless and redundant data. Findings To evaluate the method, four metrics were introduced and measured, including communication latency, response time, data access cost and data redundancy. The results indicated acceptable improvement in all of them. Originality/value So far, this is the first research that aims at managing the replicas via community detection algorithms. It opens many opportunities for further studies in this area.
APA, Harvard, Vancouver, ISO, and other styles
42

Zinchenko, Mykhailo, Oleh Potap, Maria Rybalchenko, and Ivan Manachyn. "Modernization of laboratory rolling mill 150 for the formation of students 'studies of automation object research." System technologies 3, no. 134 (April 5, 2021): 87–98. http://dx.doi.org/10.34185/1562-9945-3-134-2021-10.

Full text
Abstract:
Studying the operation of automated control systems using computers significantly re-duces the time, but does not give a complete picture of the system on a real object. Therefore, the use of real objects for the study of control systems in the educational process is appropriate and useful. The purpose of the study is to modernize the laboratory rolling mill 150, designed for rolling lead, tin and plasticine, and equip it with sensors and actuators. The manual pressure device of the rolling stand was replaced by an automated one, for which an worm gearbox was additionally installed, which allowed to increase the total gear ratio to 94.5. The thrust screws are moved from the AC motor, which is controlled by the DOP-103BQ operator panel and the MS-300 frequency converter with built-in PLC. As a displacement sensor used photopulse sensor PDF-3. The installed equipment and the developed software for the operator panel and the PLC provided high accuracy of in-stallation of pressure screws in the set position. Additionally, software was developed to measure the power parameters of the rolling process: the rolling force and the electrical parameters of the DC motor of the drive of the rolling stand. The software allows you to configure the board, ie select the type of board, the channels used to measure voltage signals, select measurement ranges, signal color on the graph, signal polling frequency, number of points to display on the graph, parameters of graph coordinate axes. In the process of measurement, the output of signals to the monitor screen is performed simultaneously. Before rolling, the measurement process is started using the keyboard or mouse and the change of parameters is displayed on the screen in real time. At the end of the rolling process, the measurement stops, and the graphs of parameter changes over time remain on the monitor screen, which allows you to quickly analyze the process. The measurement results can be saved in an Excel file and then the file can be viewed. The file stores: the time of measurement of parameters and the values of parameters those were measured. Measurement of power parameters and sizes of rolled products before and after rolling allowed to determine the stiffness of the stand and rolled metal, which is necessary to calculate the transmission coefficients of the automated tuning system of the rolling stand.
APA, Harvard, Vancouver, ISO, and other styles
43

Hayat, T., M. Waqas, Sabir Ali Shehzad, and A. Alsaedi. "Mixed convection flow of viscoelastic nanofluid by a cylinder with variable thermal conductivity and heat source/sink." International Journal of Numerical Methods for Heat & Fluid Flow 26, no. 1 (January 4, 2016): 214–34. http://dx.doi.org/10.1108/hff-02-2015-0053.

Full text
Abstract:
Purpose – The purpose of this paper is to examine the effects of variable thermal conductivity in mixed convection flow of viscoelastic nanofluid due to a stretching cylinder with heat source/sink. Design/methodology/approach – The authors have computed the existence of the solution for Walter’s B and second grade fluids corresponding to Pr=0.5 and Pr=1.5. Skin-friction coefficient, local Nusselt and Sherwood numbers are computed numerically for different values of emerging parameters. Findings – A comparative study with the existing solutions in a limiting sense is made and analyzed. The authors found that the dimensionless velocity filed and momentum boundary layer thickness are increased when the values of viscoelastic parameter increase. The present non-Newtonian fluid flow reduces to the viscous flow in the absence of viscoelastic parameter. The larger values of viscoelastic parameter corresponds to the higher values of local Nusselt and Sherwood numbers. Originality/value – No such analysis exists in the literature yet.
APA, Harvard, Vancouver, ISO, and other styles
44

Zheng, Weiwei, Huimin Yu, and Wei Huang. "Visual tracking via Graph Regularized Kernel Correlation Filer and Multi-Memory Voting." Journal of Visual Communication and Image Representation 55 (August 2018): 688–97. http://dx.doi.org/10.1016/j.jvcir.2018.08.004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Sun, Qitong, Jun Han, and Dianfu Ma. "A Framework for Service Semantic Description Based on Knowledge Graph." Electronics 10, no. 9 (April 24, 2021): 1017. http://dx.doi.org/10.3390/electronics10091017.

Full text
Abstract:
To construct a large-scale service knowledge graph is necessary. We propose a method, namely semantic information extension, for service knowledge graphs. We insist on the information of services described by Web Services Description Language (WSDL) and we design the ontology layer of web service knowledge graph and construct the service graph, and using the WSDL document data set, the generated service knowledge graph contains 3738 service entities. In particular, our method can give a full performance to its effect in service discovery. To evaluate our approach, we conducted two sets of experiments to explore the relationship between services and classify services that develop by service descriptions. We constructed two experimental data sets, then designed and trained two different deep neural networks for the two tasks to extract the semantics of the natural language used in the service discovery task. In the prediction task of exploring the relationship between services, the prediction accuracy rate reached 95.1%, and in the service classification experiment, the accuracy rate of TOP5 reached 60.8%. Our experience shows that the service knowledge graph has additional advantages over traditional file storage when managing additional semantic information is effective and the new service representation method is helpful for service discovery and composition tasks.
APA, Harvard, Vancouver, ISO, and other styles
46

Katsoufis, Petros, Maria Katsaiti, Christos Mourelas, Tatiana Santos Andrade, Vassilios Dracopoulos, Constantin Politis, George Avgouropoulos, and Panagiotis Lianos. "Study of a Thin Film Aluminum-Air Battery." Energies 13, no. 6 (March 20, 2020): 1447. http://dx.doi.org/10.3390/en13061447.

Full text
Abstract:
A thin film aluminum-air battery has been constructed using a commercial grade Al-6061 plate as anode electrode, an air-breathing carbon cloth carrying an electrocatalyst as cathode electrode, and a thin porous paper soaked with aqueous KOH as electrolyte. This type of battery demonstrates a promising behavior under ambient conditions of 20 °C temperature and around 40% humidity. It presents good electric characteristics when plain nanoparticulate carbon (carbon black) is used as electrocatalyst but it is highly improved when MnO2 particles are mixed with carbon black. Thus, the open-circuit voltage was 1.35 V, the short-circuit current density 50 mA cm−2, and the maximum power density 20 mW cm−2 in the absence of MnO2 and increased to 1.45 V, 60 mA cm−2, and 28 mW cm−2, respectively, in the presence of MnO2. The corresponding maximum energy yield during battery discharge was 4.9 mWh cm−2 in the absence of MnO2 and increased to 5.5 mWh cm−2 in the presence of MnO2. In the second case, battery discharge lasted longer under the same discharge conditions. The superiority of the MnO2-containing electrocatalyst is justified by electrode electrochemical characterization data demonstrating reduction reactions at higher potential and charge transfer with much smaller resistance.
APA, Harvard, Vancouver, ISO, and other styles
47

Kamola, Mariusz. "Sensitivity of Importance Metrics for Critical Digital Services Graph to Service Operators’ Self-Assessment Errors." Security and Communication Networks 2019 (September 23, 2019): 1–8. http://dx.doi.org/10.1155/2019/7510809.

Full text
Abstract:
Interdependency of critical digital services can be modeled in the form of a graph with exactly known structure but with edge weights subject to estimation errors. We use standard and custom centrality indexes to measure each service vulnerability. Vulnerability of all nodes in the graph gets aggregated in a number of ways into a single network vulnerability index for services whose operation is critical for the state. This study compares sensitivity of various centralities combined with various aggregation methods to errors in edge weights reported by service operators. We find that many of those combinations are quite robust and can be used interchangeably to reflect various perceptions of network vulnerability. We use graphs of source files’ dependencies for a number of open-source projects, as a good analogy for real critical services graph, which will remain confidential.
APA, Harvard, Vancouver, ISO, and other styles
48

Su, Ming-Yang, Hong-Siou Wei, Xin-Yu Chen, Po-Wei Lin, and Ding-You Qiu. "Using Ad-Related Network Behavior to Distinguish Ad Libraries." Applied Sciences 8, no. 10 (October 9, 2018): 1852. http://dx.doi.org/10.3390/app8101852.

Full text
Abstract:
Mobile app ads pose a far greater security threat to users than adverts on computer browsers. This is because app developers must embed a Software Development Kit (SDK), called an ad library or ad lib for short, provided by ad networks (i.e., ad companies) into their app program, and then merge and compile it into an Android PacKage (APK) execution file. The ad lib thus becomes a part of the entire app, and shares the whole permissions granted to the app. Unfortunately, this also resulted in many security issues, such as ad libs abusing the permissions to collect and leak private data, ad servers redirecting ad requests to download malicious JavaScript from unknown servers to execute it in the background of the mobile operating system without the user’s consent. The more well-known an embedded ad lib, the safer the app may be, and vice versa. Importantly, while decompiling an APK to inspect its source code may not identify the ad lib(s), executing the app on a simulator can reveal the network behavior of the embedded ad lib(s). Ad libs exhibit different behavior patterns when communicating with ad servers. This study uses a dynamic analysis method to inspect an executing app, and plots the ad lib behavior patterns related to the advertisement into a graph. It is then determined whether or not the ad lib is from a trusted ad network using comparisons of graph similarities.
APA, Harvard, Vancouver, ISO, and other styles
49

Durak, Hatice Yildiz, and Tolga Guyer. "Programming with Scratch in primary school, indicators related to effectiveness of education process and analysis of these indicators in terms of various variables." Gifted Education International 35, no. 3 (June 6, 2019): 237–58. http://dx.doi.org/10.1177/0261429419854223.

Full text
Abstract:
Since programming processes involve different thinking skills and different fields of knowledge, it is especially important for children to acquire 21st-century skills. Even though the programming education activities are being intensively applied, it can be said that there is a gap in quantitative researches supporting the effort to reveal the direct or indirect effectiveness of the learning–teaching processes for the programming education. This study, which was done to fill this gap, aims to examine the degree to which students learn programming concepts (PC) and to identify effective variables in that process with a developed curriculum for gifted students studying in the second–third–fourth grade in primary schools. For this purpose, a 15-week application was carried out and each student developed an individual project. In the study, a criterion list, observation forms and peer evaluations were used based on PC to examine projects and learning process. The scores obtained from these tools were used to examine the application of each participant, to comment on the effective variables and the adequacy of the teaching process. The evidence from this study intimates that female participants obtained higher scores than male ones in programming education. Those scores are higher in 9 and 10 age group of students than others. Those who haven’t had Internet access, who have never used computer or have had access to Internet as well as who haven’t had any computer courses had lower scores than others. The upshot of this is that previous computer technology experiences of students may have affected the scores obtained programming education process.
APA, Harvard, Vancouver, ISO, and other styles
50

Davis, M. J., J. Wythe, J. S. Rozum, and R. W. Gore. "Use of World Wide Web server and browser software to support a first-year medical physiology course." Advances in Physiology Education 272, no. 6 (June 1997): S1. http://dx.doi.org/10.1152/advances.1997.272.6.s1.

Full text
Abstract:
We describe the use of a World Wide Web (Web) server to support a team-taught physiology course for first-year medical students. Our objectives were to reduce the number of formal lecture hours and enhance student enthusiasm by using more multimedia materials and creating opportunities for interactive learning. On-line course materials, consisting of administrative documents, lecture notes, animations, digital movies, practice tests, and grade reports, were placed on a departmental computer with an Internet connection. Students used Web browsers to access on-line materials from a variety of computing platforms on campus, at home, and at remote sites. To assess use of the materials and their effectiveness, we analyzed 1) log files from the server, and 2) the results of a written course evaluation completed by all students. Lecture notes and practice tests were the most-used documents. The students' evaluations indicated that computer use in class made the lecture material more interesting, while the on-line documents helped reinforce lecture materials and the textbook. We conclude that the effectiveness of on-line materials depends on several different factors, including 1) the number of instructors that provide materials; 2) the quantity of other materials handed out; 3) the degree to which computer use is demonstrated in class and integrated into lectures; and 4) the ease with which students can access the materials. Finally, we propose that additional implementation of Internet-based resources beyond what we have described would further enhance a physiology course for first-year medical students.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography