Academic literature on the topic 'DYNAMO (Computer program language)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'DYNAMO (Computer program language).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "DYNAMO (Computer program language)"

1

Irshad, Mufeeda, Merel Keijzer, Martijn Wieling, and Marjolijn Verspoor. "Effectiveness of a dynamic usage based computer assisted language program." Dutch Journal of Applied Linguistics 8, no. 2 (April 11, 2019): 137–62. http://dx.doi.org/10.1075/dujal.16018.irs.

Full text
Abstract:
Abstract The current paper explores whether a Dynamic Usage Based (DUB) approach – which takes authentic meaningful language use with repetition and scaffolding for comprehension as its basis – can also be implemented in a CALL environment. The effectiveness of the DUB-CALL program was tested in a semester-long experiment, comparing it with a teacher-fronted DUB program (using the same materials as the CALL program) and a traditional CLT program; 228 university undergraduates in Sri Lanka participated. Language gains were assessed in a pre-post design with an objective General English Proficiency (GEP) test and a writing task. The results show that the students in the DUB-CALL condition performed significantly better on the GEP test than the students in the two teacher-fronted classes. The results of the writing tests show that all groups improved significantly, but here there were no differences among groups.
APA, Harvard, Vancouver, ISO, and other styles
2

Leu, M. C., and N. Hemati. "Automated Symbolic Derivation of Dynamic Equations of Motion for Robotic Manipulators." Journal of Dynamic Systems, Measurement, and Control 108, no. 3 (September 1, 1986): 172–79. http://dx.doi.org/10.1115/1.3143765.

Full text
Abstract:
A general computer program for deriving the dynamic equations of motion for robotic manipulators using the symbolic language MACSYMA has been developed. The program, developed based on the Lagrange formalism, is applicable to manipulators of any number of degrees of freedom. Examples are given to illustrate how to use this program for dynamic equation generation. Advantages of expanding the dynamic equations into symbolic form are presented. Techniques for improving efficiency of equation generation, overcoming computer memory limitation, and approximating manipulator dynamics are discussed.
APA, Harvard, Vancouver, ISO, and other styles
3

CLARK, DAVID, ROBERTO GIACOBAZZI, and CHUNYAN MU. "Foreword: programming language interference and dependence." Mathematical Structures in Computer Science 21, no. 6 (October 27, 2011): 1109–10. http://dx.doi.org/10.1017/s0960129511000168.

Full text
Abstract:
Interference and dependence are closely related concepts: interference being the observable phenomenon connected with dependence. Essentially, interference means that the behaviour of some parts of a dynamic system may influence the behaviour of other parts of the same system, while dependence specifies how the semantics of sub-components of a dynamic system are related. Identifying, measuring and controlling interference is essential in many aspects of modern computer science, in particular, in security, program analysis and verification, debugging, systems specification, model checking, program manipulation, program slicing, reverse engineering, data mining, distributed databases and systems biology. In all these fields, dependency and interference play a key role in designing suitable abstractions or in partitioning complex systems into simpler ones. Reasoning about dependency and interference requires theories, models and semantics, as well as algorithms and tools for their analysis. Beginning in 2004, the series of Programming Language Interference and Dependence (PLID) workshops has been devoted to promoting and spreading cutting-edge research in this field, with a particular emphasis on unpublished results with great impact on the theoretical basis. PLID2007, which was held at the The Technical University of Denmark on 21 August 2007, was particularly successful, and constituted the ideal forum for announcing a call for papers for a special journal issue on programming language interference and dependence, which would not necessarily be restricted to PLID2007 contributions. From the many expressions of interest, we selected six contributions by leading researchers in the field, some of which had been presented at the PLID2007 workshop. The selected papers focus on foundational aspects of dependency and interference, with applications in language-based security, data-base management systems and program slicing.
APA, Harvard, Vancouver, ISO, and other styles
4

JARZABEK, STAN, HONGYU ZHANG, SHEN RU, VU TUNG LAM, and ZHENXIN SUN. "ANALYSIS OF META-PROGRAMS: AN EXAMPLE." International Journal of Software Engineering and Knowledge Engineering 16, no. 01 (February 2006): 77–101. http://dx.doi.org/10.1142/s0218194006002689.

Full text
Abstract:
Meta-programs are generic, incomplete, adaptable programs that are instantiated at construction time to meet specific requirements. Templates and generative techniques are examples of meta-programming techniques. Understanding of meta-programs is more difficult than understanding of concrete, executable programs. Static and dynamic analysis methods have been applied to ease understanding of programs — can similar methods be used for meta-programs? In our projects, we build meta-programs with a meta-programming technique called XVCL. Meta-programs in XVCL are organized into a hierarchy of meta-components from which the XVCL processor generates concrete, executable programs that meet specific requirements. We developed an automated system that analyzes XVCL meta-programs, and presents developers with information that helps them work with meta-programs more effectively. Our system conducts both static and dynamic analysis of a meta-program. An integral part of our solution is a query language, FQL in which we formulate questions about meta-program properties. An FQL query processor automatically answers a class of queries. The analysis method described in the paper is specific to XVCL. However, the principle of our approach can be applied to other meta-programming systems. We believe readers interested in meta-programming in general will find some of the lessons from our experiment interesting and useful.
APA, Harvard, Vancouver, ISO, and other styles
5

Hrushko, Svitlana. "MODELS OF TRANSLATION EQUIVALENCE IN MACHINE TRANSLATION: PRAGMATIC ASPECT." Naukovy Visnyk of South Ukrainian National Pedagogical University named after K. D. Ushynsky: Linguistic Sciences 2020, no. 30 (March 2020): 58–74. http://dx.doi.org/10.24195/2616-5317-2020-30-4.

Full text
Abstract:
The purpose of the article is to study problems of translation equivalence in machine translation, which is based on a sequence of invariable actions (algorithms) with a text to identify linguistic equivalents in a pair of languages at a given direction of translation by means of a computer, in respect of the pragmatic aspect. Translation equivalence is understood as a specific type of equivalence, which is fundamentally different from other types, since it does not correlate with the phenomena that have a special place in the structure of a language, but the phenomena that currently exist in a language correlation or are equivalent to the text content. The translation is formalized, but allows getting an idea of the text content at the introductory level, since it is not an accurate, adequate translation, but performs the function of rendering basic information. Machine translation is not able to render nuances of an original text, not only at the lexical level. When translating, it is necessary to take peculiarities of syntax and semantics into account. Adequate computer translation is almost impossible in this case. This fact is recognized by all scholars who study possibilities of this type of translation only when rendering main content of a document without taking language nuances and features into account. Machine translation can be carried out on a basis of the translation equivalence (objective and dynamic) model. The model in terms of linguistic technology provides an optimal solution of problems of independent linguistic description and algorithm. The system of translation equivalence, which can be implemented within the model of translation equivalence, allows providing sufficient quality of machine translation at the pre-editing stage. When creating a machine translation program, in addition to solving linguistic problems, a program of their implementation is also necessary, since a translation program is a tool for studying and finding information in a foreign language, and the prospects of a machine translation are related to the further development of translation theory and practice in general.
APA, Harvard, Vancouver, ISO, and other styles
6

VENNEKENS, JOOST, MARC DENECKER, and MAURICE BRUYNOOGHE. "CP-logic: A language of causal probabilistic events and its relation to logic programming." Theory and Practice of Logic Programming 9, no. 3 (May 2009): 245–308. http://dx.doi.org/10.1017/s1471068409003767.

Full text
Abstract:
AbstractThis paper develops a logical language for representing probabilistic causal laws. Our interest in such a language is two-fold. First, it can be motivated as a fundamental study of the representation of causal knowledge. Causality has an inherent dynamic aspect, which has been studied at the semantical level by Shafer in his framework of probability trees. In such a dynamic context, where the evolution of a domain over time is considered, the idea of a causal law as something which guides this evolution is quite natural. In our formalization, a set of probabilistic causal laws can be used to represent a class of probability trees in a concise, flexible and modular way. In this way, our work extends Shafer's by offering a convenient logical representation for his semantical objects. Second, this language also has relevance for the area of probabilistic logic programming. In particular, we prove that the formal semantics of a theory in our language can be equivalently defined as a probability distribution over the well-founded models of certain logic programs, rendering it formally quite similar to existing languages such as ICL or PRISM. Because we can motivate and explain our language in a completely self-contained way as a representation of probabilistic causal laws, this provides a new way of explaining the intuitions behind such probabilistic logic programs: we can say precisely which knowledge such a program expresses, in terms that are equally understandable by a non-logician. Moreover, we also obtain an additional piece of knowledge representation methodology for probabilistic logic programs, by showing how they can express probabilistic causal laws.
APA, Harvard, Vancouver, ISO, and other styles
7

Pan, D., and R. S. Sharp. "Automatic Formulation of Dynamic Equations of Motion of Robot Manipulators." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 202, no. 6 (November 1988): 397–404. http://dx.doi.org/10.1243/pime_proc_1988_202_141_02.

Full text
Abstract:
Based on the use of homogeneous transformation matrices with Denavit-Hartenberg notation and the Lagrangian formulation method, a general computer program ROBDYN.RED for the symbolic derivation of dynamic equations of motion for robot manipulators has been developed and is discussed in this paper. The program is developed by using REDUCE, an algebraic manipulation language, and is versatile for open-chain structure robot manipulators with any number of degrees of freedom and with any combination of types of joint. Considerations are also given to saving computer memory space required for execution and to minimizing the runtime. Several examples are included to demonstrate the use of the program. Equations of motion in scalar form can be automatically transferred to FORTRAN format for later numerical simulations. The efficiency of the resulting equations in terms of numerical integration is also discussed and some further developments to improve the efficiency are suggested.
APA, Harvard, Vancouver, ISO, and other styles
8

Ahmadi, A., P. H. Robinson, F. Elizondo, and P. Chilibroste. "Implementation of CTR Dairy Model Using the Visual Basic for Application Language of Microsoft Excel." International Journal of Agricultural and Environmental Information Systems 9, no. 3 (July 2018): 74–86. http://dx.doi.org/10.4018/ijaeis.2018070105.

Full text
Abstract:
The goal of the article is to implement the CTR Dairy model using the Visual Basic for Application (VBA) language of Microsoft Excel. CTR Dairy is a dynamic simulation model for grazing lactating dairy cows that predicts milk production and profit over feeding based on ruminal digestion and absorption of nutrients under discontinuous feeding schedules. The CTR Dairy model was originally developed as a research tool using a proprietary computer simulation software called SMART that required the SMART client to run the program. As SMART software is now discontinued, and its client is no longer available, rewriting the model in the VBA language using Microsoft Excel for inputs and outputs makes the program available to a broad range of users including dairy farmers, extension advisors, dairy nutrition consultants and researchers. Dairy farmers can use the new version of the CTR Dairy program to manipulate the herbage allowance and the access time to the grazing paddock, as well as the timing of supplemental feeding, to improve the utilization of the pasture and to increase the production of the milk.
APA, Harvard, Vancouver, ISO, and other styles
9

Lopes, Bruno, Cláudia Nalon, and Edward Hermann Haeusler. "Reasoning about Petri Nets: A Calculus Based on Resolution and Dynamic Logic." ACM Transactions on Computational Logic 22, no. 2 (May 15, 2021): 1–22. http://dx.doi.org/10.1145/3441655.

Full text
Abstract:
Petri Nets are a widely used formalism to deal with concurrent systems. Dynamic Logics (DLs) are a family of modal logics where each modality corresponds to a program. Petri-PDL is a logical language that combines these two approaches: it is a dynamic logic where programs are replaced by Petri Nets. In this work we present a clausal resolution-based calculus for Petri-PDL. Given a Petri-PDL formula, we show how to obtain its translation into a normal form to which a set of resolution-based inference rules are applied. We show that the resulting calculus is sound, complete, and terminating. Some examples of the application of the method are also given.
APA, Harvard, Vancouver, ISO, and other styles
10

Amyot, Joseph R., and Gerard van Blokland. "Parameter optimization with ACSL models." SIMULATION 49, no. 5 (November 1987): 213–18. http://dx.doi.org/10.1177/003754978704900505.

Full text
Abstract:
A method whereby a parameter optimization program, written in FORTRAN, can be used in conjunction with ACSL (Advanced Continuous Simulation Language) models of dynamic systems is described. The optimization of a projectile's trajectory is used as an example.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "DYNAMO (Computer program language)"

1

McMahon, James S. "DYNAMO systems model of the roll-response of semisubmersibles." Master's thesis, This resource online, 1991. http://scholar.lib.vt.edu/theses/available/etd-01262010-020130/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Rushton, Matthew V. "Static and dynamic type systems." Diss., Connect to the thesis Connect to the thesis, 2004. http://hdl.handle.net/10066/1483.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Deitz, Steven J. "High-level programming language abstractions for advanced and dynamic parallel computations /." Thesis, Connect to this title online; UW restricted, 2005. http://hdl.handle.net/1773/6967.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Beyer, Eric W. "Design, testing, and performance of a hybrid micro vehicle - the Hopping Rotochute." Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/29661.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cain, Andrew Angus, and n/a. "Dynamic data flow analysis for object oriented programs." Swinburne University of Technology, 2005. http://adt.lib.swin.edu.au./public/adt-VSWT20060904.161506.

Full text
Abstract:
There are many tools and techniques to help developers debug and test their programs. Dynamic data flow analysis is such a technique. Existing approaches for performing dynamic data flow analysis for object oriented programs have tended to be data focused and procedural in nature. An approach to dynamic data flow analysis that used object oriented principals would provide a more natural solution to analysing object oriented programs. Dynamic data flow analysis approaches consist of two primary aspects; a model of the data flow information, and a method for collecting action information from a running program. The model for data flow analysis presented in this thesis uses a meta-level object oriented approach. To illustrate the application of this meta-level model, a model for the Java programming language is presented. This provides an instantiation of the meta-level model provided. Finally, several methods are presented for collecting action information from Java programs. The meta-level model contains elements to represent both data items and scoping components (i.e. methods, blocks, objects, and classes). At runtime the model is used to create a representation of the executing program that is used to perform dynamic data flow analysis. The structure of the model is created in such a way that locating the appropriate meta-level entity follows the scoping rules of the language. In this way actions that are reported to the meta-model are routed through the model to their corresponding meta-level elements. The Java model presented contains classes that can be used to create the runtime representation of the program under analysis. Events from the program under analysis are then used to update the model. Using this information developers are able to locate where data items are incorrectly used within their programs. Methods for collecting action information from Java programs include source code instrumentation, as used in earlier approaches, and approaches that use Java byte code transformation, and the facilities of the Java Platform Debugger Architecture. While these approaches aimed to achieve a comprehensive analysis, there are several issues that could not be resolved using the approaches covered. Of the approaches presented byte code transformation is the most practical.
APA, Harvard, Vancouver, ISO, and other styles
6

Gordon, Donald James. "Encapsulation enforcement with dynamic ownership : a thesis submitted to the Victoria University of Wellington in fulfilment of the requirements for the degree of Master of Science in Computer Science /." ResearchArchive@Victoria e-Thesis, 2008. http://hdl.handle.net/10063/624.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Kun. "Dynamic pointer tracking and its applications." Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/33936.

Full text
Abstract:
Due to the significant limitations of static analysis and the dynamic nature of pointers in weakly typed programming languages like C and C++, the points-to sets obtained at compile time are quite conservative. Most static pointer analysis methods trade the precision for the analysis speed. The methods that perform the analysis in a reasonable amount of time are often context and/or flow insensitive. Other methods that are context, flow, and field sensitive have to perform the whole program inter-procedural analysis, and do not scale with respect to the program size. A large class of problems involving optimizations such as instruction prefetching, control and data speculation, redundant load/store instructions removal, instruction scheduling, and memory disambiguation suffer due to the imprecise and conservative points-to sets computed statically. One could possibly live without optimizations, but in domains involving memory security and safety, lack of the precise points-to sets can jeopardize the security and safety. In particular, the lack of dynamic points-to sets drastically reduce the ability to reason about a program's memory access behavior, and thus illegal memory accesses can go unchecked leading to bugs as well as security holes. On the other hand, the points-to sets can be very useful for other domains such as the heap shape analysis and garbage collection. The knowledge of precise points-to sets is therefore becoming very important, but has received little attention so far beyond a few studies, which have shown that the pointers exhibit very interesting behaviors during execution. How to track such behaviors dynamically and benefit from them is the topic covered by this research. In this work, we propose a technique to compute the precise points-to sets through dynamic pointer tracking. First, the compiler performs the pointer analysis to obtain the static points-to sets. Then, the compiler analyzes the program, and inserts the necessary instructions to refine the points-to sets. At runtime, the inserted instructions automatically update the points-to sets. Dynamic pointer tracking in software can be expensive and can be a barrier to the practicality of such methods. Several optimizations including removal of redundant update, post-loop update, special pattern driven update removal, pointer initialization update removal, update propagation, invariant removal, and on demand update optimization are proposed. Our experimental results demonstrate that our mechanism is able to compute the points-to sets dynamically with tolerable overheads. Finally, the memory protection and garbage collection work are presented as the consumers of dynamic pointer tracking to illustrate its importance. In particular, it is shown how different memory properties can be easily tracked using the dynamic points-to sets opening newer possibilities.
APA, Harvard, Vancouver, ISO, and other styles
8

Spoon, Steven Alexander. "Demand-Driven Type Inference with Subgoal Pruning." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7486.

Full text
Abstract:
Highly dynamic languages like Smalltalk do not have much static type information immediately available before the program runs. Static types can still be inferred by analysis tools, but historically, such analysis is only effective on smaller programs of at most a few tens of thousands of lines of code. This dissertation presents a new type inference algorithm, DDP, that is effective on larger programs with hundreds of thousands of lines of code. The approach of the algorithm borrows from the field of knowledge-based systems: it is a demand-driven algorithm that sometimes prunes subgoals. The algorithm is formally described, proven correct, and implemented. Experimental results show that the inferred types are usefully precise. A complete program understanding application, Chuck, has been developed that uses DDP type inferences. This work contributes the DDP algorithm itself, the most thorough semantics of Smalltalk to date, a new general approach for analysis algorithms, and experimental analysis of DDP including determination of useful parameter settings. It also contributes an implementation of DDP, a general analysis framework for Smalltalk, and a complete end-user application that uses DDP.
APA, Harvard, Vancouver, ISO, and other styles
9

Gupta, Pankaj. "The Design and Implementation of a Prolog Parser Using Javacc." Thesis, University of North Texas, 2002. https://digital.library.unt.edu/ark:/67531/metadc3251/.

Full text
Abstract:
Operatorless Prolog text is LL(1) in nature and any standard LL parser generator tool can be used to parse it. However, the Prolog text that conforms to the ISO Prolog standard allows the definition of dynamic operators. Since Prolog operators can be defined at run-time, operator symbols are not present in the grammar rules of the language. Unless the parser generator allows for some flexibility in the specification of the grammar rules, it is very difficult to generate a parser for such text. In this thesis we discuss the existing parsing methods and their modified versions to parse languages with dynamic operator capabilities. Implementation details of a parser using Javacc as a parser generator tool to parse standard Prolog text is provided. The output of the parser is an “Abstract Syntax Tree” that reflects the correct precedence and associativity rules among the various operators (static and dynamic) of the language. Empirical results are provided that show that a Prolog parser that is generated by the parser generator like Javacc is comparable in efficiency to a hand-coded parser.
APA, Harvard, Vancouver, ISO, and other styles
10

Vaswani, Kapil. "An Adaptive Recompilation Framework For Rotor And Architectural Support For Online Program Instrumentation." Thesis, Indian Institute of Science, 2003. http://hdl.handle.net/2005/174.

Full text
Abstract:
Microsoft Research
Although runtime systems and the dynamic compilation model have revolutionized the process of application development and deployment, the associated performance overheads continue to be a cause for concern and much research. In the first part of this thesis, we describe the design and implementation of an adaptive recompilation framework for Rotor, a shared source implementation of the Common Language Infrastructure (CLI) that can increase program performance through intelligent recompilation decisions and optimizations based on the program's past behavior. Our extensions to Rotor include a low overhead runtime-stack based sampling profiler that identifies program hotspots. A recompilation controller oversees the recompilation process and generates recompilation requests. At the first-level of a multi-level optimizing compiler, code in the intermediate language is converted to an internal intermediate representation and optimized using a set of simple transformations. The compiler uses a fast yet effective linear scan algorithm for register allocation. Hot methods can be instrumented in order to collect basic-block, edge and call-graph profile information. Profile-guided optimizations driven by online profile information are used to further optimize heavily executed methods at the second level of recompilation. An evaluation of the framework using a set of test programs shows that performance can improve by a maximum of 42.3% and by 9% on average. Our results also show that the overheads of collecting accurate profile information through instrumentation to an extent outweigh the benefits of profile-guided optimizations in our implementation, suggesting the need for implementing techniques that can reduce such overheads. A flexible and extensible framework design implies that additional profiling and optimization techniques can be easily incorporated to further improve performance. As previously stated, fine-grained and accurate profile information must be available at low cost for advanced profile-guided optimizations to be effective in online environments. In this second part of this thesis, we propose a generic framework that makes it possible for instrumentation based profilers to collect profile data efficiently, a task that has traditionally been associated with high overheads. The essence of the scheme is to make the underlying hardware aware of instrumentation using a special set of profile instructions and tuned microarchitecture. This not only allows the hardware to provide the runtime with mechanisms to control the profiling activity, but also makes it possible for the hardware itself to optimize the process of profiling in a manner transparent to the runtime. We propose selective instruction dispatch as one possible controlling mechanism that can be used by the runtime to manage the execution of profile instructions and keep profiling overheads under check. We propose profile flag prediction, a hardware optimization that complements the selective dispatch mechanism by not fetching profile instructions when the runtime has turned profiling off. The framework is light-weight and flexible. It eliminates the need for expensive book-keeping, recompilation or code duplication. Our simulations with benchmarks from the SPEC CPU2000 suite show that overheads for call-graph and basic block profiling can be reduced by 72.7% and 52.4% respectively with a negligible loss in accuracy.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "DYNAMO (Computer program language)"

1

Merminod, Bertrand. The use of Kalman filters in GPS navigation. Kensington, N.S.W., Australia: School of Surveying, University of New South Wales, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Endler, Markus. A language for high-level programming of dynamic reconfiguration. München: R. Oldenbourg Verlag, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Daconta, Michael C. C++ pointers and dynamic memory management. New York: Wiley, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

C pointers and dynamic memory management. Boston: QED Pub. Group, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Neal, Feinberg, ed. Dylan programming: An object-oriented and dynamic language. Reading, Mass: Addison-Wesley, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tansley, David. Create dynamic Web pages using PHP and MYSQL. Boston: Addison-Wesley, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Tansley, David. Creating dynamic Web pages with PHP and MYSQL. Harlow, England: Addison-Wesley, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Colomb, Robert M. Implementing persistent Prolog: Large, dynamic, shared procedures in Prolog. New York: Ellis Horwood, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

PHP and MySQL for dynamic Web sites. 4th ed. Berkeley, CA: Peachpit Press, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Harrison, Graham Paul. Dynamic web programming: Using Java, JavaScript, and Informix. Upper Saddle River, NJ: Prentice Hall PTR, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "DYNAMO (Computer program language)"

1

Eilers, Marco, Severin Meier, and Peter Müller. "Product Programs in the Wild: Retrofitting Program Verifiers to Check Information Flow Security." In Computer Aided Verification, 718–41. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81685-8_34.

Full text
Abstract:
AbstractMost existing program verifiers check trace properties such as functional correctness, but do not support the verification of hyperproperties, in particular, information flow security. In principle, product programs allow one to reduce the verification of hyperproperties to trace properties and, thus, apply standard verifiers to check them; in practice, product constructions are usually defined only for simple programming languages without features like dynamic method binding or concurrency and, consequently, cannot be directly applied to verify information flow security in a full-fledged language. However, many existing verifiers encode programs from source languages into simple intermediate verification languages, which opens up the possibility of constructing a product program on the intermediate language level, reusing the existing encoding and drastically reducing the effort required to develop new verification tools for information flow security. In this paper, we explore the potential of this approach along three dimensions: (1) Soundness: We show that the combination of an encoding and a product construction that are individually sound can still be unsound, and identify a novel condition on the encoding that ensures overall soundness. (2) Concurrency: We show how sequential product programs on the intermediate language level can be used to verify information flow security of concurrent source programs. (3) Performance: We implement a product construction in Nagini, a Python verifier built upon the Viper intermediate language, and evaluate it on a number of challenging examples. We show that the resulting tool offers acceptable performance, while matching or surpassing existing tools in its combination of language feature support and expressiveness.
APA, Harvard, Vancouver, ISO, and other styles
2

Weik, Martin H. "language program." In Computer Science and Communications Dictionary, 871. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/1-4020-0613-6_9931.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Weik, Martin H. "program design language." In Computer Science and Communications Dictionary, 1347. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/1-4020-0613-6_14835.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Craven, Paul Vincent. "What Is a Computer Language?" In Program Arcade Games, 33–40. Berkeley, CA: Apress, 2016. http://dx.doi.org/10.1007/978-1-4842-1790-0_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Volpano, Dennis, and Geoffrey Smith. "Language Issues in Mobile Program Security." In Lecture Notes in Computer Science, 25–43. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/3-540-68671-1_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Schmidt, Jonas, Thomas Schwentick, Till Tantau, Nils Vortmeier, and Thomas Zeume. "Work-sensitive Dynamic Complexity of Formal Languages." In Lecture Notes in Computer Science, 490–509. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-71995-1_25.

Full text
Abstract:
AbstractWhich amount of parallel resources is needed for updating a query result after changing an input? In this work we study the amount of work required for dynamically answering membership and range queries for formal languages in parallel constant time with polynomially many processors. As a prerequisite, we propose a framework for specifying dynamic, parallel, constant-time programs that require small amounts of work. This framework is based on the dynamic descriptive complexity framework by Patnaik and Immerman.
APA, Harvard, Vancouver, ISO, and other styles
7

Harf, Mait, Kristiina Kindel, Vahur Kotkas, Peep Küngas, and Enn Tyugu. "Automated Program Synthesis for Java Programming Language." In Lecture Notes in Computer Science, 157–64. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-45575-2_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Munksgaard, Philip, Svend Lund Breddam, Troels Henriksen, Fabian Cristian Gieseke, and Cosmin Oancea. "Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties." In Lecture Notes in Computer Science, 3–23. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-83978-9_1.

Full text
Abstract:
AbstractFunctional languages allow rewrite-rule systems that aggressively generate a multitude of semantically-equivalent but differently-optimized code versions. In the context of GPGPU execution, this paper addresses the important question of how to compose these code versions into a single program that (near-)optimally discriminates them across different datasets. Rather than aiming at a general autotuning framework reliant on stochastic search, we argue that in some cases, a more effective solution can be obtained by customizing the tuning strategy for the compiler transformation producing the code versions.We present a simple and highly-composable strategy which requires that the (dynamic) program property used to discriminate between code versions conforms with a certain monotonicity assumption. Assuming the monotonicity assumption holds, our strategy guarantees that if an optimal solution exists it will be found. If an optimal solution doesn’t exist, our strategy produces human tractable and deterministic results that provide insights into what went wrong and how it can be fixed.We apply our tuning strategy to the incremental-flattening transformation supported by the publicly-available Futhark compiler and compare with a previous black-box tuning solution that uses the popular OpenTuner library. We demonstrate the feasibility of our solution on a set of standard datasets of real-world applications and public benchmark suites, such as Rodinia and FinPar. We show that our approach shortens the tuning time by a factor of $$6\times $$ 6 × on average, and more importantly, in five out of eleven cases, it produces programs that are (as high as $$10\times $$ 10 × ) faster than the ones produced by the OpenTuner-based technique.
APA, Harvard, Vancouver, ISO, and other styles
9

Roșu, Grigore. "From Rewriting Logic, to Programming Language Semantics, to Program Verification." In Lecture Notes in Computer Science, 598–616. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-23165-5_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kobayashi, Naoki. "Higher-Order Program Verification and Language-Based Security." In Advances in Computer Science - ASIAN 2009. Information Security and Privacy, 17–23. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-10622-4_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "DYNAMO (Computer program language)"

1

Roßmann, Ju¨rgen, Michael Schluse, and Thomas Jung. "Introducing Intuitive and Versatile Multi Modal Graphical Programming Means to Enhance Virtual Environments." In ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-49517.

Full text
Abstract:
Usability and versatility are two of the most important issues when using modern 3D simulation systems within the field of automation technology and virtual environments. 3D simulations and virtual worlds proved to be versatile tools to program, supervise and command complex robotic and automation systems. For industrial robots, 3D simulation systems like COSIMIR® introduced the so called Native Language Programming (NLP) concept enabling the automation expert to program each robot using its native programming language. But what about programming other automation components or other dynamic components in virtual environments, what about user friendly, intuitive graphical programming languages, what about easy-to-use worker oriented programming languages? When talking about graphical programming languages to model dynamic behavior, questions like “which graphical modeling languages should be supported?”, “which are the most powerful ones?” and “which one matches the most to my concrete application?” have to be answered. Each graphical programming language has its own advantages and disadvantages, so that the answer to all these questions has to be: Offer a choice of graphical modeling languages to the user and leave the decision to him. The advantage of this strategy is obvious: Instead of learning how to use a concrete modeling language or worrying about programming details, the user can focus on his individual automation task and so quickly build efficient solutions. Therefore this paper extends the NLP approach to graphical programming languages using a new kind of object oriented Petri Nets as an intermediate language. This enables the user to use — at the same time — finite automata like mealy machines or extended automata, activity diagrams as defined in UML 2, flowchart like diagrams (e. g. icon-based programming) and many more to model the dynamics or the behavior of dynamic components.
APA, Harvard, Vancouver, ISO, and other styles
2

Shumilov, Konstantin Avgustovich, and Yuliana Alexandrovna Guryeva. "Plastic Forms of Architecture in Dynamo – Revit." In 32nd International Conference on Computer Graphics and Vision. Keldysh Institute of Applied Mathematics, 2022. http://dx.doi.org/10.20948/graphicon-2022-878-891.

Full text
Abstract:
The paper presents the results of research on working with the Dynamo-Revit bundle when creating plastic architectural forms of complex geometry. The Lotus, Canopy and Parametric Pavilion objects were chosen as models for the researching. For the presented architectural objects the least resource intensive nodes and their bundles were selected. Nodes and their bundles in such a way as to optimally use the capabilities of programs and not overload computer resources were selected. The scripts developed in the Dynamo program for creating the presented models are briefly described. Explanations for the most significant fragments and full scripts used to work with models are given. The possibilities of the Dynamo program for visual (parametric) programming are briefly analyzed. Some possibilities of work of its bundle with Revit are studied. The two-way work of the Dynamo-Revit bundle for import-export of the model was analyzed, including when changing the code to correct the model. It is advisable to continue working in this direction in order to obtain more concise and universal algorithms (chains of nodes) that allow varying the initial data and options for the shape of plastic architectural objects.
APA, Harvard, Vancouver, ISO, and other styles
3

Титова, Мария, Maria Titova, Татьяна Томчинская, and Tatyana Tomchinskaya. "Simulation of Contractile Heart Function in the Autodesk Maya Environment Based on Muscle Fiber Macro-Structure." In 29th International Conference on Computer Graphics, Image Processing and Computer Vision, Visualization Systems and the Virtual Environment GraphiCon'2019. Bryansk State Technical University, 2019. http://dx.doi.org/10.30987/graphicon-2019-2-227-331.

Full text
Abstract:
A dynamic simulation model of the contractile function of the heart is presented. The contractile function simulation is based on the modeling of the muscle fibers' structure according to the Atlas of human anatomy and the use of parameters of their geometric shape as parameters that control the contraction. The basic concepts of the architecture of muscle fibers of the myocardium and the structure of the blood supply to the heart are investigated. An algorithm is developed for local parameterization of the contractile function of the heart, which mimics blood flow and conduction disturbances via special control functions. The algorithm of the simulation model is shown in the example of only the left ventricle of the heart but is embedded in the full three-dimensional model of the ventricular complex of the heart. The simulation model is implemented as a solid-state parameterized model in the Autodesk Maya tool environment, managed by a program in the embedded Python language. The result is compared with the results of the OpenCMISS software in favor of the latter. It is planned to continue work with the implementation of the most advanced concept of the myocardial architecture of Torrent-Guasp together with the networks of electrical excitation and blood supply.
APA, Harvard, Vancouver, ISO, and other styles
4

Taapopi, E. E., H. Wang, and J. Zhou. "Equal Forced Time Step Approach to PSA for a Dynamic System – A Case of the Holdup Tank." In 2021 28th International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/icone28-64081.

Full text
Abstract:
Abstract This work presents a methodology of the equal forced time step for generating accident scenarios in a time-dependent dynamic sequence. The aim is to model DPSA and system reliability of a simple hold-up tank which is a good representative of a steam generator — a component in the secondary loop of a nuclear power plant. This was achieved based on some predicted theoretical calculations on node generating conditions and the cumulative probability of partial failure mode. The computer language Python 3.0 was employed to establish a dynamic event tree of 20 equal time steps (h), the total number of 7367 nodes, 4882 failed nodes for various types of potential accidents. Also, various types of failures and the cumulative probabilities of 0.13 were generated. The program modeled cumulative probabilities and associated errors. However, there is a need to improve and refine the methodology to consider physical parameters such as temperature, pressure, etc. as these are factors that can lead to calculation termination. This will boost the program efficiency and utilize the information extracted from the dynamic event tree in the analysis of the dynamic characteristics. Although extra efforts and improvement still be needed to refine produced accident scenario results, the presented methodology has satisfactorily handled the example.
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Xiaoli, Rong Ge, and Charles Tseng. "Visualizing genetic recombination with interactive computer program." In 2010 International Conference on Audio, Language and Image Processing (ICALIP). IEEE, 2010. http://dx.doi.org/10.1109/icalip.2010.5685128.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sormaz, Dusan N., and Prashant A. Borse. "Visualization of Milling Processes on Internet." In ASME 2001 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2001. http://dx.doi.org/10.1115/detc2001/dfm-21201.

Full text
Abstract:
Abstract In current environment of simultaneous product development, the Internet is being recognized as the best platform for implementation of virtual manufacturing organization located at different global places. This paper represents a step towards the integration of product development teams and introduces a novel method for visualization of manufacturing processes in order to verify the process plans generated. The paper presents the pilot implementation in VRML with JavaScript as a scripting language for dynamic nature of the program. Using the concepts of Geometric representation of the Manufacturing processes a program in C++ language is developed, which supplies data for animation. The results are presented on Internet by developing a web page for Visualization of Manufacturing Processes.
APA, Harvard, Vancouver, ISO, and other styles
7

Chou, Yu-Cheng, David Ko, and Harry H. Cheng. "Mobile Agent Based Autonomic Dynamic Parallel Computing." In ASME 2009 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/detc2009-87750.

Full text
Abstract:
Parallel computing is widely adotped in scientific and engineering applications to enhance the efficiency. Moreover, there are increasing research interests focusing on utilizing distributed networked computers for parallel computing. The Message Passing Interface (MPI) standard was designed to support portability and platform independence of a developed parallel program. However, the procedure to start an MPI-based parallel computation among distributed computers lacks autonomicity and flexibility. This article presents an autonomic dynamic parallel computing framework that provides autonomicity and flexibility that are important and necessary to some parallel computing applications involving resource constrained and heterogeneous platforms. In this framework, an MPI parallel computing environment consisting of multiple computing entities is dynamically established through inter-agent communications using the IEEE Foundation for Intelligent Physical Agents (FIPA) compliant Agent Communication Language (ACL) messages. For each computing entity in the MPI parallel computing environment, a load-balanced MPI program C source code along with the MPI environment configuration statements are dynamically composed as a mobile agent code. A mobile agent wrapping the mobile agent code is created and sent to the computing entity where the mobile agent code is retrieved and interpretively executed. An example of autonomic parallel matrix multiplication is given to demonstrate the self-configuration and self-optimization properties of the presented framework.
APA, Harvard, Vancouver, ISO, and other styles
8

Kim, Sung-Hun, Jin-Tak Choi, and Kil-Hong Joo. "Development of Cyber Sign Language Interpreting App Program for Deaf." In Next Generation Computer and Information Technology 2017. Science & Engineering Research Support soCiety, 2017. http://dx.doi.org/10.14257/astl.2017.145.03.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Karpoff, Marcus, Jose Nelson Amaral, Kai-Ting Amy Wang, Rayson Ho, and Brice Dobry. "PSU: A Framework for Dynamic Software Updates in Multi-threaded C-Language Programs." In 2020 IEEE 32nd International Symposium on Computer Architecture and High Performance Computing (SBAC-PAD). IEEE, 2020. http://dx.doi.org/10.1109/sbac-pad49847.2020.00040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ling, Xiang, Guoqing Wu, and Bo Huang. "Comparing program to requirement and design using language acceptance." In 2012 2nd International Conference on Computer Science and Network Technology (ICCSNT). IEEE, 2012. http://dx.doi.org/10.1109/iccsnt.2012.6525961.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "DYNAMO (Computer program language)"

1

Makhachashvili, Rusudan K., Svetlana I. Kovpik, Anna O. Bakhtina, and Ekaterina O. Shmeltser. Technology of presentation of literature on the Emoji Maker platform: pedagogical function of graphic mimesis. [б. в.], July 2020. http://dx.doi.org/10.31812/123456789/3864.

Full text
Abstract:
The article deals with the technology of visualizing fictional text (poetry) with the help of emoji symbols in the Emoji Maker platform that not only activates students’ thinking, but also develops creative attention, makes it possible to reproduce the meaning of poetry in a succinct way. The application of this technology has yielded the significance of introducing a computer being emoji in the study and mastering of literature is absolutely logical: an emoji, phenomenologically, logically and eidologically installed in the digital continuum, is separated from the natural language provided by (ethno)logy, and is implicitly embedded into (cosmo)logy. The technology application object is the text of the twentieth century Cuban poet José Ángel Buesa. The choice of poetry was dictated by the appeal to the most important function of emoji – the expression of feelings, emotions, and mood. It has been discovered that sensuality can reconstructed with the help of this type of meta-linguistic digital continuum. It is noted that during the emoji design in the Emoji Maker program, due to the technical limitations of the platform, it is possible to phenomenologize one’s own essential-empirical reconstruction of the lyrical image. Creating the image of the lyrical protagonist sign, it was sensible to apply knowledge in linguistics, philosophy of language, psychology, psycholinguistics, literary criticism. By constructing the sign, a special emphasis was placed on the facial emogram, which also plays an essential role in the transmission of a wide range of emotions, moods, feelings of the lyrical protagonist. Consequently, the Emoji Maker digital platform allowed to create a new model of digital presentation of fiction, especially considering the psychophysiological characteristics of the lyrical protagonist. Thus, the interpreting reader, using a specific digital toolkit – a visual iconic sign (smile) – reproduces the polylaterial metalinguistic multimodality of the sign meaning in fiction. The effectiveness of this approach is verified by the poly-functional emoji ousia, tested on texts of fiction.
APA, Harvard, Vancouver, ISO, and other styles
2

Striuk, Andrii M., and Serhiy O. Semerikov. The Dawn of Software Engineering Education. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3671.

Full text
Abstract:
Designing a mobile-oriented environment for professional and practical training requires determining the stable (fundamental) and mobile (technological) components of its content and determining the appropriate model for specialist training. In order to determine the ratio of fundamental and technological in the content of software engineers’ training, a retrospective analysis of the first model of training software engineers developed in the early 1970s was carried out and its compliance with the current state of software engineering development as a field of knowledge and a new the standard of higher education in Ukraine, specialty 121 “Software Engineering”. It is determined that the consistency and scalability inherent in the historically first training program are largely consistent with the ideas of evolutionary software design. An analysis of its content also provided an opportunity to identify the links between the training for software engineers and training for computer science, computer engineering, cybersecurity, information systems and technologies. It has been established that the fundamental core of software engineers’ training should ensure that students achieve such leading learning outcomes: to know and put into practice the fundamental concepts, paradigms and basic principles of the functioning of language, instrumental and computational tools for software engineering; know and apply the appropriate mathematical concepts, domain methods, system and object-oriented analysis and mathematical modeling for software development; put into practice the software tools for domain analysis, design, testing, visualization, measurement and documentation of software. It is shown that the formation of the relevant competencies of future software engineers must be carried out in the training of all disciplines of professional and practical training.
APA, Harvard, Vancouver, ISO, and other styles
3

Markova, Oksana, Serhiy Semerikov, and Maiia Popel. СoCalc as a Learning Tool for Neural Network Simulation in the Special Course “Foundations of Mathematic Informatics”. Sun SITE Central Europe, May 2018. http://dx.doi.org/10.31812/0564/2250.

Full text
Abstract:
The role of neural network modeling in the learning сontent of special course “Foundations of Mathematic Informatics” was discussed. The course was developed for the students of technical universities – future IT-specialists and directed to breaking the gap between theoretic computer science and it’s applied applications: software, system and computing engineering. CoCalc was justified as a learning tool of mathematical informatics in general and neural network modeling in particular. The elements of technique of using CoCalc at studying topic “Neural network and pattern recognition” of the special course “Foundations of Mathematic Informatics” are shown. The program code was presented in a CofeeScript language, which implements the basic components of artificial neural network: neurons, synaptic connections, functions of activations (tangential, sigmoid, stepped) and their derivatives, methods of calculating the network`s weights, etc. The features of the Kolmogorov–Arnold representation theorem application were discussed for determination the architecture of multilayer neural networks. The implementation of the disjunctive logical element and approximation of an arbitrary function using a three-layer neural network were given as an examples. According to the simulation results, a conclusion was made as for the limits of the use of constructed networks, in which they retain their adequacy. The framework topics of individual research of the artificial neural networks is proposed.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography