Academic literature on the topic 'IPL (Computer program language)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'IPL (Computer program language).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "IPL (Computer program language)"

1

van Rensburg, Henriette Janse, and Jeong-Bae Son. "Improving English Language and Computer Literacy Skills in an Adult Refugee Program." International Journal of Pedagogies and Learning 6, no. 1 (2010): 69–81. http://dx.doi.org/10.5172/ijpl.6.1.69.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

CHEN, ZHENMIN, and FANG ZHAO. "DETERMINING MINIMUM SURVEY SAMPLE SIZE FOR MULTI-CELL CASE." International Journal of Reliability, Quality and Safety Engineering 17, no. 06 (2010): 579–86. http://dx.doi.org/10.1142/s0218539310003962.

Full text
Abstract:
Survey analysis method is widely used in many areas such as social study, marketing research, economics, public health, clinical trials and transportation data analysis. Minimum sample size determination is always needed before a survey is conducted to avoid huge cost. Some statistical methods can be found from the literature for finding the minimum required sample size. This paper proposes a method for finding the minimum total sample size needed for the survey when the population is divided into cells. The proposed method can be used for both the infinite population case and the finite population case. A computer program is needed to realize the sample size calculation. The computer program the authors used is SAS/IML, which is a special integrated matrix language (IML) procedure of the Statistical Analysis System (SAS) software.
APA, Harvard, Vancouver, ISO, and other styles
3

VENNEKENS, JOOST, MARC DENECKER, and MAURICE BRUYNOOGHE. "CP-logic: A language of causal probabilistic events and its relation to logic programming." Theory and Practice of Logic Programming 9, no. 3 (2009): 245–308. http://dx.doi.org/10.1017/s1471068409003767.

Full text
Abstract:
AbstractThis paper develops a logical language for representing probabilistic causal laws. Our interest in such a language is two-fold. First, it can be motivated as a fundamental study of the representation of causal knowledge. Causality has an inherent dynamic aspect, which has been studied at the semantical level by Shafer in his framework of probability trees. In such a dynamic context, where the evolution of a domain over time is considered, the idea of a causal law as something which guides this evolution is quite natural. In our formalization, a set of probabilistic causal laws can be used to represent a class of probability trees in a concise, flexible and modular way. In this way, our work extends Shafer's by offering a convenient logical representation for his semantical objects. Second, this language also has relevance for the area of probabilistic logic programming. In particular, we prove that the formal semantics of a theory in our language can be equivalently defined as a probability distribution over the well-founded models of certain logic programs, rendering it formally quite similar to existing languages such as ICL or PRISM. Because we can motivate and explain our language in a completely self-contained way as a representation of probabilistic causal laws, this provides a new way of explaining the intuitions behind such probabilistic logic programs: we can say precisely which knowledge such a program expresses, in terms that are equally understandable by a non-logician. Moreover, we also obtain an additional piece of knowledge representation methodology for probabilistic logic programs, by showing how they can express probabilistic causal laws.
APA, Harvard, Vancouver, ISO, and other styles
4

Ito, Akihiro, and Junko Yamashita. "A corpus-based validation study of the universal processing hypothesis in English relative clause formation." ITL - International Journal of Applied Linguistics 149-150 (2005): 77–91. http://dx.doi.org/10.2143/itl.150.0.2004373.

Full text
Abstract:
The present study focuses on spoken and written data in the British National Corpus (BNC). Based on a review of recent studies on English relative clauses, we formulated a Universal Processing Hypothesis (OS >OO>SS> SO) as target hypothesis to be validated using a corpus data approach. A computer program was designed to calculate the frequency of appearance of the four types of relative clauses (OS, OO, SS, and SO). The results indicated this hypothesis to be a valid predictor of frequency of appearance of relative clauses in the domain for written corpus texts. However, it is not supported in context-governed spoken material. Limitations of the present investigation and the direction of future research are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
5

Wong, Man Leung, and Kwong Sak Leung. "Evolutionary Program Induction Directed by Logic Grammars." Evolutionary Computation 5, no. 2 (1997): 143–80. http://dx.doi.org/10.1162/evco.1997.5.2.143.

Full text
Abstract:
Program induction generates a computer program that can produce the desired behavior for a given set of situations. Two of the approaches in program induction are inductive logic programming (ILP) and genetic programming (GP). Since their formalisms are so different, these two approaches cannot be integrated easily, although they share many common goals and functionalities. A unification will greatly enhance their problem-solving power. Moreover, they are restricted in the computer languages in which programs can be induced. In this paper, we present a flexible system called LOGENPRO (The LOgic grammar-based GENetic PROgramming system) that uses some of the techniques of GP and ILP. It is based on a formalism of logic grammars. The system applies logic grammars to control the evolution of programs in various programming languages and represent context-sensitive information and domain-dependent knowledge. Experiments have been performed to demonstrate that LOGENPRO can emulate GP and GP with automatically defined functions (ADFs). Moreover, LOGENPRO can employ knowledge such as argument types in a unified framework. The experiments show that LOGENPRO has superior performance to that of GP and GP with ADFs when more domain-dependent knowledge is available. We have applied LOGENPRO to evolve general recursive functions for the even-n-parity problem from noisy training examples. A number of experiments have been performed to determine the impact of domain-specific knowledge and noise in training examples on the speed of learning.
APA, Harvard, Vancouver, ISO, and other styles
6

Rahimi, Meisam. "Second language articulatory training and computer-generated feedback in L2 pronunciation improvement." ITL - International Journal of Applied Linguistics 167, no. 2 (2016): 190–209. http://dx.doi.org/10.1075/itl.167.2.04rah.

Full text
Abstract:
This paper investigates the efficacy of articulatory training and acoustic feedback on Persian L2 learners’ production of English segmental (/ɒ/). A sample of 30 Persian ESL learners was recruited- 10 learners were randomly assigned to the experimental group 1, 10 to the experimental group 2, and 10 to the control group. Over a five-week period, the experimental group 1 received training on the manner of articulation of the segment, the experimental group 2 received acoustic-articulatory training and was exposed to CALL software for receiving feedback, and the control group was only exposed to auditory input. The groups were given a pretest, an immediate posttest, and a generalization test. The results of the study showed a significant improvement in the performance of the participants in both the posttest and the generalization test in the experimental group 2. These findings suggest the inefficiency of the mere knowledge of the manner of articulation of the segment and lend support to the feasibility of using acoustic features of sounds and computer-based, learner-centred programs for second language segmental acquisition.
APA, Harvard, Vancouver, ISO, and other styles
7

Poreh, Davod, Antonio Iodice, Antonio Natale, and Daniele Riccio. "Software Tool for Soil Surface Parameters Retrieval from Fully Polarimetric Remotely Sensed SAR Data." Sensors 20, no. 18 (2020): 5085. http://dx.doi.org/10.3390/s20185085.

Full text
Abstract:
The retrieval of soil surface parameters, in particular soil moisture and roughness, based on Synthetic Aperture Radar (SAR) data, has been the subject of a large number of studies, of which results are available in the scientific literature. However, although refined methods based on theoretical/analytical scattering models have been proposed and successfully applied in experimental studies, at the operative level very simple, empirical models with a number of adjustable parameters are usually employed. One of the reasons for this situation is that retrieval methods based on analytical scattering models are not easy to implement and to be employed by non-expert users. Related to this, commercially and freely available software tools for the processing of SAR data, although including routines for basic manipulation of polarimetric SAR data (e.g., coherency and covariance matrix calculation, Pauli decomposition, etc.), do not implement easy-to-use methods for surface parameter retrieval. In order to try to fill this gap, in this paper we present a user-friendly computer program for the retrieval of soil surface parameters from Polarimetric Synthetic Aperture Radar (PolSAR) imageries. The program evaluates soil permittivity, soil moisture and soil roughness based on the theoretical predictions of the electromagnetic scattering provided by the Polarimetric Two-Scale Model (PTSM) and the Polarimetric Two-Scale Two-Component Model (PTSTCM). In particular, nine different retrieval methodologies, whose applicability depends on both the used polarimetric data (dual- or full-pol) and the characteristics of the observed scene (e.g., on its topography and on its vegetation cover), as well as their implementation in the Interactive Data Language (IDL) platform, are discussed. One specific example from Germany’s Demmin test-site is presented in detail, in order to provide a first guide to the use of the tool. Obtained retrieval results are in agreement with what was expected according to the available literature.
APA, Harvard, Vancouver, ISO, and other styles
8

Lange, Fiete, Tiemen W. Van Weerden, and Johannes H. Van Der Hoeven. "A new surface electromyography analysis method to determine spread of muscle fiber conduction velocities." Journal of Applied Physiology 93, no. 2 (2002): 759–64. http://dx.doi.org/10.1152/japplphysiol.00594.2001.

Full text
Abstract:
Muscle fiber conduction velocity (MFCV) estimation from surface signals is widely used to study muscle function, e.g., in neuromuscular disease and in fatigue studies. However, most analysis methods do not yield information about the velocity distribution of the various motor unit action potentials. We have developed a new method–the interpeak latency method (IPL)–to calculate both the mean MFCV and the spread of conduction velocities in vivo, from bipolar surface electromyogram (sEMG) during isometric contractions. sEMG was analyzed in the biceps brachii muscle in 15 young male volunteers. The motor unit action potential peaks are automatically detected with a computer program. Associated peaks are used to calculate a mean MFCV and the SD. The SD is taken as a measure of the MFCV spread. The main finding is that the IPL method can derive a measure of MFCV spread at different contraction levels. In conclusion, the IPL method provides accurate values for the MFCV and additionally gives information about the scatter of conduction velocities.
APA, Harvard, Vancouver, ISO, and other styles
9

Saikia, Manob Jyoti, Rajan Kanhirodan, and Ram Mohan Vasu. "High-Speed GPU-Based Fully Three-Dimensional Diffuse Optical Tomographic System." International Journal of Biomedical Imaging 2014 (2014): 1–13. http://dx.doi.org/10.1155/2014/376456.

Full text
Abstract:
We have developed a graphics processor unit (GPU-) based high-speed fully 3D system for diffuse optical tomography (DOT). The reduction in execution time of 3D DOT algorithm, a severely ill-posed problem, is made possible through the use of (1) an algorithmic improvement that uses Broyden approach for updating the Jacobian matrix and thereby updating the parameter matrix and (2) the multinode multithreaded GPU and CUDA (Compute Unified Device Architecture) software architecture. Two different GPU implementations of DOT programs are developed in this study: (1) conventional C language program augmented by GPU CUDA and CULA routines (C GPU), (2) MATLAB program supported by MATLAB parallel computing toolkit for GPU (MATLAB GPU). The computation time of the algorithm on host CPU and the GPU system is presented for C and Matlab implementations. The forward computation uses finite element method (FEM) and the problem domain is discretized into 14610, 30823, and 66514 tetrahedral elements. The reconstruction time, so achieved for one iteration of the DOT reconstruction for 14610 elements, is 0.52 seconds for a C based GPU program for 2-plane measurements. The corresponding MATLAB based GPU program took 0.86 seconds. The maximum number of reconstructed frames so achieved is 2 frames per second.
APA, Harvard, Vancouver, ISO, and other styles
10

Karn, Helen E., and MacEnglish. "Pronunciation Plus (Computer Program)." TESOL Quarterly 30, no. 1 (1996): 176. http://dx.doi.org/10.2307/3587618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography