Academic literature on the topic 'Lotus 1-2-3 (Computer program)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Lotus 1-2-3 (Computer program).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Lotus 1-2-3 (Computer program)"

1

Sundqvist, Christer, Artur Mayerhofer, and Sherie Hodges. "A radioimmunoassay program for Lotus 1–2–3." Computers in Biology and Medicine 19, no. 2 (January 1989): 145–50. http://dx.doi.org/10.1016/0010-4825(89)90007-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pellerin, Luc. "A protein assay program for Lotus 1-2-3." Computers in Biology and Medicine 20, no. 5 (January 1990): 373–78. http://dx.doi.org/10.1016/0010-4825(90)90017-j.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Friberg, LaVerne M. "Garnet stoichiometry program using a lotus 1-2-3 spreadsheet." Computers & Geosciences 15, no. 7 (1989): 1169–72. http://dx.doi.org/10.1016/0098-3004(89)90129-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mulyono, S. "A COMPUTER PROGRAM TO PREDICT SCALING TENDENCY OF OIL-FIELD WELL WATER COMPOSITE." Scientific Contributions Oil and Gas 22, no. 1 (April 5, 2022): 7–20. http://dx.doi.org/10.29017/scog.22.1.1093.

Full text
Abstract:
A simple computer program based on LOTUS 1-2-3 Release 2.3 or 2.4 has been prepared for scaling tendency prediction of oilfield well water composite. Up to four individual well waters can compose the composite. Scaling material considered are calcium carbonate, calcium sulphate, and barium sulfate.
APA, Harvard, Vancouver, ISO, and other styles
5

McCarthy, Gregory J. "Laboratory Note. A LOTUS 1-2-3 Spreadsheet to Aid in Data Reduction for Publication of X-Ray Powder Diffraction Data." Powder Diffraction 3, no. 1 (March 1988): 39–40. http://dx.doi.org/10.1017/s0885715600013105.

Full text
Abstract:
My students and I have developed a LOTUS 1-2-3 spreadsheet to aid in data reduction tasks associated with preparing powder diffraction data for publication in the Powder Diffraction File (PDF) (1987) and this journal. Portions of a sample spreadsheet and the formulae in each of the computational cells are given in the Table 1. The concept of this spreadsheet should apply to any of the available computer spreadsheet programs, although the specific codes for the mathematical functions may differ.The user enters data only into columns C, D and F-H. All other entries will be calculated from the input data. Observed 2θ angles are entered into column D. The corresponding d-spacing is calculated in column A. The Miller indices of these peaks are entered into column C. Prior to use of the spreadsheet, the observed 2θ angles and hkl's had been used to refine unit cell parameters using the Appleman and Evans (1973) least squares unit cell parameter refinement computer program implemented for the IBM PC by Garvey (1986).
APA, Harvard, Vancouver, ISO, and other styles
6

Hindriks, F. R., A. Bosman, and P. F. Rademaker. "The significance of indirect costs—application to clinical laboratory test economics using computer facilities." Journal of Automatic Chemistry 11, no. 4 (1989): 174–78. http://dx.doi.org/10.1155/s1463924689000374.

Full text
Abstract:
The significance of indirect costs in the cost price calculation of clinical chemistry laboratory tests by way of the production centres method has been investigated. A cost structure model based on the ‘production centres’ method, the Academisch Ziekenhuis Groningen (AZG) 1-2-3 model, is used for the calculation of cost and cost prices as an add-in tool to the spreadsheet program Lotus 1-2-3. The system specifications of the AZG 1-2-3 cost structure model have been extended with facilities to impute all relevant indirect costs to cost centres by aid of allocation rules, which can be chosen freely. The inference is made that as indirect costs play a more important part in decision-making processes concerning planning and control, the specification of the relation to the cost centres should be determined in a more detailed way. The AZG 1-2-3 cost structure model has therefore been extended in order to increase the significance as a management tool for laboratory management.
APA, Harvard, Vancouver, ISO, and other styles
7

Brill, R. W., and P. G. Bushnell. "CARDIO—A Lotus 1-2-3 based computer program for rapid calculation of cardiac output from dye or thermal dilution curves." Computers in Biology and Medicine 19, no. 5 (January 1989): 361–66. http://dx.doi.org/10.1016/0010-4825(89)90057-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Nageswara Rao, G., and K. Anand Kumar. "A microcomputer program to analyze the CD spectrum of proteins and nucleic acids—Use of LOTUS 1-2-3 spread sheet." Computers in Biology and Medicine 21, no. 6 (January 1991): 443–50. http://dx.doi.org/10.1016/0010-4825(91)90046-c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Thomas, B. G., and B. Ho. "Spread Sheet Model of Continuous Casting." Journal of Engineering for Industry 118, no. 1 (February 1, 1996): 37–44. http://dx.doi.org/10.1115/1.2803646.

Full text
Abstract:
Spreadsheet programs, such as Microsoft Excel, Informix WINGZ, and Lotus 123, provide a framework for very fast and easy development of simple engineering models. The present paper describes a model of the continuous casting process that has been developed using a spreadsheet program, Microsoft Excel, running on a Macintosh II personal computer. The model consists of two-dimensional (2-D) steady-state finite-difference heat conduction calculations within a continuous casting mold coupled to a one-dimensional (1-D) transient solidification heat transfer model of the solidifying shell. The model structure and equations are described and the model predictions are compared with previous solutions. Practical examples using the model are discussed and sample results are provided. Spreadsheet programs running on personal computers are capable of relatively complex calculations that would require extensive effort using conventional programming languages.
APA, Harvard, Vancouver, ISO, and other styles
10

Wilt, Suzanne Lorenzo, and Jeffrey L. Silvershein. "Applications of a Laptop Microcomputer During Cardiopulmonary Bypass." Journal of ExtraCorporeal Technology 20 (1988): 32–36. http://dx.doi.org/10.1051/ject/198820s032.

Full text
Abstract:
A “user friendly” microcomputer-generated data management system designed for on-site documentation and immediate analysis of patient parameters during cardiopulmonary bypass is presented. The program is a macro-driven template for use with Lotus 1-2-3 (ver. 2.0) on IBM PC-XT or compatible such as the Zenith Z-183 Laptop computer used in this report. Automatic calculation of data requiring minimal input includes kilos, centimeters, body surface area, blood flow, fluid balance, heparin, protamine and mannitol dose and systemic vascular resistance. Also included is autocomputation of documented minimum, maximum and average pressure, blood flow, oxygen flow/percent, temperature, activated clotting time and SVR. Basic file functions are easily performed as saving, retrieving, deleting and report generation. Database management capabilities include sorting by contents of fields, searching for specific records and graphing of selected parameters. The use of a computer-generated perfusion record has become a valuable tool in the tracking and evaluation of important patient parameters during cardiopulmonary bypass.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Lotus 1-2-3 (Computer program)"

1

Johannes, Elisabeth. "DEUTSCH 1, 2, 3!! : an interactive, multimedia, web-based program for the German foreign language classroom." Thesis, Link to the online version, 2007. http://hdl.handle.net/10019/741.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Lotus 1-2-3 (Computer program)"

1

Waller, Dick. Lotus 1-2-3. London: Pitman, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Shelton, Nelda. Using Lotus 1-2-3. Boston: Houghton Mifflin Co., 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Herzlich, Paul. Illustrated Lotus 1-2-3. London: Fulton, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Herzlich, Paul. Illustrated Lotus 1-2-3. London: D. Fulton, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Smith, Gaylord N. Lotus 1-2-3 QUICK. 3rd ed. Cincinnati, Ohio: College Division, South-Western Pub. Co., 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Smith, Gaylord N. Lotus 1-2-3 QUICK. Cincinnati: South-Western Pub. Co., 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Godin, Seth. Foolproof Lotus 1-2-3. New York: Putnam Pub. Group, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Bond, Gary C. Lotus 1-2-3 quick reference handbook. New York: Wiley, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Facilitators, Electronic Learning, ed. The Lotus 1-2-3 book. San Diego: Harcourt Brace Jovanovich, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yarborough, Nelson Kay, ed. Lotus 1-2-3 instant reference. San Francisco: SYBEX, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Lotus 1-2-3 (Computer program)"

1

Weik, Martin H. "Lotus 1-2-3." In Computer Science and Communications Dictionary, 938. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/1-4020-0613-6_10717.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wen, Cheng, Jialun Cao, Jie Su, Zhiwu Xu, Shengchao Qin, Mengda He, Haokun Li, Shing-Chi Cheung, and Cong Tian. "Enchanting Program Specification Synthesis by Large Language Models Using Static Analysis and Program Verification." In Computer Aided Verification, 302–28. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-65630-9_16.

Full text
Abstract:
AbstractFormal verification provides a rigorous and systematic approach to ensure the correctness and reliability of software systems. Yet, constructing specifications for the full proof relies on domain expertise and non-trivial manpower. In view of such needs, an automated approach for specification synthesis is desired. While existing automated approaches are limited in their versatility, i.e., they either focus only on synthesizing loop invariants for numerical programs, or are tailored for specific types of programs or invariants. Programs involving multiple complicated data types (e.g., arrays, pointers) and code structures (e.g., nested loops, function calls) are often beyond their capabilities. To help bridge this gap, we present AutoSpec, an automated approach to synthesize specifications for automated program verification. It overcomes the shortcomings of existing work in specification versatility, synthesizing satisfiable and adequate specifications for full proof. It is driven by static analysis and program verification, and is empowered by large language models (LLMs). AutoSpec addresses the practical challenges in three ways: (1) driving AutoSpec by static analysis and program verification, LLMs serve as generators to generate candidate specifications, (2) programs are decomposed to direct the attention of LLMs, and (3) candidate specifications are validated in each round to avoid error accumulation during the interaction with LLMs. In this way, AutoSpec can incrementally and iteratively generate satisfiable and adequate specifications. The evaluation shows its effectiveness and usefulness, as it outperforms existing works by successfully verifying 79% of programs through automatic specification synthesis, a significant improvement of 1.592x. It can also be successfully applied to verify the programs in a real-world X509-parser project.
APA, Harvard, Vancouver, ISO, and other styles
3

Eilers, Marco, Severin Meier, and Peter Müller. "Product Programs in the Wild: Retrofitting Program Verifiers to Check Information Flow Security." In Computer Aided Verification, 718–41. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81685-8_34.

Full text
Abstract:
AbstractMost existing program verifiers check trace properties such as functional correctness, but do not support the verification of hyperproperties, in particular, information flow security. In principle, product programs allow one to reduce the verification of hyperproperties to trace properties and, thus, apply standard verifiers to check them; in practice, product constructions are usually defined only for simple programming languages without features like dynamic method binding or concurrency and, consequently, cannot be directly applied to verify information flow security in a full-fledged language. However, many existing verifiers encode programs from source languages into simple intermediate verification languages, which opens up the possibility of constructing a product program on the intermediate language level, reusing the existing encoding and drastically reducing the effort required to develop new verification tools for information flow security. In this paper, we explore the potential of this approach along three dimensions: (1) Soundness: We show that the combination of an encoding and a product construction that are individually sound can still be unsound, and identify a novel condition on the encoding that ensures overall soundness. (2) Concurrency: We show how sequential product programs on the intermediate language level can be used to verify information flow security of concurrent source programs. (3) Performance: We implement a product construction in Nagini, a Python verifier built upon the Viper intermediate language, and evaluate it on a number of challenging examples. We show that the resulting tool offers acceptable performance, while matching or surpassing existing tools in its combination of language feature support and expressiveness.
APA, Harvard, Vancouver, ISO, and other styles
4

Wallace, Tracey D., and John T. Morris. "SwapMyMood: User-Centered Design and Development of a Mobile App to Support Executive Function." In Lecture Notes in Computer Science, 259–65. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58805-2_31.

Full text
Abstract:
AbstractThis paper describes the research and development of the SwapMyMood smartphone application designed to support use of evidence-based executive function strategies by people with traumatic brain injury. Executive dysfunction is a common sequela of traumatic brain injury (TBI) resulting in diminished cognitive-behavioral functioning. Problem-solving and emotion regulation are cognitive-behavioral functions that are often disrupted by changes in the executive control system. SwapMyMood is an electronic version of the Executive Plus/STEP program, a set of clinical techniques taught to people living with brain injury to help them 1) identify and implement solutions to problems encountered in daily life and 2) to utilize the emotion cycle to understand and regulate emotional responses to these problems. The Executive Plus/STEP program has until now relied on paper-based instruction and use. Input from target users – people with brain injury and clinical professionals who teach this program to their patients – has contributed to key refinements of features and functioning of the mobile app. Data gathered from target user participation in the user-centered design process are presented. Future directions for ongoing development of technologies to support executive function strategies are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
5

Kura, Satoshi, Hiroshi Unno, and Ichiro Hasuo. "Decision Tree Learning in CEGIS-Based Termination Analysis." In Computer Aided Verification, 75–98. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81688-9_4.

Full text
Abstract:
AbstractWe present a novel decision tree-based synthesis algorithm of ranking functions for verifying program termination. Our algorithm is integrated into the workflow of CounterExample Guided Inductive Synthesis (CEGIS). CEGIS is an iterative learning model where, at each iteration, (1) a synthesizer synthesizes a candidate solution from the current examples, and (2) a validator accepts the candidate solution if it is correct, or rejects it providing counterexamples as part of the next examples. Our main novelty is in the design of a synthesizer: building on top of a usual decision tree learning algorithm, our algorithm detects cycles in a set of example transitions and uses them for refining decision trees. We have implemented the proposed method and obtained promising experimental results on existing benchmark sets of (non-)termination verification problems that require synthesis of piecewise-defined lexicographic affine ranking functions.
APA, Harvard, Vancouver, ISO, and other styles
6

Golia, Priyanka, Brendan Juba, and Kuldeep S. Meel. "A Scalable Shannon Entropy Estimator." In Computer Aided Verification, 363–84. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-13185-1_18.

Full text
Abstract:
AbstractQuantified information flow (QIF) has emerged as a rigorous approach to quantitatively measure confidentiality; the information-theoretic underpinning of QIF allows the end-users to link the computed quantities with the computational effort required on the part of the adversary to gain access to desired confidential information. In this work, we focus on the estimation of Shannon entropy for a given program $$\varPi $$ Π . As a first step, we focus on the case wherein a Boolean formula $$\varphi (X,Y)$$ φ ( X , Y ) captures the relationship between inputs X and output Y of $$\varPi $$ Π . Such formulas $$\varphi (X,Y)$$ φ ( X , Y ) have the property that for every valuation to X, there exists exactly one valuation to Y such that $$\varphi $$ φ is satisfied. The existing techniques require $$\mathcal {O}(2^m)$$ O ( 2 m ) model counting queries, where $$m = |Y|$$ m = | Y | .We propose the first efficient algorithmic technique, called $$\mathsf {EntropyEstimation}$$ EntropyEstimation to estimate the Shannon entropy of $$\varphi $$ φ with PAC-style guarantees, i.e., the computed estimate is guaranteed to lie within a $$(1\pm \varepsilon )$$ ( 1 ± ε ) -factor of the ground truth with confidence at least $$1-\delta $$ 1 - δ . Furthermore, $$\mathsf {EntropyEstimation}$$ EntropyEstimation makes only $$\mathcal {O}(\frac{min(m,n)}{\varepsilon ^2})$$ O ( m i n ( m , n ) ε 2 ) counting and sampling queries, where $$m = |Y|$$ m = | Y | , and $$n = |X|$$ n = | X | , thereby achieving a significant reduction in the number of model counting queries. We demonstrate the practical efficiency of our algorithmic framework via a detailed experimental evaluation. Our evaluation demonstrates that the proposed framework scales to the formulas beyond the reach of the previously known approaches.
APA, Harvard, Vancouver, ISO, and other styles
7

Abohaia, Zina Ahmed, and Yousef Mamdouh Hassan. "Summarising a Twitter Feed Using Weighted Frequency." In Lecture Notes in Civil Engineering, 179–87. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-27462-6_17.

Full text
Abstract:
AbstractData is growing exponentially every day, with 500 million tweets sent on Twitter alone (Desjardins 2021). Twitter feeds are long, take time to understand, are multilingual, and have multimedia. This makes it difficult to analyse in its raw form so the data needs to be extracted, cleaned, and structured, to be able to be used in research. This paper proposes summarising twitter feeds as a manner of structuring them. The objectives we sought to achieve are: (1) Use the Twitter API to retrieve tweets successfully, (2) Efficiently detect the language of text, and tokenize it to then analyse their content (in its language), (3) Use live tweets as the input instead of a database of tweets, (4) Create the interface as a plugin to make it accessible for computer scientists, and others, alike. We also aimed to test whether using weighted frequency to construct summaries of tweets would be successful, and by conducting a survey to test our results, we have found that our program is seen to be useful, accessible, and efficient at giving summarizations of twitter accounts. Weighted frequency also proved to be good at summarising text of any language, inputted.
APA, Harvard, Vancouver, ISO, and other styles
8

Meyer-Peyton, Lore. "Elements of a Successful Distributed Learning Program." In Distance Learning Technologies, 82–90. IGI Global, 2000. http://dx.doi.org/10.4018/978-1-878289-80-3.ch007.

Full text
Abstract:
Global connectivity has opened up a new dimension in education, namely, the concept of delivering education via technology to students who may never see their classmates or their instructor face to face. The typical school with its traditional classrooms does not exist in this new scenario, and many of the professionals responsible for developing distributed learning courses are new to the task. This chapter will guide the reader through the process of planning and implementing a distributed learning program. The model for this chapter is the distributed learning program provided by the Department of Defense Education Activity to schools serving the family members of U.S. military personnel at home and abroad. The DoDEA Electronic School (DES) offers sixteen courses to over six hundred students at 56 high schools in fourteen countries, spanning twelve time zones. The program has been in existence for over twelve years, evolving from a two-teacher program to a worldwide school headed by an administrative staff and employing 23 instructors and four technical support staff members. Courses currently available through the DES include seven advanced placement courses (Calculus AB and BC, Physics B, German, United States History, and Computer Science A and AB); five computer programming courses (Pascal I and II, Q-BASIC, Visual BASIC, and C++); economics; health; humanities; and science research seminar. In addition to offering student courses, the DES is in the process of adding an extensive staff development component. With teachers and staff based worldwide, the system can save a significant amount of travel money by providing staff development opportunities that are accessible at the local site. The DoDEA Electronic School grew up with technology. During those first years, students used an acoustic coupler and a telephone to call a central computer in the United States, where they accessed a text based conferencing program to communicate with their classmates and instructors. Today’s DES instructors develop their courses in Lotus Notes, and students can use either the Lotus Notes client or a Web browser. Domino servers at each school send and receive information via the Internet, resulting in efficient transfer of data. In today’s environment, rich with technology but short on hours in the day, there is no time afforded for the luxury of “evolving.” Professionals tasked with developing distributed learning programs for their organizations are given a staff, a budget and a mandate— and certainly a challenge. The goal of this chapter is to help those professionals meet the challenge by examining the key elements of a successful distributed learning program.
APA, Harvard, Vancouver, ISO, and other styles
9

Sekerinski, Emil. "Exceptions for Dependability." In Dependability and Computer Engineering, 11–35. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-60960-747-0.ch002.

Full text
Abstract:
Exception handling allows (1) a program to be structured such that the original design is preserved in presence of possibly failing components; (2) rare or undesired cases to be treated in an unobtrusive manner; and (3) imperfections to be handled systematically. This chapter develops a theory of exception handling with try-catch statements, and demonstrates its use in the design of dependable systems by giving a formal account of the patterns of masking, propagating, flagging, rollback, degraded service, recovery block, repeated attempts, and conditional retry. The theory is based on weakest exceptional preconditions, which are used for both defining statements and proofs. Proof outlines are introduced and used to establish the correctness of the patterns.
APA, Harvard, Vancouver, ISO, and other styles
10

Jensen, Sisse Siggaard, and Simon B. Heilesen. "Time, Place and Identity in Project Work on the Net." In Computer-Supported Collaborative Learning in Higher Education, 51–69. IGI Global, 2005. http://dx.doi.org/10.4018/978-1-59140-408-8.ch003.

Full text
Abstract:
This chapter identifies some of the fundamental conditions and factors that affect collaborative project work on the Net. Understanding them is fundamental to developing key qualities in Net-based collaborative learning such as confidence, reliability, and trust. We argue that: (1) Collaboration and social interaction develop in continuous oscillations between abstract and meaningful frames of reference as to time and place. (2) Such oscillations condition the creation of a double identity of writer and author modes in social interaction. (3) Collaborative work creates an ever-increasing complexity of interwoven texts that we have to develop strategies for organizing. (4) One such important strategy is the negotiation of roles among the participants. Having established this theoretical framework, we discuss how to deal with these conditions in an actual Net-based learning environment, the Master of Computer-Mediated Communication program at Roskilde University, Denmark.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Lotus 1-2-3 (Computer program)"

1

Sinha, Dipendra K., and Michael T. McDonald. "Automated Design of Optimum Belt Drives." In ASME 1991 Design Technical Conferences. American Society of Mechanical Engineers, 1991. http://dx.doi.org/10.1115/detc1991-0082.

Full text
Abstract:
Abstract The paper describes a belt design package which works from within a commercial Computer Aided Design and Drafting package (AutoCAD) environment and utilizes FORTRAN programs for design and selection of lowest weight components for the drive system. The components used in the process are available as stock items in U.S.A. The relevant information on these products is stored in commercial database management systems such as EXCEL and LOTUS 1-2-3. Output from the package consists of scaled drawing and tabular specifications.
APA, Harvard, Vancouver, ISO, and other styles
2

Joo, Sung-Hwan. "Development of the New Drafting and Computer Aided Design Standards for Mechanical Engineering Program." In ASME 2012 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/imece2012-88758.

Full text
Abstract:
The paper provides the method to develop the drafting and CAD standards for mechanical engineering program based on the systemic approach. The drafting has been used since the early stage of engineering. Lots of drafting technique has been developed and standardized. Recently, Computer Aided Design (CAD) software has been used widely in academia and industries too. Because of these reasons, every mechanical engineering program offers Drafting and CAD courses to its students. Some programs even have their own Drafting and CAD standards. However, it is not easy to develop the Drafting and CAD standard for whole program. It needs a careful plan to develop the standards. It needs to meet the certain requirements. Those requirements are 1) It needs to meet ASME Y14/ANSI Y14 standard as much as possible, 2) Students should be able to understand the standards and apply the rules to their own drawing and CAD models, 3) Any instructors should be able to give the proper feedback to students about their drawing using the standards, 4) Graduating students should be able to adopt the standard of their company easily. To meet these requirements, some preliminary work must be done. 1) Understanding of ASME Y14 is needed, 2) Expertise of one or more CAD software packages is required, 3) Students’ level of understanding the ASME Y14 standards needs to be measured, 4) Feedback from industries is required. Each steps of development of Drafting and CAD standards are explained using real example of students work.
APA, Harvard, Vancouver, ISO, and other styles
3

Khosrowjerdi, Mohammad, and Gary A. Sniezak. "Design and Analysis of Mechanical Systems Using Electronic Spreadsheet Packages." In ASME 1993 International Computers in Engineering Conference and Exposition. American Society of Mechanical Engineers, 1993. http://dx.doi.org/10.1115/cie1993-0073.

Full text
Abstract:
Abstract In the past decade, sophisticated mathematical, statistical and graphical routines have been added to spreadsheet packages, thus converting them to serious analytical tools. The engineering community has adopted these packages to perform various sensitivity analyses on design problems which lend themselves to these types of applications. Design engineers can now rapidly and conveniently utilize these tools to perform a wide variety of analyses on design problems having simple geometries and loading conditions, without having to develop any FORTRAN, Basic or C code. This paper is concerned with the applications of spreadsheet packages in the area of steady and transient thermal analysis, design optimization and solution of initial-value differential equations. Lotus 1-2-3 in conjunction with the explicit finite difference method (Gauss-Seidel method) has been used to predict transient temperature distribution in an axi-symmetric model of the barrel of a handgun. Borland International Quattro Pro spreadsheet program along with Runge-Kutta and Newmark numerical integration techniques have been employed to optimally design the recoil system of a handgun. Also, the use of Quattro Pro for constructing the contour plots of temperature in a Very Large Scale Integrated (VLSI) chip and the IBM 3081 TCM cold plate are demonstrated. Finally, the built-in Error and Bessel functions in Microsoft Excel have been used to find temperature distribution in a semi-infinite and cylindrical objects.
APA, Harvard, Vancouver, ISO, and other styles
4

Stephenson, Dave. "Using Spreadsheets to Recalculate Airspaces for Diffraction-Limited Assemblies." In International Optical Design Conference. Washington, D.C.: Optica Publishing Group, 1994. http://dx.doi.org/10.1364/iodc.1994.mid.202.

Full text
Abstract:
The use of spreadsheet programs (Lotus [1], Excel [2], etc.) as an optical design and production fabrication tool is reviewed. The common problem of melt and thickness recalculation prior to assembly is demonstrated using two approaches: (1) the spreadsheet preprocesses and writes a file for batch processing by a commercial optical design program, and (2) the spreadsheet uses matrix algebra and a partial derivative matrix to compute the respacing without any additional ray tracing.
APA, Harvard, Vancouver, ISO, and other styles
5

Hanawa, Kirk. "Thermodynamic Performance Analyses of Mixed Gas-Steam Cycle (1): Performance Prediction Method." In ASME 1998 International Gas Turbine and Aeroengine Congress and Exhibition. American Society of Mechanical Engineers, 1998. http://dx.doi.org/10.1115/98-gt-117.

Full text
Abstract:
There are various papers relevant to the improvement ideas of gas turbine cycles, which in general discuss only optimum one-point cycle analysis.*1,*2,*6 It is, accordingly, unclear whether such improvement concepts can be applied into existing gas turbines or not. It might be difficult to incorporate such ideas, in the case of yielding significant changes for operation modes. And it may be essential to assess improvement ideas, from view points of applicability to existing gas turbine models.*3 This paper introduces the performance analysis method of simplified small perturbation procedure, showing thermodynamic behaviors based upon the component characteristics, and resultant influences due to settled operation parameters, like ambient temperature & pressure, turbine inlet temperature, etc. The established method might be used as a rule of thumb for the performance prediction when introducing water and/or steam injection into GTs, where operational parameters’ changes are defined under multi-linear differential equations. This is easy to compile in the computer as Lotus 1-2-3 or Exel to evaluate whether every parameter is within the limit or not, offering very helpful performance evaluation tool for the conceptual design stage.
APA, Harvard, Vancouver, ISO, and other styles
6

Zaazaa, Khaled E., Brian Whitten, Brian Marquis, Erik Curtis, Magdy El-Sibaie, Ali Tajaddini, and Ahmed A. Shabana. "A Nonlinear Rail Vehicle Dynamics Computer Program SAMS/Rail: Part 1—Theory and Formulations." In 2009 Joint Rail Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/jrc2009-63045.

Full text
Abstract:
Accurate prediction of railroad vehicle performance requires detailed formulations of wheel-rail contact models. In the past, most dynamic simulation tools used an offline wheel-rail contact element based on look-up tables that are used by the main simulation solver. Nowadays, the use of an online nonlinear three-dimensional wheel-rail contact element is necessary in order to accurately predict the dynamic performance of high speed trains. Recently, the Federal Railroad Administration, Office of Research and Development has sponsored a project to develop a general multibody simulation code that uses an online nonlinear three-dimensional wheel-rail contact element to predict the contact forces between wheel and rail. In this paper, several nonlinear wheel-rail contact formulations are presented, each using the online three-dimensional approach. The methods presented are divided into two contact approaches. In the first Constraint Approach, the wheel is assumed to remain in contact with the rail. In this approach, the normal contact forces are determined by using the technique of Lagrange multipliers. In the second Elastic Approach, wheel/rail separation and penetration are allowed, and the normal contact forces are determined by using Hertz’s Theory. The advantages and disadvantages of each method are presented in this paper. In addition, this paper discusses future developments and improvements for the multibody system code. Some of these improvements are currently being implemented by the University of Illinois at Chicago (UIC). In the accompanying “Part 2” and “Part 3” to this paper, numerical examples are presented in order to demonstrate the results obtained from this research.
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Christine, and Yujia Zhang. "An Intelligent Program to Monitor 3D Printing and Detect Failures using Computer Vision and Machine Learning." In 9th International Conference on Artificial Intelligence. Academy and Industry Research Collaboration Center (AIRCC), 2023. http://dx.doi.org/10.5121/csit.2023.130713.

Full text
Abstract:
This paper proposes a novel solution for tracking the 3D printing process using an application that provides users with real-time updates on its progress [1]. The approach involves taking pictures of the 3D printer during the printing process, which are then analyzed by an AI model trained on thousands of labeled images to detect print failures [2]. The system is implemented using a Raspberry Pi and a camera, which capture images of the 3D printer and upload them to an online database [3]. The proposed application accesses this database to keep the user informed of the printer's current state, ensuring a seamless printing experience.
APA, Harvard, Vancouver, ISO, and other styles
8

Kang, Jian, and J. T. Wang. "ISP: An Airbag Inflator Simulation Program." In ASME 1998 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 1998. http://dx.doi.org/10.1115/imece1998-0975.

Full text
Abstract:
Abstract An inflator simulation computer program, ISP, has been developed to facilitate the assessment of transient heat transfer models. In addition to providing for incorporation of new transient heat transfer models, this program has several other useful and unique features: 1) it allows gas to have multiple chemical components, 2) it allows gas to use temperature-dependent thermodynamic properties, 3) it can self-generate gas thermodynamic properties through linkage to a chemical kinetic package CHEMKIN, and 4) it allows the users to use either cubic spline or polynomial curve fitting to smooth the noise tank test data. This program can simulate three basic tasks: inflator tank test analysis using either the average temperature method or the dual pressure method, tank test simulation, and bag deployment analysis.
APA, Harvard, Vancouver, ISO, and other styles
9

Brault, Janes W., and Mark C. Abrams. "DECOMP: A Fourier Transfom Spectra Decomposition Program." In High Resolution Fourier Transform Spectroscopy. Washington, D.C.: Optica Publishing Group, 1989. http://dx.doi.org/10.1364/hrfts.1989.pdp2.

Full text
Abstract:
Current techniques for processing high resolution Fourier transform spectra revolve around interactive graphical display of the spectrum on a computer. The DECOMP spectrum decomposition program is designed explicitly for the reduction of Fourier transform spectra and focuses on reducing a spectrum into a list of line parameters. Basic methods of spectrum manipulation will be demonstrated and a IBM PC - compatible computer will be available for hands-on demonstrations of the process of spectrum analysis. Figures 1 and 2 illustrate the process of background subtraction: in Figure 1 a low resolution spectrum is generated by binning the high resolution spectrum and beneath the spectrum is a background correction function generated by creating a low resolution "minima" spectrum and smoothing the spectrum. The results of the background correction are given in Figure 2. Figure 3 illustrates a common problem in Fourier transform spectroscopy: the finite length of the interferogram introduces "ringing" into the spectrum. Using a filtered fitting routine in DECOMP the ringing can be effectively removed yielding a spectrum illustrated in Figure 4, in which several new spectral features that had been hidden beneath the ringing. An example of the atlas plots that can be generated using DECOMP is given in Figure 5.
APA, Harvard, Vancouver, ISO, and other styles
10

Đukić, Aleksandra, Jelena Marić, Eva Vaništa Lazarević, Biserka Mitrović, and Emilija Jović. "METHODOLOGICAL APPROACH: STUDENTS’ VISIONS AND CONCEPTS FOR PLANING THE SUSTAINABLE EXPO CENTER IN BELGRADE." In 20th SCIENTIFIC-PROFESSIONAL CONFERENCE WITH INTERNATIONAL PARTICIPATION “URBANISM AND SUSTAINABLE DEVELOPMENT”. Serbian Town Planner Association, 2024. http://dx.doi.org/10.46793/urbanizam24.177dj.

Full text
Abstract:
The "EXPO" exhibition represents a famous world exhibition with a tradition of over 150 years. This event, led by the French Bureau International des Expositions, will be organized in Belgrade in 2027. A large influx of tourists, new construction, and recognizable mega-projects of architecture are an integral part of large organizations like this one. Therefore, the topic of planning an alternative version of the exhibition "EXPO 2027 - EXPO 2027" is an integral part of the task for students of the fourth year of integrated academic studies at the University of Belgrade, Faculty of Architecture in the academic year 2023-24. The total number of students engaged in this task includes two "studio-projects" with over 40 students who cooperate in group work with mentors and associates during 14 weeks of active teaching. The methodological basis of student approaches is concentrated on "RIBA - The Royal Institute of British Architects" methodology, which includes the following phases: (1) developing vision and information base; (2) analysis and evaluation; (3) conceptual solution and (4) design. These steps are creatively applied in working with students through innovative methods and techniques such as: superhero and lotus blossom method. With their help, students begin the work primarily by defining problems and potentials, all the way up to developing a new conceptual solution. Student works represent original visions for sustainable concepts, principles elements, and content features adequate for an exhibition like "EXPO 2027 - EXPO 2027" and presented in the form of an attractive design and extensive program.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Lotus 1-2-3 (Computer program)"

1

Brown, R. J. L51598 Tow Methods Design Guide for the Installation of Offshore Pipelines. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), January 1989. http://dx.doi.org/10.55274/r0010093.

Full text
Abstract:
In an effort to progress the development of hydrocarbon fields found in water depths beyond 1000 feet, the oil and gas industry is considering cost effective methods of subsea pipeline installation. As an established and reliable oil and gas transportation system, the pipeline will be an important factor in deepwater development. Surface installation methods are expected to become costly in deepwater due to the necessary equipment modifications and vessel requirements which are synonymous with increased tension capability and deepwater station keeping ability. This estimated increase in installation costs has spurred interests in alternative pipeline installation methods that possess the potential for economic competitiveness. Limiting parameters for single as well as bundled pipeline configurations are evaluated for surface, near-surface, mid-depth, off-bottom and bottom tow methods of pipeline installation. The evaluation shows that the viable towing methods for construction of offshore pipelines include the mid-depth and bottom tow methods. Buckle collapse criteria should be utilized in the design of deepwater pipelines to achieve low submerged weights with minimum buoyancy requirements. Low submerged weight provides the maximum pipe tow string length, reducing the number of mid-line connections. A cost estimating program in LOTUS 1-2-3 format is included with this study. The costing program generates cost for each tow method based on user specified pipeline parameters. The cost program also includes J-lay estimating capabilities for single pipelines.
APA, Harvard, Vancouver, ISO, and other styles
2

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Cripple Wall Small-Component - Test Program: Comparisons (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/lohh5109.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER) and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4 (WG4): Testing, whose central focus was to experimentally investigate the seismic performance of retrofit and existing cripple walls. Amongst the body of reports from WG4, in the present report, a suite of four small cripple wall test phases, in total 28 specimens, are cross compared with varied exterior finishes, namely stucco (wet) and non-stucco (dry) exterior finishes. Details representative of era specific construction, specifically the most vulnerable pre-1960s construction are of predominant focus in the present effort. Experiments involved imposition of combined vertical loading and quasi-static reversed cyclic lateral load onto cripple walls of 12 ft in length and 2 ft or 6 ft in height. All specimens in this report were constructed with the same boundary conditions and tested with the same vertical load. Parameters addressed in this report include: wet exterior finishes (stucco over framing, stucco over horizontal lumber sheathing, and stucco over diagonal lumber sheathing); and dry exterior finishes (horizontal siding, horizontal siding over diagonal sheathing, and T1-11 wood structural panels) with attention towards cripple wall height and the retrofit condition. The present report provides only a brief overview of the test program and setup; whereas a series of three prior reports present results of test groupings nominally by exterior finish type (wet versus dry). As such, herein the focus is to cross compare key measurements and observations of the in-plane seismic behavior of all 28 specimens.
APA, Harvard, Vancouver, ISO, and other styles
3

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Cripple Wall Small-Component Test Program: Wet Specimens I (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/dqhf2112.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4: Testing and focuses on the first phase of an experimental investigation to study the seismic performance of retrofitted and existing cripple walls with sill anchorage. Paralleled by a large-component test program conducted at the University of California [Cobeen et al. 2020], the present study involves the first of multiple phases of small-component tests conducted at the UC San Diego. Details representative of era-specific construction, specifically the most vulnerable pre-1960s construction, are of predominant focus in the present effort. Parameters examined are cripple wall height, finish materials, gravity load, boundary conditions, anchorage, and deterioration. This report addresses the first phase of testing, which consisted of six specimens. Phase 1 including quasi-static reversed cyclic lateral load testing of six 12-ft-long, 2-ft high cripple walls. All specimens in this phase were finished on their exterior with stucco over horizontal sheathing (referred to as a “wet” finish), a finish noted to be common of dwellings built in California before 1945. Parameters addressed in this first phase include: boundary conditions on the top, bottom, and corners of the walls, attachment of the sill to the foundation, and the retrofitted condition. Details of the test specimens, testing protocol, instrumentation; and measured as well as physical observations are summarized in this report. In addition, this report discusses the rationale and scope of subsequent small-component test phases. Companion reports present these test phases considering, amongst other variables, the impacts of dry finishes and cripple wall height (Phases 2–4). Results from these experiments are intended to provide an experimental basis to support numerical modeling used to develop loss models, which are intended to quantify the reduction of loss achieved by applying state-of-practice retrofit methods as identified in FEMA P-1100, Vulnerability-Base Seismic Assessment and Retrofit of One- and Two-Family Dwellings.
APA, Harvard, Vancouver, ISO, and other styles
4

Schiller, Brandon, Tara Hutchinson, and Kelly Cobeen. Cripple Wall Small-Component Test Program: Wet Specimens II (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/ldbn4070.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 4 (WG4): Testing, whose central focus was to experimentally investigate the seismic performance of retrofitted and existing cripple walls. This report focuses stucco or “wet” exterior finishes. Paralleled by a large-component test program conducted at the University of California, Berkeley (UC Berkeley) [Cobeen et al. 2020], the present study involves two of multiple phases of small-component tests conducted at the University of California San Diego (UC San Diego). Details representative of era-specific construction, specifically the most vulnerable pre-1960s construction, are of predominant focus in the present effort. Parameters examined are cripple wall height, finish style, gravity load, boundary conditions, anchorage, and deterioration. This report addresses the third phase of testing, which consisted of eight specimens, as well as half of the fourth phase of testing, which consisted of six specimens where three will be discussed. Although conducted in different phases, their results are combined here to co-locate observations regarding the behavior of the second phase the wet (stucco) finished specimens. The results of first phase of wet specimen tests were presented in Schiller et al. [2020(a)]. Experiments involved imposition of combined vertical loading and quasi-static reversed cyclic lateral load onto ten cripple walls of 12 ft long and 2 or 6 ft high. One cripple wall was tested with a monotonic loading protocol. All specimens in this report were constructed with the same boundary conditions on the top and corners of the walls as well as being tested with the same vertical load. Parameters addressed in this report include: wet exterior finishes (stucco over framing, stucco over horizontal lumber sheathing, and stucco over diagonal lumber sheathing), cripple wall height, loading protocol, anchorage condition, boundary condition at the bottom of the walls, and the retrofitted condition. Details of the test specimens, testing protocol, including instrumentation; and measured as well as physical observations are summarized in this report. Companion reports present phases of the tests considering, amongst other variables, impacts of various boundary conditions, stucco (wet) and non-stucco (dry) finishes, vertical load, cripple wall height, and anchorage condition. Results from these experiments are intended to support advancement of numerical modeling tools, which ultimately will inform seismic loss models capable of quantifying the reduction of loss achieved by applying state-of-practice retrofit methods as identified in FEMA P-1100,Vulnerability-Base Seismic Assessment and Retrofit of One- and Two-Family Dwellings.
APA, Harvard, Vancouver, ISO, and other styles
5

Chapman and Keshavarz-Valian. L51988 Development of Turbocharger-Reciprocating Engine Simulation (T-RECS). Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), December 2002. http://dx.doi.org/10.55274/r0010947.

Full text
Abstract:
The objective of this project was to develop a numerical simulation system that can be used to optimize the performance of a large-bore reciprocating engine. The simulation system will include sub-models of all major components between the inlet filter and the exhaust pipe of the engine. The deliverable product of this project is the software program, Turbocharger-Reciprocating Engine Computer Simulation (T-RECS), which was developed at the National Gas Machinery Laboratory at Kansas State University. The simulation program calls upon a database of system components to allow the user specify a specific system. The database in this edition of T-RECS contains nominal components, but will be expanded under the Populate T-RECS project that has been funded by the Gas Research Institute. This report contains 1) a discussion of the methodology utilized to develop T-RECS; 2) a user�s guide; and 3) an example problem.
APA, Harvard, Vancouver, ISO, and other styles
6

Maxey. L51537 Power Line Fault Current Coupling to Nearby Natural Gas Pipelines. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), November 1988. http://dx.doi.org/10.55274/r0010412.

Full text
Abstract:
Electric and natural gas utilities often find it advantageous to share rights-of-way. Available methods for evaluating electrical effects on gas pipelines have been difficult to use at best, and at worst, incorrect. A generalized approach that addresses inductive and conductive interferences has not been available. Initiated to fill that need, this work is part of a research effort cosponsored by EPRI and the Pipe Line Research Council International, Inc. (PRCI) �A generalized approach to the analysis of the effects of transmission line faults on natural gas transmission pipelines has been developed and is presented in this report. A state of the art user-friendly computational tool has been developed and verified for the analysis of interference between electrical power lines and nearby buried or aboveground pipelines. This computer program, ECCAPP, is distinguished by its ability to model and analyze accurately complex, realistic interactions between pipelines and power lines, using easily obtained input data. The final report consists of three volumes. An independent fourth volume was also developed to simplify the installation of the ECCAPP software.Volume 1 contains the theory upon which the ECCAPP computer program is based. A parametric analysis and graphical charts have been formulated using ECCAPP to permit estimates to be made in the field or during preliminary analyses for situations that are not too complex. A discussion of various useful mitigation methods is included. The discussion is based on previous research work and on the results of the parametric analysis.Volume 2 is a detailed user's manual which describes not only how to use the program itself, but also which engineering data must be sought during an analysis and how to assimilate it into a computer model. A detailed sample problem is included. A detailed \Glossary of Terms\" used by ECCAPP as well as suitable input data forms to be filled by power line and pipeline engineers are provided in the appendices.Volume 3 discusses the modeling and performance of pipeline insulation or coating.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography