Academic literature on the topic 'Latex (computer program)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Latex (computer program).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Latex (computer program)"

1

STEVENSON, P. D. "AUTOMATIC GENERATION OF VACUUM AMPLITUDE MANY-BODY PERTURBATION SERIES." International Journal of Modern Physics C 14, no. 08 (2003): 1135–41. http://dx.doi.org/10.1142/s0129183103005236.

Full text
Abstract:
An algorithm and a computer program in Fortran 95 are presented which enumerate the Hugenholtz diagram representation of the many-body perturbation series for the ground state energy with a two-body interaction. The output is in a form suitable for post-processing such as automatic code generation. The result of a particular application, generation of LATEX code to draw the diagrams, is shown.
APA, Harvard, Vancouver, ISO, and other styles
2

Kusworo, Kusworo, Nasmal Hamda, Purwati Yuni Rahayu, Heri Indra Gunawan, and Fitra Jaya. "Pelatihan Manajemen Referensi Mendeley Bagi Peneliti di Provinsi Banten." Indonesian Journal of Society Engagement 1, no. 2 (2021): 19–29. http://dx.doi.org/10.33753/ijse.v1i2.12.

Full text
Abstract:
The community service undertaken aims to provide training to the reseacher in Province of Banten about the importance of producing quality scientific papers. Specifically, this training provides the opportunity for participants to practice directly how to write scientific papers using Mendeley software for reference used in the paper. Mendeley is a computer and web program developed by Elsevier to manage and share research papers, search for research data, and work together online. Posts made with Microsoft Word, Open Office or LaTex can be linked to Mendeley software so that citations and reference lists (bibliography) can be arranged automatically. The activities carried out involve the reseacher In province of banten. The response from the participants is positive which can be seen from their active participation during the activity. They also find the training really useful to help them become more productive teachers by writing more scientific papers using Mendeley. Abstrak Pengabdian Kepada Masyarakat yang dilakukan bertujuan untuk memberikan pelatihan ke peneliti di Provinsi Banten akan pentingnya menghasilkan karya tulis ilmiah yang berkualitas. Secara spesifik, pelatihan ini memberikan kesempatan ke para peserta untuk praktik langsung bagaimana menulis karya ilmiah dengan menggunakan software Mendeley untuk referensi rujukan yang digunakan dalam karya tulis. Mandeley merupakan program komputer dan web yang dikembangkan Elsevier untuk mengelola dan berbagi penelitian, mencari data penelitian, dan bekerjasama secara daring. Tulisan yang dibuat dengan Microsoft Word, Open Office atau LaTex bisa dihubungkan dengan software Mendeley sehingga sitasi dan daftar referensi (bibliography) bisa disusun secara otomatis. Kegiatan yang dilaksanakan diikuti oleh para peneliti di Provinsi Banten. Peserta merespon kegiatan ini dengan sangat positif yang bisa dilihat dari keaktifan mereka selama kegiatan berlangsung. Lebih jauh mereka merasa pelatihan sangat berguna dalam membantu mereka menjadi peneliti yang lebih produktif untuk menulis lebih banyak karya ilmiah dengan menggunakan aplikasi Mendeley. Kata Kunci: mendeley; referensi; pengabdian kepada masyarakat; karya ilmiah
APA, Harvard, Vancouver, ISO, and other styles
3

Rougny, Adrien. "sbgntikz—a TikZ library to draw SBGN maps." Bioinformatics 35, no. 21 (2019): 4499–500. http://dx.doi.org/10.1093/bioinformatics/btz287.

Full text
Abstract:
Abstract Summary The systems biology graphical notation (SBGN) has emerged as the main standard to represent biological maps graphically. It comprises three complementary languages: Process Description, for detailed biomolecular processes; Activity Flow, for influences of biological activities and Entity Relationship, for independent relations shared among biological entities. On the other hand, TikZ is one of the most commonly used package to ‘program’ graphics within TEX/LATEX. Here, we present sbgntikz, a TikZ library that allows drawing and customizing SBGN maps directly into TEX/LATEX documents, using the TikZ language. sbgntikz supports all glyphs of the three SBGN languages, and offers options that facilitate the drawing of complex glyph assembly within TikZ. Furthermore, sbgntikz is provided together with a converter that allows transforming any SBGN map stored under the SBGN Markup Language into a TikZ picture, or rendering it directly into a PDF file. Availability and implementation sbgntikz, the SBGN-ML to sbgntikz converter, as well as a complete documentation can be freely downloaded from https://github.com/Adrienrougny/sbgntikz/. The library and the converter are compatible with all recent operating systems, including Windows, MacOS, and all common Linux distributions. Supplementary information Supplementary material is available at Bioinformatics online.
APA, Harvard, Vancouver, ISO, and other styles
4

Thorning, L. "Introduction of new computing facilities at the Geological Survey of Greenland." Rapport Grønlands Geologiske Undersøgelse 140 (December 31, 1988): 7–9. http://dx.doi.org/10.34194/rapggu.v140.8023.

Full text
Abstract:
From a cautious start in the use of computers in the early 1970s, the Geological Survey of Greenland has developed complex and varied uses of modern computer facilities for both scientific and administrative tasks. GGU's first computer installation, a noisy TTY connected to the Computing Centre of Copenhagen University by a 110 baud telephone modem, was a selfservice facility which was not easy to use. Over the years, first with use of a PDP-10 with just one Tektronix 4014 graphic terminal and later a succession of increasingly powerful PDP-11s with many terminals, GGU's in-house facilities just kept ahead of the ever increasing demand for computer services. At the same time a number of programs for special tasks were developed on external facilities, because they required larger computers or special facilities. In the 1980s the demands on the computer facilitiesrequiring many different types of programs, including word processing, had grown so large that GGU's in-house system could no longer handle them satisfactorily. A major reorganisation was required, and consequently activities were divided between personal computers (PCs; mainly administrative) and a new central computer (mainly scientific). This development took place in late 1986 with the purchase of 17 new personal computers and a new central computer with accessory peripheral equipment. This has allowed an increasing integration of computer methods into GGU's activities. A brief summary is given below.
APA, Harvard, Vancouver, ISO, and other styles
5

Pahmi, Pahmi, Ardiya Ardiya, Wandi Syahfutra, Agung Prasetyo Wibowo, Siti Niah, and Prih Febtiningsih. "PELATIHAN PENGGUNAAN MENDELEY UNTUK REFERENSI DALAM MENULIS KARYA ILMIAH BAGI GURU SMA HANDAYANI PEKANBARU." Jurnal Pengabdian UntukMu NegeRI 2, no. 2 (2018): 35–39. http://dx.doi.org/10.37859/jpumri.v2i2.849.

Full text
Abstract:
The community service undertaken aims to provide training to the teachers at Handayani Senior High School about the importance of producing quality scientific papers. Specifically, this training provides the opportunity for participants to practice directly how to write scientific papers using Mendeley software for reference used in the paper. Mendeley is a computer and web program developed by Elsevier to manage and share research papers, search for research data, and work together online. Posts made with Microsoft Word, Open Office or LaTex can be linked to Mendeley software so that citations and reference lists (bibliography) can be arranged automatically. To achieve the goals and targets of this community service activity (PPM), the implementers use the lecture, discussion and practicum methods so that the trainees can easily understand the material provided in the training. The activities carried out involve the teachers at Handayani Pekanbaru Senior High School as participants. The response from the participants is positive which can be seen from their active participation during the activity. They also find the training really useful to help them become more productive teachers by writing more scientific papers using Mendeley.
 
 Keywords: Community Service, Scientific Paper, Reference, Mendeley
 Abstract
 Pengabdian pada masyarakat yang dilakukan bertujuan untuk memberikan pelatihan kepada para guru SMA Handayani Pekanbaru akan pentingnya menghasilkan karya tulis ilmiah yang berkualitas. Secara specifik, pelatihan ini memberikan kesempatan kepada para peserta untuk praktek langsung bagaimana menulis karya ilmiah dengan menggunakan software mendeley untuk referensi rujukan yang di gunakan dalam karya tulis. Mendeley merupakan program komputer dan web yang dikembangkan Elsevier untuk mengelola dan berbagi makalah penelitian, mencari data penelitian, dan bekerja sama secara daring. Tulisan yang dibuat dengan Microsoft Word, Open Office atau LaTex bisa dihubungkan dengan software Mendeley sehingga sitasi dan daftar referensi (bibliography) bisa disusun secara otomatis. Untuk mencapai tujuan dan target dari kegiatan pengabdian pada masyarakat (PPM) ini, maka pelaksana menggunakan metode ceramah, diskusi, dan praktikum agar peserta pelatihan dapat mudah memahami materi yang diberikan pada pelatihan tersebut. Kegiatan yang dilaksanakan diikuti oleh para guru di SMA Handayani Pekanbaru. Peserta merespon kegiatan ini dengan sangat positif yang bisa dilihat dari keaktifan mereka selama kegiatan berlangsung. Lebih jauh mereka merasa pelatihan sangat berguna dalam membantu mereka menjadi guru yang lebih produktif untuk menulis lebih banyak karya ilmiah menggunakan aplikasi Mendeley.
APA, Harvard, Vancouver, ISO, and other styles
6

Galassi, Giuseppe, and Richard V. Mattessich. "Some Clarification to the Evolution of the Electronic Spreadsheet." Journal of Emerging Technologies in Accounting 11, no. 1 (2014): 99–104. http://dx.doi.org/10.2308/jeta-51114.

Full text
Abstract:
ABSTRACT As early as 1961 Mattessich suggested (in an article in The Accounting Review) to use budget simulation in form of a computerized spreadsheet. This was followed up by him in a mathematical model, outlined in his book Accounting and Analytical Methods (Mattessich 1964a) with a corresponding computer program (in FORTRAN IV on mainframe computers), including illustrations in a companion volume (Simulation of the Firm through a Budget Computer Program, Mattessich 1964b). Five years later (in 1969) Rene Pardo and Remy Landau co-presented “LANPAR” (LANguage for Programming Arrays at Random) at Random Corporation. This electronic spreadsheet type was also used on mainframe computers for budgeting at Bell Canada, AT&T, Bell operating companies, and General Motors. In 1978, Dan Bricklin and Robert Frankston introduced VisiCalc, the first commercialized spreadsheet program for personal desktop (Apple) computers. This program became the trailblazer for future developments of electronic spreadsheets.
APA, Harvard, Vancouver, ISO, and other styles
7

Mukhanov, S. A., A. A. Mukhanova, and V. V. Britvina. "Overview of some technologies for designing a task generator in higher mathematics for distance learning systems." SHS Web of Conferences 141 (2022): 03009. http://dx.doi.org/10.1051/shsconf/202214103009.

Full text
Abstract:
Successful teaching of mathematics to students is impossible without their independent performance of practical tasks, the compilation of which manually is very laborious. This article provides an overview of some technologies for designing task generators in higher mathematics, from the point of view of both those algorithms that can be used to generate tasks, and from the point of view of the technologies used to build the generator. There are also some options for implementing task generators: based on the Microsoft Office office suite, a Python program, an online JavaScript generator. In all the proposed variants of the generator, computer layout systems (or a translator) are responsible for translating mathematical formulas into a readable form TeX (or LaTeX). The generator based on the Microsoft Office suite uses such features of the office suite as random number generation, branching using the “IF” function of Excel and using the merge tool in Word. The proposed generator, written in Python, uses recursive functions to generate tasks on the topic “Derivative of a function” of various types. The generation of the type of tasks on the topic “Integrals” is carried out taking into account the selected canonical forms. The proposed online JavaScript generator uses similar concepts. The specified generator can be effectively integrated into the LMS MOODLE, which allows it to be used for building distance courses.
APA, Harvard, Vancouver, ISO, and other styles
8

Campbell-Kelly, Martin. "Sir Maurice Vincent Wilkes. 26 June 1913 — 29 November 2010." Biographical Memoirs of Fellows of the Royal Society 60 (January 2014): 433–54. http://dx.doi.org/10.1098/rsbm.2013.0020.

Full text
Abstract:
Maurice Wilkes was head of the Mathematical Laboratory (later Computer Laboratory) at Cambridge University from 1945 until his retirement in 1980. He led the construction of the EDSAC (Electronic Delay Storage Automatic Calculator), the world’s first practical stored-program computer, completed in May 1949. In 1951 he invented microprogramming, a fundamental technique of computer design. He subsequently led the construction of the EDSAC 2 and the Titan computers; he then established the CAP computer project, the Cambridge Digital Ring, and the Cambridge Distributed Computer System. Beyond Cambridge University, he was founding president of the British Computer Society. He was knighted in 2000 for services to computing.
APA, Harvard, Vancouver, ISO, and other styles
9

., Suryanti, and Acholder Tahi Perdoman. "HUBUNGAN PENGETAHUAN DAN PERSEPSI PRIA DENGAN PEMAKAIAN KONDOM DI WILAYAH KERJA PUSKESMAS RIMBO DATA." Zona Kedokteran: Program Studi Pendidikan Dokter Universitas Batam 9, no. 2 (2020): 62–70. http://dx.doi.org/10.37776/zked.v9i2.292.

Full text
Abstract:
A condom is a sheath made of latex which is caused by an erect penis or vagina that acts as a protector to prevent semen or fluid from ejaculating when the penis is in the vagina. Men's participation in joining the Family Planning program is quite low. This has also reduced men's participation in using condom contraception. The purpose of this study was to determine the Relationship between Knowledge and Perceptions of Men with Condom Use in the work area of the Rimbo Data Center. This research is quantitative research with analytic descriptive design and cross-sectional approach conducted in January 2019. The sampling technique is purposive sampling with a sample size of 80 people. Data were analyzed univariately and bivariate by computer using the chi-square statistical test. The results obtained from 80 samples, based on this univariate analysis (46.2%) had a low level of knowledge, (53.8%) had a negative perception, and most (63.8%) respondents did not use contraceptives condom. Based on bivariate analysis states there is a significant relationship between male knowledge with the use of condoms with a value of p-value = 0,000 <0.05. The perception of men with the use of condoms states that there is a significant relationship between perception with the use of condoms with the results of p-value = 0,000 <0.05. It is expected that the results of this study can increase the knowledge and perception of the community through counseling, especially regarding condom contraception. Based on the results of this study it can be concluded that there is a relationship between Knowledge and Perceptions of Men with Condom Use in the work area of Rimbo Data Center.
APA, Harvard, Vancouver, ISO, and other styles
10

Fishwick, Paul A. "A Decade of Digital Arts and Sciences at the University of Florida." Leonardo 45, no. 3 (2012): 211–16. http://dx.doi.org/10.1162/leon_a_00362.

Full text
Abstract:
The advent of cinematic special effects and console gaming since the late 1990s suggests an increasing and sustained emphasis on combining elements from the arts and computer science. The author presents a 10-year synopsis of a degree program created in 2000 to build an undergraduate curriculum using this emphasis as a catalyst. The degree program has resulted in steady student enrollment over the past decade as well as a significantly higher female student participation compared with the university's other three computer science degree programs. The article presents an overview of the program, qualitative and quantitative assessments, lessons learned and recommendations for continued improvement.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Latex (computer program)"

1

Renet, Nicolas P. "Automatic line segmentation in late medieval Latin manuscripts." 2012. http://liblink.bsu.edu/uhtbin/catkey/1678827.

Full text
Abstract:
This thesis describes a new line segmentation method that is optimized for medieval manuscripts. Using a thinned version of the binarized document image, the segmentation algorithm extracts two types of salient features from the handwritten patterns: nodes, whose distribution allows for the detection of line axes; segments, which are labeled according to the nodes they connect. This method obtains very good results on manuscripts that are usually considered hard to segment because of the numerous overlapping and touching lines. By contrast with many existing segmentation algorithms, this method does not rely on user-entered parameters and is not overly sensitive to the quality of the preprocessing treatments. Although more work is required to make it resistant to fluctuating lines, this line separation technique can already handle a large set of medieval documents and provides a useful input to a character segmentation program.<br>Line segmentation techniques in off-line handwriting recognition -- Line segmentation with the profile method -- Feature-based line segmentation -- Tests and conclusions.<br>Department of Computer Science
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Latex (computer program)"

1

Math into LaTeX: An introduction to LaTeX and AMS-LaTex. Birkhäuser, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Math into LaTeX. 3rd ed. Birkhäuser, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Michel, Goossens, ed. The LaTeX Graphics companion. 2nd ed. Addison-Wesley, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Michel, Goossens, Braams Johannes, and Rowley Chris 1948-, eds. The LaTeX companion. 2nd ed. Addison-Wesley, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Frank, Mittelbach, and Samarin Alexander, eds. The LaTeX companion. Addison, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

First steps in LATEX. Birkhauser,Switzerland, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Antonis, Tsolomitis, and Sofroniou Nick, eds. Digital typography using LaTeX. Springer, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

First steps in LaTeX. Birkhäuser, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lamport, Leslie. LATEX: A document preparation system. Addison-Wesley Pub. Co., 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hahn, Jane. LATEX for everyone: A reference guide and tutorial for typesetting documents using a computer. Personal TEX, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Latex (computer program)"

1

Molenda, Michael H. "History and Development of Instructional Design and Technology." In Handbook of Open, Distance and Digital Education. Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-0351-9_4-1.

Full text
Abstract:
AbstractThe origins and evolution of instructional technology and instructional design are treated in this chapter as separate concepts, although having intertwined histories. As with other technologies, their origins can be traced to the scientific discoveries on which they are based. Early in the twentieth century, new discoveries in optics and electricity stimulated educators to the adoption of technological innovations such as projected still pictures, motion pictures, and audio recording. Individuals and, later, groups of affiliated professionals promoted enriching learning by adding visual and, later, audiovisual resources where verbal presentations previously dominated. As radio broadcasting grew in the 1930s and then television in the 1950s, these mass media were perceived as ways to reach audiences, in and out of school, with educative audiovisual programs. In the 1960s, the wave of interest in teaching machines incorporating behaviorist psychological technology engendered a shift in identity from audiovisual technologies to all technologies, including psychological ones. As computers became ubiquitous in the 1990s, they became the dominant delivery system, due to their interactive capabilities. With the global spread of the World Wide Web after 1995, networked computers took on communication functions as well as storage and processing functions, giving new momentum to distance education. Meanwhile, research during and after World War II prompted a technology of planning – systems analysis. In the 1960s, educators adapted the systems approach to instructional planning, starting the development of instructional systems design (ISD). Since the 1980s, ISD has been the reigning paradigm for instructional design, while instructional design has become the central activity of instructional technology professionals.
APA, Harvard, Vancouver, ISO, and other styles
2

Molenda, Michael H. "History and Development of Instructional Design and Technology." In Handbook of Open, Distance and Digital Education. Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-19-2080-6_4.

Full text
Abstract:
AbstractThe origins and evolution of instructional technology and instructional design are treated in this chapter as separate concepts, although having intertwined histories. As with other technologies, their origins can be traced to the scientific discoveries on which they are based. Early in the twentieth century, new discoveries in optics and electricity stimulated educators to the adoption of technological innovations such as projected still pictures, motion pictures, and audio recording. Individuals and, later, groups of affiliated professionals promoted enriching learning by adding visual and, later, audiovisual resources where verbal presentations previously dominated. As radio broadcasting grew in the 1930s and then television in the 1950s, these mass media were perceived as ways to reach audiences, in and out of school, with educative audiovisual programs. In the 1960s, the wave of interest in teaching machines incorporating behaviorist psychological technology engendered a shift in identity from audiovisual technologies to all technologies, including psychological ones. As computers became ubiquitous in the 1990s, they became the dominant delivery system, due to their interactive capabilities. With the global spread of the World Wide Web after 1995, networked computers took on communication functions as well as storage and processing functions, giving new momentum to distance education. Meanwhile, research during and after World War II prompted a technology of planning – systems analysis. In the 1960s, educators adapted the systems approach to instructional planning, starting the development of instructional systems design (ISD). Since the 1980s, ISD has been the reigning paradigm for instructional design, while instructional design has become the central activity of instructional technology professionals.
APA, Harvard, Vancouver, ISO, and other styles
3

Sizova, Nina Alekseevna, Nikita Aleksandrovich Osmakov, and Sergei Anatolevich Elkov. "Programma-trenazher protsessa kataliticheskogo krekinga." In Topical issues of pedagogy and psychology. Publishing house Sreda, 2023. http://dx.doi.org/10.31483/r-105035.

Full text
Abstract:
Currently, computer tools for conducting training courses are being actively developed. Practically in all areas of academic disciplines, simulator programs for real technological processes are being created. However, the creation and organization of training courses using e-learning tools, especially based on Internet technologies, is a difficult technological and methodological task. The industry of computer educational materials is expanding due to their demand and social significance. In this regard, it is relevant to develop the concept of building and using computer teaching aids, in particular training simulators, adequate to modern ideas of the development of education. This chapter discusses the developed simulator program for an automated control system for the catalytic cracking process, written in the C# programming language, presents its capabilities and a description of how to work with it. This simulator allows students to independently learn how to arrange sensors and draw up a specification for instrumentation and control equipment, using the built-in tools for checking the correct placement of sensors. To run the program, you need a computer with Windows 7 or later, NET.Framework version 4.5 or later, optionally MS Word. The program is portable, does not require installation, which allows it to be easily scaled to an audience with several computers, it is also worth noting that the program is small, for the time being, the size of the program is less than 10 MB, which allows it to be sent to students who are unable to attend classes via the Enternet network. The results of the execution can be printed on any computer under the operating system Windows 7 and newer in PDF format, with MS Word installed – in Doc format.
APA, Harvard, Vancouver, ISO, and other styles
4

Burks, Arthur W. "An Early Graduate Program in Computers and Communications." In Perspectives on Adaptation in Natural and Artificial Systems. Oxford University Press, 2005. http://dx.doi.org/10.1093/oso/9780195162929.003.0010.

Full text
Abstract:
This is the story of how, in 1957, John Holland, a graduate student in mathematics; Gordon Peterson, a professor of speech; the present writer, a professor of philosophy; and several other Michigan faculty started a graduate program in Computers and Communications—with John our first Ph.D. and, I believe, the world's first doctorate in this now-burgeoning field. This program was to become the Department of Computer and Communication Sciences in the College of Literature, Science, and the Arts about ten years later. It had arisen also from a research group at Michigan on logic and computers that I had established in 1949 at the request of the Burroughs Adding Machine Company. When I first met John in 1956, he was a graduate of MIT in electrical engineering, and one of the few people in the world who had worked with the relatively new electronic computers. He had used the Whirlwind I computer at MIT [33], which was a process-control variant of the Institute for Advanced Study (IAS) Computer [27]. He had also studied the 1946 Moore School Lectures on the design of electronic computers, edited by George Patterson [58]. He had then gone to IBM and helped program its first electronic computer, the IBM 701, the first commercial version of the IAS Computer. While a graduate student in mathematics at Michigan, John was also doing military work at the Willow Run Research Laboratories to support himself. And 1 had been invited to the Laboratories by a former student of mine, Dr. Jesse Wright, to consult with a small research group of which John was a member. It was this meeting that led to the University's graduate program and then the College's full-fledged department. The Logic of Computers Group, out of which this program arose, in part, then continued with John as co-director, though each of us did his own research. This anomaly of a teacher of philosophy meeting an accomplished electrical engineer in the new and very small field of electronic computers needs some explanation, one to be found in the story of the invention of the programmable electronic computer. For the first three programmable electronic computers (the manually programmed ENIAC and the automatically programmed EDVAC and Institute for Advanced Study Computer) and their successors constituted both the instrumentation and the subject matter of our new Graduate Program in Computers and Communications.
APA, Harvard, Vancouver, ISO, and other styles
5

Con Díaz, Gerardo. "The Long History of Software Patenting in the United States." In The Battle over Patents. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780197576151.003.0008.

Full text
Abstract:
The patent protections available to computer programs are almost as old as modern electronic computing. In the late 1940s and early 1950s, when a computer’s programming was as tangible as the machine’s circuits, there was nothing unusual about the idea that a patent could protect a program. The main problem was not whether programs were patent-eligible but how to draft patent applications for them that could bypass well-established doctrinal obstacles. As programs increased in complexity and programming languages enabled their creation through texts, inventors and their lawyers relied on the means-plus claim structure—a claim that discloses a machine as the means to perform a given collection of functions—as a shorthand to disclose the kinds of physicality that their predecessors would have spelled out. Successful patent applications combined means-plus language with very specific descriptions of interconnected electronic components to secure patent protections for the computer programs at their core.
APA, Harvard, Vancouver, ISO, and other styles
6

Haigh, Thomas, Mark Priestley, and Crispin Rope. "ENIAC and Its Contemporaries Meet the “Stored Program Concept”." In Eniac in Action. The MIT Press, 2016. http://dx.doi.org/10.7551/mitpress/9780262033985.003.0012.

Full text
Abstract:
Having explored ENIAC’s actual use and the programs it ran the authors shift to a more abstract analytical level. Previous discussion of the invention of the modern computer has focused on the “stored program concept” as the crucial innovation setting modern computers apart from their more limited predecessors. The authors explore the origins of this phrase and its changing meaning over time. They look in detail at a 1944 document produced by J. Presper Eckert and sometimes claimed as a first statement of this concept, showing that it actually describes an electronic desk calculator. The authors summarize ENIAC’s capabilities after conversion and to compare these on both practical and theoretical levels with the 1945 EDVAC design and with several other early computers. This supports a balanced appraisal of the senses in which the converted ENIAC did and did not constitute an initial implementation of the key ideas from the 1945 design. The chapter argues for an appraisal of early computers better grounded in the historical realities of documented use, and against a widespread fixation on the notion of “universality” based on a school of theoretical computer science that gained prominence years later.
APA, Harvard, Vancouver, ISO, and other styles
7

Dasgupta, Subrata. "Language Games." In It Began with Babbage. Oxford University Press, 2014. http://dx.doi.org/10.1093/oso/9780199309412.003.0017.

Full text
Abstract:
It must have been entirely coincidental that two remarkable linguistic movements both occurred during the mid 1950s—one in the realm of natural language, the other in the domain of the artificial; the one brought about largely by a young linguist named Noam Chomsky (1928–), the other initiated by a new breed of scientists whom we may call language designers; the one affecting linguistics so strongly that it would be deemed a scientific revolution, the other creating a class of abstract artifacts called programming languages and also enlarging quite dramatically the emerging paradigm that would later be called computer science. As we will see, these two linguistic movements intersected in a curious sort of way. In particular, we will see how an aspect of Chomskyan linguistics influenced computer scientists far more profoundly than it influenced linguists. But first things first: concerning the nature of the class of abstract artifacts called programming languages. There is no doubt that those who were embroiled in the design of the earliest programmable computers also meditated on a certain goal: to make the task of programming a computer as natural as possible from the human point of view. Stepping back a century, we recall that Ada, Countess of Lovelace specified the computation of Bernoulli numbers in an abstract notation far removed from the gears, levers, ratchets, and cams of the Analytical Engine (see Chapter 2, Section VIII ). We have seen in the works of Herman Goldstine and John von Neumann in the United States, and David Wheeler in England that, even as the first stored-program computers were coming into being, eff orts were being made to achieve the goal just mentioned. Indeed, a more precise statement of this goal was in evidence: to compose computer programs in a more abstract form than in the machine’s “native” language. The challenge here was twofold: to describe the program (or algorithm) in such a language that other humans could comprehend, without knowing much about the computer for which the program was written—in other words, a language that allowed communication between the writer of the program and other (human) readers—and also to communicate the program to the machine in such fashion that the latter could execute the program with minimal human intervention.
APA, Harvard, Vancouver, ISO, and other styles
8

Dasgupta, Subrata. "An Explosion of Subparadigms." In It Began with Babbage. Oxford University Press, 2014. http://dx.doi.org/10.1093/oso/9780199309412.003.0019.

Full text
Abstract:
In 1962, purdue University in West Lafayette, Indiana, in the United States opened a department of computer science with the mandate to offer master’s and doctoral degrees in computer science. Two years later, the University of Manchester in England and the University of Toronto in Canada also established departments of computer science. These were the first universities in America, Britain, and Canada, respectively, to recognize a new academic reality formally—that there was a distinct discipline with a domain that was the computer and the phenomenon of automatic computation. There after, by the late 1960s—much as universities had sprung up all over Europe during the 12th and 13th centuries after the founding of the University of Bologna (circa 1150) and the University of Paris (circa 1200)—independent departments of computer science sprouted across the academic maps on North America, Britain, and Europe. Not all the departments used computer science in their names; some preferred computing, some computing science, some computation. In Europe non-English terms such as informatique and informatik were used. But what was recognized was that the time had come to wean the phenomenon of computing away from mathematics and electrical engineering, the two most common academic “parents” of the field; and also from computer centers, which were in the business of offering computing services to university communities. A scientific identity of its very own was thus established. Practitioners of the field could call themselves computer scientists. This identity was shaped around a paradigm. As we have seen, the epicenter of this paradigm was the concept of the stored-program computer as theorized originally in von Neumann’s EDVAC report of 1945 and realized physically in 1949 by the EDSAC and the Manchester Mark I machines (see Chapter 8 ). We have also seen the directions in which this paradigm radiated out in the next decade. Most prominent among the refinements were the emergence of the historically and utterly original, Janus-faced, liminal artifacts called computer programs, and the languages—themselves abstract artifacts—invented to describe and communicate programs to both computers and other human beings.
APA, Harvard, Vancouver, ISO, and other styles
9

Rugelj, Joze. "Computer Supported Network Based Learning Environment for the Workplace." In Usability Evaluation of Online Learning Programs. IGI Global, 2003. http://dx.doi.org/10.4018/978-1-59140-105-6.ch013.

Full text
Abstract:
In this chapter we present our experiences in the field of computer-supported network-based learning over the last 10 years. We began our activities in this field with investigation of group communications and of generic models for online learning. Later we extended our interests to implementation of computer-supported network-based learning environments for different user groups and to measures that have to accompany introduction of new learning technologies to schools or workplaces.
APA, Harvard, Vancouver, ISO, and other styles
10

Xu, Xun. "Program CNCs." In Integrating Advanced Computer-Aided Design, Manufacturing, and Numerical Control. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-59904-714-0.ch009.

Full text
Abstract:
A CNC machine can be programmed in different ways to machine a workpiece. In addition to creating the cutting program, many other factors also need to be considered or programmed. These include workholding devices, cutting tools, machining conditions as well as the machining strategy. The first generation CNCs were programmed manually and punched tapes were used as a medium for transferring the machine control data (MCD), that is, G-codes into a controller. Tapes were later replaced by RS232 cables, floppy disks, and finally standard computer network cables. Today’s CNC machines are controlled directly from files created by CAD/CAM or CAM software packages, so that a part or assembly can go directly from design to manufacturing without the need of producing a drafted paper drawing of the component. This means that for the first time, bringing design and manufacturing under the same automation regime becomes a reachable target. Error detection features give CNC machines the ability to alert the operator in different ways including giving a ring to the operation’s mobile phone if it detects that a tool has broken. While the machine is awaiting replacement on the tool, it would run other parts that are already loaded up to that tool and wait for the operator. The focus of this chapter is on a detailed account of the basics of CNC programming, and the emphasis is on G-code and Automatic Programming Tool (APT). G-code is still the dominant manual programming language for CNC machine tools. It is also the main form of control commands many CAD/CAM (or CAM) systems output. APT was developed soon after G-codes and CNC machine tools were developed to alleviate the drudgery work of straight G-code programming. Modern CAD/CAM systems these days are now becoming the main-stream tools for CNC programming.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Latex (computer program)"

1

Beg, Azam, Manzoor Ahmed Khan, and Maqsood Sandhu. "SPREADSHEETS AND LATEX – A PERFECT UNION FOR THE CREATION OF TESTBANKS FOR ONLINE ASSESSMENT." In International Conference on Education and New Developments. inScience Press, 2021. http://dx.doi.org/10.36315/2021end010.

Full text
Abstract:
The current COVID-19 pandemic forced an instant shift in teaching from the traditional classrooms to an online format. While it was relatively easy to switch the teaching to online mode, the assessment process presents bigger challenges. Specifically, the assessment quality is compromised because during an online test, most students are able to seek help from their fellow test-takers as well as from different online sources. One way of discouraging the students’ tendency to share the answers among themselves is to inform them they will be given different questions than their peers. In this paper, we propose to use spreadsheets to create test questions in Latex format, thus making it easy to present each student with a ‘unique’ question-set during a test. The uniqueness of the testbank questions comes from randomly generated variable values in numerical questions. The spreadsheet also produces the answers to the questions to help automate the grading process. Such testbanks are suitable not only for normally sized courses, but also for the larger massive open online courses. We have successfully used such testbanks for multiple courses in our university’s Computer Engineering program. Originally, we had used the testbanks for in-class assessment. After the classes shifted online, we ported the testbanks to our learning management system to enable online assessment.
APA, Harvard, Vancouver, ISO, and other styles
2

Beckmann, Leo H. J. F. "A small-computer program for optical design and analysis, written in 'C'." In International Lens Design. Optica Publishing Group, 1990. http://dx.doi.org/10.1364/ild.1990.ltuc5.

Full text
Abstract:
A modular computer program for the design and analysis of optical systems on a small computer has been developed under the name "Opdesign". While the origins date back into the late seventies and the use of BASIC on, successively, programmable calculators and 8-bit home computers, the current program is written in 'C' and runs on different types of personal computers. Concurrently, the speed of calculations, in particular a skew-ray tracing, went up from one ray surface per second to 30..60 ray surfaces per second depending on the compiler and the hardware. With a program size (executable code) in excess of 250 kByte, the package covers system data (input, general properties), aberration calculation (third order-, wavefront-, exact ray aberrations), a number of tools for design manipulations, and automatic design improvement (by damped least squares method).
APA, Harvard, Vancouver, ISO, and other styles
3

Saseta Naranjo, Albertina. "Dibujando la Casa Peyrissac." In LC2015 - Le Corbusier, 50 years later. Universitat Politècnica València, 2015. http://dx.doi.org/10.4995/lc2015.2015.680.

Full text
Abstract:
Resumen: Se trata de imaginamos en la piel de un empleado de Le Corbusier al que se le encomienda la misión de pasar a limpio unos croquis relativos al proyecto de una casa. Los croquis mencionados son realizados por Le Corbusier en 1942, durante su estancia en Argel, y hacen referencia a una casa diseñada para una propiedad agrícola cerca de la Montaña Chenoua perteneciente a la familia Peyrissac. Disponemos de dichos croquis gracias a la Fundación Le Corbusier y también contamos con la información ofrecida por La Œuvre Complète, aunque en su mayoría consiste en una selección de los mismos dibujos acompañados de algunas notas aclaratorias. Por razones de operatividad se ha optado por recurrir a la tecnología actual, de esta manera todos los dibujos han sido realizados por ordenador. Básicamente nos hemos centrado en una serie de dibujos, que consideramos son los últimos cronológicamente hablando y que representan la planta baja, primera y de cubiertas, una sección transversal de la casa, y sendas axonometrías. También hemos prestado atención al resto de la documentación existente, especialmente a los dibujos que hacen referencia a la implantación de la casa en su entorno, a la organización global y al programa de necesidades, así como cualquier información concreta sobre dimensiones. Hemos insertado la casa en la parcela original, previa búsqueda de la misma, hemos dibujado la planta baja, la planta primera, y una sección transversal, todo ello atendiendo a la información prestada por los dibujos originales del maestro. Como ejercicio hemos dibujado una hipótesis de detalle constructivo de cubiertas. Abstract: It is imagined to be a Le Corbusier employee who is commissioned to draft a house based on some original drawings. These drawings are made by Le Corbusier in 1942, during his stay in Algeria, and refer to a house designed for an agricultural property near the Mountain Chenoua which belongs to Peyrissac family. The original drawings have been provided by Le Corbusier Foundation. Also, The Œuvre Complète provides information for this project, although mostly consists of a selection of the same drawings accompanied by some explanatory notes. For operational reasons it is decided to use current technology, so all drawings have been made by computer. Basically it has been focused on a series of drawings, which it is believed are the last ones and represent the ground, first floor plan and the roof, also a cross-section of the house, and two axonometrics. It has been paid attention to other existing documentation, especially drawings that refer to the implementation of the house in its surrounding, the organization and the program requirements, as well as any specific information about dimensions. The house has been inserted in the original plot and the ground floor, first floor, and a cross section have been drawn, based on the information provided by the original drawings of the Master. As an exercise, a hypothetical construction detail of the roof has been drawn. Palabras Clave: Le Corbusier; Peyrissac; Chenoua; Argel; Croquis; Casa. Keywords: Le Corbusier; Peyrissac; Chenoua; Argel; Sketch; House. DOI: http://dx.doi.org/10.4995/LC2015.2015.680
APA, Harvard, Vancouver, ISO, and other styles
4

Trevin, Ste´phane, Matthieu Persoz, and Thomas Knook. "Making FAC Calculations With BRT-CICERO™ and Updating to Version 3.0." In 17th International Conference on Nuclear Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/icone17-75341.

Full text
Abstract:
The surveillance of Flow Accelerated Corrosion (FAC) on secondary pipes is a major concern for every nuclear power plant operator. After the Surry accident in 1986, EDF launched a computer code development program to monitor this degradation phenomenon. A chemical corrosion model has been developed, based on laboratory test results obtained by EDF R&amp;D since the late 70’s. This model enables to compute the wall thickness loss of pipes submitted to FAC, with respect to the thermo-hydraulic conditions, the fluid chemistry, the material chromium content, the pipe geometry and the cycles duration.
APA, Harvard, Vancouver, ISO, and other styles
5

Monzo´n, Mario, Pedro M. Herna´ndez, Mari´a D. Marrero, Antonio N. Beni´tez, Fernando Ortega, and Ayoze Socas. "New Development in Computer Aided Electroforming for Rapid Prototyping Applications." In ASME 2008 9th Biennial Conference on Engineering Systems Design and Analysis. ASMEDC, 2008. http://dx.doi.org/10.1115/esda2008-59112.

Full text
Abstract:
Electroforming enables metallic parts manufacturing with good mechanical properties and high level of accuracy and reproducibility. A thin metallic shell is deposited on a model and later released from it. There are several applications of electroforming combined with rapid prototyping: injection moulds, EDM electrodes, moulds for rotational moulding, complex metallic parts, etc. However the two main disadvantages of electroforming are the non uniform thicknesses distribution and high time of shell manufacturing. The paper focuses on a new development in order to achieve uniform thickness and otherwise a faster shell manufacturing. A new device and software have been developed, named Elecform3D™. The device is an automatic machine controlled by computer and assembled into the electroforming equipment. Otherwise the software not only controls the device but also simulates and calculates the optimal positions of the cathode based in the electrolytic parameters of the bath. The software recommends an automatic program of movements or allows the operator to decide another alternatives programs if necessary. Elecform 3D is an important step beyond electroforming so far. RP 3D printer combined with Elecform 3D is a cheaper alternative for high quality metallic parts manufacturing in comparison with SLS-SLM technologies or high speed machining, mainly for rapid tooling and even rapid manufacturing.
APA, Harvard, Vancouver, ISO, and other styles
6

Hans, Atharva, Ashish M. Chaudhari, Ilias Bilionis, and Jitesh H. Panchal. "A Mixed-Method Analysis of Schedule and Cost Growth in Defense Acquisition Programs." In ASME 2021 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/detc2021-71517.

Full text
Abstract:
Abstract Cost and schedule overruns are common in the procurement of large-scale defense acquisition programs. Current work focuses on identifying the root causes of cost growth and schedule delays in the defense acquisition programs. There is need for a mix of quantitative and qualitative analysis of cost and schedule overruns which takes into account program factor such as, technology maturity, design maturity, initial acquisition time, and program complexity. Such analysis requires an easy to access database for program-specific data about how an acquisition programs’ technical and financial characteristics vary over the time. To fulfill this need, the objective of this paper is twofold: (i) to develop a database of major US defense weapons programs which includes details of the technical and financial characteristics and how they vary over time, and (ii) to test various hypotheses about the interdependence of such characteristics using the collected data. To achieve the objective, we use a mixed-method analysis on schedule and cost growth data available in the U.S. Government Accountability Office’s (GAO’s) defense acquisitions annual assessments during the period 2003–2017. We extracted both analytical and textual data from original reports into Excel files and further created an easy to access database accessible from a Python environment. The analysis reveals that technology immaturity is the major driver of cost and schedule growth during the early stages of the acquisition programs while technical inefficiencies drive cost overruns and schedule delays during the later stages. Further, we find that the acquisition programs with longer initial length do not necessarily have higher greater cost growth. The dataset and the results provide a useful starting point for the research community for modeling cost and schedule overruns, and for practitioners to inform their systems acquisition processes.
APA, Harvard, Vancouver, ISO, and other styles
7

Hotchkiss, Anthony. "The Development of a Profile-Milling Program for Teaching Computer-Aided-Manufacturing and CNC Programming." In ASME 1997 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1997. http://dx.doi.org/10.1115/detc97/cie-4439.

Full text
Abstract:
Abstract At SUNY College at Buffalo, a new course, TEC302, CAD/CAM, computer-aided-design and computer-aided-manufacturing was added to the Industrial Technology (IT) undergraduate curriculum in the fall of 1994. At that time, the technology department had been using the AutoCAD system for design/drafting, and SmartCAM for demonstrating computer-aided-manufacturing. SmartCAM is a sophisticated product that takes a great deal of training to use, does not work directly in AutoCAD, and with only four licenses, was not available to all the students. For these reasons, the author developed a CAM program, VAL-CAM, that works inside AutoCAD, and has most of the aspects of a more sophisticated CAM program, yet is simpler to use, is available to all students, and automatically generates CNC (computer-numerical-control) code suitable for driving the departments’ vertical milling machining center. This paper discusses the development of VAL-CAM, which is written in the AutoLISP language for compatibility with AutoCAD. The dialogue control language (DCL) of AutoCAD was also used for part of the user interface for VALCAM. The algorithms, flow diagrams, pseudo code and actual LISP code for some of the more interesting parts of the program are presented. VAL-CAM is under continuous development, and later sections of the program will be discussed in future papers.
APA, Harvard, Vancouver, ISO, and other styles
8

Treichler, David H., та Ronald Carmichael. "Observations on Raytheon 6 σ: The ASTOR Early Engagement". У ASME 2002 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2002. http://dx.doi.org/10.1115/detc2002/dfm-34197.

Full text
Abstract:
Raytheon Six Sigma (R6Sigma) is a six-step quality management approach and culture change effort that has proven to be highly effective. It incorporates lessons learned from the earlier efforts by Motorola, Texas Instruments, Allied Signal, General Electric, and many others. Within each of these major companies, the Six Sigma approach is a reflection of the company’s unique culture and specific industry needs. However, one criticism common to most of these programs is that the change analysis and leadership tools are engaged too late in the overall process. Building upon the lessons learned by other organizations, the paper recounts the early engagement of the Six Sigma tools, coupled with direct customer involvement, on a large-scale program by the Raytheon Company: The Airborne Stand-off Radar (ASTOR) system, which is under development for the UK Ministry of Defense (MOD). Because the ASTOR program is still years from completion, this paper cannot provide detail in terms of final lessons learned or quantified results derived from the front-end application of R6Sigma on this program. The purpose of this paper is to capture the thought processes behind (and initial stages observed during) early customer involvement and the application of R6Sigma process improvement approaches at the beginning of the program.
APA, Harvard, Vancouver, ISO, and other styles
9

Andreou, Thomas, Craig White, Konstantinos Kontis, Shahrokh Shahpar, and Nicholas Brown. "Part 1: A Swirl Vane Generation Code for Fuel Spray Nozzles." In ASME Turbo Expo 2020: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/gt2020-15414.

Full text
Abstract:
Abstract Achieving an optimal level of flow swirl is required for efficient mixing of air and fuel in order to realise lean combustion. A novel method is devised to achieve a necessary level of swirl, using NACA airfoil profiles as the baseline for swirl stator blades. Formulas for achieving a required level of swirl have been derived and implemented in a computer program that generates aerodynamic vanes which meet the specified swirl. The usability of the program over a broad range of Reynolds numbers is verified. A curve fitting method has been developed, taking into account the trailing edge angles and blade solidity, in order to speed up the iterative process. A significant computational speed-up is achieved from this approach, and an excellent initial preliminary vane design can be obtained, which can later be introduced inside an automated optimisation process.
APA, Harvard, Vancouver, ISO, and other styles
10

Mehl, Douglas C., Kurt A. Beiter, and Kos Ishii. "Design for Injection Molding: Using Dimensional Analysis to Assess Fillability." In ASME 1994 Design Technical Conferences collocated with the ASME 1994 International Computers in Engineering Conference and Exhibition and the ASME 1994 8th Annual Database Symposium. American Society of Mechanical Engineers, 1994. http://dx.doi.org/10.1115/detc1994-0085.

Full text
Abstract:
Abstract This paper addresses the determination of wall thicknesses and gating schemes in the preliminary design of injection-molded plastic parts. Today, most of the existing design guidelines come in the form of experience-based qualitative rules. If the designers already have a detailed geometry of the part, the numerical process simulation program provides another form of design aid. There exists a huge gap between these two types of design aids; the experience-based guidelines are often too vague, while the process simulation programs come too late to impact preliminary part design. To fill this gap, this paper develops physics-based guidelines that utilize dimensional analysis techniques. Experiments and simulation studies can deduce non-dimensional relationships between flow length, thickness, material, and process parameters. The guidelines will aid plastic component designers in determining wall-thickness, gating schemes, and in selecting the material in the preliminary stages of part design. This paper describes the formulation of the non-dimensional charts for fillability assessment, and explains the use of these charts in part design. We further outline an ongoing experimental program to validate and refine our formulation.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Latex (computer program)"

1

Rankin, Nicole, Deborah McGregor, Candice Donnelly, et al. Lung cancer screening using low-dose computed tomography for high risk populations: Investigating effectiveness and screening program implementation considerations: An Evidence Check rapid review brokered by the Sax Institute (www.saxinstitute.org.au) for the Cancer Institute NSW. The Sax Institute, 2019. http://dx.doi.org/10.57022/clzt5093.

Full text
Abstract:
Background Lung cancer is the number one cause of cancer death worldwide.(1) It is the fifth most commonly diagnosed cancer in Australia (12,741 cases diagnosed in 2018) and the leading cause of cancer death.(2) The number of years of potential life lost to lung cancer in Australia is estimated to be 58,450, similar to that of colorectal and breast cancer combined.(3) While tobacco control strategies are most effective for disease prevention in the general population, early detection via low dose computed tomography (LDCT) screening in high-risk populations is a viable option for detecting asymptomatic disease in current (13%) and former (24%) Australian smokers.(4) The purpose of this Evidence Check review is to identify and analyse existing and emerging evidence for LDCT lung cancer screening in high-risk individuals to guide future program and policy planning. Evidence Check questions This review aimed to address the following questions: 1. What is the evidence for the effectiveness of lung cancer screening for higher-risk individuals? 2. What is the evidence of potential harms from lung cancer screening for higher-risk individuals? 3. What are the main components of recent major lung cancer screening programs or trials? 4. What is the cost-effectiveness of lung cancer screening programs (include studies of cost–utility)? Summary of methods The authors searched the peer-reviewed literature across three databases (MEDLINE, PsycINFO and Embase) for existing systematic reviews and original studies published between 1 January 2009 and 8 August 2019. Fifteen systematic reviews (of which 8 were contemporary) and 64 original publications met the inclusion criteria set across the four questions. Key findings Question 1: What is the evidence for the effectiveness of lung cancer screening for higher-risk individuals? There is sufficient evidence from systematic reviews and meta-analyses of combined (pooled) data from screening trials (of high-risk individuals) to indicate that LDCT examination is clinically effective in reducing lung cancer mortality. In 2011, the landmark National Lung Cancer Screening Trial (NLST, a large-scale randomised controlled trial [RCT] conducted in the US) reported a 20% (95% CI 6.8% – 26.7%; P=0.004) relative reduction in mortality among long-term heavy smokers over three rounds of annual screening. High-risk eligibility criteria was defined as people aged 55–74 years with a smoking history of ≥30 pack-years (years in which a smoker has consumed 20-plus cigarettes each day) and, for former smokers, ≥30 pack-years and have quit within the past 15 years.(5) All-cause mortality was reduced by 6.7% (95% CI, 1.2% – 13.6%; P=0.02). Initial data from the second landmark RCT, the NEderlands-Leuvens Longkanker Screenings ONderzoek (known as the NELSON trial), have found an even greater reduction of 26% (95% CI, 9% – 41%) in lung cancer mortality, with full trial results yet to be published.(6, 7) Pooled analyses, including several smaller-scale European LDCT screening trials insufficiently powered in their own right, collectively demonstrate a statistically significant reduction in lung cancer mortality (RR 0.82, 95% CI 0.73–0.91).(8) Despite the reduction in all-cause mortality found in the NLST, pooled analyses of seven trials found no statistically significant difference in all-cause mortality (RR 0.95, 95% CI 0.90–1.00).(8) However, cancer-specific mortality is currently the most relevant outcome in cancer screening trials. These seven trials demonstrated a significantly greater proportion of early stage cancers in LDCT groups compared with controls (RR 2.08, 95% CI 1.43–3.03). Thus, when considering results across mortality outcomes and early stage cancers diagnosed, LDCT screening is considered to be clinically effective. Question 2: What is the evidence of potential harms from lung cancer screening for higher-risk individuals? The harms of LDCT lung cancer screening include false positive tests and the consequences of unnecessary invasive follow-up procedures for conditions that are eventually diagnosed as benign. While LDCT screening leads to an increased frequency of invasive procedures, it does not result in greater mortality soon after an invasive procedure (in trial settings when compared with the control arm).(8) Overdiagnosis, exposure to radiation, psychological distress and an impact on quality of life are other known harms. Systematic review evidence indicates the benefits of LDCT screening are likely to outweigh the harms. The potential harms are likely to be reduced as refinements are made to LDCT screening protocols through: i) the application of risk predication models (e.g. the PLCOm2012), which enable a more accurate selection of the high-risk population through the use of specific criteria (beyond age and smoking history); ii) the use of nodule management algorithms (e.g. Lung-RADS, PanCan), which assist in the diagnostic evaluation of screen-detected nodules and cancers (e.g. more precise volumetric assessment of nodules); and, iii) more judicious selection of patients for invasive procedures. Recent evidence suggests a positive LDCT result may transiently increase psychological distress but does not have long-term adverse effects on psychological distress or health-related quality of life (HRQoL). With regards to smoking cessation, there is no evidence to suggest screening participation invokes a false sense of assurance in smokers, nor a reduction in motivation to quit. The NELSON and Danish trials found no difference in smoking cessation rates between LDCT screening and control groups. Higher net cessation rates, compared with general population, suggest those who participate in screening trials may already be motivated to quit. Question 3: What are the main components of recent major lung cancer screening programs or trials? There are no systematic reviews that capture the main components of recent major lung cancer screening trials and programs. We extracted evidence from original studies and clinical guidance documents and organised this into key groups to form a concise set of components for potential implementation of a national lung cancer screening program in Australia: 1. Identifying the high-risk population: recruitment, eligibility, selection and referral 2. Educating the public, people at high risk and healthcare providers; this includes creating awareness of lung cancer, the benefits and harms of LDCT screening, and shared decision-making 3. Components necessary for health services to deliver a screening program: a. Planning phase: e.g. human resources to coordinate the program, electronic data systems that integrate medical records information and link to an established national registry b. Implementation phase: e.g. human and technological resources required to conduct LDCT examinations, interpretation of reports and communication of results to participants c. Monitoring and evaluation phase: e.g. monitoring outcomes across patients, radiological reporting, compliance with established standards and a quality assurance program 4. Data reporting and research, e.g. audit and feedback to multidisciplinary teams, reporting outcomes to enhance international research into LDCT screening 5. Incorporation of smoking cessation interventions, e.g. specific programs designed for LDCT screening or referral to existing community or hospital-based services that deliver cessation interventions. Most original studies are single-institution evaluations that contain descriptive data about the processes required to establish and implement a high-risk population-based screening program. Across all studies there is a consistent message as to the challenges and complexities of establishing LDCT screening programs to attract people at high risk who will receive the greatest benefits from participation. With regards to smoking cessation, evidence from one systematic review indicates the optimal strategy for incorporating smoking cessation interventions into a LDCT screening program is unclear. There is widespread agreement that LDCT screening attendance presents a ‘teachable moment’ for cessation advice, especially among those people who receive a positive scan result. Smoking cessation is an area of significant research investment; for instance, eight US-based clinical trials are now underway that aim to address how best to design and deliver cessation programs within large-scale LDCT screening programs.(9) Question 4: What is the cost-effectiveness of lung cancer screening programs (include studies of cost–utility)? Assessing the value or cost-effectiveness of LDCT screening involves a complex interplay of factors including data on effectiveness and costs, and institutional context. A key input is data about the effectiveness of potential and current screening programs with respect to case detection, and the likely outcomes of treating those cases sooner (in the presence of LDCT screening) as opposed to later (in the absence of LDCT screening). Evidence about the cost-effectiveness of LDCT screening programs has been summarised in two systematic reviews. We identified a further 13 studies—five modelling studies, one discrete choice experiment and seven articles—that used a variety of methods to assess cost-effectiveness. Three modelling studies indicated LDCT screening was cost-effective in the settings of the US and Europe. Two studies—one from Australia and one from New Zealand—reported LDCT screening would not be cost-effective using NLST-like protocols. We anticipate that, following the full publication of the NELSON trial, cost-effectiveness studies will likely be updated with new data that reduce uncertainty about factors that influence modelling outcomes, including the findings of indeterminate nodules. Gaps in the evidence There is a large and accessible body of evidence as to the effectiveness (Q1) and harms (Q2) of LDCT screening for lung cancer. Nevertheless, there are significant gaps in the evidence about the program components that are required to implement an effective LDCT screening program (Q3). Questions about LDCT screening acceptability and feasibility were not explicitly included in the scope. However, as the evidence is based primarily on US programs and UK pilot studies, the relevance to the local setting requires careful consideration. The Queensland Lung Cancer Screening Study provides feasibility data about clinical aspects of LDCT screening but little about program design. The International Lung Screening Trial is still in the recruitment phase and findings are not yet available for inclusion in this Evidence Check. The Australian Population Based Screening Framework was developed to “inform decision-makers on the key issues to be considered when assessing potential screening programs in Australia”.(10) As the Framework is specific to population-based, rather than high-risk, screening programs, there is a lack of clarity about transferability of criteria. However, the Framework criteria do stipulate that a screening program must be acceptable to “important subgroups such as target participants who are from culturally and linguistically diverse backgrounds, Aboriginal and Torres Strait Islander people, people from disadvantaged groups and people with a disability”.(10) An extensive search of the literature highlighted that there is very little information about the acceptability of LDCT screening to these population groups in Australia. Yet they are part of the high-risk population.(10) There are also considerable gaps in the evidence about the cost-effectiveness of LDCT screening in different settings, including Australia. The evidence base in this area is rapidly evolving and is likely to include new data from the NELSON trial and incorporate data about the costs of targeted- and immuno-therapies as these treatments become more widely available in Australia.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!