Academic literature on the topic 'ALVEN (Computer system)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'ALVEN (Computer system).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "ALVEN (Computer system)"

1

Tsotsos, John K. "Computer assessment of left ventricular wall motion: The ALVEN expert system." Computers and Biomedical Research 18, no. 3 (June 1985): 254–77. http://dx.doi.org/10.1016/0010-4809(85)90050-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wallis, Peter. "Revisiting the DARPA communicator data using conversation analysis." Interaction Studies 9, no. 3 (December 5, 2008): 434–57. http://dx.doi.org/10.1075/is.9.3.05wal.

Full text
Abstract:
The state of the art in human computer conversation leaves something to be desired and, indeed, talking to a computer can be down-right annoying. This paper describes an approach to identifying “opportunities for improvement” in these systems by looking for abuse in the form of swear words. The premise is that humans swear at computers as a sanction and, as such, swear words represent a point of failure where the system did not behave as it should. Having identified where things went wrong, we can work backward through the transcripts and, using conversation analysis (CA) work out how things went wrong. Conversation analysis is a qualitative methodology and can appear quite alien — indeed unscientific — to those of us from a quantitative background. The paper starts with a description of Conversation analysis in its modern form, and then goes on to apply the methodology to transcripts of frustrated and annoyed users in the DARPA Communicator project. The conclusion is that there is at least one species of failure caused by the inability of the Communicator systems to handle mixed initiative at the discourse structure level. Along the way, I hope to demonstrate that there is an alternative future for computational linguistics that does not rely on larger and larger text corpora.
APA, Harvard, Vancouver, ISO, and other styles
3

Garzon-Lopez, Carol X., Tarek Hattab, Sandra Skowronek, Raf Aerts, Michael Ewald, Hannes Feilhauer, Olivier Honnay, et al. "The DIARS toolbox: a spatially explicit approach to monitor alien plant invasions through remote sensing." Research Ideas and Outcomes 4 (March 30, 2018): e25301. http://dx.doi.org/10.3897/rio.4.e25301.

Full text
Abstract:
The synergies between remote sensing technologies and ecological research have opened new avenues for the study of alien plant invasions worldwide. Such scientific advances have greatly improved our capacity to issue warnings, develop early-response systems and assess the impacts of alien plant invasions on biodiversity and ecosystem functioning. Hitherto, practical applications of remote sensing approaches to support nature conservation actions are lagging far behind scientific advances. Yet, for some of these technologies, knowledge transfer is difficult due to the complexity of the different data handling procedures and the huge amounts of data it involves per spatial unit. In this context, the next logical step is to develop clear guidelines for the application of remote sensing data to monitor and assess the impacts of alien plant invasions, that enable scientists, landscape managers and policy makers to fully exploit the tools which are currently available. It is desirable to have such guidelines accompanied by freely available remote sensing data and generated in a free and open source environment that increases the availability and affordability of these new technologies. Here we present a toolbox that provides an easy-to-use, flexible, transparent and open source set of tools to sample, map, model and assess the impact of alien plant invasions using two high-resolution remote sensing products (hyperspectral and LiDAR images). This online toolbox includes a real case dataset designed to facilitate testing and training in any computer system and processing capacity.
APA, Harvard, Vancouver, ISO, and other styles
4

Gregori, Verónica. "Discrete duality for De Morgan Algebras with operators." Asian-European Journal of Mathematics 12, no. 01 (February 2019): 1950010. http://dx.doi.org/10.1142/s1793557119500104.

Full text
Abstract:
A discrete duality is a relationship between classes of algebras and classes of relational systems (frames). In this paper, discrete dualities are presented for De Morgan algebras with various kind of unary operators. To do this, we will extend the discrete duality given in [W. Dzik, E. Orłowska and C. van Alten, Relational representation theorems for general lattices with negations, in Relations and Kleene Algebra in Computer Science, Lecture Notes in Computer Science, Vol. 4136 (Springer, Berlin, 2006), pp. 162–176], for De Morgan algebras.
APA, Harvard, Vancouver, ISO, and other styles
5

Richardson, DM, BW van Wilgen, DC Le Maitre, KB Higgins, and GG Forsyth. "A Computer-Based System for Fire Management in the Mountains of the Cape Province, South-Africa." International Journal of Wildland Fire 4, no. 1 (1994): 17. http://dx.doi.org/10.1071/wf9940017.

Full text
Abstract:
This paper describes a Catchment Management System (CMS) that provides objective procedures for managing fire. Prescribed burning is carried out in the mountain catchments of the Cape Province, South Africa, to enhance water yield, to rejuvenate the indigenous shrubland vegetation, to reduce fire hazard and to control invasive alien plants. Fire is the only practical tool for achieving these aims in the mountainous terrain. Recent research has improved understanding of the response of these systems to fire, but managing fire to achieve goals is very difficult. The CMS comprises a central geographical information system for managing and processing spatial data, linked to personal computers with DBase IV data-bases and simple rule-based models for decision-making. Current applications are: prioritization of areas for burning, monitoring the success of fire management, mapping fire hazard for fire control planning, and the production of management summaries and statistics. This paper presents examples of these applications from three areas in the Cape Province with different management problems and priorities: the Kammanassie in the southern Cape, and the Kogelberg and Table Mountain areas in the western Cape.
APA, Harvard, Vancouver, ISO, and other styles
6

Woods, David D. "GUTs or no GUTs (Grand Unified Theories): Does/Can/Should Cognitive Engineering have G.U.T.s?" Proceedings of the Human Factors and Ergonomics Society Annual Meeting 46, no. 3 (September 2002): 468–71. http://dx.doi.org/10.1177/154193120204600353.

Full text
Abstract:
What are the GUTs of Cognitive Systems Engineering (CSE)? G.U.T. is an abbreviation for Grand Unified Theory. As Cognitive Science matured, Allen Newell proposed a unifying model of cognition expressed as a software architecture SOAR. Similarly, John Anderson developed ACTR also claiming it represented a unified theory of cognition in the form of a computer simulation. Both of these cognitive architectures are computer programs that claim to simulate or be the basis for creating simulations of how people perform and learn cognitive tasks. Taking the development of Cognitive Science as a possible analogy for the potential development of Cognitive Systems Engineering, this panel discussion provides a platform to stimulate a vigorous exchange of ideas about the foundation of and potential futures of CSE.
APA, Harvard, Vancouver, ISO, and other styles
7

Allanson, D. R., W. B. Rowe, X. Chen, and A. Boyle. "Automatic dwell control in computer numerical control plunge grinding." Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture 211, no. 7 (July 1, 1997): 565–75. http://dx.doi.org/10.1243/0954405971516518.

Full text
Abstract:
Accuracy in precision grinding is strongly affected by variations in the normal grinding force. The force gives rise to deflections of the machine, grinding wheel and workpiece. To allow relaxation of the deflection a dwell period is included at the end of the grinding cycle. Estimation of the time constant during grinding allows the automatic selection of the correct dwell time for the individual workpiece and current force level. A new strategy has been developed for the estimation of the time constant based on power measurement during dwell. The strategy employs the weighted least mean squares technique together with weightings based on classification of the power level into bands. The power bands are designed so as to employ lower weightings to be applied to power samples in the regions of the power curve most prone to causing an inaccurate estimate of the time constant. The complete system was implemented on an adaptive system comprising a PC and an Allen Bradley 8200 CNC (computer numerical controller). The time constant identification and dwell control algorithms were executed within the PC and synchronized to the grinding cycle executed under the control of the CNC. The system was successfully tested under laboratory and industrial conditions. It was shown to produce a reliable and accurate estimate of the time constant with workpieces exhibiting time constants in a range from 0.7 to 55 s. The system was shown to cope with this wide range of time constants without user intervention and to be tolerant of the high signal noise levels typically encountered in an industrial environment.
APA, Harvard, Vancouver, ISO, and other styles
8

Al Amayreh, Ahmad, Maryline Hélard, Meryem Ouzzif, and Jérôme Le Masson. "Alien crosstalk elimination in digital subscriber line systems." IET Communications 8, no. 10 (July 3, 2014): 1714–23. http://dx.doi.org/10.1049/iet-com.2013.0536.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jones, Todd. "Thick description, fat syntax, and alternative conceptual systems." Pragmatics and Cognition 5, no. 1 (January 1, 1997): 131–62. http://dx.doi.org/10.1075/pc.5.1.08jon.

Full text
Abstract:
Many philosophers have claimed that intentional ascription is not possible if alien peoples are truly radically different from ourselves. At the same time, many anthropologists have claimed that the people they study think very differently from the way that we do. I claim that it is possible for both the anthropologists and the philosophers to be right. Giving intentional descriptions is problematic for people unlike ourselves, but anthropologists can, and do give good descriptions of alien mental states using descriptions not unlike those given in certain formulations of cognitive psychology.
APA, Harvard, Vancouver, ISO, and other styles
10

Pyle, Ian. "Alvey and ESPRIT." Computing & Control Engineering Journal 2, no. 6 (1991): 246. http://dx.doi.org/10.1049/cce:19910066.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "ALVEN (Computer system)"

1

Alves, dos Santos Luiz M. [Verfasser]. "Asymmetric and Adaptive Conference Systems for Enabling Computer-Supported Mobile Activities / Luiz M Alves dos Santos." Aachen : Shaker, 2003. http://d-nb.info/1170541402/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ögren, Ellinor. "Orcher förtjänar bättre : Orcher, Alver och deras stereotypiska design inom fantasy genren." Thesis, Högskolan i Skövde, Institutionen för informationsteknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-20565.

Full text
Abstract:
I denna undersökning så görs ett fördjupande perspektiv på fantasygenrens olika stereotyper, med ett specifikt fokus på orcher och alver samt även vad som har skapat dessa stereotyper och vad som har förändrats under de åren som de har existerat genom att ställa frågor till deltagarna. Det visar att beroende på individ som observerar dessa raser så ändras perspektivet på stereotyperna samt även huruvida uppskattat potentiella normkritiska förändringar anses vara. Detta har upptäckts genom användningen av artefakter, ett frågeformulär samt även en förstudie med en fördjupning på J.R.R. Tolkien och de stereotyper som har skapats tack vare hans verk. Efter reflektion så finns det möjligheter för en fortsatt studie inom detta eftersom det bara finns ett fokus på orcher och alver, även fast det existerar flera olika typer av fantasyraser som kan ge ytterligare intressant data som kan hjälpa framtida spelföretag.

Det finns övrigt digitalt material (t.ex. film-, bild- eller ljudfiler) eller modeller/artefakter tillhörande examensarbetet som ska skickas till arkivet.

APA, Harvard, Vancouver, ISO, and other styles
3

Gumpert, Ben Allen. "A recursive Gauss-Newton method for model independent eye-in-hand visual servoing / by Ben Allen Gumpert." Thesis, Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/17260.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Santos, Luiz Manoel Alves dos [Verfasser]. "Asymmetric and adaptive conference systems for enabling computer supported mobile activities / by Luiz Manoel Alves dos Santos." 2003. http://d-nb.info/967968917/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "ALVEN (Computer system)"

1

Office, General Accounting. Computer security: Identification of sensitive systems operated on behalf of ten agencies : congressional requesters. Washington, D.C: The Office, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Office, General Accounting. Computer security: Virus highlights need for improved Internet management : report to the chairman, Subcommittee on Telecommunications and Finance, Committee on Energy and Commerce, House of Representatives. Washington, D.C: GAO, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Office, General Accounting. Computer security: DEA is not adequately protecting national security information : report to the Chairman, Government Information, Justice, and Agriculture Subcommittee, Committee on Government Operations, House of Representatives. Washington, D.C: The Office, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Office, General Accounting. Computer security: DEA is not adequately protecting sensitive drug enforcement data : report to the Chairman, Government Information, Justice, and Agriculture Subcommittee, Committee on Government Operations, House of Representatives. Washington, D.C: The Office, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Office, General Accounting. Computer security: Governmentwide planning process had limited impact : report to the chairman, Committee on Science, Space, and Technology, House of Representatives. Washington, D.C: U.S. General Accounting Office, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Office, General Accounting. Computer security: FAA is addressing personnel weaknesses, but further action is required : report to the Chairman and Ranking Minority Member, Committee on Science, House of Representatives. Washington, D.C. (P.O. Box 37050, Washington, D.C. 20013): U.S. General Accounting Office, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Office, General Accounting. Computer security: FAA needs to improve controls over use of foreign nationals to remediate and review software : report to the Chairman, Committee on Science, House of Representatives. Washington, D.C. (P.O. Box 37050, Washington, D.C. 20013): The Office, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Office, General Accounting. Social Security Administration: Significant progress made in Year 2000 effort, but key risks remain : report to Congressional requesters. Washington, D.C. (P.O. Box 37050, Washington, D.C. 20013): The Office, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Office, General Accounting. Social Security Administration: Technical and performance challenges threaten progress of modernization : report to the Chairman, Subcommittee on Social Security, Committee on Ways and Means, House of Representatives. Washington, D.C. (P.O. Box 37050, Washington, D.C. 20013): The Office, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Office, General Accounting. Social Security Administration: Strategic workforce planning needed to address human capital challenges facing the disability determination services : report to the Chairman, Subcommittee on Social Security, Committee on Ways and Means, House of Representatives. Washington, D.C: GAO, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "ALVEN (Computer system)"

1

Li, Hui, Ningning Ge, Lingwang Gao, Zuorui Shen, Guoliang Zhang, Zhiyuan Zang, and Yi Li. "Development of the Information Management System for Monitoring Alien Invasive Species." In Computer and Computing Technologies in Agriculture IV, 594–99. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-18333-1_72.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

TSOTSOS, JOHN K. "Knowledge organization and its role in representation and interpretation for time-varying data: the ALVEN system." In Readings in Computer Vision, 498–514. Elsevier, 1987. http://dx.doi.org/10.1016/b978-0-08-051581-6.50051-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Solymar, Laszlo. "The Communications–Computing Symbiosis: The Beginning." In Getting the Message, 259–86. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198863007.003.0016.

Full text
Abstract:
Chapter 16 discusses the history of the computer. Important events include IBM bringing out the personal computer, and Xerox PARC inventing the graphical user interface. Paul Allen and Bill Gates left Harvard in 1975 to set up a computer laboratory. A year later Steve Jobs and Steve Wozniak set up Apple, followed soon by Dan Bricklin inventing the electronic spreadsheet. At the start of the 1980s Gates leased the MS-DOS operating system to IBM. Prior to all this, in 1969 the Advanced Research Product Agency set up ARPANET, the predecessor of the Internet. Other topics covered in this chapter include the birth of electronic mail, uses and abuses of the Internet, security and coding, and the Minitel in France. The last part of the chapter looks at the Soviet Union and the InterNyet.
APA, Harvard, Vancouver, ISO, and other styles
4

Lake, Mark Winter. "MAGICAL Computer Simulation of Mesolithic Foraging." In Dynamics in Human and Primate Societies. Oxford University Press, 2000. http://dx.doi.org/10.1093/oso/9780195131673.003.0011.

Full text
Abstract:
The MAGICAL (Multi-Agent Geographically Informed Computer AnaLysis) software described in this chapter was designed to integrate two of the most important computational methods used by archaeologists during the last decade: Geographical Information Systems (GIS) (e.g., Allen et al. 1990) and multiagent simulation (e.g., Lake 1995; Mithen 1990). At the outset of model development in 1995, it was recognized that GIS provide archaeologists with a sophisticated means of manipulating spatial data, but offer limited support for modeling change through time. Conversely, multiagent simulation models have allowed archaeologists to study change through time, but have either lacked or had simplistic spatial components. Consequently, the research described here aimed to combine the strengths of GIS and multiagent simulation in one software package so as to better facilitate the quantitative study of spatiotemporal variability in the archaeological record. The MAGICAL software was developed within the broader context of the Southern Hebrides Mesolithic Project (SHMP). This project was established in 1988 by Dr. Steven Mithen (University of Reading) to acquire new data from the Scottish Islands of Islay and Colonsay and, by integrating this with existing data, to develop a regional perspective on the early postglacial settlement of Western Scotland (Mithen and Lake 1996). The construction of a computer simulation model was considered a fundamental part of the postexcavation studies of the SHMP (Lake in press). It was hoped that conceptual models which would otherwise remain largely intuitive could be more rigorously explored by formalizing them into mathematical algorithms, translating those algorithms into computer code, and then running simulation experiments. This chapter describes how the MAGICAL software integrates GIS and multiagent simulation. It does so directly in section one and then by example in sections two, three, and four. Section two discusses the conceptual basis of the SHMP simulation model, and section three describes how this was implemented using the MAGICAL software. Section four presents the results of the SHMP simulations. Note that the SHMP simulation model is discussed primarily as a means of demonstrating the capabilities of the MAGICAL software. Those interested in the wider background to this particular modeling endeavor are urged to consult Mithen (ed., in prep).
APA, Harvard, Vancouver, ISO, and other styles
5

Dasgupta, Subrata. "A Symbolic Science Of Intelligence." In The Second Age of Computer Science. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190843861.003.0010.

Full text
Abstract:
Human Problem Solving (1972) by Allen Newell and Herbert Simon of Carnegie-Mellon University, a tome of over 900 pages, was the summa of some 17 years of research by Newell, Simon, and their numerous associates (most notably Cliff Shaw, a highly gifted programmer at Rand Corporation) into “how humans think.” “How humans think” of course belonged historically to the psychologists’ turf. But what Newell and Simon meant by their project of “understanding . . . how humans think” was very different from how psychologists envisioned the problem before these two men invaded their milieu in 1958 with a paper on human problem solving in the prestigious Psychological Review. Indeed, professional psychologists must have looked at them askance. Neither was formally trained in psychology. Newell was originally trained as a mathematician, Simon as a political scientist. They both disdained disciplinary boundaries. Their curricula vitae proclaimed loudly their intellectual heterodoxy. At the time Human Problem Solving was published, Newell’s research interests straddled artificial intelligence, computer architecture, and (as we will see) what came to be called cognitive science. Simon’s multidisciplinary creativity—his reputation as a “Renaissance man”—encompassing administrative theory, economics, sociology, cognitive psychology, computer science, and the philosophy of science—was of near-mythical status by the early 1970s. Yet, for one prominent historian of psychology it would seem that what Newell and Simon did had nothing to do with the discipline: the third edition of Georgetown University psychologist Daniel N. Robinson’s An Intellectual History of Psychology (1995) makes no mention of Newell or Simon. Perhaps this was because, as Newell and Simon explained, their study of thinking adopted a pointedly information processing perspective. Information processing: Thus entered the computer into this conversation. But, Newell and Simon hastened to clarify, they were not suggesting a metaphor of humans as computers. Rather, they would propose an information processing system (IPS) that would serve to describe and explain how humans “process task-oriented symbolic information.” In other words, human problem solving, in their view, is an instance of representing information as symbols and processing them.
APA, Harvard, Vancouver, ISO, and other styles
6

Turing, Alan. "Chess (1953)." In The Essential Turing. Oxford University Press, 2004. http://dx.doi.org/10.1093/oso/9780198250791.003.0023.

Full text
Abstract:
Chess and some other board games are a test-bed for ideas in Artificial Intelligence. Donald Michie—Turing’s wartime colleague and subsequently founder of the Department of Machine Intelligence and Perception at the University of Edinburgh—explains the relevance of chess to AI: Computer chess has been described as the Drosophila melanogaster of machine intelligence. Just as Thomas Hunt Morgan and his colleagues were able to exploit the special limitations and conveniences of the Drosophila fruit fly to develop a methodology of genetic mapping, so the game of chess holds special interest for the study of the representation of human knowledge in machines. Its chief advantages are: (1) chess constitutes a fully defined and well-formalized domain; (2) the game challenges the highest levels of human intellectual capacity; (3) the challenge extends over the full range of cognitive functions such as logical calculation, rote learning, concept-formation, analogical thinking, imagination, deductive and inductive reasoning; (4) a massive and detailed corpus of chess knowledge has accumulated over the centuries in the form of chess instructional works and commentaries; (5) a generally accepted numerical scale of performance is available in the form of the U.S. Chess Federation and International ELO rating system. In 1945, in his paper ‘Proposed Electronic Calculator’, Turing predicted that computers would probably play ‘very good chess’, an opinion echoed in 1949 by Claude Shannon of Bell Telephone Laboratories, another leading early theoretician of computer chess. By 1958, Herbert Simon and Allen Newell were predicting that within ten years the world chess champion would be a computer, unless barred by the rules. Just under forty years later, on 11 May 1997, IBM’s Deep Blue beat the reigning world champion, Gary Kasparov, in a six-game match. Turing was theorizing about the mechanization of chess as early as 1941. Fellow codebreakers at GC & CS remember him experimenting with two heuristics now commonly used in computer chess, minimax and best-first. The minimax heuristic involves assuming that one’s opponent will move in such a way as to maximize their gains; one then makes one’s own move in such a way as to minimize the losses caused by the opponent’s expected move.
APA, Harvard, Vancouver, ISO, and other styles
7

Pool, Robert. "Managing the Faustian Bargain." In Beyond Engineering. Oxford University Press, 1997. http://dx.doi.org/10.1093/oso/9780195107722.003.0013.

Full text
Abstract:
A quarter of a century ago, Alvin Weinberg offered one of the most insightful— and unsettling—observations anyone has made about modern technology. Speaking of the decision to use nuclear power, the long-time director of Oak Ridge National Laboratory warned that society had made a “Faustian bargain.” On the one hand, he said, the atom offers us a nearly limitless supply of energy which is cheaper than that from oil or coal and which is nearly nonpolluting. But on the other hand, the risk from nuclear power plants and nuclear-waste disposal sites demands “both a vigilance and a longevity of our social institutions that we are quite unaccustomed to.” We cannot afford, he said, to treat nuclear power as casually as we do some of our other technological servants—coal-fired power plants, for instance—but must instead commit ourselves to maintaining a close and steady control over it. Although Weinberg’s predictions about the cost of nuclear power may now seem naive, the larger issue he raised is even more relevant today than twenty-five years ago: Where should society draw the line in making these Faustian technological bargains? With each decade, technology becomes more powerful and more unforgiving of mistakes. Since Weinberg’s speech, we have witnessed major accidents at Three Mile Island, Chernobyl, and Bhopal, as well as the explosion of the Challenger and the wreck of the Exxon Valdez. And looking into the future, it’s easy to see new technological capabilities coming along that hold the potential for far greater disasters. In ten or twenty years, many of our computers and computer-controlled devices may be linked through a widespread network that dwarfs the current telecommunications system. A major breakdown like those that occasionally hit long-distance telephone systems could cost billions of dollars and perhaps kill some people, depending on what types of devices use the network. And if genetic engineering becomes a reality on a large scale, a mistake there could make the thalidomide debacle of the late 1950s and early 1960s look tame.
APA, Harvard, Vancouver, ISO, and other styles
8

Herring, Susan C., Christine Ogan, Manju Ahuja, and Jean C. Robinson. "Gender and the Culture of Computing in Applied IT Education." In Human Computer Interaction, 1736–44. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-87828-991-9.ch111.

Full text
Abstract:
The “shrinking pipeline” of women who ascend through the ranks in computer science education programs and careers is by now a familiar problem. Women drop out at rates faster than men at all levels of educational and professional advancement, resulting in a gender gap especially pronounced at the highest levels of the computing workforce, and that has not narrowed appreciably at any level in more than 20 years (Camp, 1997; ITAA, 2005; Vegso, 2005). Efforts to move more women into the pipeline at lower levels have met with limited success (cf. the Carnegie Mellon experience as reported by Margolis & Fisher, 2002); girls and women still express less interest than boys and men in studying computer science and pursuing information technology (IT) careers (Bentson, 2000; Vegso, 2005). A reason often cited in the literature is the masculine culture of many computer science programs and IT workplaces, which is perceived by many women as alien and unwelcoming (Bentson, 2000; Spertus, 1991; Turkle, 1988). Even when institutions make efforts to treat women and men equally or accord women special consideration in admissions and hiring decisions, attitudes discouraging women from entering computing persist, both within the institutions and in society at large. Sometimes these attitudes are expressed overtly: Underground “hacker” culture is notoriously antagonistic to women (Gilboa, 1996), and even mainstream computer aficionados respond with resistance and sexist jokes to proposals to recruit more girls and women to study computer science (Slashdot.org, 2005). Moreover, there is a widespread perception that computer experts are socially-isolated “geeks” or “nerds” obsessed with technology, a mode of being that women, who tend to be more socially oriented, find unappealing (Margolis & Fisher, 2002; Turkle, 1988). Fortunately, the situation for computer science does not tell the whole story. In the latter part of the 20t h century, the expansion of computing and the Internet fueled the rise of applied IT fields in which technical skills, rather than being developed for their own sake, are increasingly put to use in the service of human needs. Applied fields, such as information science, information systems and instructional technology, have gained strength, and a new interdisciplinary field, informatics, has emerged. At the same time, interest in computer science itself is declining, especially among women (ITAA, 2005; Vegso, 2005). In this article, we explore the possibility that applied IT fields may provide more women-friendly cultures while still focused on technology. The larger question underlying this exploration is: Does applied IT education have the potential to bridge the “gender computing gap”?
APA, Harvard, Vancouver, ISO, and other styles
9

Becker, Shirley Ann. "PDA Usability for Telemedicine Support." In Encyclopedia of Human Computer Interaction, 457–62. IGI Global, 2006. http://dx.doi.org/10.4018/978-1-59140-562-7.ch069.

Full text
Abstract:
Telemedicine is broadly defined as the use of information and communications technology to provide medical information and services (Perednia & Allen, 1995). Telemedicine offers an unprecedented means of bringing healthcare to anyone regardless of geographic remoteness. It promotes the use of ICT for healthcare when physical distance separates the provider from the patient (Institute of Medicine, 1996). In addition, it provides for real-time feedback, thus eliminating the waiting time associated with a traditional healthcare visit. Telemedicine has been pursued for over three decades as researchers, healthcare providers, and clinicians search for a way to reach patients living in remote and isolated areas (Norris, 2001). Early implementation of telemedicine made use of the telephone in order for healthcare providers and patients to interact. Over time, fax machines were introduced along with interactive multimedia, thus supporting teleconferencing among participants. Unfortunately, many of the early telemedicine projects did not survive because of high costs and insurmountable barriers associated with the use of technology. Telemedicine has been resurrected during the last decade as a means to help rural healthcare facilities. Advances in information and communications technology have initiated partnerships between rural healthcare facilities and larger ones. The Internet in particular has changed the way in which medical consultations can be provided (Coiera, 1997). Personal computers (PCs) and supporting peripherals, acting as clients, can be linked to medical databases residing virtually in any geographic space. Multimedia data types, video, audio, text, imaging, and graphics promote the rapid diagnosis and treatment of casualties and diseases. Innovations in ICT offer unprecedented healthcare opportunities in remote regions throughout the world. Mobile devices using wireless connectivity are growing in popularity as thin clients that can be linked to centralized or distributed medical-data sources. These devices provide for local data storage of medical data, which can be retrieved and sent back to a centralized source when Internet access becomes available. Those working in nomadic environments are connected to data sources that in the past were inaccessible due to a lack of telephone and cable lines. For the military, paramedics, social workers, and other healthcare providers in the field, ICT advances have removed technology barriers that made mobility difficult if not impossible. Personal digital assistants (PDAs)1 are mobile devices that continue to grow in popularity. PDAs are typically considered more usable for multimedia data than smaller wireless devices (e.g., cell phones) because of larger screens, fully functional keyboards, and operating systems that support many desktop features. Over the past several years, PDAs have become far less costly than personal-computing technology. They are portable, lightweight, and mobile when compared to desktop computers. Yet, they offer similar functionality scaled back to accommodate the differences in user-interface designs, data transmission speed, memory, processing power, data storage capacity, and battery life.
APA, Harvard, Vancouver, ISO, and other styles
10

Kaye, Phillip, Raymond Laflamme, and Michele Mosca. "Linear Algebra and the Dirac Notation." In An Introduction to Quantum Computing. Oxford University Press, 2006. http://dx.doi.org/10.1093/oso/9780198570004.003.0005.

Full text
Abstract:
We assume the reader has a strong background in elementary linear algebra. In this section we familiarize the reader with the algebraic notation used in quantum mechanics, remind the reader of some basic facts about complex vector spaces, and introduce some notions that might not have been covered in an elementary linear algebra course. The linear algebra notation used in quantum computing will likely be familiar to the student of physics, but may be alien to a student of mathematics or computer science. It is the Dirac notation, which was invented by Paul Dirac and which is used often in quantum mechanics. In mathematics and physics textbooks, vectors are often distinguished from scalars by writing an arrow over the identifying symbol: e.g a⃗. Sometimes boldface is used for this purpose: e.g. a. In the Dirac notation, the symbol identifying a vector is written inside a ‘ket’, and looks like |a⟩. We denote the dual vector for a (defined later) with a ‘bra’, written as ⟨a|. Then inner products will be written as ‘bra-kets’ (e.g. ⟨a|b⟩). We now carefully review the definitions of the main algebraic objects of interest, using the Dirac notation. The vector spaces we consider will be over the complex numbers, and are finite-dimensional, which significantly simplifies the mathematics we need. Such vector spaces are members of a class of vector spaces called Hilbert spaces. Nothing substantial is gained at this point by defining rigorously what a Hilbert space is, but virtually all the quantum computing literature refers to a finite-dimensional complex vector space by the name ‘Hilbert space’, and so we will follow this convention. We will use H to denote such a space. Since H is finite-dimensional, we can choose a basis and alternatively represent vectors (kets) in this basis as finite column vectors, and represent operators with finite matrices. As you see in Section 3, the Hilbert spaces of interest for quantum computing will typically have dimension 2n, for some positive integer n. This is because, as with classical information, we will construct larger state spaces by concatenating a string of smaller systems, usually of size two.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "ALVEN (Computer system)"

1

Hossieny, Morteza Sadat, and Hamid Khan. "Modernization of the Mechanical/Manufacturing Engineering Laboratories: Upgrading Educational CIM Cells Involving Students and Faculty." In ASME 2002 International Mechanical Engineering Congress and Exposition. ASMEDC, 2002. http://dx.doi.org/10.1115/imece2002-33966.

Full text
Abstract:
This paper reports the process of upgrade and enhancement of the Educational CIM Cell at Northern Kentucky University (NKU). The upgrade is part of the laboratory experiments in the Automated Manufacturing Systems Course at NKU. The goal of this paper is to increase students’ practical experience, upgrade the equipments in house, save cost, and reduce the technical dependency on an outside company. In this project Allen-Bradley SLC 100 PLC and Allen-Bradley SLC 150 will be upgraded with a new Allen-Bradley PLC and Panelview operator interface. Comprehensive effort are made to incorporate what has been learned in the MET program to design, manufacture a part, and use robotics and programmed interface for placement onto a conveyor. After the part is placed on the conveyor it will be transferred to a location where the part will be accepted or rejected. Personal computers will be interfaced for simulation, and to actual hardware for control and automation of typical manufacturing operations and industrial processes. This concept of an integrated laboratory system will allow expanded coverage of traditional controls topics and permit introduction of appropriately advanced control techniques including adaptive control for machining operations. This method of modernizing is shown to be more effective than modernizing by a turnkey upgrade of the laboratory Computer Integrated Manufacturing (CIM) facilities.
APA, Harvard, Vancouver, ISO, and other styles
2

Viste, Michael J., and David M. Cannon. "Firmware Design Capture." In ASME 1995 Design Engineering Technical Conferences collocated with the ASME 1995 15th International Computers in Engineering Conference and the ASME 1995 9th Annual Engineering Database Symposium. American Society of Mechanical Engineers, 1995. http://dx.doi.org/10.1115/detc1995-0164.

Full text
Abstract:
Abstract One of Allen-Bradley’s goals is leveraging — taking better advantage of existing resources. We are developing a methodology and supporting tools that help engineers share and reuse (i.e., leverage) their firmware design and development work. Writing reusable firmware source code is especially difficult due to the tight constraints in most embedded systems — code must usually be written for product specific hardware needs and resources. Reuse of engineering work at the design level is a more effective approach. With this in mind, we have been working with Allen-Bradley Power Products engineers and managers to pilot a Firmware Design Capture (FDC) system. In a FDC system, engineers work in their own paper or electronic workbooks compiling descriptions of their domains’ technologies and algorithms in loosely structured electronic document sets called technology books. Product-specific information is placed in complementary document sets called product books. Engineers can access this growing body of ‘Strategic Design Information’ that they and others have created; freely drawing from, commenting on, or adding to it. Key characteristics of this FDC system are: • A focus on collecting reusable and accessible design information • Incremental, small-grained development of documents during design activity • Electronic format of documents, for ease of refinement and access • Unobtrusive tools and methods, determined through frequent user feedback We expect this methodology to help engineers improve schedule predictability and reduce the firmware development life cycle, better retain vital technologies and product data, and increase product quality. Feedback from our initial work supports these expectations.
APA, Harvard, Vancouver, ISO, and other styles
3

Gupta, Ananya Sen, Ryan A. McCarthy, Kawther Rouabhi, Craig Kletzing, and Ivar Christopher. "Disentangling high energy chorus elements against structured background interference in the Van Allen radiation belts using braid manifolds." In 2020 54th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2020. http://dx.doi.org/10.1109/ieeeconf51394.2020.9443336.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lall, Pradeep, Madhura Hande, Chandan Bhat, and Jeff Suhling. "Leading Prognostic Indicators for Health Management of Electronics Under Thermo-Mechanical Stresses." In ASME 2007 InterPACK Conference collocated with the ASME/JSME 2007 Thermal Engineering Heat Transfer Summer Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/ipack2007-33876.

Full text
Abstract:
Methodologies for prognostication and health monitoring can significantly impact electronic reliability for applications in which even minimal risk of failure may be unbearable. Presently, health monitoring approaches such as the built-in self-test (BIST) are based on reactive failure diagnostics and unable to determine residual-life or estimate residual-reliability [Allen 2003, Drees 2004, Gao 2002, Rosenthal 1990]. Prognostics health-monitoring (PHM) approach presented in this paper is different from state-of-art diagnostics and resides in the pre-failure-space of the electronic-system, in which no macro-indicators such as cracks or delamination exist. Applications for the presented PHM framework include, consumer applications such as automotive safety systems including front and rear impact protection system, chassis-control systems, x-by-wire systems; and defense applications such as avionics systems, naval electronic warfare systems. The presented PHM methodologies enable the estimation of prior damage in deployed electronics by interrogation of the system state. The presented methodologies will trigger repair or replacement, significantly prior to failure. The approach involves the use of condition monitoring devices which can be interrogated for damage proxies at finite time-intervals. The system’s residual life is computed based on residual-life computation algorithms. Previously, Lall, et. al. [2004, 2005, 2006] have developed several leading indicators of failure. In this paper a mathematical approach has been presented to calculate the prior damage in electronics subjected to cyclic and isothermal thermo-mechanical loads. Electronic components operating in a harsh environment may be subjected to both temperature variations in addition to thermal aging during use-life. Data has been collected for leading indicators of failure for 95.5Sn4Ag0.5Cu first-level interconnects under both single and sequential application of cyclic and isothermal thermo-mechanical loads. Methodology for the determination of prior damage history has been presented using non-linear least-squares method based interrogation techniques. The methodology presented used the Levenberg-Marquardt Algorithm. Test vehicle includes various area-array packaging architectures soldered on Immersion Ag finish, subjected to thermal cycling in the range of −40°C to 125°C and isothermal aging at 125°C.
APA, Harvard, Vancouver, ISO, and other styles
5

Lall, Pradeep, Madhura Hande, Chandan Bhat, and Jeff Suhling. "Methodologies for System-State Interrogation for Prognostication of Electronics Under Thermo-Mechanical Loads." In ASME 2007 International Mechanical Engineering Congress and Exposition. ASMEDC, 2007. http://dx.doi.org/10.1115/imece2007-42560.

Full text
Abstract:
Methodologies for prognostication and health monitoring can significantly impact electronic reliability for applications in which even minimal risk of failure may be unbearable. Presently, health monitoring approaches such as the built-in self-test (BIST) are based on reactive failure diagnostics and unable to determine residual-life or estimate residual-reliability [Allen 2003, Drees 2004, Gao 2002, Rosenthal 1990]. Prognostics health-monitoring (PHM) approach presented in this paper is different from state-of-art diagnostics and resides in the pre-failure-space of the electronic-system, in which no macro-indicators such as cracks or delamination exist. Applications for the presented PHM framework include, consumer applications such as automotive safety systems including front and rear impact protection system, chassis-control systems, x-by-wire systems; and defense applications such as avionics systems, naval electronic warfare systems. The presented PHM methodologies enable the estimation of prior damage in deployed electronics by interrogation of the system state. The presented methodologies will trigger repair or replacement, significantly prior to failure. The approach involves the use of condition monitoring devices which can be interrogated for damage proxies at finite time-intervals. The system’s residual life is computed based on residual-life computation algorithms. Previously, Lall, et. al. [2004, 2005, 2006] have developed several leading indicators of failure. In this paper a mathematical approach has been presented to calculate the prior damage in electronics subjected to cyclic and isothermal thermomechanical loads. Electronic components operating in a harsh environment may be subjected to both temperature variations in addition to thermal aging during use-life. Data has been collected for leading indicators of failure for 95.5Sn4Ag0.5Cu first-level interconnects under both single and sequential application of cyclic and isothermal thermo-mechanical loads. Methodology for the determination of prior damage history has been presented using non-linear least-squares method based interrogation techniques. The methodology presented used the Levenberg-Marquardt Algorithm. Test vehicle includes various area-array packaging architectures soldered on Immersion Ag finish, subjected to thermal cycling in the range of −40°C to 125°C and isothermal aging at 125°C.
APA, Harvard, Vancouver, ISO, and other styles
6

Jayaram, S., H. Joshi, U. Jayaram, Y. Kim, H. Kate, and L. Varoz. "Embedding Haptics-Enabled Virtual Tools in CAD for Training Applications." In ASME 2006 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2006. http://dx.doi.org/10.1115/detc2006-99656.

Full text
Abstract:
This paper describes recent work completed to provide haptics-enabled virtual tools in a native CAD environment, such as CATIA V5™. This was a collaborative effort between Washington State University, Sandia National Laboratories, and Immersion Technologies. The intent was to start by utilizing Immersion’s Haptic Workstation™ hardware and supporting CATIA V5™ software at Sandia and leverage the existing work on virtual assembly done by the VRCIM laboratory at Washington State University (WSU). The key contribution of this paper is a unique capability to perform interactive assembly and disassembly simulations in a native Computer Aided Design (CAD) environment using tools such as allen and box-end wrenches with force feedback using a cyberforce™ and cybergrasp™. Equally important, it also contributes to the new trend in the integration of various commercial-off-the-shelf (COTS) systems with specific user driven systems and solutions using component-based software design concepts. We discuss some of the key approaches and concepts including: different approaches to integrating the native CAD assembly data with the virtual environment constraints data; integration of the native CAD kinematics capability with the immersive environment; algorithms to dynamically organize the assembly constraints for use in manipulation with a virtual hand for assembly and disassembly simulations; and an event-callback mechanism in which different events and callback functions were designed and implemented to simulate different situations in the virtual environment. This integrated capability of haptic tools in a native CAD environment provides functionality beyond extracting data from a CAD model and using it in a virtual environment.
APA, Harvard, Vancouver, ISO, and other styles
7

Dahai, Chang, and Andrew Wike. "Preparing for New Modes of Pipeline Operation in the People’s Republic of China." In 1998 2nd International Pipeline Conference. American Society of Mechanical Engineers, 1998. http://dx.doi.org/10.1115/ipc1998-2008.

Full text
Abstract:
There are more than 7,000km of crude oil transmission pipelines, and more than 8,000km of natural gas transmission pipelines in the People’s Republic of China. Although there are few product pipelines in China today, the growth of this industry is anticipated, fueled by the rapid development of the economy in China. The China Petroleum Pipeline Bureau is the largest pipeline operator in China, accounting for more than 6,000km of crude oil and natural gas pipelines. The Langfang-based Staff and Workers College (known simply as the Pipeline College) is a unit of the China Petroleum Pipeline Bureau. Students at the Pipeline College include full-time engineering students, and short-term trainees. In general, the short-term trainees are management and operator level staff who attend the Pipeline College for more advanced training. Having a lack of effective training tools, it was almost impossible for the Pipeline College to provide a really effective training experience, particularly in actual pipeline operations. In 1996, the Pipeline College developed plans to embrace advanced training tools in order to increase the effectiveness of the training courses it offered. The focus was in two areas: pipeline operations; and Programmable Logic Controller (PLC) set-up and maintenance. To achieve an effective training environment for pipeline operations, a simulation training system was set-up using commercially available pipeline simulation software from Stoner Associates; the PLC training is based on Allen-Bradley equipment. The operations training center computer systems were configured to accommodate ten trainees simultaneously running their own independent training sessions. The first training courses delivered by the Pipeline College using their new tools were presented in the summer of 1997. This paper briefly chronicles the development of the pipeline industry in China as a background to the operation of the Pipeline College. The training center hardware and software configurations are described in some detail. The paper describes the Pipeline College’s first experiences of using these advanced training tools, and their plans for the future development of the training center.
APA, Harvard, Vancouver, ISO, and other styles
8

Yang, Shifei, David Ehrhardt, and Matthew S. Allen. "A Review of Signal Processing Techniques for Continuous-Scan Laser Doppler Vibrometry." In ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-34972.

Full text
Abstract:
A Laser Doppler Vibrometer (LDV) measures the laser Doppler frequency shift and converts it to the velocity at a point of a structure along the laser beam direction. In commercially available scanning LDV, the laser is redirected by a pair of orthogonal mirrors from one point to another, measuring the responses at these points sequentially. Continuous-Scan Laser Doppler Vibrometry (CSLDV) is built on scanning LDV; the laser sweeps continuously over a structure while recording the response along the laser path. The continuous-scan approach can greatly accelerate modal testing, providing spatially detailed vibration shape of the structure at tens or even hundreds of points in the time that is required to measure the vibration at a single point. However, extracting vibration shapes from CSLDV measurements is challenging because the laser spot is continuously moving. This technical difficulty and the equipment cost have become the major barriers that prevent the widespread use of CSLDV. Several algorithms to extract vibration shapes have been developed since CSLDV was introduced. Ewins et al proposed a polynomial approach that treats the vibration shape along the laser scan path as a polynomial function of the laser position. The polynomial coefficients were found from the sideband harmonics in the frequency spectrum of the acquired velocity signal. Allen et al proposed a lifting approach that collects the measured responses at the same location along the laser path. The reorganized measurements appear to be from a set of pseudo transducers attached to the structure. Hence, the well-established conventional modal identification routines can be applied to process CSLDV measurement. Algorithms based on linear time periodic system identification theory were explored as well. These algorithms are based on the fact that the measured velocities along the laser path are the responses of a special liner time periodic system when a closed, periodic laser scan pattern is employed. For the first time, this work compares these signal processing techniques employed in different applications using the same set of data obtained from a cantilever beam. The noise and uncertainty in the reconstructed vibration shapes are discussed in order to present the advantages and disadvantages of each method.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography