Academic literature on the topic 'Word 97. (Computer file)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Word 97. (Computer file).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Word 97. (Computer file)"

1

Kim, Hyunji, Jaehoon Park, Hyeokdong Kwon, Kyoungbae Jang, and Hwajeong Seo. "Convolutional Neural Network-Based Cryptography Ransomware Detection for Low-End Embedded Processors." Mathematics 9, no. 7 (March 24, 2021): 705. http://dx.doi.org/10.3390/math9070705.

Full text
Abstract:
A crypto-ransomware has the process to encrypt victim’s files. Afterward, the crypto-ransomware requests a ransom for the password of encrypted files to victims. In this paper, we present a novel approach to prevent crypto-ransomware by detecting block cipher algorithms for Internet of Things (IoT) platforms. We extract the sequence and frequency characteristics from the opcode of binary files for the 8-bit Alf and Vegard’s RISC (AVR) processor microcontroller. In other words, the late fusion method is used to extract two features from one source data, learn through each network, and integrate them. We classify the crypto-ransomware virus or harmless software through the proposed method. The general software from AVR packages and block cipher implementations written in C language from lightweight block cipher library (i.e., Fair Evaluation of Lightweight Cryptographic Systems (FELICS)) are trained through the deep learning network and evaluated. The general software and block cipher algorithms are successfully classified by training functions in binary files. Furthermore, we detect binary codes that encrypt a file using block ciphers. The detection rate is evaluated in terms of F-measure, which is the harmonic mean of precision and recall. The proposed method not only achieved 97% detection success rate for crypto-ransomware but also achieved 80% success rate in classification for each lightweight cryptographic algorithm and benign firmware. In addition, the success rate in classification for Substitution-Permutation-Network (SPN) structure, Addition-Rotation-eXclusive-or structures (ARX) structure, and benign firmware is 95%.
APA, Harvard, Vancouver, ISO, and other styles
2

Amos, Adediran Adekunle. "Sources of Information for Social Studies Teachers and their Level of usage in Abeokuta, Ogun State Nigeria." Information Management and Business Review 4, no. 10 (October 15, 2012): 517–22. http://dx.doi.org/10.22610/imbr.v4i10.1007.

Full text
Abstract:
The study examined sources of information for teachers in social studies and their level of usage in secondary schools. The design of this study was descriptive of survey type. Data was collected from twentyfour (24) secondary schools and was randomly selected from Abeokuta north local government and Abeokuta south local government area of Ogun state, Nigeria. Ninety-seven (97) secondary schools female and male teachers of different age- grades, qualifications and experience were used as sample. Questionnaire was made use of for data collection. The questionnaire was the close-ended likert- type technique that expresses degree of agreement or disagreement with he supplied statement. Data was analyzed using standard deviation, students- t-test statistics, mean and rank order. The findings of the study revealed that, teachers’ sources of information for social studies include reference materials, newspapers, journals, literary materials, historical monuments and artefacts, television, textbooks, resource person, magazines, pamphlet, bulletins and radio. Computer component, such as e-mail, file transfer protocol, World Wide Web (www) rarely used by social studies teachers. Recommendations were made which includes computer training be provided for social studies teachers at all level of education in Nigeria and modern information centers with integrated circuits and digital communication to link schools should be provided at local levels for the use of teachers, particularly for social studies teachers.
APA, Harvard, Vancouver, ISO, and other styles
3

Pitaud, Philippe. "The Elderly, Digital Technologies and the Breakdown of Social Ties: Risks of Exclusion or Lures of Inclusion?" Ciências e Políticas Públicas / Public Sciences & Policies 6, no. 2 (December 2020): 79–97. http://dx.doi.org/10.33167/2184-0644.cpp2020.vvin2/pp.79-97.

Full text
Abstract:
When you are old in France in 2019, you do not have to be living on the street to be excluded or even feel excluded from a society that is increasingly turning its back on some of its members, because of the digital revolution imposed on citizens. In fact, faced with the excessive digitization advocated by governing bodies and other technocrats, which is rising speedily like a Tsunami, the elderly, often single women and/or widows belonging to underprivileged categories of society, generally with little or no education, and even less awareness in terms of management of minimal IT practices, are already or will soon find themselves on the sidelines of this type of modernization, which has nothing inclusive about it. A few local actors in the social and medico-social field and rights activists are already sounding the alarm and raising the voices of anguish in defense of these elderly people who no longer know how to cope with the dehumanization of public services: “I am 78 years old, I have a very small pension, no computer and anyway, I do not know how to do anything. So, it is annoying now because I have to get help and I do not know people who can help me. I am going to have to go there. It is a long way from home, I have to wait a long time and I am tired. And then you must be sure that there will be someone there!”. Aware of this dynamic of exclusion that is currently taking place and because we have been collecting the signs of this disarray for months, aggravated by isolation and loneliness, our action-research approach aims in the long run to implement counter-actions that aim to offset the harmful effects induced by the digital transition on the social life of the elderly, while seeking to free them from the negative confinement into which their inability to manage this transition by themselves has insidiously led them. It is these changes in the aspects of the most fragile of human existences that are at the heart of our approach as researchers-practitioners, as well as of our actions; acting like a mild buffer against the inhumanity of the system that is inexorably set up when a robot signals to you: “You have exceeded the deadline [note that this word contains the word ‘dead’] for the submission of your file on the lambda portal and therefore the administration can no longer do anything for you.”. There is no doubt that this is an immediate field of action for public policies, particularly in the fight against the digital exclusion of older citizens. For the time being, as always, in France, associations and humanitarian actors compensate for this absence of public authority with their limited means, but such a situation cannot last without in the long term seriously affecting societal balance and the moral principles of social justice, such as access to rights for all.
APA, Harvard, Vancouver, ISO, and other styles
4

Ceccon Ribeiro, Paula, Melissa L. Biles, Charles Lang, Claudio Silva, and Jan L. Plass. "Visualizing log-file data from a game using timed word trees." Information Visualization 17, no. 3 (August 2, 2017): 183–95. http://dx.doi.org/10.1177/1473871617720810.

Full text
Abstract:
In this article, we present the application of a method for visualizing gameplay patterns observed in log-file data from a geometry game. Using VisCareTrails, a data visualization software system based on the principle of timed word trees, we were able to identify five novel behaviors that informed our understanding of how players were approaching the game. We further utilized these newly identified player behaviors by triangulating them with geometry test scores collected from players outside the game setting. We compared the predictive capacity of these behaviors against five demographic characteristics commonly observed to be associated with educational outcomes: age, gender, ethnicity, mother’s education, and attitude toward video games. Two of the novel behaviors we identified, both reflecting inflexible problem-solving strategies, outperformed all demographic variables except age in terms of predicting change in geometry test scores post-gameplay. We believe that this is sound evidence for the utility of VisCareTrails and the timed-word-tree method for identifying pedagogically relevant player behaviors from semi-structured data associated with educational games.
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Zhi Qiang, Yayin Du, and Feng Liu. "Catastrophe Model of Elastic Compression Pole Buckling." Applied Mechanics and Materials 166-169 (May 2012): 3369–73. http://dx.doi.org/10.4028/www.scientific.net/amm.166-169.3369.

Full text
Abstract:
This screen is for uploading/submitting a new manuscript. This screen is NOT for updating revised versions of a previously submitted manuscript (click on 'My Papers' above to revise previous submissions). To upload/submit a new manuscript, first enter the title of the manuscript into the 'Paper Title' box. Then use the Browse buttons below to select BOTH a Word file and a PDF file on your computer of the manuscript that you are submitting. After you have selected both Word and PDF files from your local computer and entered your title, press the 'Update' button and the files will be transferred to our online system.
APA, Harvard, Vancouver, ISO, and other styles
6

Aqilah Mohd Nahar, Nur Farah, Nurul Hidayah Ab Rahman, and Kamarudin Malik Mohammad. "E-Raser: File Shredder Application With Content Replacement by Using Random Words Function." JOIV : International Journal on Informatics Visualization 2, no. 4-2 (September 10, 2018): 313. http://dx.doi.org/10.30630/joiv.2.4-2.175.

Full text
Abstract:
Data shredding indicates a process of irreversible file destruction while file shredder is the program designed to render computer-based files unreadable by implementing overwriting method to destroy data in the content of a file. The addressable problem with existence of file recovery tools is it may lead to data leakage, exploitation or dissemination from an unauthorized person. Thus, this study proposed a file shredding application named E-Raser which replacing the content of the file using random words function algorithm. A file shredder application named E-Raser was developed to shred Microsoft Word documents with (.doc) or (.docx) format. The implemented algorithm replaced the original content of the files with uninformative words provided by the application. After rewriting phase is complete, shredding process take place to make the file unrecoverable. Object Oriented Software Development was used as the methodology to develop this application. As a result, E-Raser achieved the objectives to add, remove, rewrite, display and shred files. Also, E-Raser is significantly facilitates users to securely dispose their file, protect the confidentiality and privacy of the file’s content.
APA, Harvard, Vancouver, ISO, and other styles
7

Khumrin, Piyapong, Ariyaphong Wongnoppavich, Khemmapop Boonploy, and Volaluck Supajatura. "A new approach to Computer-Based Examinations using word documents and spreadsheets." INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY 12, no. 3 (January 10, 2014): 3319–24. http://dx.doi.org/10.24297/ijct.v12i3.3241.

Full text
Abstract:
This paper describes a new approach to computer based testing where lecturers submit questions via word document which is processed to produce an examination, with student results analyzed and reported in a spreadsheet. The overall process starts with lecturers sending question files in word document format via email to the service provider. The questions are passed through the approval process using the editing system and then transferred to the examination system. The examination system directly accesses information from the question files to create a test, which students complete by inserting their answers directly into the spreadsheet file. Finally, the data are analyzed using spreadsheet formulas and the report system sends the results to students' emails. The document based approach helps the system implementation to be simpler and well accepted by the users while consistent with organizational requirements of moving towards electronic data management.
APA, Harvard, Vancouver, ISO, and other styles
8

Dervos, D., Y. Manolopoulos, and P. Linardis. "Comparison of signature file models with superimposed coding." Information Processing Letters 65, no. 2 (January 1998): 101–6. http://dx.doi.org/10.1016/s0020-0190(97)00210-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Eladawi, A. E., E. S. Gadelmawla, I. M. Elewa, and A. A. Abdel-Shafy. "An application of computer vision for programming computer numerical control machines." Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture 217, no. 9 (September 1, 2003): 1315–24. http://dx.doi.org/10.1243/095440503322420241.

Full text
Abstract:
Generation of the part programs, or tool paths, for products to be manufactured by computer numerical control (CNC) machines is very important. Many methods have been used to produce part programs, ranging from manual calculations to computer aided design/ manufacturing (CAD/CAM) systems. This work introduces a new technique for generating the part programs of existing products using the latest technology of computer vision. The proposed vision system is applicable for two-dimensional vertical milling CNC machines and is calibrated to produce both metric and imperial dimensions. Two steps are used to generate the part program. In the first step, the vision system is used to capture an image for the product to be manufactured. In the second step, the image is processed and analysed by software specially written for this purpose. The software CNCVision is fully written (in lab) using Microsoft Visual C++ 6.0. It is ready to run on any Windows environment. The CNCVision software processes the captured images and applies computer vision techniques to extract the product dimensions, then generates a suitable part program. All required information for the part program is calculated automatically, such as G-codes, X and Y coordinates of start-points and end-points, radii of arcs and circles and direction of arcs (clockwise or counterclockwise). The generated part program can be displayed on screen, saved to a file or sent to MS Word or MS Excel. In addition, the engineering drawing of the product can be displayed on screen or sent to AutoCAD as a drawing file.
APA, Harvard, Vancouver, ISO, and other styles
10

,, ,. "Great Enterprise Contribution to Society in Information System Perspectives PlagScan." Global Journal of Enterprise Information System 7, no. 1 (March 1, 2015): 89. http://dx.doi.org/10.18311/gjeis/2015/3047.

Full text
Abstract:
PlagScan is an entirely browser-based web service that verifies the authenticity of documents. Files can be uploaded in all com- mon file formats (MS Word, PDF and many more). Alternatively, users can paste text directly into PlagScan and check for authen - ticity. Our service employs a highly advanced two-step algorithm based on the latest research in computer linguistics.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Word 97. (Computer file)"

1

Chapman, Deena Jacques. "Word processing: What features need to be learned first to be productive fast?" CSUSB ScholarWorks, 1991. https://scholarworks.lib.csusb.edu/etd-project/592.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mobarak, Barbara Ann. "The development of a computer literacy curriculum for California charter schools." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2683.

Full text
Abstract:
To develop leaders for the 21st century, schools must be able to prepare students to meet the high academic, technical and workforce challenges. Charter schools are increasingly attempting to meet these challenges by educating students through innovative means and by creating effectual educational programs that are more conducive to the needs of the student. This document provides a computer literacy curriculum, which will facilitate student learning of computer literacy skills.
APA, Harvard, Vancouver, ISO, and other styles
3

Gopi, Srikanth. "Design of the 32-bit 32-word 10-read/write port register file." 2006. http://digital.library.okstate.edu/etd/umi-okstate-1825.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Carter, William D. (William David) 1950. "A comparison of instructor-led and interactive video training for the personal computer application WordPerfect." Thesis, 1991. http://hdl.handle.net/1957/37383.

Full text
Abstract:
This research compared the effectiveness of an interactive video training program with an instructor-led program for teaching working adults the personal computer (PC) application WordPerfect. The objectives of the study were to develop a research methodology and instrumentation in order to determine whether instructor-led training resulted in significantly different reaction, performance, and post-training use than interactive video instruction and to utilize the findings to suggest strategies for teaching working adults PC applications. The study was based on the need to comparatively evaluate various instructional approaches for teaching PC applications to working adults. There is also an underlying need in computer training is for easily administered, yet comprehensive evaluation methodologies. There were 111 individuals in the original sample. Half were randomly assigned to a interactive video group and half to an instructor-led group. After initial dropouts there were 53 individuals in the instructor-led group and 47 in the interactive video group. Instructional objectives, content and topic sequence were the same for both groups. A pilot study was conducted to confirm the reliability and validity of the instruments and methodology. A demographic questionnaire was completed at the beginning of an initial training session. At the end of the first training session a performance test and a reaction questionnaire were completed. After two to three weeks a use survey, a knowledge test, and a performance test were completed. Descriptive and analytic statistics were prepared for the dependent variables (reaction, performance, and post-training use) and covariates (age, gender, occupation, organization, education, and prior use). Null hypotheses of no difference were rejected when the significance was less than .05. Results indicated no significant differences in performance between the groups after either the first training session or after two to three weeks. However, results indicated significant differences (p = .0004) in reaction with the instructor-led group rating the training better overall. The instructor-led group also indicated that the clarity and usefulness of the course materials was better (p = .035). Significant differences were also found in post-training use (p = .036).
Graduation date: 1991
APA, Harvard, Vancouver, ISO, and other styles
5

Srimugunthan, *. "Efficient Usage Of Flash Memories In High Performance Scenarios." Thesis, 2012. http://etd.iisc.ernet.in/handle/2005/2562.

Full text
Abstract:
New PCI-e flash cards and SSDs supporting over 100,000 IOPs are now available, with several usecases in the design of a high performance storage system. By using an array of flash chips, arranged in multiple banks, large capacities are achieved. Such multi-banked architecture allow parallel read, write and erase operations. In a raw PCI-e flash card, such parallelism is directly available to the software layer. In addition, the devices have restrictions such as, pages within a block can only be written sequentially. The devices also have larger minimum write sizes (>4KB). Current flash translation layers (FTLs) in Linux are not well suited for such devices due to the high device speeds, architectural restrictions as well as other factors such as high lock contention. We present a FTL for Linux that takes into account the hardware restrictions, that also exploits the parallelism to achieve high speeds. We also consider leveraging the parallelism for garbage collection by scheduling the garbage collection activities on idle banks. We propose and evaluate an adaptive method to vary the amount of garbage collection according to the current I/O load on the device. For large scale distributed storage systems, flash memories are an excellent choice because flash memories consume less power, take lesser floor space for a target throughput and provide faster access to data. In a traditional distributed filesystem, even distribution is required to ensure load-balancing, balanced space utilisation and failure tolerance. In the presence of flash memories, in addition, we should also ensure that the numbers of writes to these different flash storage nodes are evenly distributed, to ensure even wear of flash storage nodes, so that unpredictable failures of storage nodes are avoided. This requires that we distribute updates and do garbage collection, across the flash storage nodes. We have motivated the distributed wearlevelling problem considering the replica placement algorithm for HDFS. Viewing the wearlevelling across flash storage nodes as a distributed co-ordination problem, we present an alternate design, to reduce the message communication cost across participating nodes. We demonstrate the effectiveness of our design through simulation.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Word 97. (Computer file)"

1

Schwabe, Walter. Word 97 in no time. London: Prentice Hall, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Clifford, Sarah Hutchinson. Total advantage: Microsoft Word 97. Boston, Mass: Irwin/Mc-Graw, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Clifford, Sarah Hutchinson. Microsoft Office 97 professional. Boston, Mass: Irwin McGraw-Hill, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

J, O'Leary Timothy. Microsoft Office 97. Boston, Mass: Irwin/McGraw-Hill, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Using Microsoft Office 97. Indianapolis, IN: Que, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bott, Ed. Using Microsoft Office 97. 3rd ed. Indianapolis: Que, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Person, Ron. Using Word and Excel in Office 97. Indianapolis, IN: Que Corporation, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Learn Microsoft Office 97: Comprehensive tutorials for Word 97, Excel 97, PowerPoint 97, Outlook 97, Web Access, Shortcut Bar, Binder and much more ... Plano, TX: Wordware Publishing, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Microsoft Office 97 in easy steps. Southam: Computer Step, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

J, O'Leary Timothy. McGraw-Hill Microsoft Office 97. Boston, Mass: Irwin McGraw-Hill, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Word 97. (Computer file)"

1

Pournelle, Jerry. "1001 Computer Words you Need to Know." In 1001 Computer Words You Need to Know. Oxford University Press, 2004. http://dx.doi.org/10.1093/oso/9780195167757.003.0007.

Full text
Abstract:
ac•cel•er•at•ed graph•ics port /æk'selƏ<sub>'</sub>rātid 'græfiks <sub>'</sub>pôrt/ ▶ n. see AGP. ac•cel•er•a•tor board / æk'selƏ<sub>'</sub>rātƏr <sub>'</sub>bôrd/ (also ac•cel•er•a•tor card) ▶ n. an accessory circuit board that can be plugged into a desktop computer to increase the speed of its processor or input/ output operations. ac•cess /'ækses/ ▶ n. the action or process of obtaining or retrieving information stored in a computer’s memory: this prevents unauthorized access or inadvertent deletion of the file. ▇ the opportunity to use a computer, files, data, etc.: unauthorized user access. ▇ a way to connect to the Internet: broadband access. ▶ v. [trans.] (usu. be accessed) obtain, examine, or retrieve (data or a file)…. USAGE: The verb access is standard and common in computing and related terminology, but the word is primarily a noun. Outside computing contexts, its use as a verb in the sense of ‘approach or enter a place’ is often regarded as nonstandard.
APA, Harvard, Vancouver, ISO, and other styles
2

Chan, Yung-Kuan, Yu-An Ho, Hsien-Chu Wu, and Yen-Ping Chu. "A Duplicate Chinese Document Image Retrieval System." In Encyclopedia of Information Science and Technology, First Edition, 1–6. IGI Global, 2005. http://dx.doi.org/10.4018/978-1-59140-553-5.ch001.

Full text
Abstract:
An optical character recognition (OCR) system enables a user to feed an article directly into an electronic computer file and translate the optically scanned bitmaps of text characters into machine-readable codes; that is, ASCII, Chinese GB, as well as Big5 codes, and then edits it by using a word processor. OCR is hence being employed by libraries to digitize and preserve their holdings. Billions of letters are sorted every day by OCR machines, which can considerably speed up mail delivery.
APA, Harvard, Vancouver, ISO, and other styles
3

Medlin, B. Dawn, Joseph A. Cazier, and Dinesh S. Dave. "Password Security Issues on an E-Commerce Site." In Information Security and Ethics, 3133–41. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-937-3.ch210.

Full text
Abstract:
With the exponential growth of the Internet and e-commerce, the need for secure transactions has become a necessity for both consumer and business. Even though there have been advances in security technology, one aspect remains constant: passwords still play a central role in system security. The difficulty with passwords is that all too often they are the easiest security mechanism to defeat. Kevin Mitnick, notably the most recognized computer hacker, made the following statement concerning humans and their passwords: …the human side of computer security is easily exploited and constantly overlooked. Companies spend millions of dollars on firewalls, encryption and secure access devices, and it’s money wasted, because none of these measures addresses the weakest link in the security chain. (Poulsen, 2000) Without secure passwords, e-commerce sites invite online criminals to attempt fraudulent schemes that mimic the goods and services that legitimate e-commerce merchants offer. With increasing numbers of users on an increasing array of e-commerce sites, often requiring the use of passwords, users often choose to reuse the same simplistic password, and do so on multiple sites (Campbell, Calvert, & Boswell, 2003). For most computerized systems, passwords are the first line of defense against hackers or intruders (Horowitz, 2001). There have been numerous published articles that have created guidelines on how to create better or safer passwords with the following recommendations: 1. passwords should be memorized and not written down; 2. passwords should be an eight- or nine-character word or phrase, and end users should randomly add 3. passwords should contain a mixture of letters (both upper- and lowercase), numbers, and punctuation characters; and 4. passwords should never be words that can be commonly found in a dictionary. But if an individual adheres to security experts’ suggestions about password authentication, it usually involves a trade-off. If a password is easy to create and remember, it is most likely that it is easy for others to guess or a hacker to crack. Eventually, any password can be cracked. Password crackers use a variety of methods and tools that can include guessing, dictionary lists, or brute force attacks. Dictionary lists are created by using an automated program that includes a text file of words that are common in a dictionary. The program repeatedly attempts to log on to the target system, using a different word from the text file on each attempt. A brute force attack is a variation of the dictionary attacks, but it is designed to determine passwords that may not be included in the text file. In a brute force attack, the attacker uses an automated program that generates hashes or encrypted values for all possible passwords and compares them to the values in the password file (Conklin, White, Cothren, Williams, & Davis, 2004). Unfortunately, many of the deficiencies of password authentication systems arise from the limitations of human cognitive ability (Pond, Podd, Bunnell, & Henderson, 2000). The requirements to remember long and complicated passwords are contrary to a well-known property of human memory. First, the capacity of human memory in its capacity to remember a sequence of items is temporally limited, with a short-term capacity of around seven items plus or minus two (Kanaley, 2001). Second, when humans remember a sequence of items, those items cannot be drawn from an arbitrary and unfamiliar range, but must be familiar “chunks” such as words or familiar symbols. Third, the human memory thrives on redundancy. In fact, studies have shown that individuals’ short-term memory will retain a password for approximately 30 seconds, thereby requiring individuals to attempt to memorize their passwords immediately (Atkinson & Shiffrin, 1968).
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Word 97. (Computer file)"

1

Da Rocha, Erik Lucas, Larissa Rodrigues, and João Fernando Mari. "Maize leaf disease classification using convolutional neural networks and hyperparameter optimization." In Workshop de Visão Computacional. Sociedade Brasileira de Computação - SBC, 2020. http://dx.doi.org/10.5753/wvc.2020.13489.

Full text
Abstract:
Maize is an important food crop in the world, but several diseases affect the quality and quantity of agricultural production. Identifying these diseases is a very subjective and time-consuming task. The use of computer vision techniques allows automatizing this task and is essential in agricultural applications. In this study, we assess the performance of three state-of-the-art convolutional neural network architectures to classify maize leaf diseases. We apply enhancement methods such as Bayesian hyperparameter optimization, data augmentation, and fine-tuning strategies. We evaluate these CNNs on the maize leaf images from PlantVillage dataset, and all experiments were validated using a five-fold cross-validation procedure over the training and test sets. Our findings include the correlation between the maize leaf classes and the impact of data augmentation in pre-trained models. The results show that maize leaf disease classification reached 97% of accuracy for all CNNs models evaluated. Also, our approach provides new perspectives for the identification of leaf diseases based on computer vision strategies.
APA, Harvard, Vancouver, ISO, and other styles
2

Sgarzi, O., and F. Leboeuf. "Analysis of Vortices in Three-Dimensional Jets Introduced in a Cross-Flow Boundary-Layer." In ASME 1997 International Gas Turbine and Aeroengine Congress and Exhibition. American Society of Mechanical Engineers, 1997. http://dx.doi.org/10.1115/97-gt-517.

Full text
Abstract:
The aim of this work is to numerically investigate the different vortical structures present in the flow generated by a jet in crossflow. The test case which is relevant to an hydraulic experiment consists of a single jet ejecting normally into a laminar main stream. The computation is performed using a stationary three-dimensional Navier-Stokes code. A multi-block technique is used to compute the flow in the injection pipe. In addition high resolution is achieved in the region of jet-mainstream interaction. The flow analysis relies on the visualization of particle trajectories. These particles are introduced into vortex cores that are located by the secondary velocity field they induce. The three-dimensional behavior of these structures is enlightened. This provides us with the origin of the fluid of which they are comprised. The mixing between the jet and the incoming viscous layer appears to begin on the windward side of the jet boundary. Up to five types of vortices are identified including the well known counter rotating vortices that dominate the downstream flow. They clearly result from the stretching and warping of the annular vorticity rings issuing from the pipe. At the jet exit, they each split into two sub-structures with different growth and downstream development. Less prominent structures are also seen. The “horse shoe” vortex has been captured which is typical of the near wall effects due to blockage induced by the jet in the main stream. Its downstream legs are sucked through the jet boundary. Another weak structure located on the upstream jet boundary is the “lip” vortex which results from the upstream part of the flow specific topology. It is shown that viscous effects play an important role in both the generation and interaction between vortical structures.
APA, Harvard, Vancouver, ISO, and other styles
3

Odegbile, Olufemi, Chaoyi Ma, Shigang Chen, Dimitrios Melissourgos, and Haibo Wang. "Hierarchical Virtual Bitmaps for Spread Estimation in Traffic Measurement." In 11th International Conference on Computer Science and Information Technology (CCSIT 2021). AIRCC Publishing Corporation, 2021. http://dx.doi.org/10.5121/csit.2021.110718.

Full text
Abstract:
This paper introduces a hierarchical traffic model for spread measurement of network traffic flows. The hierarchical model, which aggregates lower level flows into higher-level flows in a hierarchical structure, will allow us to measure network traffic at different granularities at once to support diverse traffic analysis from a grand view to fine-grained details. The spread of a flow is the number of distinct elements (under measurement) in the flow, where the flow label (that identifies packets belonging to the flow) and the elements (which are defined based on application need) can be found in packet headers or payload. Traditional flow spread estimators are designed without hierarchical traffic modeling in mind, and incur high overhead when they are applied to each level of the traffic hierarchy. In this paper, we propose a new Hierarchical Virtual bitmap Estimator (HVE) that performs simultaneous multi-level traffic measurement, at the same cost of a traditional estimator, without degrading measurement accuracy. We implement the proposed solution and perform experiments based on real traffic traces. The experimental results demonstrate that HVE improves measurement throughput by 43% to 155%, thanks to the reduction of perpacket processing overhead. For small to medium flows, its measurement accuracy is largely similar to traditional estimators that work at one level at a time. For large aggregate and base flows, its accuracy is better, with up to 97% smaller error in our experiments.
APA, Harvard, Vancouver, ISO, and other styles
4

Ouillette, Joanne J. "Designing the Future DDG 51 Class Computer Aided Design." In ASME 1993 International Computers in Engineering Conference and Exposition. American Society of Mechanical Engineers, 1993. http://dx.doi.org/10.1115/edm1993-0105.

Full text
Abstract:
Abstract The DDG 51 Class of AEGIS guided missile destroyers is the Navy’s premier surface combatant. Named for famed World War II hero. Admiral Arleigh Burke, these ships represent state-of-the-art technology. This 504 foot, 8,300 ton destroyer has been designed with improved seakeeping and survivability characteristics and carries the sophisticated AEGIS Weapon System. Derived from the Greek word meaning “shield”, AEGIS ships are the “shield of the fleet”. The Navy has commissioned the first two ships of the class. They have performed beyond expectation in rigorous at-sea trials designed to fully test combat capability. The DDG 51 Class ships are replacing retiring fleet assets. In a decreasing Department of Defense (DoD) budget environment, however, acquisition costs must be reduced to continue to build capable warships. The Navy’s Destroyer Program Office is pursuing the implementation of Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) technology to reduce costs without reducing ship’s capability. Under Navy direction, the ship construction yards, Bath Iron Works and Ingalls Shipbuilding, are aggressively pursuing the transition to CAD-based design, construction, and life cycle support This effort also involves General Electric, the Combat System Engineering Agent. Building a three dimensional (3D) computer model of the ship prior to construction will facilitate the identification and resolution of interferences and interface problems that would otherwise go undetected until actual ship construction. This 3D database contains geometry and design data to support system design. Accurate construction drawings, fabrication sketches, and Numerical Control (NC) data can be extracted directly from the database to support construction at each shipyard. At completion of construction, a model representing the “as built” configuration will be provided as a lifetime support tool for each ship’s projected 40 year life. The transition to CAD-based design and construction has applied fundamental concepts of the DoD’s Computer Aided Acquisition and Logistic Support (CALS) initiative. In addition to creating a 3D database representing ship design, the shipyards have developed a neutral file translator to exchange this data between Computervision and Calma CAD systems in operation at Bath Iron Works and Ingalls Shipbuilding respectively. This object oriented transfer capability ensures data is shared rather than duplicated. The CALS concepts of concurrent engineering and computer aided engineering analysis are being applied to design an upgrade to the ship that features the addition of a helicopter hanger. The CAD models are used as an electronic baseline from which to assess proposed modifications. Optimizing the design before the first piece of steel is cut will reduce construction costs and improve the quality of the ship.
APA, Harvard, Vancouver, ISO, and other styles
5

Beecher, Scott F., and Bret G. Lynch. "Loading Software to Engine Controls in the Field." In ASME 1997 International Gas Turbine and Aeroengine Congress and Exhibition. American Society of Mechanical Engineers, 1997. http://dx.doi.org/10.1115/97-gt-016.

Full text
Abstract:
With the advent of electronically alterable memories in electronic gas turbine engine control systems, there is now the opportunity for updating software in the field. Field loading provides a means to economically correct problems or introduce enhancements to system operation through the electronic control. In this paper we describe the characteristics of high integrity reprogramming systems used to update engine controls in-the-field. Pratt & Whitney Aircraft supports two methods for in-service reprogramming of Electronic Engine Controls (EECs), These two methods are PC Laptop based loaders and ARINC loaders. This discussion will focus on the capabilities provided to support in-the-field reprogramming of engine controls. The flexibility, integrity, and the benefits of field reprogramming provided by these software loading systems will be explained. These reprogramming systems provide a PC based application and ARINC based systems for either on-wing reprogramming or on-board reprogramming directly from a flight deck device to the EEC. The PC Loader reprogramming utilities allow field personnel to reprogram engine control application software and/or constants and configuration information using a suitably equipped IBM PC or compatible computer. These utilities are intended to be operated per Service Bulletin authorization only. They require a PC compatible computer (presumably a laptop model) with 2 UART interface cards, an interfacing cable, and the new software to be loaded. The rigor and manner of the integrity checks to ensure proper loading of the control is essential to an acceptable loading system. There are two types of ARINC-based loaders: on-wing loaders and on-board loaders. Both types enable the operator to upload application, trim, and/or configuration software to the engine control. Additionally the ARINC 615 device allows operators to download fault and configuration data from the control. Each type of loader uses a specially formatted file to control the sequence of operations involved in a data loading session. The on-wing loader utilizes a specially designed portable data loader which connects directly to the EEC via dedicated cabling through the control’s ARINC connectors. This type of data loader contains software which communicates via an ARINC 615 protocol to a peer software entity running on the EEC. The on-board loader uses the aircraft’s central maintenance computer system to communicate with the EEC over the aircraft’s ARINC 629 data bus. It also operates using a peer-to-peer communication protocol with the EEC. The ARINC 629 loader requires no extra equipment or cabling, nor does it require the EEC to be accessible for attachment of cables.
APA, Harvard, Vancouver, ISO, and other styles
6

Ding, Houzhu, Filippos Tourlomousis, Azizbek Babakhanov, and Robert C. Chang. "Design of a Personalized Skin Grafting Methodology Using an Additive Biomanufacturing System Guided by 3D Photogrammetry." In ASME 2015 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/imece2015-51990.

Full text
Abstract:
In this paper, the authors propose a novel method whereby a prescribed simulated skin graft is 3D printed, followed by the realization of a 3D model representation using an open-source software AutoDesk 123D Catch to reconstruct the entire simulated skin area. The methodology is photogrammetry, which measures the 3D model of a real-word object. Specifically, the principal algorithm of the photogrammetry is structure from motion (SfM) which provides a technique to reconstruct a 3D scene from a set of images collected using a digital camera. This is an efficient approach to reconstruct the burn depth compared to other non-intrusive 3D optical imaging modalities (laser scanning, optical coherence tomography). Initially, an artificial human hand with representative dimensions is designed using a CAD design program. Grooves with a step-like depth pattern are then incorporated into the design in order to simulate a skin burn wound depth map. Then, the *.stl format file of the virtually wounded artificial hand is extruded as a thermoplastic material, acrylonitrile butadiene styrene (ABS), using a commercial 3D printer. Next, images of the grooves representing different extents of burned injury are acquired by a digital camera from different directions with respect to the artificial hand. The images stored in a computer are then imported into AutoDesk 123D Catch to process the images, thereby yielding the 3D surface model of the simulated hand with a burn wound depth map. The output of the image processing is a 3D model file that represents the groove on the plastic object and thus the burned tissue area. One dimensional sliced sections of the designed model and reconstructed model are compared to evaluate the accuracy of the reconstruction methodology. Finally, the 3D CAD model is designed with a prescribed internal tissue scaffold structure and sent to the dedicated software of the 3D printing system to print the design of the virtual skin graft with biocompatible material poly-ε-caprolactone (PCL).
APA, Harvard, Vancouver, ISO, and other styles
7

"Changing Paradigms of Technical Skills for Data Engineers." In InSITE 2018: Informing Science + IT Education Conferences: La Verne California. Informing Science Institute, 2018. http://dx.doi.org/10.28945/4001.

Full text
Abstract:
Aim/Purpose: [This Proceedings paper was revised and published in the 2018 issue of the journal Issues in Informing Science and Information Technology, Volume 15] This paper investigates the new technical skills that are needed for Data Engineering. Past research is compared to new research which creates a list of the 20 top tech-nical skills required by a Data Engineer. The growing availability of Data Engineering jobs is discussed. The research methodology describes the gathering of sample data and then the use of Pig and MapReduce on AWS (Amazon Web Services) to count occurrences of Data Engineering technical skills from 100 Indeed.com job advertisements in July, 2017. Background: A decade ago, Data Engineering relied heavily on the technology of Relational Database Management Sys-tems (RDBMS). For example, Grisham, P., Krasner, H., and Perry D. (2006) described an Empirical Soft-ware Engineering Lab (ESEL) that introduced Relational Database concepts to students with hands-on learning that they called “Data Engineering Education with Real-World Projects.” However, as seismic im-provements occurred for the processing of large distributed datasets, big data analytics has moved into the forefront of the IT industry. As a result, the definition for Data Engineering has broadened and evolved to include newer technology that supports the distributed processing of very large amounts of data (e.g. Hadoop Ecosystem and NoSQL Databases). This paper examines the technical skills that are needed to work as a Data Engineer in today’s rapidly changing technical environment. Research is presented that re-views 100 job postings for Data Engineers from Indeed (2017) during the month of July, 2017 and then ranks the technical skills in order of importance. The results are compared to earlier research by Stitch (2016) that ranked the top technical skills for Data Engineers in 2016 using LinkedIn to survey 6,500 peo-ple that identified themselves as Data Engineers. Methodology: A sample of 100 Data Engineering job postings were collected and analyzed from Indeed during July, 2017. The job postings were pasted into a text file and then related words were grouped together to make phrases. For example, the word “data” was put into context with other related words to form phrases such as “Big Data”, “Data Architecture” and “Data Engineering”. A text editor was used for this task and the find/replace functionality of the text editor proved to be very useful for this project. After making phrases, the large text file was uploaded to the Amazon cloud (AWS) and a Pig batch job using Map Reduce was leveraged to count the occurrence of phrases and words within the text file. The resulting phrases/words with occurrence counts was download to a Personal Computer (PC) and then was loaded into an Excel spreadsheet. Using a spreadsheet enabled the phrases/words to be sorted by oc-currence count and then facilitated the filtering out of irrelevant words. Another task to prepare the data involved the combination phrases or words that were synonymous. For example, the occurrence count for the acronym ELT and the occurrence count for the acronym ETL were added together to make an overall ELT/ETL occurrence count. ETL is a Data Warehousing acronym for Extracting, Transforming and Loading data. This task required knowledge of the subject area. Also, some words were counted in lower case and then the same word was also counted in mixed or upper case, thus producing two or three occur-rence counts for the same word. These different counts were added together to make an overall occur-rence count for the word (e.g. word occurrence counts for Python and python were added together). Fi-nally, the Indeed occurrence counts were sorted to allow for the identification of a list of the top 20 tech-nical skills needed by a Data Engineer. Contribution: Provides new information about the Technical Skills needed by Data Engineers. Findings: Twelve of the 20 Stitch (2016) report phrases/words that are highlighted in bold above matched the tech-nical skills mentioned in the Indeed research. I considered C, C++ and Java a match to the broader cate-gory of Programing in the Indeed data. Although the ranked order of the two lists did not match, the top five ranked technical skills for both lists are similar. The reader of this paper might consider the skills of SQL, Python, Hadoop/HDFS to be very important technical skills for a Data Engineer. Although the programming language R is very popular with Data Scientists, it did not make the top 20 skills for Data Engineering; it was in the overall list from Indeed. The R programming language is oriented towards ana-lytical processing (e.g. used by Data Scientists), whereas the Python language is a scripting and object-oriented language that facilitates the creation of Data Pipelines (e.g. used by Data Engineers). Because the data was collected one year apart and from very different data sources, the timing of the data collection and the different data sources could account for some of the differences in the ranked lists. It is worth noting that the Indeed research ranked list introduced the technical skills of Design Skills, Spark, AWS (Amazon Web Services), Data Modeling, Kafta, Scala, Cloud Computing, Data Pipelines, APIs and AWS Redshift Data Warehousing to the top 20 ranked technical skills list. The Stitch (2016) report that did not have matches to the Indeed (2017) sample data for Linux, Databases, MySQL, Business Intelligence, Oracle, Microsoft SQL Server, Data Analysis and Unix. Although many of these Stitch top 20 technical skills were on the Indeed list, they did not make the top 20 ranked technical skills. Recommendations for Practitioners: Some of the skills needed for Database Technologies are transferable to Data Engineering. Recommendation for Researchers: None Impact on Society: There is not much peer reviewed literature on the subject of Data Engineering, this paper will add new information to the subject area. Future Research: I'm developing a Specialization in Data Engineering for the MS in Data Science degree at our university.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography