To see the other types of publications on this topic, follow the link: Electronic data processing Computers.

Journal articles on the topic 'Electronic data processing Computers'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Electronic data processing Computers.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Moulin, P., A. T. Ogielski, G. Lilienfeld, and J. W. Woods. "Video Signal Processing and Coding on Data-Parallel Computers." Digital Signal Processing 5, no. 2 (1995): 118–29. http://dx.doi.org/10.1006/dspr.1995.1011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Jacobson, Robert V. "Electronic Data Processing Security—An Overview." International Journal of Network Management 6, no. 2 (1996): 77–93. http://dx.doi.org/10.1002/(sici)1099-1190(199603/04)6:2<77::aid-nem184>3.0.co;2-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cao, Yuan, Hongkang Lu, and Tao Wen. "A Safety Computer System Based on Multi-Sensor Data Processing." Sensors 19, no. 4 (2019): 818. http://dx.doi.org/10.3390/s19040818.

Full text
Abstract:
The safety computer in the train control system is designed to be the double two-vote-two architecture. If safety-critical multi-input data are inconsistent, this may cause non-strict multi-sensor data problems in the output. These kinds of problems may directly affect the decision making of the safety computer and even pose a serious threat to the safe operation of the train. In this paper, non-strict multi-sensor data problems that exist in traditional safety computers are analyzed. The input data are classified based on data features and safety computer features. Then, the input data that cause non-strict multi-sensor data problems are modeled. Fuzzy theory is used in the safety computer to process multi-sensor data and to avoid the non-strict multi-sensor problems. The fuzzy processing model is added into the onboard double two-vote-two architecture safety computer platform. The fuzzy processing model can be divided into two parts: improved fuzzy decision tree and improved fuzzy weighted fusion. Finally, the model is verified based on two kinds of data. Verification results indicate that the fuzzy processing model can effectively reduce the non-strict identical problems and improve the system efficiency on the premise of ensuring the data reliability.
APA, Harvard, Vancouver, ISO, and other styles
4

El-Seoud, Samir Abou, Reham Fouad Mohamed, and Samy Ghoneimy. "DNA Computing: Challenges and Application." International Journal of Interactive Mobile Technologies (iJIM) 11, no. 2 (2017): 74. http://dx.doi.org/10.3991/ijim.v11i2.6564.

Full text
Abstract:
&lt;p class="Abstract"&gt;Much of our scientific, technological, and economic future depends on the availability of an ever-increasing supply of computational power. However, the increasing demand for such power has pushed electronic technology to the limit of physical feasibility and has raised the concern that this technology may not be able to sustain our growth in the near future. It became important to consider an alternative means of achieving computational power. In this regard, DNA computing was introduced based on the usage of DNA and molecular biology hardware instead of the typical silicon based technology. The molecular computers could take advantage of DNA's physical properties to store information and perform calculations. These include extremely dense information storage, enormous parallelism and extraordinary energy efficiency. One of the main advantages that DNA computations would add to computation is its self - parallel processing while most of the electronic computers now use linear processing. In this paper, the DNA computation is reviewed and its state of the art challenges and applications are presented. Some of these applications are those require fast processing, at which DNA computers would be able to solve the hardest problems faster than the traditional ones. For example, 10 trillion DNA molecules can fit in one cubic centimeter that would result in a computer that holds 10 terabytes of data. Moreover, this work focuses on whether a large scale molecular computer can be built.&lt;/p&gt;
APA, Harvard, Vancouver, ISO, and other styles
5

Mansfield, John F. "An Introduction Electronic Documentation Preparation and Submission for Future Microscopy and Microanalysis Meetings." Microscopy and Microanalysis 5, S2 (1999): 520–21. http://dx.doi.org/10.1017/s1431927600015920.

Full text
Abstract:
Since the introduction of personal computers in the early 1980s, most documents and manuscripts have been prepared from within a word processing program on such a computer. Indeed, most documents on personal computers are created on a limited number of word processing packages that have come to dominate the market. Most of the data acquisition that attendees of Microscopy and Microanalysis currently perform is digital, i.e. directly into a personal computer or desktop workstation. It is frequently the case that images, spectra and diffraction patterns are only committed to paper when they are dispatched for publication. With the advent of the World Wide Web in 1994, and its subsequent explosive growth, entirely electronic publishing has become possible. It is possible to envisage the day when most publications will be available solely electronically and material will be only printed out occasionally by the reader.While there are a growing number of journals that are only available electronically, entirely paperless publication is not yet the norm and there are no definite plans, as yet, to publish the proceedings of the Microscopy and Microanalysis meetings solely in electronic format. However, there are a number of advantages to applying some of these emerging technologies in the production of the future proceedings. Electronic submission of abstracts is one advance that is being considered by the organizers of the meetings scheduled for the early 2000s. The National Science Foundation now receives almost all proposals electronically. There are several reasons for moving to electronic submission. The first is that proofing and correcting the abstracts would then be straightforward.
APA, Harvard, Vancouver, ISO, and other styles
6

Ahmad Bukhori S. "DESAIN SISTEM: VISUAL BASIC FOR APPLICATION EXCEL 2010 UNTUK PENGGAJIAN GURU MADRASAH ALIYAH MUHAMMADIYAH I MALANG." INTAJ : Jurnal Penelitian Ilmiah 3, no. 1 (2019): 155–85. http://dx.doi.org/10.35897/intaj.v3i1.209.

Full text
Abstract:
In the world of education in Indonesia, computers have been introduced and used in schools ranging from basic education to tertiary institutions. Computers make it easy to find and provide learning materials can also be obtained, for example by the concept of electronic libraries (e-libraries) or electronic books (e-books).&#x0D; MA Muhammadiyah I Malang is a school that is equivalent to a high school (SMA) that has used a computer as a tool for the process of teaching and learning activities, finance, administration and so forth. Computers are tools that help humans in making reports or in producing information that can be achieved as a basis for decision making and can contribute more to data storage and data retrieval processes. Currently in the financial section for the processing of teaching fees and employee honorariums still use Microsoft Excel. Separate financial data in several sheets that require separate data input for each sheet so that financial calculations and reporting are less efficient and fast.&#x0D; The bookkeeping process is sometimes very difficult, complicated or even troublesome for some people. But today's computer technology has been able to change this process to be easier, simpler, faster and more efficientPayroll system is a computer application created to make it easier for employees in the financial department to make, calculate, report and analyze school financial turnover so that the application can provide optimal services to teachers and employees as a support for timely decision making.
APA, Harvard, Vancouver, ISO, and other styles
7

Voland, Patrick, and Hartmut Asche. "Processing and Visualizing Floating Car Data for Human-Centered Traffic and Environment Applications." International Journal of Agricultural and Environmental Information Systems 8, no. 2 (2017): 32–49. http://dx.doi.org/10.4018/ijaeis.2017040103.

Full text
Abstract:
In the era of the Internet of Things and Big Data modern cars have become mobile electronic systems or computers on wheels. Car sensors record a multitude of car and traffic related data as well as environmental parameters outside the vehicle. The data recorded are spatio-temporal by nature (floating car data) and can thus be classified as geodata. Their geospatial potential is, however, not fully exploited so far. In this paper, we present an approach to collect, process and visualize floating car data for traffic- and environment-related applications. It is demonstrated that cartographic visualization, in particular, is as effective means to make the enormous stocks of machine-recorded data available to human perception, exploration and analysis.
APA, Harvard, Vancouver, ISO, and other styles
8

ZIPPEL, RICHARD. "THE DATA STRUCTURE ACCELERATOR ARCHITECTURE." International Journal of High Speed Electronics and Systems 07, no. 04 (1996): 533–71. http://dx.doi.org/10.1142/s012915649600030x.

Full text
Abstract:
We present a heterogeneous architecture that contains a fine grained, massively parallel SIMD component called the data structure accelerator and demonstrate its use in a number of problems in computational geometry including polygon filling and convex hull. The data structure accelerator is extremely dense and highly scalable. Systems of 106 processing elements can be embedded in workstations and personal computers, without dramatically changing their cost. These components are intended for use in tandem with conventional single sequence machines and with small scale, shared memory multiprocessors. A language for programming these heterogeneous systems is presented that smoothly incorporates the SIMD instructions of the data structure accelerator with conventional single sequence code. We then demonstrate how to construct a number of higher level primitives such as maximum and minimum, and apply these tools to problems in logic and computational geometry. For computational geometry problems, we demonstrate that simple algorithms that take advantage of the parallelism available on a data structure accelerator perform as well or better than the far more complex algorithms which are needed for comparable efficiency on single sequence computers.
APA, Harvard, Vancouver, ISO, and other styles
9

Noakes, J. E., J. D. Spaulding, and R. J. Valenta. "Low-Level Liquid Scintillation Counter Array with Computerized Data Acquisition and Age Calculation Capabilities for 14C Dating." Radiocarbon 37, no. 2 (1995): 773–79. http://dx.doi.org/10.1017/s0033822200031325.

Full text
Abstract:
We describe a two-phase study directed toward background reduction of a manual liquid scintillation counter and the interfacing of electronics for counting to a computer data acquisition system. Counter background reduction is achieved with afterpulse electronics, a high-performance cocktail, an auxiliary detector/guard and a special sample vial holder. The data acquisition system is comprised of an electronic signal processor and sorter for operating up to eight counters simultaneously and interfacing to a computer with software for data storage, acquisitions and age dating calculations. We discuss low-background counter modifications, electronic signal processing and computer software for 14C age dating.
APA, Harvard, Vancouver, ISO, and other styles
10

Seitkulov, Yerzhan N., Seilkhan N. Boranbayev, Gulden B. Ulyukova, Banu B. Yergaliyeva, and Dina Satybaldina. "Methods for secure cloud processing of big data." Indonesian Journal of Electrical Engineering and Computer Science 22, no. 3 (2021): 1650. http://dx.doi.org/10.11591/ijeecs.v22.i3.pp1650-1658.

Full text
Abstract:
We study new methods of secure cloud processing of big data when solving applied computationally-complex problems with secret parameters. This is one of the topical issues of secure client-server communication. As part of our research work, we model the client-server interactions: we give specific definitions of such concepts as “solvable by the protocol”, “secure protocol”, “correct protocol”, as well as actualize the well-known concepts-“active attacks” and “passive attacks”. First, we will outline the theory and methods of secure outsourcing for various abstract equations with secret parameters, and then present the results of using these methods in solving applied problems with secret parameters, arising from the modeling of economic processes. Many economic tasks involve processing a large set of economic indicators. Therefore, we are considering a typical economic problem that can only be solved on very powerful computers.
APA, Harvard, Vancouver, ISO, and other styles
11

Millington, D. "Electronic data processing in practice: a handbook for users." Information and Software Technology 33, no. 2 (1991): 167. http://dx.doi.org/10.1016/0950-5849(91)90070-r.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Ranade, Satish J. "An Automated Data Acquisition and Processing System Using Personal Computers for an Undergraduate Electric Machinery Laboratory." IEEE Power Engineering Review 9, no. 2 (1989): 69. http://dx.doi.org/10.1109/mper.1989.4310499.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Kim, Chulho. "An Implementation of Natural Language Processing and Text Mining in Stroke Research." Journal of the Korean Neurological Association 39, no. 3 (2021): 121–28. http://dx.doi.org/10.17340/jkna.2021.3.2.

Full text
Abstract:
Natural language processing (NLP) is a computerized approach to analyzing text that explores how computers can be used to understand and manipulate natural language text or speech to do useful things. In healthcare field, these NLP techniques are applied in a variety of applications, ranging from evaluating the adequacy of treatment, assessing the presence of the acute illness, and the other clinical decision support. After converting text into computer-readable data through the text preprocessing process, an NLP can extract valuable information using the rule-based algorithm, machine learning, and neural network. We can use NLP to distinguish subtypes of stroke or accurately extract critical clinical information such as severity of stroke and prognosis of patients, etc. If these NLP methods are actively utilized in the future, they will be able to make the most of the electronic health records to enable optimal medical judgment.
APA, Harvard, Vancouver, ISO, and other styles
14

Do Nascimento, Paulo Sérgio Brandão, Stelita M. Da Silva, Jordana L. Seixas, Remy E. Sant’Anna, and Manoel E. De Lima. "Mapping of Massive Data Processing Systems to FPGA Computers Based on Temporal Partitioning and Design Space Exploration." Journal of Integrated Circuits and Systems 2, no. 1 (2007): 45–54. http://dx.doi.org/10.29292/jics.v2i1.235.

Full text
Abstract:
High parallelism degree is fundamental for high speed massive data processing systems. Modern FPGA devices can provide such parallelism plus flexibility. However, these devices are still limited by their logic block size, memory size, memory bandwidth and configuration time. Temporal partitioning techniques can be a solution for such problems when FPGAs are used to implement large systems. In this case, the system is split into partitions (called contexts), multiplexed in a FPGA, by using reconfiguration techniques. This approach can increase the effective area for system implementation, allowing increase of parallelism in each task that composes the application. However, the necessary reconfiguration time between contexts can cause performance decrease. A possible solution for this is an intensive parallelism exploration of massive data application to compensate for this overhead and improve global performance. This is true for modern FPGA with relatively high reconfiguration speed. In this work, A reconfigurable computer platform and design space exploration techniques are proposed for mapping of such massive data applications, as image processing, in FPGA devices, depending on the application task scheduling. A library with different hardware implementation for a different parallelism degree is used for better adjustment of space/time for each task. Experiments demonstrate the efficiency of this approach when compared to the optimal mapping reached by exhaustive timing search in the complete design space exploration. A design flow is shown based on library components that implements typical tasks used in the domain of applications.
APA, Harvard, Vancouver, ISO, and other styles
15

Masic, Izet. "The History and New Trends of Medical Informatics." Donald School Journal of Ultrasound in Obstetrics and Gynecology 7, no. 3 (2013): 301–12. http://dx.doi.org/10.5005/jp-journals-10009-1298.

Full text
Abstract:
ABSTRACT The breakthrough of the computer and information technologies in all the segments of the society, led to the needs for the computer and information technologies. The knowledge of information technology is now part of general literacy. The computer literacy does not require comprehensive and detailed knowledge of the electronics or programming. Although with the electronic computer which is the invention of our age, the attempts of the construction of the first machine for the processing of the information reach far in the history of human civilization. The only and global function of a computer data processing can be naturally separated into the series of the other elementary operations, as for examples are: ‘the followup of the data, their registration, reproduction, selection, sorting, and comparison’ and so on. The computers are being classified according to ‘the purpose, type and computer size’. According to the purpose the computers it can be of the general and specific purposes. The computers for the general purpose serve for the commercial applications or any other application that is necessary. If medical informatics is regarded as a scientific discipline dealing with theory and practice of information processes in medicine, comprising data communication by information and communication technologies (ICT), with computers as an especially important ICT, then it can be stated that the history medical informatics is connected with the beginnings of computer usage in medicine. The medical informatics is the foundation for understanding and practice of the up-to-day medicine. Its basic tool is the computer, subject of studying and the means by which the aspects and achieve the new knowledge in the studying of a man, his health and disease, and functioning of the total health activities. Current network system possesses the limited global performance in the organization of health care, and that is especially expressed in the clinical medicine, where the computer technology has not received the wanted applications yet. In front of us lies the brilliant future of the medical informatics. It should expect that the application of terminal and personal computers with more simple manners of operation will enable routine use of computer technology by all health professionals in the fields of telemedicine, distance learning (DL) (web-based medical education), application of ICT, medical robotics, genomics, etc. The development of nature languages for communication with the computers and the identification of input voice will make the work simpler. Regarding the future of medical informatics education there are numerous controversies. Everybody agrees that the medical informatics is very significant for the whole health care and for the needs for personnel. However, there is not yet the general agreement regarding the teaching programs, because the medical informatics is very involved and propulsive, what makes the performance of the stable education programs more difficult. There are also not general agreement in which year of studding should transfer the knowledge from medical informatics. The majority of the experts still agree that the priority should be given in later study years, since more and more students enroll the faculties with prior informatics illiteracy, and the comprehension of some medical informatics fields is not possible without prior clinical knowledge. How to cite this article Masic I. The History and New Trends of Medical Informatics. Donald School J Ultrasound Obstet Gynecol 2013;7(3):301-312.
APA, Harvard, Vancouver, ISO, and other styles
16

Cibin, Fabio, Massimo Lanzoni, Luca Benini, and Bruno Ricco. "Linux-Based Data Acquisition and Processing on Palmtop Computer." IEEE Transactions on Instrumentation and Measurement 55, no. 6 (2006): 2039–44. http://dx.doi.org/10.1109/tim.2006.884135.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Kuti, C., L. L. Láng, and Z. Bedő. "Use of barcodes and digital balances for the identification and measurement of field trial data." Acta Agronomica Hungarica 52, no. 4 (2005): 409–19. http://dx.doi.org/10.1556/aagr.52.2004.4.10.

Full text
Abstract:
The widespread use of digitally-controlled measuring and analytical devices and electronic data collectors, all equipped with microprocessors and linked to computers, has made it possible for on-line data collection to become a routine process. A rational combination of two up-to-date techniques, barcodes and digital balance terminals, linked to an average computer background (Kuti et al., 2003), has proved in practice to satisfy the criteria raised for the up-to-date processing of breeding data at low cost. This system is an example of how it is possible to reduce costs while processing data more rapidly and reliably and allowing human resources to be utilised more flexibly and efficiently. The modules (MvLabel, MvSticker, MvWeighing)of the program package developed in Martonvásár for the handling and analysis of the data from plant breeding and crop production experiments can also be used independently for the identification of experimental field units (spikes, rows, plots) and for the online handling of weight measurements and analytical data. They provide a simple solution for the design and printing of labels (self-adhesive or plastic) containing barcodes. They make it easier to retrieve the data recorded by digital balance terminals and store them on hard discs, while also helping to unify and synchronise the various parts of the system using barcode readers to identify the measurement data.
APA, Harvard, Vancouver, ISO, and other styles
18

Acho, Leonardo, Gisela Pujol-Vázquez, and José Gibergans-Báguena. "Electronic Device and Data Processing Method for Soil Resistivity Analysis." Electronics 10, no. 11 (2021): 1281. http://dx.doi.org/10.3390/electronics10111281.

Full text
Abstract:
This paper presents a mathematical algorithm and an electronic device to study soil resistivity. The system was based on introducing a time-varying electrical signal into the soil by using two electrodes and then collecting the electrical response of the soil. Hence, the proposed electronic system relied on a single-phase DC-to-AC converter followed by a transformer for the soil-to-circuit coupling. By using the maximum likelihood statistical method, a mathematical algorithm was realized to discern soil resistivity. The novelty of the numerical approach consisted of modeling a set of random data from the voltmeters by using a parametric uniform probability distribution function, and then, a parametric estimation was carried out for dataset analysis. Furthermore, to validate our contribution, a two-electrode laboratory experiment with soil was also designed. Finally, and according to the experimental outcomes, our electronic circuit and mathematical data analysis approach were able to detect different soil resistivities.
APA, Harvard, Vancouver, ISO, and other styles
19

Muhammad, Yar, and Daniil Vaino. "Controlling Electronic Devices with Brain Rhythms/Electrical Activity Using Artificial Neural Network (ANN)." Bioengineering 6, no. 2 (2019): 46. http://dx.doi.org/10.3390/bioengineering6020046.

Full text
Abstract:
The purpose of this research study was to explore the possibility to develop a brain-computer interface (BCI). The main objective was that the BCI should be able to recognize brain activity. BCI is an emerging technology which focuses on communication between software and hardware and permitting the use of brain activity to control electronic devices, such as wheelchairs, computers and robots. The interface was developed, and consists of EEG Bitronics, Arduino and a computer; moreover, two versions of the BCIANNET software were developed to be used with this hardware. This BCI used artificial neural network (ANN) as a main processing method, with the Butterworth filter used as the data pre-processing algorithm for ANN. Twelve subjects were measured to collect the datasets. Tasks were given to subjects to stimulate brain activity. The purpose of the experiments was to test and confirm the performance of the developed software. The aim of the software was to separate important rhythms such as alpha, beta, gamma and delta from other EEG signals. As a result, this study showed that the Levenberg–Marquardt algorithm is the best compared with the backpropagation, resilient backpropagation, and error correction algorithms. The final developed version of the software is an effective tool for research in the field of BCI. The study showed that using the Levenberg–Marquardt learning algorithm gave an accuracy of prediction around 60% on the testing dataset.
APA, Harvard, Vancouver, ISO, and other styles
20

Schafer, Burkhard. "D-waste: Data disposal as challenge for waste management in the Internet of Things." International Review of Information Ethics 22 (December 1, 2014): 101–7. http://dx.doi.org/10.29173/irie131.

Full text
Abstract:
Proliferation of data processing and data storage devices in the Internet of Things poses significant privacy risks. At the same time, faster and faster use-cycles and obsolescence of devices with electronic components causes environmental problems. Some of the solutions to the environmental challenges of e-waste include mandatory recycling schemes as well as informal second hand markets. However, the data security and privacy implications of these green policies are as yet badly understood. This paper argues that based on the experience with second hand markets in desktop computers, it is very likely that data that was legitimately collected under the household exception of the Data Protection Directive will “leak” into public spheres. Operators of large recycling schemes may find themselves inadvertently and unknowingly to be data controller for the purpose of Data Protection law, private resale of electronic devices can expose the prior owner to significant privacy risks.
APA, Harvard, Vancouver, ISO, and other styles
21

Schafer, Burkhard. "D-waste: Data disposal as challenge for waste management in the Internet of Things." International Review of Information Ethics 22 (December 1, 2014): 101–7. http://dx.doi.org/10.29173/irie122.

Full text
Abstract:
Proliferation of data processing and data storage devices in the Internet of Things poses significant privacy risks. At the same time, faster and faster use-cycles and obsolescence of devices with electronic components causes environmental problems. Some of the solutions to the environmental challenges of e-waste include mandatory recycling schemes as well as informal second hand markets. However, the data security and privacy implications of these green policies are as yet badly understood. This paper argues that based on the experience with second hand markets in desktop computers, it is very likely that data that was legitimately collected under the household exception of the Data Protection Directive will “leak” into public spheres. Operators of large recycling schemes may find themselves inadvertently and unknowingly to be data controller for the purpose of Data Protection law, private resale of electronic devices can expose the prior owner to significant privacy risks.
APA, Harvard, Vancouver, ISO, and other styles
22

Nugraha, Arya Adhi. "Electronic Stetoscope Design And Analysis Application Of Heart Sound With Digital Signal Processing." Jurnal Jartel: Jurnal Jaringan Telekomunikasi 1, no. 1 (2015): 1–8. http://dx.doi.org/10.33795/jartel.v1i1.123.

Full text
Abstract:
In this paper and has implemented a stethoscope electronic application sound analysis in heart client-server. A stethoscope electronics will catch a heart and menghantarkannya to computer so that the computer can sound mendigitalisasi heart. The application will process, sound analysis heart store and display a heart condition and sound spectrum of the heart. Extraction habitude anything undertaken to gain special habitude from the heart to perform the process of decomposing paket wavelet and root mean square ( rms ) at the sound of the heart. From the data obtained, in different heart conditions, decomposition of wavelet package give value range min 6 up to a maximum of 23 is much larger and RMS only give minimal range 0.04 to 0.16 in band 0-125Hz of variations of the same types of heart conditions. Sample Data obtained from 5 persons recorded sound his heart and then analyzed with the same two methods. The Data obtained are more closer to the normal heart sound so it can be deduced from the 5 sample data used is the sound of the heart under normal conditions.
APA, Harvard, Vancouver, ISO, and other styles
23

Irwan, Ridwan Rahim, Usman Latif Rusdi, and Mahzuz Umar Syahiq. "Study Of Domestic E-Waste Management in Sungguminasa City, Gowa Regency, South Sulawesi Province, Indonesia." E3S Web of Conferences 73 (2018): 07006. http://dx.doi.org/10.1051/e3sconf/20187307006.

Full text
Abstract:
E-waste is the impact resulting from the massive use of electronic goods in the information technology era. The increasing use of electronic goods resulted in increasing electronic waste. This study aims to calculate the potential of generation and characteristics of e-waste, processing methods and potential economic value of electronic e-waste recycling in Sungguminasa City, Gowa Regency, South Sulawesi Province, Indonesia. The study was conducted in 14 sub-districts in Somba Opu district, the largest population district in Sungguminasa City with population of 157,448 people or about 1.67% of the population of South Sulawesi Province. The research method is conducted by analyzing the data obtained through: survey, observation and interview to the head of household in Somba Opu District, household divided in 3 groups by monthly income level (high, medium and low) and obtained: e-waste characteristics, waste generation potential, processing method and potential economic value of recycled e- waste. The analysis results of 37 types of electronic goods, 3 types of electronic goods with the largest percentage are: televisions, refrigerators, personal computers, namely: 26%, and 17%, 14% or respectively: 150, 98, and 80 units/year. The potential of e-waste generation in Somba Opu District: 801,8 ton/year. The traditional method of e-waste processing is “converted function” by 55% and then “repaired” by 19% and “stored” by 17%, while the least applied e-waste method is “discarded” with a percentage of 9%. The results of the economic potential analysis of e-waste recycling of the 3 largest electronic goods are: refrigerator US$ 32,439, computer US$ 45,994 and television US$ 76,254 (US$ 1,00 = IDR 14,000)
APA, Harvard, Vancouver, ISO, and other styles
24

Putra, Yeviki Maisyah. "PENERAPAN SISTEM INFORMASI PERPUSTAKAAN PADA SMA NEGERI 2 MUARA BUNGO MENGGUNAKAN BAHASA PEMROGRAMAN JAVA DAN DIDUKUNG DATABASE MARIADB." INTECOMS: Journal of Information Technology and Computer Science 1, no. 2 (2018): 198–211. http://dx.doi.org/10.31539/intecoms.v1i2.293.

Full text
Abstract:
Computers are an electronic device that is not strange to every human being because it serves as a tool in all things. The computer has an application program capable of processing various types of data quickly, precisely and accurately. Therefore, many agencies that use computer services as a tool that can help in the activities of the company. Based on the research that has been done in the Library of SMA Negeri 2 Muara Bungo by using field research methods, libraries and laboratories, it is known that the system used in borrowing and returning books is still done manually and simply. With the design of Library information systems supported by Java programming language will provide better solutions to the problems encountered. The level of error in doing the calculations can be minimized, the information produced more accurately and the data can be stored safely&#x0D; Keywords : Library, SMA Negeri 2 Muara Bungo, Java, MariaDB, Library Information System
APA, Harvard, Vancouver, ISO, and other styles
25

Оrdabayeva G.К., Dzhusupbekova G.T., and Rakhymbek N. "DESIGN AND SIMULATION OF VIRTUAL LOCAL AREA NETWORK USING CISCO PACKET TRACER." BULLETIN 6, no. 388 (2020): 6–14. http://dx.doi.org/10.32014/2020.2518-1467.176.

Full text
Abstract:
Modern local networks consist of several subscriber devices located inside the same building. Computers on the local network are interconnected using network equipment - switches. By default, all devices connected to the ports of the same switch can communicate by exchanging network packets. Computer networks of data transmission are the result of the information revolution and in the future will be able to form the main means of communication. The worldwide trend towards the integration of computers in the network is due to a number of important reasons, such as the acceleration of the transmission of information messages, the ability to quickly exchange information between users, receiving and transmitting messages (faxes, E-mail letters, electronic conferences, etc.) without leaving the workplace, the ability to instantly receive any information from anywhere in the world, as well as the exchange of information between computers of different manufacturers working under different software. A large number of broadcast packets sent by devices leads to a decrease in network performance, because instead of useful operations, the switches are busy processing data addressed to everyone at once. The situation forces us to divide such large networks into autonomous subnets; as a result, the logical structures of the network are different from the physical topologies. This article discusses VLAN technology (Virtual Local Area Network - VLAN), which allows you to divide one local network into separate segments.
APA, Harvard, Vancouver, ISO, and other styles
26

Durusoy, Murat. "In-Game Photography: Creating New Realities through Video Game Photography." Membrana Journal of Photography, Vol. 3, no. 1 (2018): 42–47. http://dx.doi.org/10.47659/m4.042.art.

Full text
Abstract:
Computers and photography has had a long and complicated relationship throughout the years. As image processing and manipulating capabilities advanced on the computer front, photography re-birthed itself with digital cameras and digital imaging techniques. Development of interconnected social sharing networks like Instagram and Twitter feeds the photographers’/users’ thirst to show off their momentaneous “been there/seen that – capture the moment/share the moment” instincts. One other unlikely front emerged as an image processing power of the consumer electronics improved is “video game worlds” in which telematic travellers may shoot photographs in constructed fantasy worlds as if travelling in real life. While life-like graphics manufactured by the computers raise questions about authenticity and truthfulness of the image, the possible future of the photography as socially efficient visual knowledge is in constant flux. This article aims to reflect on today’s trends in in-game photography and tries to foresee how this emerging genre and its constructed realities will transpose the old with the new photographic data in the post-truth condition fostering for re-evaluation of photography truth-value. Keywords: digital image, lens-based, photography, screenshot, video games
APA, Harvard, Vancouver, ISO, and other styles
27

Hayashi, Tatsuo, Hideyuki Ino, Shu Yamaguchi, and Tetsuro Akashi. "Method of Data Maintenance for Computer Control System in Parallel with On-line Data Processing." IEEJ Transactions on Electronics, Information and Systems 109, no. 2 (1989): 75–82. http://dx.doi.org/10.1541/ieejeiss1987.109.2_75.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Leenaerts, Domine M. W. "Data processing based on wave propagation." International Journal of Circuit Theory and Applications 27, no. 6 (1999): 633–45. http://dx.doi.org/10.1002/(sici)1097-007x(199911/12)27:6<633::aid-cta88>3.0.co;2-m.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Herrmann, K. H. "State and Future of Electronic Image Read-OUT in TEM." Proceedings, annual meeting, Electron Microscopy Society of America 48, no. 1 (1990): 112–13. http://dx.doi.org/10.1017/s0424820100179312.

Full text
Abstract:
The electron microscope is becoming a link in a highly sophisticated data processing system. The acquisition of image data supplied as a spatial distribution of current density requires a position sensitive electron detector which converts the current into digital information to be processed by image storages and computers to retrieve the information in which the user is interested. The ultimate goal of this interface is a lossless conversion with respect to both the number of pixels and the detection quantum efficiency (DQE) as well as high speed, minimum distortion, and linearity. I shall try to outline the present state of image read-out using both the conventional photoplate with spatial digitizing equipment and conventional TV sets. Subsequently it will be discussed how the future CCD technology (charge coupled device) may overcome some restrictions of present solutions.Every spatial recording device combines a conversion with a storage function in order to build up a signal-to-noise ratio (SNR) sufficient for detection. The following characteristics describe their performance:
APA, Harvard, Vancouver, ISO, and other styles
30

Kralev, Velin, Radoslava Kraleva, and Petia Koprinkova-Hristova. "Data modelling and data processing generated by human eye movements." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 5 (2021): 4345. http://dx.doi.org/10.11591/ijece.v11i5.pp4345-4352.

Full text
Abstract:
Data modeling and data processing are important activities in any scientific research. This research focuses on the modeling of data and processing of data generated by a saccadometer. The approach used is based on the relational data model, but the processing and storage of the data is done with client datasets. The experiments were performed with 26 randomly selected files from a total of 264 experimental sessions. The data from each experimental session was stored in three different formats, respectively text, binary and extensible markup language (XML) based. The results showed that the text format and the binary format were the most compact. Several actions related to data processing were analyzed. Based on the results obtained, it was found that the two fastest actions are respectively loading data from a binary file and storing data into a binary file. In contrast, the two slowest actions were storing the data in XML format and loading the data from a text file, respectively. Also, one of the time-consuming operations turned out to be the conversion of data from text format to binary format. Moreover, the time required to perform this action does not depend in proportion on the number of records processed.
APA, Harvard, Vancouver, ISO, and other styles
31

Abushagur, Mustafa A. G. "Adaptive array radar data processing using the bimodal optical computer." Microwave and Optical Technology Letters 1, no. 7 (1988): 236–40. http://dx.doi.org/10.1002/mop.4650010704.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Pečiuliauskienė, Palmira. "Application of Statistical Methods in Education Sciences: the Didactical Principles in the Educational Means Prepared by Bronislovas Bitinas." Pedagogika 124, no. 4 (2016): 47–57. http://dx.doi.org/10.15823/p.2016.50.

Full text
Abstract:
Based on works of professor B. Bitinas this article analyses educational materials on applying statistical methods in education sciences. The educational materials on applying statistical methods in education sciences can be classified into two groups: methodological group (the application of mathematical statistics in quantitative research) and technological group (computer software for processing quantitative research data). The article analyses application of didactical principles in the methodological group and the technological group.&#x0D; Didactical principles provide direction for learning process and govern contents, methods and organisation of learning. Different didactical principles can be used when preparing educational materials on the applying statistical methods in education as principle of the availability and principle of relation between the theory and practise.&#x0D; Application of statistical methods in education sciences is influenced by technological tools used for processing statistical data. These tools changed from electronic calculating machines (ECM) to modern computers with modern statistical programs. Technological change influences didactical principles applied for preparing educational materials on applying statistical methods in education sciences.&#x0D; The comparative analysis of learning materials on applying statistical methods in education sciences prepared by Professor B. Bitinas shows that the principle of availability was an important principle at the time of the processing statistical data using the ECM. This principle is less important when statistical data is processed using modern statistical computer programs. Didactical principle of the relation between theory and practice was important in the early (the eighth-ninth decade of XX century) and later stages (the twentyfirst century.)&#x0D; Professor B. Bitinas prepared both methodological (the usage of mathematical statistics into quantitative research) and technological (computer software for quantitative research data processing) learning means. The didactical principles in the methodological and technological learning means are different. The principle of the availability and the principle of relation between the theory and practise dominate in the methodological learning means. The principle of systematization and continuity is important in technological learning means.
APA, Harvard, Vancouver, ISO, and other styles
33

Zörög, Zoltán, Tamás Csomós, and Csaba Szűcs. "ERP systems in higher education." Applied Studies in Agribusiness and Commerce 6, no. 3-4 (2012): 103–9. http://dx.doi.org/10.19041/apstract/2012/3-4/14.

Full text
Abstract:
In the past few decades data processing and in-company communication has changed significantly. First there were only a few computers purchased at companies, therefore departments developed applications that covered corporate administration which lead to so called isolated solutions. These days with the spread of electronic data processing the greatest problem for companies is not gaining information – since they can be found in all sorts of databases and data warehouses as internal or external information – rather producing information that is necessary in a given situation. What can help to solve this situation? It is informatics, more precisely ERP systems which have substituted software that provided isolated solutions at companies for decades. System based thinking is important in their application beside the fact that only data absolutely necessary for managerial decisions must be produced. This paper points out why we consider practice oriented teaching of ERP systems in higher education important.
APA, Harvard, Vancouver, ISO, and other styles
34

Mahato, Shiva Prasad. "Dynamic cluster management and resource utilization using JINI technology." Journal of Science and Engineering 5 (August 31, 2018): 7–14. http://dx.doi.org/10.3126/jsce.v5i0.22367.

Full text
Abstract:
With the commencement of Electronic Transaction Act, Nepal has taken further step in the field of information and communication technology. With government offices nowadays starting to use computers; there lies ahead many challenges to maximize the utilization of computing resources offered by each computer and minimize the overall cost. With many computers, so many idle resources are being wasted unnecessarily. Jobs can be distributed out to idle servers or even to idle desktops. Many of these resources remain idle during office hours off or even during office hours with many users utilizing the computing as well as memory resources. The proposed model not only utilizes resources to optimum but also makes the architecture more modular, adaptive and then provides dynamic fail over recovery and linear scalability. This approach is useful in a place which requires clusters to set up to perform resource intensive works like data processing or computing works. This model can be realized using JINI/Java Space technology which is open source technology and hence, can be cost effective as compared to other proprietary solutions. The motivating factor of this paper is to understand and identify the architectural constraint in the existing distributed application.
APA, Harvard, Vancouver, ISO, and other styles
35

Awogbemi, K. J., R. T. Olaniyi, and W. O. Erhun. "A Survey of Computerization of Selected Community Pharmacies in Southwestern Nigeria." Nigerian Journal of Pharmaceutical Research 16, no. 2 (2021): 185–90. http://dx.doi.org/10.4314/njpr.v16i2.9.

Full text
Abstract:
Background: The use of computers has had impact in all professions including pharmacy. Computers have found lots of application in the management of inventories, electronic prescribing and counting machines for tablets and capsules.Objective: This research was designed to identify the types of technology devices and programs in use by community pharmacies, capabilities of software in use, as well as to identify the challenges faced by community pharmacists in the use of computerized systems in their premises.Method: A cross-sectional survey of 217 community pharmacies in 6 Southwestern Nigerian states was done using a set of questionnaires. Data gathered was subjected to statistical analysis using SPSS version 17.Results: Some community pharmacies in Southwestern Nigeria used inventory management software (47.0%). The reported capabilities of the software packages in use included sales processing (99.0%), account processing (88.2%) and POS link (62.7%). The reasons why some community pharmacies have not computerized their outlets were erratic power supply (56.2%), high cost of the devices (48.4%) and low turnover (35.9%). The major challenges faced by community pharmacists in the use of computer devices in their premises included erratic power supply (90.2%) and high cost of fuel (83.3%).Conclusion: The level of computerization of community pharmacies in Southwestern Nigeria was observed to be generally low. Erratic power supply and cost of device were major challenges to the computerization of community pharmacies in Southwestern Nigeria.&#x0D; Keywords: Community Pharmacy, Software, Computerization
APA, Harvard, Vancouver, ISO, and other styles
36

Gomez, A., A. Boronat, J. A. Carsi, I. Ramos, C. Taubner, and S. Eckstein. "Biological Data Processing using Model Driven Engineering." IEEE Latin America Transactions 6, no. 4 (2008): 324–31. http://dx.doi.org/10.1109/tla.2008.4815285.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Seekoli, Avinash, and Abhilasha Akkala. "RAW DATA PROCESSING WITH HQL AND PIG." Far East Journal of Electronics and Communications 20, no. 2 (2019): 87–95. http://dx.doi.org/10.17654/ec020020087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Konsa, Kurmo. "Automaatse infotöötluse algus: perfokaartidel põhinevad infosüsteemid [Abstract: The beginning of automatic information processing: information systems on punch cards]." Ajalooline Ajakiri. The Estonian Historical Journal 167, no. 1 (2019): 119–48. http://dx.doi.org/10.12697/aa.2019.1.04.

Full text
Abstract:
Abstract: The beginning of automatic information processing: information systems on punch cards&#x0D; Toward the end of the 19th century, several different national bureaucratic institutions and private enterprises had evolved and expanded to such an extent that new information management tools and methods were necessary. The electromechanical data-processing system developed by Hermann Hollerith in the 1880s, which operated on the basis of punch cards, can be regarded as the predecessor of modern automatic digital data-processing systems. Information storage and data processing systems were accomplished through punch cards, i.e. carton cards in a standardised form. Information was punched onto the cards by perforating in fixed positions. Hollerith invented a number of electromechanical devices that could punch information on cards and process the cards that were carrying information. The sorter machine made it possible to sort cards by the perforated marker in a column, and the tabulator enabled counting and adding up cards.&#x0D; This article treats the development of punch card-based systems by demonstrating the primary modifications and correlating them with the purposes of such systems. In order to do this, I will divide the punch card-based systems into five generations. The system was created in the United States in the 1880s to process large volumes of statistical data that had been recorded in population censuses. Several countries applied this system in everyday practice up until the beginning of the 20th century. In 1894, information systems capable of processing statistical data evolved out of this development. These systems were still in use even after the end of World War II. Solutions that facilitated bookkeeping were developed by 1906; such systems served until the 1960s, in some places even longer. Population registers based on punch cards were elaborated between 1935 and 1937, and were used by various countries until the 1960s. Upon the introduction of electronic computers after World War II, punch cards were used to enter data and programs into the computers.&#x0D; Punch card-based systems were the first automatic systems that were able to process large quantities of data. They were the most complex information systems from the end of the 19th century until the end of World War II, offering the most multifarious options. After the introduction of electronic computers, the application of such systems was consistently scaled back, but they were still widely in use until the mid-20th century.&#x0D; The experience and technical knowledge gained while applying punch card-based information systems laid the groundwork for further digital developments in computer systems. As both of these systems were used in parallel over a considerably long period of time, knowledge was shared between them. To represent data units on punch cards, data had to be encoded. This shared knowledge resulted in significant gains in the processing of information, specifically the division of data into discrete, distinctly specified units to mechanically process information, and the representation of data units on punch cards via encoding.&#x0D; The codes of the early punch card-based systems were case based, and had been elaborated according to the data to be analysed, and to the purposes of processing. Further developments made these encoding systems more universal, so that they came to be used as standards. Punch cards were the first data carriers that could be read by machines. Corresponding devices were employed to punch, read and process information on cards. The initial versions of these systems were easily read by the naked eye, but later the systems evolved to become completely number based. The constant increase in the amount of data recorded on the cards was the result of more complex tasks, as well as the growing abundance of calculation options on the tabulators themselves. Programming with changeable setups, which characterised the first generations of programming, was followed by conditional programming with punch cards themselves. Increased processing speed was one of the milestones in the development of punch card-based systems. The systems were characterised by a vast universality. Data, once recorded on the cards, was available for repeated analysis, regardless of the objective. Punch cards were the first databases that could be processed automatically. Nonetheless, operating mechanical information systems required a firmer organisation and standardisation of the working process.
APA, Harvard, Vancouver, ISO, and other styles
39

Zhou, Yang, Ye Li, and Wei Bo Li. "Research and Design of Networked Virtual Instrument Electronic Test System." Applied Mechanics and Materials 241-244 (December 2012): 66–69. http://dx.doi.org/10.4028/www.scientific.net/amm.241-244.66.

Full text
Abstract:
A novel virtual instrument operated on different computers, peripherals, measured node and databases connecting to the network based on network and database technology is proposed to achieve resource sharing by using of the virtual instrument electronic test system (VIETS) developed by Jilin University. Users can not only remotely control the instrument measurement, collection, analysis and processing data, but also conveniently monitor the test site, which benefit to the foundation for setting up networked virtual electronic measurement laboratory. A mixed-mode of C/S (client/server) and B/S (browser/server) and a system structure include user client, internet, server, client device and measurement and control equipment was presented. Network function was also achieved by using of LabVIEW, MyEclipse, MySQL and other technologies to accomplish the database design, server management website design and the communication. Thus users can remotely both monitor and control the test instruments through the network function of VIETS.
APA, Harvard, Vancouver, ISO, and other styles
40

Loneli Costaner, Guntoro, and Yuhelmi. "Penerapan Sistem Sirkulasi Perpustakaan Berbasis Slims Pada SMA IT Al Fityah Pekanbaru." Dinamisia : Jurnal Pengabdian Kepada Masyarakat 4, no. 2 (2020): 268–74. http://dx.doi.org/10.31849/dinamisia.v4i2.3926.

Full text
Abstract:
In the world of education there is now an electronic-based learning system that can be used for distance learning. The change in learning from the manual base to electronic precisely the internet indicates a global change to make it easier for humans to interact without having to meet. In the field of computers today we also find many companies in the company, the market and the world of education, including schools, have used information systems based on automation. With the aim of facilitating the processing of data and information can be obtained quickly and accurately. One way to provide good, fast and safe service is a trusted and proven system that is able to provide librarian officers with an open source SLIMS application. SLIMS is an application library that is almost perfect for organizing data data and information needed.
APA, Harvard, Vancouver, ISO, and other styles
41

Perdigon-Lagunes, Pedro, Octavio Estevez, Cristina Zorrilla, Jorge A. Ascencio, and Raul Herrera-Becerra. "Structural and Electronic Characterization Through Spectroscopy Analysis of Gd-Gd2O3 Nanoparticles." Journal of Nanoscience and Nanotechnology 19, no. 11 (2019): 7345–55. http://dx.doi.org/10.1166/jnn.2019.16604.

Full text
Abstract:
Based on the new necessity of residual dispensing of heavy metals (or even more, rare earths) that are part of the composition of multiple electronic components as the cellular phones or computers; this work reports the use of physical processes for taking advantage of elements from the electronic residues. Recycling and processing to generate nanomaterials is an attractive option, because the economic and ecologic perspectives. In this work, Gd–Gd2O3 nanoparticles (through a sonic-chemical method at room temperature assisted with Tannic acid as reduction agent), is reported. The structure characterization was performed over the samples using complementary techniques such as transmission electron microscopy, X-ray diffraction, Raman and infrared spectroscopies. The obtained results are related to self-assembling mechanisms. As a complement, Ultraviolet-visible absorbance data was used to determine some electronic properties such as band structure. Therefore, this work has demonstrated the possibility of synthesizing lanthanide nanoparticles without burning the samples.
APA, Harvard, Vancouver, ISO, and other styles
42

Dudeck, J., G. Junghans, K. Marquardt, P. Sebald, A. Michel, and H. U. Prokosch. "WING – Entering a New Phase of Electronic Data Processing at the Gießen University Hospital." Methods of Information in Medicine 30, no. 04 (1991): 289–98. http://dx.doi.org/10.1055/s-0038-1634851.

Full text
Abstract:
AbstractAt the Gielßen University Hospital electronic data processing systems have been in routine use since 1975. In the early years developments were focused on ADT functions (admission/discharge/transfer) and laboratory systems. In the next decade additional systems were introduced supporting various functional departments. In the mid-eighties the need to stop the ongoing trend towards more and more separated standalone systems was realized and it was decided to launch a strategic evaluation and planning process which sets the foundation for an integrated hospital information system (HIS). The evaluation of the HELP system for its portability into the German hospital environment was the first step in this process. Despite its recognized capabilities in integrating decision support and communication technologies, and its powerful HIS development tools, the large differences between American and German hospital organization, influencing all existing HELP applications, and the incompatibility of the HELP tools with modern software standards were two important factors forcing the investigation of alternative solutions. With the HELP experience in mind, a HIS concept for the Gießen University Hospital was developed. This new concept centers on the idea of a centralized relational patient database on a highly reliable database server, and clinical front-end applications which might be running on various other computer systems (mainframes, departmental UNIX satellites or PCs in a LAN) integrated into a comprehensive open HIS network. The first step towards this integrated approach was performed with the implementation of ADT and results reporting functions on care units.
APA, Harvard, Vancouver, ISO, and other styles
43

Tomesh, Teague, Pranav Gokhale, Eric R. Anschuetz, and Frederic T. Chong. "Coreset Clustering on Small Quantum Computers." Electronics 10, no. 14 (2021): 1690. http://dx.doi.org/10.3390/electronics10141690.

Full text
Abstract:
Many quantum algorithms for machine learning require access to classical data in superposition. However, for many natural data sets and algorithms, the overhead required to load the data set in superposition can erase any potential quantum speedup over classical algorithms. Recent work by Harrow introduces a new paradigm in hybrid quantum-classical computing to address this issue, relying on coresets to minimize the data loading overhead of quantum algorithms. We investigated using this paradigm to perform k-means clustering on near-term quantum computers, by casting it as a QAOA optimization instance over a small coreset. We used numerical simulations to compare the performance of this approach to classical k-means clustering. We were able to find data sets with which coresets work well relative to random sampling and where QAOA could potentially outperform standard k-means on a coreset. However, finding data sets where both coresets and QAOA work well—which is necessary for a quantum advantage over k-means on the entire data set—appears to be challenging.
APA, Harvard, Vancouver, ISO, and other styles
44

Krippner, G. "Electronic Roundwood Measurement and Connected Data Processing Computer as a Basic System for Automatic Log Yards." IFAC Proceedings Volumes 19, no. 12 (1986): 71–73. http://dx.doi.org/10.1016/s1474-6670(17)59591-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Kuba, A., A. Makay, E. Máté, and L. Csernay. "Data processing system for nuclear medicine images." International Journal of Imaging Systems and Technology 4, no. 1 (1992): 51–61. http://dx.doi.org/10.1002/ima.1850040111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

McGuigan, Lee. "Automating the audience commodity: The unacknowledged ancestry of programmatic advertising." New Media & Society 21, no. 11-12 (2019): 2366–85. http://dx.doi.org/10.1177/1461444819846449.

Full text
Abstract:
Programmatic advertising describes techniques for automating and optimizing transactions in the audience marketplace. Facilitating real-time bidding for audience impressions and personalized targeting, programmatic technologies are at the leading edge of digital, data-driven advertising. But almost no research considers programmatic advertising within a general history of information technology in commercial media industries. The computerization of advertising and media buying remains curiously unexamined. Using archival sources, this study situates programmatic advertising within a longer trajectory, focusing on the incorporation of electronic data processing into the spot television business, starting in the 1950s. The article makes three contributions: it illustrates that (1) demands for information, data processing, and rapid communications have long been central to advertising and media buying; (2) automation “ad tech” developed gradually through efforts to coordinate and accelerate transactions; and (3) the use of computers to increase efficiency and approach mathematical optimization reformatted calculative resources for media and marketing decisions.
APA, Harvard, Vancouver, ISO, and other styles
47

Mishra, Harshita, and Anuradha Misra. "Techniques for Image Segmentation: A Critical Review." International Journal of Research in Advent Technology 9, no. 3 (2021): 1–4. http://dx.doi.org/10.32622/ijrat.93202101.

Full text
Abstract:
In today’s world there is requirement of some techniques or methods that will be helpful for retrieval of the information from the images. Information those are important for finding solution to the problems in the present time are needed. In this review we will study the processing involved in the digitalization of the image. The set or proper array of the pixels that is also called as picture element is known as image. The positioning of these pixels is in matrix which is formed in columns and rows. The image undergoes the process of digitalization by which a digital image is formed. This process of digitalization is called digital image processing of the image (D.I.P). Electronic devices as such computers are used for the processing of the image into digital image. There are various techniques that are used for image segmentation process. In this review we will also try to understand the involvement of data mining for the extraction of the information from the image. The process of the identifying patterns in the large stored data with the help of statistic and mathematical algorithms is data mining. The pixel wise classification of the image segmentation uses data mining technique.
APA, Harvard, Vancouver, ISO, and other styles
48

Rez, Peter, and W. J. de Ruijter. "The Electron Microscope and the Computer: Present Trends and Future Prospects." Proceedings, annual meeting, Electron Microscopy Society of America 48, no. 1 (1990): 116–17. http://dx.doi.org/10.1017/s0424820100179336.

Full text
Abstract:
In the present generation of electron microscopes the roles of computers or microprocessors can be divided into control, acquisition and analysis of both spectral and image data. Not much, however, has been done to realise the full power of computer based systems and integrate all these functions with the electron optics. The control of practically all microscope columns is performed by an 8-bit or 16-bit processor usually running a program that repeatedly scans for user inputs and changes lens currents or alignment settings if necessary. External control for special experiments can either be implemented by scanning an additional user input from a serial port or by allowing external analog signals to replace those generated by the microscope control scheme. As the program loop typically takes 0.1 sec to complete it is preferable to implement some functions such as external beam scanning by providing analog ramps (even if generated by another computer).Computer acquistion of data was introduced to electron microscopy with analytical techniques, such as EDX, in which the computer was the basis of a multi channel analyser. In the case of energy loss and Auger spectroscopy, computer scanning of the spectrometer and acquisition of single electron pulse-counted data quickly displaced chart recorders as a means of collecting data. A computer based system not only could perform acquisition more efficiently, it could also provide a convenient means for processing the results and doing quantitative analysis. Furthermore digitally stored data could easily be transferred to other systems on disks or by direct link (such as ethernet) and analysed elsewhere. However for image acquisition there has been very little use of computers in acquiring data. Although microscopists are happy to consider a spectrum as an array of numbers they still prefer to deal with images as pictures rather than digital data sets. The problems are not entirely psychological since a major barrier to the widespread use of image processing in microscopy is the lack of a suitable detection system. TV cameras compare unfavourably with photographic plates in terms of both dynamic range and "resolution" as defined loosely in terms of pixel size or lines/mm. This argument does not apply to scanning microscopes but, even in scanning systems, frame buffers and powerful computer systems have only been integrated as part of the microscope electronics in the last few years. Microscopists still prefer to measure quantities from exposed photographs rather than work with digitized data in a workstation environment using high level image processing software. Until recently cost may have been a consideration but now computing platforms of sufficient capability are less than 1/5 of the cost of an average SEM.
APA, Harvard, Vancouver, ISO, and other styles
49

Ono, Kiyonobu, Makoto Maruya, Takashi Fujimura, Tsunekazu Kimura, and Minoru Murata. "3-2. Satellite Image Data Processing Techniques." Journal of The Institute of Image Information and Television Engineers 66, no. 6 (2012): 465–69. http://dx.doi.org/10.3169/itej.66.465.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Henrich-Franke, Christian. "Innovationsmotor Medientechnik – Von der Schreibmaschine zur «Mittleren Datentechnik» bei der Siemag Feinmechanische Werke (1950 bis 1969)." Zeitschrift für Unternehmensgeschichte 66, no. 1 (2021): 93–117. http://dx.doi.org/10.1515/zug-2020-0023.

Full text
Abstract:
Abstract The second half of the 20th century is commonly considered to be a time in which German companies lost their innovative strength, while promising new technologies presented an enormous potential for innovation in the US. The fact that German companies were quite successful in the production of medium data technology and had considerable influence on the development of electronic data processing was neglected by business and media historians alike until now. The article analyses the Siemag Feinmechanische Werke (Eiserfeld) as one of the most important producers of the predecessors to said medium data technologies in the 1950s and 1960s. Two transformation processes regarding the media – from mechanic to semiconductor and from semiconductor to all-electronic technology – are highlighted in particular. It poses the question of how and why a middling family enterprise such as Siemag was able to rise to being the leading provider for medium data processing office computers despite lacking expertise in the field of electrical engineering while also facing difficult location conditions. The article shows that Siemag successfully turned from its roots in heavy industry towards the production of innovative high technology devices. This development stems from the company’s strategic decisions. As long as their products were not mass-produced, a medium-sized family business like Siemag could hold its own on the market through clever decision-making which relied on flexible specialization, targeted license and patent cooperation as well as innovative products, even in the face of adverse conditions. Only in the second half of the 1960s, as profit margins dropped due to increasing sales figures and office machines had finally transformed into office computers, Siemag was forced to enter cooperation with Philips in order to broaden its spectrum and merge the production site in Eiserfeld into a larger business complex.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography