To see the other types of publications on this topic, follow the link: PC-File[plus] (Computer program).

Journal articles on the topic 'PC-File[plus] (Computer program)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 28 journal articles for your research on the topic 'PC-File[plus] (Computer program).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Klimasara, Anthony J. "Automated Quantitative XRF Analysis Software in Quality Control Applications." Advances in X-ray Analysis 35, A (1991): 111–16. http://dx.doi.org/10.1154/s0376030800008739.

Full text
Abstract:
AbstractA menu-driven IBM-PC/MS-DOS software system for quantitative XRF analysis was designed combining elements of pattern recognition (spectral “fingerprinting”) with mathematical correction models. The following topics are covered:-pattern recognition method; N - dimension representation-local empirical Lachance-Traill or Lucas-Tooth & Price models-EGA/VGA intelligent driver for a calibration display-dSASE III Plus “data files - accessed from dBXL/Quicksifver”, dBASE III Plus or from compiled BASIC equipped with “call subroutines”-Examples of applications (stainless steels and ceramics).In this approach, experimental alphas are computed utilizing the closest automatically preselected standards from available data files. Alpha values generated in this fashion we term “local alphas”. An intelligent software program is employed which automatically recognizes the resolution of tiie CRT display and delivers the best possible display for the available equipment. This new design also permits the utilization of dBXL/Quicksitver, dBASE III Plus or compiled BASIC resources for additional on site system customization programming. A dBASE III Plus file format is used for XRF data storage. This permits fast data exchange with a local ASCII database and also with all popular spreadsheet formats. Additionally, there are utility subroutines available which allow direct and rapid manipuiation of dBASE III Plus files in compiled BASIC. Application of this software system and graphics to stainless steels and engineered ceramic materials are demonstrated. RIGAKU Smax/PDP-11/73 and TRACOR X-RAY SPECTRACE 5000/Compaq 386 spectrometers are used in these studies.
APA, Harvard, Vancouver, ISO, and other styles
2

Siek, Theodore J., Corey W. Stradling, Marc W. McCain, and Timothy C. Mehary. "Computer-aided identifications of thin-layer chromatographic patterns in broad-spectrum drug screening." Clinical Chemistry 43, no. 4 (April 1, 1997): 619–26. http://dx.doi.org/10.1093/clinchem/43.4.619.

Full text
Abstract:
Abstract We have developed a systematic thin-layer chromatographic (TLC) technique for detecting and identifying drugs and drug metabolites on 10-cm-long silica-gel plates with organic binder (fluorescent indicator); a computer program (SPOT CHEK) assists in matching the data from a particular chromatogram with those obtained for known drugs recovered from serum, urine, or other specimens. The plates are developed in a single mobile phase. Visualization and detection reagents used to characterize unknowns include fluorescamine, ferric chloride/perchloric acid/nitric acid, Dragendorff, Marquis, Mandelin, and iodinated Dragendorff solutions, 254 nm ultraviolet light, and vapor from chlorine or hydrochloric acid. Detection limits of 5–200 ng per sample spot were obtained for drugs in the database. The computer program database is based on nine reaction responses plus the plate zone locations for 243 drug substances but requires entry of only one TLC property to generate a matching list. We ran the program with an IBM-compatible 386/486 PC using an MS-DOS operating system (version 6.2).
APA, Harvard, Vancouver, ISO, and other styles
3

Ludlow, Barbara L., and Michael C. Duff. "Live Broadcasting Online: Interactive Training for Rural Special Educators." Rural Special Education Quarterly 21, no. 4 (December 2002): 26–30. http://dx.doi.org/10.1177/875687050202100405.

Full text
Abstract:
This paper describes how to use a desktop computer and inexpensive software plus a PC or Macintosh streaming server to deliver live interactive class sessions via video with audio streaming on the Internet. Although the use of Web-based instruction for preservice and inservice program delivery in special education and disability services is expanding rapidly, most existing programs rely primarily on text presentation and asynchronous (delayed time) technologies such as threaded discussions. Relatively little use has been made to date of the Web's multimedia capabilities or synchronous (real time) technologies such as audio- or video-conferencing. The use of webcasting technology (both simulcasts in real time and re-broadcasts on demand) represents a fairly inexpensive, simple to use mechanism for delivering personnel preparation programs for practitioners working in early intervention, special education, or adult disability services in rural areas without the need for high bandwidth connections. The distance education program in Severe/Multiple Disabilities and Early Intervention Special Education at West Virginia University has successfully utilized webcasting technology to deliver a graduate certification and degree program to practicing but uncertified special educators working in rural areas of the United States as well as in several other locations around the world.
APA, Harvard, Vancouver, ISO, and other styles
4

Karanović, Lj, and D. Poleti. "A FORTRAN Program for Conversion of PC-APD Data Files into ASCII Files." Powder Diffraction 7, no. 3 (September 1992): 179. http://dx.doi.org/10.1017/s0885715600018595.

Full text
Abstract:
Recently, Dahan and co-workers (Dahan, 1991) suggested processing the XRD data by spreadsheet computer programs. Treated in this manner the XRD data became very flexible and made comparison with other data sets, as well as graphical presentation, much easier. In this note a simple FORTRAN 77 program for conversion of PC-APD data files into ASCII files suitable for import into spreadsheets is reported.In our laboratory XRD data are collected on a Philips 1710 diffractometer operated by the PC-APD version 2.0 (PC-APD Software, 1989). Each experiment usually generates its files containing collected raw intensity data (.RD file), background data (.BK file) and file with peak positions and their intensities (.DI file). The XRD data can be further processed: after smoothing, data are stored in files with extension .SM (.SM file) and, after Kα2 stripping, into files with extension .A2 (.A2 file). All files are stored in the binary format.
APA, Harvard, Vancouver, ISO, and other styles
5

Colijn, H. O. "Spectraplot: A PC-based spectrum translation and display program." Proceedings, annual meeting, Electron Microscopy Society of America 46 (1988): 918–19. http://dx.doi.org/10.1017/s042482010010665x.

Full text
Abstract:
Many labs today wish to transfer data between their EDS systems and their existing PCs and minicomputers. Our lab has implemented SpectraPlot, a low- cost PC-based system to allow offline examination and plotting of spectra. We adopted this system in order to make more efficient use of our microscopes and EDS consoles, to provide hardcopy output for an older EDS system, and to allow students to access their data after leaving the university.As shown in Fig. 1, we have three EDS systems (one of which is located in another building) which can store data on 8 inch RT-11 floppy disks. We transfer data from these systems to a DEC MINC computer using “SneakerNet”, which consists of putting on a pair of sneakers and running down the hall. We then use the Hermit file transfer program to download the data files with error checking from the MINC to the PC.
APA, Harvard, Vancouver, ISO, and other styles
6

Kingma, Johannes, Elisabeth Tenvergert, Hinke Anja Werkman, Henk Jan Ten Duis, and Henk J. Klasen. "A Turbo Pascal Program to Convert Icd-9Cm Coded Injury Diagnoses into Injury Severity Scores: Icdtoais." Perceptual and Motor Skills 78, no. 3 (June 1994): 915–30. http://dx.doi.org/10.1177/003151259407800346.

Full text
Abstract:
Diagnoses of injuries as a result of trauma are commonly coded by means of the International Classification of Diseases (9th rev.) Clinical Modification (ICD-9CM). The Abbreviated Injury Scale (AIS) is frequently employed to assess the severity of injury per body region. The Injury Severity Score (ISS) is an over-all index or summary of the severity of injury. To compute one of these two types of scores the entire medical record of each patient must be examined. The program ICDTOAIS replaces the manual coding or translation between the two scores. The program converts the ICD-9CM coded diagnoses into AIS and ISS scores. The program also computes the maximum AIS (MAXAIS) per body region, enabling the researcher to assess the relative impact of the severity of trauma of different body regions in both morbidity and mortality studies. The program locates invalid ICD-9CM rubrics in the data file. ICDTOAIS may be employed as a program alone or as a procedure in database management systems (e.g., DBase III plus, DBase IV, or the different versions of FOXPRO). The program is written in Turbo Pascal, Version 6.
APA, Harvard, Vancouver, ISO, and other styles
7

Hollebrands, Karen Flanagan, and Hollylynne Stohl. "Technology Tips: January 2004." Mathematics Teacher 97, no. 1 (January 2004): 68–72. http://dx.doi.org/10.5951/mt.97.1.0068.

Full text
Abstract:
This month, we provide an example of a rich mathematical task that leads to many different connections. The task was posed to a class of high school seniors who were using a dynamic program for geometry called Cabri Geometry II. This tip includes directions for creating this problem with technology and suggestions for exploring it. The Cabri II software is available for Macintosh and PC computers from www.cabrilog.com/en or education.ti.com. It is also available for several different Texas Instruments calculators (TI-83 Plus, TI-83 Plus Silver, Voyage 200, TI-89, and TI-92 Plus). The program is similar to The Geometer's Sketchpad, and users who are familiar with The Geometer's Sketchpad should be able to easily adapt this task to use with it.
APA, Harvard, Vancouver, ISO, and other styles
8

TAHAT, AMANI N., WA'EL SALAH, and AWNI B. HALLAK. "PASS (PIXE ANALYSIS SHELL SOFTWARE): A COMPUTER UTILITY PROGRAM FOR THE EVALUATION OF PIXE SPECTRA." International Journal of PIXE 20, no. 03n04 (January 2010): 63–76. http://dx.doi.org/10.1142/s0129083510001987.

Full text
Abstract:
This paper describes a shell which facilitates the use of the existing PIXE analysis software package PIXAN. In this work, we designed, wrote and examined several PIXE spectra in a utility program that is called WPASS. The WPASS program merely links PIXAN modules and makes their use more convenient than before. The WPASS program handles automatically PEAKFIT (BATTY) and THICK programs. It outputs the results into several files belonging to the same data file. These include converting data files from one-column-format OCF to PIXANPC format; control, graphics, and result files from PEAKFIT; control and result files from THICK; options for graphical plotting the results on the PC and converting the graphics files for their components for publications of the results. WPASS has new features that consider the secondary interelement fluorescence. WPASS has been used successfully for the analysis of PIXE spectra and inner-shell ionization studies.
APA, Harvard, Vancouver, ISO, and other styles
9

HAYAMI, Ken-ichi. "Development of A File Conversion Program for Chemical Structure Data: MOLCONV. Traensplantation of the MOLCONV program from PC-9801 for IBM personal computer." Journal of Chemical Software 4, no. 3 (1998): 119–26. http://dx.doi.org/10.2477/jchemsoft.4.119.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Caussin, P., J. Nusinovici, and D. W. Beard. "Using Digitised X-Ray Powder Diffraction Scans as Input for a New Pc-At Search/Match Program." Advances in X-ray Analysis 31 (1987): 423–30. http://dx.doi.org/10.1154/s0376030800022254.

Full text
Abstract:
AbstractA Search/Matcti program lias 'beea written for the IBM PC AT computer that is capable of -using "background - subtracted, digitized 2-ray powder diffraction scans as inputs in addition to the d/I data traditionally used. This novel procedure has proved especially effective when numerous unresolved lines are present in the pattern. The method is also less demanding of data quality thaii the peak location programs. The program may he extended to searching & data "base of digitized standard patterns.The program, has several parameters that can- "be adjusted, including chemistry. The results from the Johnson/Vand list type of output are directly accessible to the interactive graphics program. This gives the diffraction!st a fast method for verifying the phase identification. Because of the speed of fixed point computation techniques, the 52,791 pattern file can be scanned in about 90 seconds.This paper will illustrate the utility of the program.
APA, Harvard, Vancouver, ISO, and other styles
11

Zhang, Xiao Yan, and Guo Liang Fan. "Design of CAN Central Control Modular Based on USB." Advanced Materials Research 490-495 (March 2012): 990–93. http://dx.doi.org/10.4028/www.scientific.net/amr.490-495.990.

Full text
Abstract:
The central control modular of liquid level data acquisition and transmission system based on CAN bus was introduced. The modular was controlled by PC for transmitting and receiving data to CAN units through the USB port by the USB2.0-CAN High Speed adapter card designed by CP2102 and SJAl000, and the CAN bus controller SJA1000 and CP2102 was controlled by single chip computer (SCC) AT89C51. The hardware interface circuits was introduced, and the software flowchart and PC application program was also designed. The result shows that the system has the advantages of strong real-time, high speed, high reliability, easily extending communication devices, Plug and Play supporting, and automatic configuration. Its communications quality is better than the traditional ones, such as RS232, RS485 et al., and it can be used as real-time monitor in industrial multiple devices.
APA, Harvard, Vancouver, ISO, and other styles
12

Fuentes-Porto, Alba, Carlos García-Ávila, and Efraín Marrero-Salas. "Casa del Samarín, una estación de grabados rupestres en deterioro. Documentación, análisis y diagnóstico en Los Llanos de Ifara, Granadilla, Tenerife." Virtual Archaeology Review 12, no. 24 (January 19, 2021): 99. http://dx.doi.org/10.4995/var.2021.13810.

Full text
Abstract:
<p class="VARAbstract"><strong>Extended Abstract:</strong></p><p class="VARAbstract">In the archaeology of the Canary Islands (Spain), there are many studies based on the usage of new technologies to contribute to the identification and description of rock art engravings through high-resolution digital models (<a href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#Mart%C3%ADn_2005">Martín, 2005</a>; <a href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#Mart%C3%ADn_2007">Martín, Velasco, González &amp; Ramírez, 2007;</a> <a href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#Senen_2016">Senén &amp; Cuenca, 2016</a>; <a href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#Navarro_2019">Navarro &amp; Cancel, 2019</a>). This paper is supported by these documentation techniques and digital analysis in order to deepen into the characterization of the damaged rock art station Casa del Samarín (House of Samarín), or Tagoro del Rey, in Los Llanos de Ifara, south of the island of Tenerife (Figs. 1). Twenty-one panels conserved in situ were documented (Fig. 6). Geometric-linear, geometric with an oval and rectangular trend and figurative ones can be distinguished. The blocks <a title="" href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#_ftn1">[1]</a> that compose the engravings station belong to a rocky basalt outcrop, to which other free-standing blocks are attached, forming a circle. The shape that describes this set of blocks is defined as a "cabin" or circular-shaped structure.</p><p class="VARAbstract">This set of engravings, made on a basalt rocky outcrop with a planar factory, show a tendency to suffer from exfoliation and are affected by internal stresses. The intrinsic characteristics of this stone support, together with their exposure to anthropic actions and strong insolation, condition its fragility, with the risk of losing part of the representations that it houses. Given the threat posed by its gradual deterioration, we seek to ensure its digital preservation through precision three-dimensional (3D) records, the engravings inventory, the record of their conservation state and the understanding of the degradation processes that are affecting the outcrop. What has been explained will be addressed quarterly, to observe the evolution of any material changes every three months.</p><p class="VARAbstract">The registration work consisted of taking four photogrammetric surveys in eight months; the surveys were georeferenced by means of a centimetric Global Navigation Satellite System (GNSS) and a total station. Structure from Motion (SfM) technology enabled the researchers to generate high-precision 3D models in an affordable way, not only in terms of cost but also ease of use. Digital copies with Geographic Information System GIS technology were extracted from them, being exportable in shapefile format (Fig. 7).</p><p class="VARAbstract">As regards the documentation of existing pathologies, assuming standardized lexicon and classification criteria (<a href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#IPCE_2013">IPCE, 2013</a>), together with a rigorous information systematization, was key for achieving agile handling of the data collected and for facilitating monitoring tasks (Fig. 8). Damage maps were created for collecting the location and scope of the alterations. The complex volumetry of the outcrop and the varied orientation of the panels marked the need to resort to 3D editing so that all their faces could be properly registered (Fig. 10). This project was performed with a 3D design program, Blender®. </p><p class="VARAbstract">Thanks to an imaging analysis process, internal textures of 3D models also provided relevant graphic support for the pictographic content and the conservation state (Figs. 11 &amp; 14). DStretch® (<a href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#Harman_2008">Harman, 2008</a>), a plugin implemented in the scientific image processing software ImageJTM, was used for this purpose. To conclude, researchers relied on CloudCompare (<a href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#Girardeau_2015">Girardeau-Montaut, 2015</a>), an advanced 3D data processing software, to tackle a morphometric analysis that allowed us to detect the appearance of formal changes along with the recorded sequences (Figs. 12 &amp; 15). In this process, the distances between two records, taken after six months, were computed with the Cloud to Mesh (C2M) tool, based on the Chamfer distance algorithm (<a href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#Ruiz_2016">Ruiz et al., 2016: 120</a>).</p><p>Registration file cards and damage maps clearly highlighted the main conditions: material losses (shown in orange) and breaks (in green) have affected the outcrop in a generalized way. Furthermore, sedimentary deposits (blue) are concentrated in interstices; while lichen colonies (idem) do so in the least sun-exposed areas. The use of DStretch® highlighted modern excoriations of anthropogenic origin and contributed to distinguishing recent material losses from the older ones, already affected by an incipient patina. Finally, thanks to morphological analysis, a new detachment (Fig. 15b) and a generalized displacement of exempt elements (Figs. 12 &amp; 15) were detected. These displacements indicate outstanding manipulation, which could lead to decontextualizations or new fragmentations.</p><p>Regarding the archaeological interpretation, macroscopic observation of exempt blocks located in the vicinity of the station and the zenith representation of the immediate environment from photogrammetry, have shown that they are forming a set of attached structures (Fig. 13). The site redefinition and the diagnosis of its very weakened defensive system show the need to intensify the archaeological study of this area, so emblematic for the archaeology of the south of Tenerife, in addition to establishing preventive conservation measures that can contribute to its stabilization.</p><p class="VARAbstract"> </p><div><hr align="left" size="1" width="33%" /><div><p><a title="" href="file:///L:/PC%20Port%C3%A1til%20Poli/Mis%20documentos/VAR/En%20Curso/Revisi%C3%B3n/13810/13810-60985-2-CE_jll.docx#_ftnref1">[1]</a> Geological unit of size greater than 300 mm, term standardized by the USCS (Unified Soil Classification System).</p></div></div>
APA, Harvard, Vancouver, ISO, and other styles
13

Sedgewick, J. "From Digital Imaging to the Publisher: Navigating the Digital Journey." Microscopy and Microanalysis 7, S2 (August 2001): 844–45. http://dx.doi.org/10.1017/s1431927600030294.

Full text
Abstract:
With the increased frequency of digital presentation of research on laptops at meetings, on the web and to journals, the need arises for a unified set of computer programs. The best of all worlds would be to create text, graphs, images and HTML in one program only, but the reality is that several need to be used without much duplication of effort. The ideal situation is one in which the computer applications are open-ended, so that one application saves in a format that is readable by another application across Macintosh® and PC platforms. At the same time, the applications should be straightforward to use.Taking only the above conditions into account, even the most cursory examination would reveal that Microsoft® products are, in fact, closed systems at best and dead end at worst. PowerPoint® is a good example of a program that can only do one thing—make slides or on--screen presentations--but dead ends because high resolution image files cannot be saved from the program, nor can files export to HTML or to a file usable by a publisher.
APA, Harvard, Vancouver, ISO, and other styles
14

Tobi, Markus Dwiyanto, and VINA N. VAN HARLING. "PENGENDALI LISTRIK AKSES PARALLEL PORT DENGAN PEMROGRAMAN BORLAND DELPHI 7.0." Electro Luceat 4, no. 2 (November 1, 2018): 26–34. http://dx.doi.org/10.32531/jelekn.v4i2.141.

Full text
Abstract:
Komputer atau disebut PC (personal Computer) sudah berada hampir di setiap rumah, gedung atau perkantoran. Kebanyakan komputer lebih sering digunakan untuk keperluan pengetikan, memutar film, mendengarkan musik dan untuk permainan atau game. Pada saat sebagian orang juga dapat menghabiskan waktu seharian di depan computer mereka. Hal ini dapat membuat orang malas untuk mengerjakan hal lain saat sibuk di depan komputer, miasalnya saja untuk menghidupkan lampu, pendingin ruangan (Air Conditioner), atau perangkat elektronik lainnya. Penelitian ini dirancang untuk mengendalikan peralatan listrik menggunakan akses parallel port dengan pemrograman Delphi 7.0. Peralatan listrik yang dikontrol adalah lampu pijar, untuk mengontrol lampu pada suatu rumah, gedung ataupun di industry-industri yang dihubungkan dan di control melalui komputer dengan memanfaatkan interface port parallel. dan untuk menghidupkan dan mematikan lampu pada suatu gedung atau tempat lainnya dapat dilakukan dengan menekan tombol-tombol yang telah dirancang pada modul program computer dan agar dapat mengakses port parallel dengan baik digunakan sebuah file library, yaitu “inpout32.dll”.
APA, Harvard, Vancouver, ISO, and other styles
15

Nur Haziq Mohd Safri, Mohamad, Wan Nor Shuhadah Wan Nik, Zarina Mohamad, and Mumtazimah Mohamad. "Wireless Network Traffic Analysis and Troubleshooting using Raspberry Pi." International Journal of Engineering & Technology 7, no. 2.15 (April 6, 2018): 58. http://dx.doi.org/10.14419/ijet.v7i2.15.11213.

Full text
Abstract:
In the past five decades, computer network has kept up growing with the increases of its complexity. In such situation, the management, monitoring and maintenance of such computer network requires special attention to ensure optimal network access capability is achieved. Wireless network traffic analysis is a process of recording, studying and analyzing packets in wireless network for network performance analysis purposes. In some cases, the quality of network access performance can be very low without knowing the actual problem. Therefore, in this paper, the performance of wireless network traffic is proposed to be analyzed by using a Raspberry Pi which further able to send an alert to network admin to lessen the downtime. Raspberry Pi is a low cost, a small and portable size of a computer board that can be used to plug-in to monitor, keyboard, mouse, pen drive, etc. In this project, a MyTraceroute (MTR) program is installed on the Raspberry Pi to capture the IP of the Access Point (AP) and show packets loss percentage in the network. The results will be saved in the form of text file and sent to network admin by using email. The solution proposed in this paper is able to support solution to a problem on efficient monitoring, managing and maintaining wireless network traffics.
APA, Harvard, Vancouver, ISO, and other styles
16

YANG, CHENG-HONG, CHING-HSING LUO, and CHENG-HUEI YANG. "IMPROVED MEASUREMENT OF GRIP STRENGTH THROUGH USE OF A PRESSURE SENSITIVE HAND DYNAMOMETER." Biomedical Engineering: Applications, Basis and Communications 14, no. 04 (August 25, 2002): 157–63. http://dx.doi.org/10.4015/s1016237202000231.

Full text
Abstract:
Computers play an increasingly important role in daily lives due to rapid developments in information and computer technology. Computers as instruments for physical rehabilitation are becoming more and more prominent. This paper introduces an improved measurement of grip strength through use of a pressure sensitive hand dynamometer. This system can reliably assess a user's grip strength through recording sudden changes in load cell sensors using a circuit and A/D adapter. The analyzed data is directly displayed on an LCD (Liquid Crystal Display) or is transformed and saved in a data record file via a PC interface. The compiled data can be combined with a patient's case history and stored in a database as future reference for clinical use. The system is more efficient, cheaper to operate and more convenient to use than comparable systems currently used by doctors to monitor and analyze the progress of a patient's individually designed rehabilitation program.
APA, Harvard, Vancouver, ISO, and other styles
17

McCarthy, Gregory J. "Laboratory Note. A LOTUS 1-2-3 Spreadsheet to Aid in Data Reduction for Publication of X-Ray Powder Diffraction Data." Powder Diffraction 3, no. 1 (March 1988): 39–40. http://dx.doi.org/10.1017/s0885715600013105.

Full text
Abstract:
My students and I have developed a LOTUS 1-2-3 spreadsheet to aid in data reduction tasks associated with preparing powder diffraction data for publication in the Powder Diffraction File (PDF) (1987) and this journal. Portions of a sample spreadsheet and the formulae in each of the computational cells are given in the Table 1. The concept of this spreadsheet should apply to any of the available computer spreadsheet programs, although the specific codes for the mathematical functions may differ.The user enters data only into columns C, D and F-H. All other entries will be calculated from the input data. Observed 2θ angles are entered into column D. The corresponding d-spacing is calculated in column A. The Miller indices of these peaks are entered into column C. Prior to use of the spreadsheet, the observed 2θ angles and hkl's had been used to refine unit cell parameters using the Appleman and Evans (1973) least squares unit cell parameter refinement computer program implemented for the IBM PC by Garvey (1986).
APA, Harvard, Vancouver, ISO, and other styles
18

Yohe, William P. "Software Reviews : Buffland, the Macrosim, Version 4.0 Publisher: Craig A. Roger, 928 Chicago Ave., #30, Minneapolis, MN 55404 (telephone: 612-339-8306) Year of Publication: 1987 Materials: Three 5.25-inch program disks (not copy protected), two blank data disks, vinyl disk holders for binder Price: $45 plus $3 shipping; $3.50 for additional copy of instructor disks; $2.50 for additional copy of student disk Machine Specificity: IBM PC, compatible System Requirements: 256K, one drive Effectiveness: Fair User-Friendliness: Good Documentation: Good." Social Science Computer Review 7, no. 1 (April 1989): 102–5. http://dx.doi.org/10.1177/089443938900700114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Rodríguez Bustinza, Ricardo, and Ewar Mamani Churayra. "CONTROL DE LA VELOCIDAD EN TIEMPO REAL DE UN MOTOR DC CONTROLADO POR LÓGICA DIFUSA TIPO PD+I USANDO LABVIEW." Revista Cientifica TECNIA 21, no. 1 (April 7, 2017): 5. http://dx.doi.org/10.21754/tecnia.v21i1.92.

Full text
Abstract:
En este artículo, se presenta un método basado en inteligencia artificial para controlar una planta motor DC por un microordenador personal (PC), el que interactuando hardware y software logra el control de la velocidad del motor DC en tiempo real usando el algoritmo de control Difuso-PD+I. La adquisición de datos e identificación de los parámetros del motor DC han sido necesarias para el control de la velocidad del motor DC, por medio de la tarjeta de adquisición de datos PCI NIDAQ 6024E cuya interface corre en tiempo real que usa el Workshop Real-Time (RTW), el archivo de datos es procesado con la herramienta de identificación del programa Matlab llamada IDENT. El prototipo del sistema computadora-controlador se diseña empleando la programación grafica de LabVIEW, en este caso se hace uso de las herramientas Fuzzy Logic Control y Simulation Module. El control en tiempo real del sistema se lleva a cabo en el laboratorio usando el convertidor digital-a-analógico (DAC) y encoder formado por dos sensores de efecto hall de tipo incremental que por medio de un convertidor frecuencia voltaje se logra procesar las señales desde las entradas analógicas de la NIDAQ. Se verifican los resultados de simulación de computadora experimentalmente, los que demuestran que la señal de control diseñada puede hacer que la salida del sistema prototipo siga eficientemente las referencias impuestas con mínimo sobrepaso y error en estado estacionario nulo. Palabras clave.- Motor DC, Adquisición de datos, Identificación de parámetros, Diseño del controlador e implementación. ABSTRACTIn this article, a method is presented based on artificial intelligence to control a plant DC motor for a personal microcomputer (PC), that interacted hardware and software achieves the control of the speed of the DC motor in real time using the control algorithm Fuzzy-PD+I. The acquisition of data and identification of the parameters of the DC motor have been necessary for the control of the speed of the motor DC, by means of the card of acquisition of data PCI NIDAQ 6024E whose interface runs in the real time that the Workshop Real-Time uses (RTW), the file of data is processed with the tool of identification of the program called IDENT of Matlab. The prototype of the system computer-controller is designed using the graphic programming of LabVIEW, in this case use of the tools Fuzzy Logic Control and Simulation Module. The control in real time of the system is carried out in the laboratory using the digital-to-analogical converter (DAC) and incremental encoder formed by two sensors of effect hall that is possible to process the signs from the analogical input of the NIDAQ by means of a convertor frequency voltage. The results of computer simulation are verified experimentally, those that demonstrate that the designed control sign can make that the exit of the system prototype follows the references imposed with minimum overshoot and null steady-state error. Keywords.- DC Motor, Data acquisition, Parameters identification, Control design and implementation.
APA, Harvard, Vancouver, ISO, and other styles
20

Anjali, Anjali, and Manisha Sabharwal. "Perceived Barriers of Young Adults for Participation in Physical Activity." Current Research in Nutrition and Food Science Journal 6, no. 2 (August 25, 2018): 437–49. http://dx.doi.org/10.12944/crnfsj.6.2.18.

Full text
Abstract:
This study aimed to explore the perceived barriers to physical activity among college students Study Design: Qualitative research design Eight focus group discussions on 67 college students aged 18-24 years (48 females, 19 males) was conducted on College premises. Data were analysed using inductive approach. Participants identified a number of obstacles to physical activity. Perceived barriers emerged from the analysis of the data addressed the different dimensions of the socio-ecological framework. The result indicated that the young adults perceived substantial amount of personal, social and environmental factors as barriers such as time constraint, tiredness, stress, family control, safety issues and much more. Understanding the barriers and overcoming the barriers at this stage will be valuable. Health professionals and researchers can use this information to design and implement interventions, strategies and policies to promote the participation in physical activity. This further can help the students to deal with those barriers and can help to instil the habit of regular physical activity in the later adult years.
APA, Harvard, Vancouver, ISO, and other styles
21

"DETECTION ANALYSIS OF DROWSY DRIVER USING IMAGE PROCESSING TECHNIQUES." July-2020 9, no. 7 (2020): 11–19. http://dx.doi.org/10.29121/ijesrt.v9.i7.2020.2.

Full text
Abstract:
Operater glitches plus nonperformance chip in a good many avenue injuries occuring nowadays. The individual glitches are generated by sleepiness, boozy plus dangerous patterns from the driver. This valuable papers specializes in a fabulous individual sleepiness prognosis procedure found in Sensible Moving Program, in which specializes in unusually high patterns revealed by just the driving force choosing Razzing pi sole block computer. The proportions in getting behind the wheel services models to be able to pick up on how much driver's performance is really important found in making sure of avenue safety. By simply remark in flicker habit plus eyes routines, individual stress and fatigue might be observed beginning sufficiently to circumvent crashes due to drowsiness. Inside the recommended procedure a fabulous non-intrusive individual sleepiness overseeing procedure happens to be designed choosing home pc eye-sight techniques. Using the computer simulation success, them is discovered the procedure includes gotten to be able to pick up on sleepiness info individual having on glasses and also iniquity levels inside of the vehicle. Moreover the system is capable of detecting drowsiness within time duration of about two seconds. The detected abnormal behavior is corrected through alarms in real time. We have also done a transformation of an image from 2d to 3d using wavelet analysis. Here we also compared the wavelet technique with other techniques namely stereo-photogrammetric & edge information technique. The image conversion from 2d to 3d can be done by finding the edges of an image. This edge detection can be used in various fields of image processing, image analysis, image pattern recognition, and computer vision, as well as in human vision. In this, we did our experiment using wavelet technique and the results when compared with the stereo photogrammetry and edge information techniques we find that the wavelet technique gives better result. From the past work, we have observed that Wavelet analysis is an easy method for 2D to 3D analysis.
APA, Harvard, Vancouver, ISO, and other styles
22

Durfee, William K., Samantha A. Weinstein, Ela Bhatt, Ashima Nagpal, and James R. Carey. "Design and Usability of a Home Telerehabilitation System to Train Hand Recovery Following Stroke." Journal of Medical Devices 3, no. 4 (November 19, 2009). http://dx.doi.org/10.1115/1.4000451.

Full text
Abstract:
Current theories of stroke rehabilitation point toward paradigms of intense concentrated use of the afflicted limb as a means for motor program reorganization and partial function restoration. A home-based system for stroke rehabilitation that trains recovery of hand function by a treatment of concentrated movement was developed and tested. A wearable goniometer measured finger and wrist motions in both hands. An interface box transmitted sensor measurements in real-time to a laptop computer. Stroke patients used joint motion to control the screen cursor in a one-dimensional tracking task for several hours a day over the course of 10–14 days to complete a treatment of 1800 tracking trials. A telemonitoring component enabled a therapist to check in with the patient by video phone to monitor progress, to motivate the patient, and to upload tracking data to a central file server. The system was designed for use at home by patients with no computer skills. The system was placed in the homes of 20 subjects with chronic stroke and impaired finger motion, ranging from 2–305 mi away from the clinic, plus one that was a distance of 1057 miles. Fifteen subjects installed the system at home themselves after instruction in the clinic, while nine required a home visit to install. Three required follow-up visits to fix equipment. A post-treatment telephone survey was conducted to assess ease of use and most responded that the system was easy to use. Functional improvements were seen in the subjects enrolled in the formal treatment study, although the treatment period was too short to trigger cortical reorganization. We conclude that the system is feasible for home use and that tracking training has promise as a treatment paradigm.
APA, Harvard, Vancouver, ISO, and other styles
23

"Software Reviews : INTROSTAT 2.2 Reviewed by Ben Nefzger, Augustana College Publisher: Ideal Systems, P.O. Box 681, Fairfield, IA 52556 (telephone : 515-472-4507) Year of Publication: 1982 Materials: One unprotected program diskette and a 55-page user's manual Price: $125 Machine Specificity: Apple II Plus, IIe, and IIc; Atari 800, 1200, 65XE, 130XE; IBM Pc, XT, AT and compatibles System Requirements: Apple: 48K RAM and one disk drive, Dos 3.3. Atari: 48K RAM, one disk drive, DOS 2.0S, Atari BASIC. IBM : 64K RAM, one disk drive, DOS 1.1 or higher. A second disk drive and printer are optional additions to each computer system. Effectiveness: Good User-Friendliness: Good Documentation: Good." Social Science Microcomputer Review 4, no. 1 (April 1986): 119–21. http://dx.doi.org/10.1177/089443938600400116.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

"Software Reviews : Pc-File/R Staff review Publisher: ButtonWare, Inc., P.O. Box 5786, Bellevue, WA 98006 (telephone: 800-J-BUTTON or 206-454-0479) Year of Publication: 1985 Materials: Unlocked program disk; 150-page spiral-bound manual; plastic case Price: $149 plus $5 shipping; quantity discounts negotiated Machine Specificity: IBM PC and most MS-DOS machines, including Tandy 2000 System Requirements: 128K memory (256K if built-in word processor used); printer recommended Effectiveness: Excellent User-Friendliness: Excellent Documentation: Good." Social Science Microcomputer Review 4, no. 2 (July 1986): 265–67. http://dx.doi.org/10.1177/089443938600400227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Wasser, Frederick. "Media Is Driving Work." M/C Journal 4, no. 5 (November 1, 2001). http://dx.doi.org/10.5204/mcj.1935.

Full text
Abstract:
My thesis is that new media, starting with analog broadcast and going through digital convergence, blur the line between work time and free time. The technology that we are adopting has transformed free time into potential and actual labour time. At the dawn of the modern age, work shifted from tasked time to measured time. Previously, tasked time intermingled work and leisure according to the vagaries of nature. All this was banished when industrial capitalism instituted the work clock (Mumford 12-8). But now, many have noticed how post-industrial capitalism features a new intermingling captured in such expressions as "24/7" and "multi-tasking." Yet, we are only beginning to understand that media are driving a return to the pre-modern where the hour and the space are both ambiguous, available for either work or leisure. This may be the unfortunate side effect of the much vaunted "interactivity." Do you remember the old American TV show Dobie Gillis (1959-63) which featured the character Maynard G. Krebs? He always shuddered at the mention of the four-letter word "work." Now, American television shows makes it a point that everyone works (even if just barely). Seinfeld was a bold exception in featuring the work-free Kramer; a deliberate homage to the 1940s team of Abbott and Costello. Today, as welfare is turned into workfare, The New York Times scolds even the idle rich to adopt the work ethic (Yazigi). The Forms of Broadcast and Digital Media Are Driving the Merger of Work and Leisure More than the Content It is not just the content of television and other media that is undermining the leisured life; it is the social structure within which we use the media. Broadcast advertisements were the first mode/media combinations that began to recolonise free time for the new consumer economy. There had been a previous buildup in the volume and the ubiquity of advertising particularly in billboards and print. However, the attention of the reader to the printed commercial message could not be controlled and measured. Radio was the first to appropriate and measure its audience's time for the purposes of advertising. Nineteenth century media had promoted a middle class lifestyle based on spending money on home to create a refuge from work. Twentieth century broadcasting was now planting commercial messages within that refuge in the sacred moments of repose. Subsequent to broadcast, home video and cable facilitated flexible work by offering entertainment on a 24 hour basis. Finally, the computer, which juxtaposes image/sound/text within a single machine, offers the user the same proto-interactive blend of entertainment and commercial messages that broadcasting pioneered. It also fulfills the earlier promise of interactive TV by allowing us to work and to shop, in all parts of the day and night. We need to theorise this movement. The theory of media as work needs an institutional perspective. Therefore, I begin with Dallas Smythe's blindspot argument, which gave scholarly gravitas to the structural relationship of work and media (263-299). Horkheimer and Adorno had already noticed that capitalism was extending work into free time (137). Dallas Smythe went on to dissect the precise means by which late capitalism was extending work. Smythe restates the Marxist definition of capitalist labour as that human activity which creates exchange value. Then he considered the advertising industry, which currently approaches200 billion in the USA and realised that a great deal of exchange value has been created. The audience is one element of the labour that creates this exchange value. The appropriation of people's time creates advertising value. The time we spend listening to commercials on radio or viewing them on TV can be measured and is the unit of production for the value of advertising. Our viewing time ipso facto has been changed into work time. We may not experience it subjectively as work time although pundits such as Marie Winn and Jerry Mander suggest that TV viewing contributes to the same physical stresses as actual work. Nonetheless, Smythe sees commercial broadcasting as expanding the realm of capitalism into time that was otherwise set aside for private uses. Smythe's essay created a certain degree of excitement among political economists of media. Sut Jhally used Smythe to explain aspects of US broadcast history such as the innovations of William Paley in creating the CBS network (Jhally 70-9). In 1927, as Paley contemplated winning market share from his rival NBC, he realised that selling audience time was far more profitable than selling programs. Therefore, he paid affiliated stations to air his network's programs while NBC was still charging them for the privilege. It was more lucrative to Paley to turn around and sell the stations' guaranteed time to advertisers, than to collect direct payments for supplying programs. NBC switched to his business model within a year. Smythe/Jhally's model explains the superiority of Paley's model and is a historical proof of Smythe's thesis. Nonetheless, many economists and media theorists have responded with a "so what?" to Smythe's thesis that watching TV as work. Everyone knows that the basis of network television is the sale of "eyeballs" to the advertisers. However, Smythe's thesis remains suggestive. Perhaps he arrived at it after working at the U.S. Federal Communications Commission from 1943 to 1948 (Smythe 2). He was part of a team that made one last futile attempt to force radio to embrace public interest programming. This effort failed because the tide of consumerism was too strong. Radio and television were the leading edge of recapturing the home for work, setting the stage for the Internet and a postmodern replication of the cottage industries of pre and proto-industrial worlds. The consequences have been immense. The Depression and the crisis of over-production Cultural studies recognises that social values have shifted from production to consumption (Lash and Urry). The shift has a crystallising moment in the Great Depression of 1929 through 1940. One proposal at the time was to reduce individual work hours in order to create more jobs (see Hunnicut). This proposal of "share the work" was not adopted. From the point of view of the producer, sharing the work would make little difference to productivity. However, from the retailer's perspective each individual worker would accumulate less money to buy products. Overall sales would stagnate or decline. Prominent American economists at the time argued that sharing the work would mean sharing the unemployment. They warned the US government this was a fundamental threat to an economy based on consumption. Only a fully employed laborer could have enough money to buy down the national inventory. In 1932, N. A. Weston told the American Economic Association that: " ...[the labourers'] function in society as a consumer is of equal importance as the part he plays as a producer." (Weston 11). If the defeat of the share the work movement is the negative manifestation of consumerism, then the invasion by broadcast of our leisure time is its positive materialisation. We can trace this understanding by looking at Herbert Hoover. When he was the Secretary of Commerce in 1924 he warned station executives that: "I have never believed that it was possible to advertise through broadcasting without ruining the [radio] industry" (Radio's Big Issue). He had not recognised that broadcast advertising would be qualitatively more powerful for the economy than print advertising. By 1929, Hoover, now President Hoover, approved an economics committee recommendation in the traumatic year of 1929 that leisure time be made "consumable " (Committee on Recent Economic Changes xvi). His administration supported the growth of commercial radio because broadcasting was a new efficient answer to the economists' question of how to motivate consumption. Not so coincidentally network radio became a profitable industry during the great Depression. The economic power that pre-war radio hinted at flourished in the proliferation of post-war television. Advertisers switched their dollars from magazines to TV, causing the demise of such general interest magazines as Life, The Saturday Evening Postet al. Western Europe quickly followed the American broadcasting model. Great Britain was the first, allowing television to advertise the consumer revolution in 1955. Japan and many others started to permit advertising on television. During the era of television, the nature of work changed from manufacturing to servicing (Preston 148-9). Two working parents also became the norm as a greater percentage of the population took salaried employment, mostly women (International Labour Office). Many of the service jobs are to monitor the new global division of labour that allows industrialised nations to consume while emerging nations produce. (Chapter seven of Preston is the most current discussion of the shift of jobs within information economies and between industrialised and emerging nations.) Flexible Time/ Flexible Media Film and television has responded by depicting these shifts. The Mary Tyler Moore Show debuted in September of 1970 (see http://www.transparencynow.com/mary.htm). In this show nurturing and emotional attachments were centered in the work place, not in an actual biological family. It started a trend that continues to this day. However, media representations of the changing nature of work are merely symptomatic of the relationship between media and work. Broadcast advertising has a more causal relationship. As people worked more to buy more, they found that they wanted time-saving media. It is in this time period that the Internet started (1968), that the video cassette recorder was introduced (1975) and that the cable industry grew. Each of these ultimately enhanced the flexibility of work time. The VCR allowed time shifting programs. This is the media answer to the work concept of flexible time. The tired worker can now see her/his favourite TV show according to his/her own flex schedule (Wasser 2001). Cable programming, with its repeats and staggered starting times, also accommodates the new 24/7 work day. These machines, offering greater choice of programming and scheduling, are the first prototypes of interactivity. The Internet goes further in expanding flexible time by adding actual shopping to the vicarious enjoyment of consumerist products on television. The Internet user continues to perform the labour of watching advertising and, in addition, now has the opportunity to do actual work tasks at any time of the day or night. The computer enters the home as an all-purpose machine. Its purchase is motivated by several simultaneous factors. The rhetoric often stresses the recreational and work aspects of the computer in the same breath (Reed 173, Friedrich 16-7). Games drove the early computer programmers to find more "user-friendly" interfaces in order to entice young consumers. Entertainment continues to be the main driving force behind visual and audio improvements. This has been true ever since the introduction of the Apple II, Radio Shack's TRS 80 and Atari 400 personal computers in the 1977-1978 time frame (see http://www.atari-history.com/computers/8bits/400.html). The current ubiquity of colour monitors, and the standard package of speakers with PC computers are strong indications that entertainment and leisure pursuits continue to drive the marketing of computers. However, once the computer is in place in the study or bedroom, its uses fully integrates the user with world of work in both the sense of consuming and creating value. This is a specific instance of what Philip Graham calls the analytical convergence of production, consumption and circulation in hypercapitalism. The streaming video and audio not only captures the action of the game, they lend sensual appeal to the banner advertising and the power point downloads from work. In one regard, the advent of Internet advertising is a regression to the pre-broadcast era. The passive web site ad runs the same risk of being ignored as does print advertising. The measure of a successful web ad is interactivity that most often necessitates a click through on the part of the viewer. Ads often show up on separate windows that necessitate a click from the viewer if only to close down the program. In the words of Bolter and Grusin, click-through advertising is a hypermediation of television. In other words, it makes apparent the transparent relationship television forged between work and leisure. We do not sit passively through Internet advertising, we click to either eliminate them or to go on and buy the advertised products. Just as broadcasting facilitated consumable leisure, new media combines consumable leisure with flexible portable work. The new media landscape has had consequences, although the price of consumable leisure took awhile to become visible. The average work week declined from 1945 to 1982. After that point in the US, it has been edging up, continuously (United States Bureau of Labor Statistics). There is some question whether the computer has improved productivity (Kim), there is little question that the computer is colonising leisure time for multi-tasking. In a population that goes online from home almost twice as much as those who go online from work, almost half use their online time for work based activities other than email. Undoubtedly, email activity would account for even more work time (Horrigan). On the other side of the blur between work and leisure, the Pew Institute estimates that fifty percent use work Internet time for personal pleasure ("Wired Workers"). Media theory has to reengage the problem that Horkheimer/Adorno/Smythe raised. The contemporary problem of leisure is not so much the lack of leisure, but its fractured, non-contemplative, unfulfilling nature. A media critique will demonstrate the contribution of the TV and the Internet to this erosion of free time. References Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: MIT Press, 2000. Committee on Recent Economic Changes. Recent Economic Changes. Vol. 1. New York: no publisher listed, 1929. Friedrich, Otto. "The Computer Moves In." Time 3 Jan. 1983: 14-24. Graham, Philip. Hypercapitalism: A Political Economy of Informational Idealism. In press for New Media and Society2.2 (2000). Horkheimer, Max, and Theodor W. Adorno. Dialectic of Enlightenment. New York: Continuum Publishing, 1944/1987. Horrigan, John B. "New Internet Users: What They Do Online, What They Don't and Implications for the 'Net's Future." Pew Internet and American Life Project. 25 Sep. 2000. 24 Oct. 2001 <http://www.pewinternet.org/reports/toc.asp?Report=22>. Hunnicutt, Benjamin Kline. Work without End: Abandoning Shorter Hours for the Right to Work. Philadelphia: Temple UP, 1988. International Labour Office. Economically Active Populations: Estimates and Projections 1950-2025. Geneva: ILO, 1995. Jhally, Sut. The Codes of Advertising. New York: St. Martin's Press, 1987. Kim, Jane. "Computers and the Digital Economy." Digital Economy 1999. 8 June 1999. October 24, 2001 <http://www.digitaleconomy.gov/powerpoint/triplett/index.htm>. Lash, Scott, and John Urry. Economies of Signs and Space. London: Sage Publications, 1994. Mander, Jerry. Four Arguments for the Elimination of Television. New York: Morrow Press, 1978. Mumford, Lewis. Technics and Civilization. New York: Harcourt Brace, 1934. Preston, Paschal. Reshaping Communication: Technology, Information and Social Change. London: Sage, 2001. "Radio's Big Issue Who Is to Pay the Artist?" The New York Times 18 May 1924: Section 8, 3. Reed, Lori. "Domesticating the Personal Computer." Critical Studies in Media Communication17 (2000): 159-85. Smythe, Dallas. Counterclockwise: Perspectives on Communication. Boulder, CO: Westview Press, 1993. United States Bureau of Labor Statistics. Unpublished Data from the Current Population Survey. 2001. Wasser, Frederick A. Veni, Vidi, Video: The Hollywood Empire and the VCR. Austin, TX: U of Texas P, 2001. Weston, N.A., T.N. Carver, J.P. Frey, E.H. Johnson, T.R. Snavely and F.D. Tyson. "Shorter Working Time and Unemployment." American Economic Review Supplement 22.1 (March 1932): 8-15. <http://links.jstor.org/sici?sici=0002-8282%28193203%2922%3C8%3ASWTAU%3E2.0.CO%3B2-3>. Winn, Marie. The Plug-in Drug. New York: Viking Press, 1977. "Wired Workers: Who They Are, What They're Doing Online." Pew Internet Life Report 3 Sep. 2000. 24 Oct. 2000 <http://www.pewinternet.org/reports/toc.asp?Report=20>. Yazigi, Monique P. "Shocking Visits to the Real World." The New York Times 21 Feb. 1990. Page unknown. Links http://www.pewinternet.org/reports/toc.asp?Report=20 http://www.pewinternet.org/reports/toc.asp?Report=22 http://www.atari-history.com/computers/8bits/400.html http://www.transparencynow.com/mary.htm http://www.digitaleconomy.gov/powerpoint/triplett/index.htm http://links.jstor.org/sici?sici=0002-8282%28193203%2922%3C8%3ASWTAU%3 E2.0.CO%3B2-3 Citation reference for this article MLA Style Wasser, Frederick. "Media Is Driving Work" M/C: A Journal of Media and Culture 4.5 (2001). [your date of access] < http://www.media-culture.org.au/0111/Wasser.xml >. Chicago Style Wasser, Frederick, "Media Is Driving Work" M/C: A Journal of Media and Culture 4, no. 5 (2001), < http://www.media-culture.org.au/0111/Wasser.xml > ([your date of access]). APA Style Wasser, Frederick. (2001) Media Is Driving Work. M/C: A Journal of Media and Culture 4(5). < http://www.media-culture.org.au/0111/Wasser.xml > ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
26

"Software Reviews : PARADOX, Version 1.1 Reviewed by John J. Treacy, Wright State University Publisher: ANSA Software, 1301 Shoreway, Belmont, CA 94002 (telephone : 415-595-4469) Year of Publication: 1986 Materials: Two program disks, one installation disk, one sample table disk, two keyboard templates, and five manuals ( PARADOX introduction, 150 pp.; user's guide, 430 pp.; PARADOX application language user's guide: A quick guide to PARADOX for dBASE users, 22 pp.; and A quick guide to PARADOX for LOTUS users, 18 pp.). Also available upon request is an applications generator guide with three disks that requires a hard disk to run. Price: $695 list Availability: IBM PC, XT, AT, COMPAQ, COMPAQ PLUS, DESKPRO, and 100-percent-compatible personal computers. (I used NCR 41 with 640K RAM, color monitor, 808 math coprocessor, one floppy disk [360K], one 20M hard drive, and a 2M Ram disk with IBM DOS 3.1. ~ System Requirements: 512K RAM, two floppy drives or one hard disk and one floppy drive, DOS 2.0 or higher; supports IBM Pro-writer, Epson, Okida A dot-matrix, and HP Laser printers Effectiveness: Good User-Friendliness: Excellent Documentation: Good on program usage, weak on hardware." Social Science Microcomputer Review 5, no. 2 (July 1987): 268–71. http://dx.doi.org/10.1177/089443938700500226.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Goggin, Gerard. "Broadband." M/C Journal 6, no. 4 (August 1, 2003). http://dx.doi.org/10.5204/mcj.2219.

Full text
Abstract:
Connecting I’ve moved house on the weekend, closer to the centre of an Australian capital city. I had recently signed up for broadband, with a major Australian Internet company (my first contact, cf. Turner). Now I am the proud owner of a larger modem than I have ever owned: a white cable modem. I gaze out into our new street: two thick black cables cosseted in silver wire. I am relieved. My new home is located in one of those streets, double-cabled by Telstra and Optus in the data-rush of the mid-1990s. Otherwise, I’d be moth-balling the cable modem, and the thrill of my data percolating down coaxial cable. And it would be off to the computer supermarket to buy an ASDL modem, then to pick a provider, to squeeze some twenty-first century connectivity out of old copper (the phone network our grandparents and great-grandparents built). If I still lived in the country, or the outskirts of the city, or anywhere else more than four kilometres from the phone exchange, and somewhere that cable pay TV will never reach, it would be a dish for me — satellite. Our digital lives are premised upon infrastructure, the networks through which we shape what we do, fashion the meanings of our customs and practices, and exchange signs with others. Infrastructure is not simply the material or the technical (Lamberton), but it is the dense, fibrous knotting together of social visions, cultural resources, individual desires, and connections. No more can one easily discern between ‘society’ and ‘technology’, ‘carriage’ and ‘content’, ‘base’ and ‘superstructure’, or ‘infrastructure’ and ‘applications’ (or ‘services’ or ‘content’). To understand telecommunications in action, or the vectors of fibre, we need to consider the long and heterogeneous list of links among different human and non-human actors — the long networks, to take Bruno Latour’s evocative concept, that confect our broadband networks (Latour). The co-ordinates of our infrastructure still build on a century-long history of telecommunications networks, on the nineteenth-century centrality of telegraphy preceding this, and on the histories of the public and private so inscribed. Yet we are in the midst of a long, slow dismantling of the posts-telegraph-telephone (PTT) model of the monopoly carrier for each nation that dominated the twentieth century, with its deep colonial foundations. Instead our New World Information and Communication Order is not the decolonising UNESCO vision of the late 1970s and early 1980s (MacBride, Maitland). Rather it is the neoliberal, free trade, market access model, its symbol the 1984 US judicial decision to require the break-up of AT&T and the UK legislation in the same year that underpinned the Thatcherite twin move to privatize British Telecom and introduce telecommunications competition. Between 1984 and 1999, 110 telecommunications companies were privatized, and the ‘acquisition of privatized PTOs [public telecommunications operators] by European and American operators does follow colonial lines’ (Winseck 396; see also Mody, Bauer & Straubhaar). The competitive market has now been uneasily installed as the paradigm for convergent communications networks, not least with the World Trade Organisation’s 1994 General Agreement on Trade in Services and Annex on Telecommunications. As the citizen is recast as consumer and customer (Goggin, ‘Citizens and Beyond’), we rethink our cultural and political axioms as well as the axes that orient our understandings in this area. Information might travel close to the speed of light, and we might fantasise about optical fibre to the home (or pillow), but our terrain, our band where the struggle lies today, is narrower than we wish. Begging for broadband, it seems, is a long way from warchalking for WiFi. Policy Circuits The dreary everyday business of getting connected plugs the individual netizen into a tangled mess of policy circuits, as much as tricky network negotiations. Broadband in mid-2003 in Australia is a curious chimera, welded together from a patchwork of technologies, old and newer communications industries, emerging economies and patterns of use. Broadband conjures up grander visions, however, of communication and cultural cornucopia. Broadband is high-speed, high-bandwidth, ‘always-on’, networked communications. People can send and receive video, engage in multimedia exchanges of all sorts, make the most of online education, realise the vision of home-based work and trading, have access to telemedicine, and entertainment. Broadband really entered the lexicon with the mass takeup of the Internet in the early to mid-1990s, and with the debates about something called the ‘information superhighway’. The rise of the Internet, the deregulation of telecommunications, and the involuted convergence of communications and media technologies saw broadband positioned at the centre of policy debates nearly a decade ago. In 1993-1994, Australia had its Broadband Services Expert Group (BSEG), established by the then Labor government. The BSEG was charged with inquiring into ‘issues relating to the delivery of broadband services to homes, schools and businesses’. Stung by criticisms of elite composition (a narrow membership, with only one woman among its twelve members, and no consumer or citizen group representation), the BSEG was prompted into wider public discussion and consultation (Goggin & Newell). The then Bureau of Transport and Communications Economics (BTCE), since transmogrified into the Communications Research Unit of the Department of Communications, Information Technology and the Arts (DCITA), conducted its large-scale Communications Futures Project (BTCE and Luck). The BSEG Final report posed the question starkly: As a society we have choices to make. If we ignore the opportunities we run the risk of being left behind as other countries introduce new services and make themselves more competitive: we will become consumers of other countries’ content, culture and technologies rather than our own. Or we could adopt new technologies at any cost…This report puts forward a different approach, one based on developing a new, user-oriented strategy for communications. The emphasis will be on communication among people... (BSEG v) The BSEG proposed a ‘National Strategy for New Communications Networks’ based on three aspects: education and community access, industry development, and the role of government (BSEG x). Ironically, while the nation, or at least its policy elites, pondered the weighty question of broadband, Australia’s two largest telcos were doing it. The commercial decision of Telstra/Foxtel and Optus Vision, and their various television partners, was to nail their colours (black) to the mast, or rather telegraph pole, and to lay cable in the major capital cities. In fact, they duplicated the infrastructure in cities such as Sydney and Melbourne, then deciding it would not be profitable to cable up even regional centres, let alone small country towns or settlements. As Terry Flew and Christina Spurgeon observe: This wasteful duplication contrasted with many other parts of the country that would never have access to this infrastructure, or to the social and economic benefits that it was perceived to deliver. (Flew & Spurgeon 72) The implications of this decision for Australia’s telecommunications and television were profound, but there was little, if any, public input into this. Then Minister Michael Lee was very proud of his anti-siphoning list of programs, such as national sporting events, that would remain on free-to-air television rather than screen on pay, but was unwilling, or unable, to develop policy on broadband and pay TV cable infrastructure (on the ironies of Australia’s television history, see Given’s masterly account). During this period also, it may be remembered, Australia’s Internet was being passed into private hands, with the tendering out of AARNET (see Spurgeon for discussion). No such national strategy on broadband really emerged in the intervening years, nor has the market provided integrated, accessible broadband services. In 1997, landmark telecommunications legislation was enacted that provided a comprehensive framework for competition in telecommunications, as well as consolidating and extending consumer protection, universal service, customer service standards, and other reforms (CLC). Carrier and reseller competition had commenced in 1991, and the 1997 legislation gave it further impetus. Effective competition is now well established in long distance telephone markets, and in mobiles. Rivalrous competition exists in the market for local-call services, though viable alternatives to Telstra’s dominance are still few (Fels). Broadband too is an area where there is symbolic rivalry rather than effective competition. This is most visible in advertised ADSL offerings in large cities, yet most of the infrastructure for these services is comprised by Telstra’s copper, fixed-line network. Facilities-based duopoly competition exists principally where Telstra/Foxtel and Optus cable networks have been laid, though there are quite a number of ventures underway by regional telcos, power companies, and, most substantial perhaps, the ACT government’s TransACT broadband network. Policymakers and industry have been greatly concerned about what they see as slow takeup of broadband, compared to other countries, and by barriers to broadband competition and access to ‘bottleneck’ facilities (such as Telstra or Optus’s networks) by potential competitors. The government has alternated between trying to talk up broadband benefits and rates of take up and recognising the real difficulties Australia faces as a large country with a relative small and dispersed population. In March 2003, Minister Alston directed the ACCC to implement new monitoring and reporting arrangements on competition in the broadband industry. A key site for discussion of these matters has been the competition policy institution, the Australian Competition and Consumer Commission, and its various inquiries, reports, and considerations (consult ACCC’s telecommunications homepage at http://www.accc.gov.au/telco/fs-telecom.htm). Another key site has been the Productivity Commission (http://www.pc.gov.au), while a third is the National Office on the Information Economy (NOIE - http://www.noie.gov.au/projects/access/access/broadband1.htm). Others have questioned whether even the most perfectly competitive market in broadband will actually provide access to citizens and consumers. A great deal of work on this issue has been undertaken by DCITA, NOIE, the regulators, and industry bodies, not to mention consumer and public interest groups. Since 1997, there have been a number of governmental inquiries undertaken or in progress concerning the takeup of broadband and networked new media (for example, a House of Representatives Wireless Broadband Inquiry), as well as important inquiries into the still most strategically important of Australia’s companies in this area, Telstra. Much of this effort on an ersatz broadband policy has been piecemeal and fragmented. There are fundamental difficulties with the large size of the Australian continent and its harsh terrain, the small size of the Australian market, the number of providers, and the dominant position effectively still held by Telstra, as well as Singtel Optus (Optus’s previous overseas investors included Cable & Wireless and Bell South), and the larger telecommunications and Internet companies (such as Ozemail). Many consumers living in metropolitan Australia still face real difficulties in realising the slogan ‘bandwidth for all’, but the situation in parts of rural Australia is far worse. Satellite ‘broadband’ solutions are available, through Telstra Countrywide or other providers, but these offer limited two-way interactivity. Data can be received at reasonable speeds (though at far lower data rates than how ‘broadband’ used to be defined), but can only be sent at far slower rates (Goggin, Rural Communities Online). The cultural implications of these digital constraints may well be considerable. Computer gamers, for instance, are frustrated by slow return paths. In this light, the final report of the January 2003 Broadband Advisory Group (BAG) is very timely. The BAG report opens with a broadband rhapsody: Broadband communications technologies can deliver substantial economic and social benefits to Australia…As well as producing productivity gains in traditional and new industries, advanced connectivity can enrich community life, particularly in rural and regional areas. It provides the basis for integration of remote communities into national economic, cultural and social life. (BAG 1, 7) Its prescriptions include: Australia will be a world leader in the availability and effective use of broadband...and to capture the economic and social benefits of broadband connectivity...Broadband should be available to all Australians at fair and reasonable prices…Market arrangements should be pro-competitive and encourage investment...The Government should adopt a National Broadband Strategy (BAG 1) And, like its predecessor nine years earlier, the BAG report does make reference to a national broadband strategy aiming to maximise “choice in work and recreation activities available to all Australians independent of location, background, age or interests” (17). However, the idea of a national broadband strategy is not something the BAG really comes to grips with. The final report is keen on encouraging broadband adoption, but not explicit on how barriers to broadband can be addressed. Perhaps this is not surprising given that the membership of the BAG, dominated by representatives of large corporations and senior bureaucrats was even less representative than its BSEG predecessor. Some months after the BAG report, the Federal government did declare a broadband strategy. It did so, intriguingly enough, under the rubric of its response to the Regional Telecommunications Inquiry report (Estens), the second inquiry responsible for reassuring citizens nervous about the full-privatisation of Telstra (the first inquiry being Besley). The government’s grand $142.8 million National Broadband Strategy focusses on the ‘broadband needs of regional Australians, in partnership with all levels of government’ (Alston, ‘National Broadband Strategy’). Among other things, the government claims that the Strategy will result in “improved outcomes in terms of services and prices for regional broadband access; [and] the development of national broadband infrastructure assets.” (Alston, ‘National Broadband Strategy’) At the same time, the government announced an overall response to the Estens Inquiry, with specific safeguards for Telstra’s role in regional communications — a preliminary to the full Telstra sale (Alston, ‘Future Proofing’). Less publicised was the government’s further initiative in indigenous telecommunications, complementing its Telecommunications Action Plan for Remote Indigenous Communities (DCITA). Indigenous people, it can be argued, were never really contemplated as citizens with the ken of the universal service policy taken to underpin the twentieth-century government monopoly PTT project. In Australia during the deregulatory and re-regulatory 1990s, there was a great reluctance on the part of Labor and Coalition Federal governments, Telstra and other industry participants, even to research issues of access to and use of telecommunications by indigenous communicators. Telstra, and to a lesser extent Optus (who had purchased AUSSAT as part of their licence arrangements), shrouded the issue of indigenous communications in mystery that policymakers were very reluctant to uncover, let alone systematically address. Then regulator, the Australian Telecommunications Authority (AUSTEL), had raised grave concerns about indigenous telecommunications access in its 1991 Rural Communications inquiry. However, there was no government consideration of, nor research upon, these issues until Alston commissioned a study in 2001 — the basis for the TAPRIC strategy (DCITA). The elision of indigenous telecommunications from mainstream industry and government policy is all the more puzzling, if one considers the extraordinarily varied and significant experiments by indigenous Australians in telecommunications and Internet (not least in the early work of the Tanami community, made famous in media and cultural studies by the writings of anthropologist Eric Michaels). While the government’s mid-2003 moves on a ‘National Broadband Strategy’ attend to some details of the broadband predicament, they fall well short of an integrated framework that grasps the shortcomings of the neoliberal communications model. The funding offered is a token amount. The view from the seat of government is a glance from the rear-view mirror: taking a snapshot of rural communications in the years 2000-2002 and projecting this tableau into a safety-net ‘future proofing’ for the inevitable turning away of a fully-privately-owned Telstra from its previously universal, ‘carrier of last resort’ responsibilities. In this aetiolated, residualist policy gaze, citizens remain constructed as consumers in a very narrow sense in this incremental, quietist version of state securing of market arrangements. What is missing is any more expansive notion of citizens, their varied needs, expectations, uses, and cultural imaginings of ‘always on’ broadband networks. Hybrid Networks “Most people on earth will eventually have access to networks that are all switched, interactive, and broadband”, wrote Frances Cairncross in 1998. ‘Eventually’ is a very appropriate word to describe the parlous state of broadband technology implementation. Broadband is in a slow state of evolution and invention. The story of broadband so far underscores the predicament for Australian access to bandwidth, when we lack any comprehensive, integrated, effective, and fair policy in communications and information technology. We have only begun to experiment with broadband technologies and understand their evolving uses, cultural forms, and the sense in which they rework us as subjects. Our communications networks are not superhighways, to invoke an enduring artefact from an older technology. Nor any longer are they a single ‘public’ switched telecommunications network, like those presided over by the post-telegraph-telephone monopolies of old. Like roads themselves, or the nascent postal system of the sixteenth century, broadband is a patchwork quilt. The ‘fibre’ of our communications networks is hybrid. To be sure, powerful corporations dominate, like the Tassis or Taxis who served as postmasters to the Habsburg emperors (Briggs & Burke 25). Activating broadband today provides a perspective on the path dependency of technology history, and how we can open up new threads of a communications fabric. Our options for transforming our multitudinous networked lives emerge as much from everyday tactics and strategies as they do from grander schemes and unifying policies. We may care to reflect on the waning potential for nation-building technology, in the wake of globalisation. We no longer gather our imagined community around a Community Telephone Plan as it was called in 1960 (Barr, Moyal, and PMG). Yet we do require national and international strategies to get and stay connected (Barr), ideas and funding that concretely address the wider dimensions of access and use. We do need to debate the respective roles of Telstra, the state, community initiatives, and industry competition in fair telecommunications futures. Networks have global reach and require global and national integration. Here vision, co-ordination, and resources are urgently required for our commonweal and moral fibre. To feel the width of the band we desire, we need to plug into and activate the policy circuits. Thanks to Grayson Cooke, Patrick Lichty, Ned Rossiter, John Pace, and an anonymous reviewer for helpful comments. Works Cited Alston, Richard. ‘ “Future Proofing” Regional Communications.’ Department of Communications, Information Technology and the Arts, Canberra, 2003. 17 July 2003 <http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115485,00.php> —. ‘A National Broadband Strategy.’ Department of Communications, Information Technology and the Arts, Canberra, 2003. 17 July 2003 <http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115486,00.php>. Australian Competition and Consumer Commission (ACCC). Broadband Services Report March 2003. Canberra: ACCC, 2003. 17 July 2003 <http://www.accc.gov.au/telco/fs-telecom.htm>. —. Emerging Market Structures in the Communications Sector. Canberra: ACCC, 2003. 15 July 2003 <http://www.accc.gov.au/pubs/publications/utilities/telecommu... ...nications/Emerg_mar_struc.doc>. Barr, Trevor. new media.com: The Changing Face of Australia’s Media and Telecommunications. Sydney: Allen & Unwin, 2000. Besley, Tim (Telecommunications Service Inquiry). Connecting Australia: Telecommunications Service Inquiry. Canberra: Department of Information, Communications and the Arts, 2000. 17 July 2003 <http://www.telinquiry.gov.au/final_report.php>. Briggs, Asa, and Burke, Peter. A Social History of the Internet: From Gutenberg to the Internet. Cambridge: Polity, 2002. Broadband Advisory Group. Australia’s Broadband Connectivity: The Broadband Advisory Group’s Report to Government. Melbourne: National Office on the Information Economy, 2003. 15 July 2003 <http://www.noie.gov.au/publications/NOIE/BAG/report/index.htm>. Broadband Services Expert Group. Networking Australia’s Future: Final Report. Canberra: Australian Government Publishing Service (AGPS), 1994. Bureau of Transport and Communications Economics (BTCE). Communications Futures Final Project. Canberra: AGPS, 1994. Cairncross, Frances. The Death of Distance: How the Communications Revolution Will Change Our Lives. London: Orion Business Books, 1997. Communications Law Centre (CLC). Australian Telecommunications Regulation: The Communications Law Centre Guide. 2nd edition. Sydney: Communications Law Centre, University of NSW, 2001. Department of Communications, Information Technology and the Arts (DCITA). Telecommunications Action Plan for Remote Indigenous Communities: Report on the Strategic Study for Improving Telecommunications in Remote Indigenous Communities. Canberra: DCITA, 2002. Estens, D. Connecting Regional Australia: The Report of the Regional Telecommunications Inquiry. Canberra: DCITA, 2002. <http://www.telinquiry.gov.au/rti-report.php>, accessed 17 July 2003. Fels, Alan. ‘Competition in Telecommunications’, speech to Australian Telecommunications Users Group 19th Annual Conference. 6 March, 2003, Sydney. <http://www.accc.gov.au/speeches/2003/Fels_ATUG_6March03.doc>, accessed 15 July 2003. Flew, Terry, and Spurgeon, Christina. ‘Television After Broadcasting’. In The Australian TV Book. Ed. Graeme Turner and Stuart Cunningham. Allen & Unwin, Sydney. 69-85. 2000. Given, Jock. Turning Off the Television. Sydney: UNSW Press, 2003. Goggin, Gerard. ‘Citizens and Beyond: Universal service in the Twilight of the Nation-State.’ In All Connected?: Universal Service in Telecommunications, ed. Bruce Langtry. Melbourne: University of Melbourne Press, 1998. 49-77 —. Rural Communities Online: Networking to link Consumers to Providers. Melbourne: Telstra Consumer Consultative Council, 2003. Goggin, Gerard, and Newell, Christopher. Digital Disability: The Social Construction of Disability in New Media. Lanham, MD: Rowman & Littlefield, 2003. House of Representatives Standing Committee on Communications, Information Technology and the Arts (HoR). Connecting Australia!: Wireless Broadband. Report of Inquiry into Wireless Broadband Technologies. Canberra: Parliament House, 2002. <http://www.aph.gov.au/house/committee/cita/Wbt/report.htm>, accessed 17 July 2003. Lamberton, Don. ‘A Telecommunications Infrastructure is Not an Information Infrastructure’. Prometheus: Journal of Issues in Technological Change, Innovation, Information Economics, Communication and Science Policy 14 (1996): 31-38. Latour, Bruno. Science in Action: How to Follow Scientists and Engineers Through Society. Cambridge, MA: Harvard University Press, 1987. Luck, David. ‘Revisiting the Future: Assessing the 1994 BTCE communications futures project.’ Media International Australia 96 (2000): 109-119. MacBride, Sean (Chair of International Commission for the Study of Communication Problems). Many Voices, One World: Towards a New More Just and More Efficient World Information and Communication Order. Paris: Kegan Page, London. UNESCO, 1980. Maitland Commission (Independent Commission on Worldwide Telecommunications Development). The Missing Link. Geneva: International Telecommunications Union, 1985. Michaels, Eric. Bad Aboriginal Art: Tradition, Media, and Technological Horizons. Sydney: Allen & Unwin, 1994. Mody, Bella, Bauer, Johannes M., and Straubhaar, Joseph D., eds. Telecommunications Politics: Ownership and Control of the Information Highway in Developing Countries. Mahwah, NJ: Erlbaum, 1995. Moyal, Ann. Clear Across Australia: A History of Telecommunications. Melbourne: Thomas Nelson, 1984. Post-Master General’s Department (PMG). Community Telephone Plan for Australia. Melbourne: PMG, 1960. Productivity Commission (PC). Telecommunications Competition Regulation: Inquiry Report. Report No. 16. Melbourne: Productivity Commission, 2001. <http://www.pc.gov.au/inquiry/telecommunications/finalreport/>, accessed 17 July 2003. Spurgeon, Christina. ‘National Culture, Communications and the Information Economy.’ Media International Australia 87 (1998): 23-34. Turner, Graeme. ‘First Contact: coming to terms with the cable guy.’ UTS Review 3 (1997): 109-21. Winseck, Dwayne. ‘Wired Cities and Transnational Communications: New Forms of Governance for Telecommunications and the New Media’. In The Handbook of New Media: Social Shaping and Consequences of ICTs, ed. Leah A. Lievrouw and Sonia Livingstone. London: Sage, 2002. 393-409. World Trade Organisation. General Agreement on Trade in Services: Annex on Telecommunications. Geneva: World Trade Organisation, 1994. 17 July 2003 <http://www.wto.org/english/tratop_e/serv_e/12-tel_e.htm>. —. Fourth protocol to the General Agreement on Trade in Services. Geneva: World Trade Organisation. 17 July 2003 <http://www.wto.org/english/tratop_e/serv_e/4prote_e.htm>. Links http://www.accc.gov.au/pubs/publications/utilities/telecommunications/Emerg_mar_struc.doc http://www.accc.gov.au/speeches/2003/Fels_ATUG_6March03.doc http://www.accc.gov.au/telco/fs-telecom.htm http://www.aph.gov.au/house/committee/cita/Wbt/report.htm http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115485,00.html http://www.dcita.gov.au/Article/0,,0_1-2_3-4_115486,00.html http://www.noie.gov.au/projects/access/access/broadband1.htm http://www.noie.gov.au/publications/NOIE/BAG/report/index.htm http://www.pc.gov.au http://www.pc.gov.au/inquiry/telecommunications/finalreport/ http://www.telinquiry.gov.au/final_report.html http://www.telinquiry.gov.au/rti-report.html http://www.wto.org/english/tratop_e/serv_e/12-tel_e.htm http://www.wto.org/english/tratop_e/serv_e/4prote_e.htm Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Goggin, Gerard. "Broadband" M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0308/02-featurebroadband.php>. APA Style Goggin, G. (2003, Aug 26). Broadband. M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0308/02-featurebroadband.php>
APA, Harvard, Vancouver, ISO, and other styles
28

Downes, Daniel M. "The Medium Vanishes?" M/C Journal 3, no. 1 (March 1, 2000). http://dx.doi.org/10.5204/mcj.1829.

Full text
Abstract:
Introduction The recent AOL/Time-Warner merger invites us to re-think the relationships amongst content producers, distributors, and audiences. Worth an estimated $300 billion (US), the largest Internet transaction of all time, the deal is 45 times larger than the AOL/Netscape merger of November 1998 (Ledbetter). Additionally, the Time Warner/EMI merger, which followed hard on the heels of the AOL/Time-Warner deal and is itself worth $28 billion (US), created the largest content rights organisation in the music industry. The joining of the Internet giant (AOL) with what was already the world's largest media corporation (Time-Warner-EMI) has inspired some exuberant reactions. An Infoworld column proclaimed: The AOL/Time-Warner merger signals the demise of traditional media companies and the ascendancy of 'new economy' media companies that will force any industry hesitant to adopt a complete electronic-commerce strategy to rethink and put itself on Internet time. (Saap & Schwarrtz) This comment identifies the distribution channel as the dominant component of the "new economy" media. But this might not really be much of an innovation. Indeed, the assumption of all industry observers is that Time-Warner will provide broadband distribution (through its extensive cable holdings) as well as proprietary content for AOL. It is also expected that Time-Warner will adopt AOL's strategy of seeking sponsorship for development projects as well as for content. However, both of these phenomena -- merger and sponsorship -- are at least as old as radio. It seems that the Internet is merely repeating an old industrial strategy. Nonetheless, one important difference distinguishes the Internet from earlier media: its characterisation of the audience. Internet companies such as AOL and Microsoft tend towards a simple and simplistic media- centred view of the audience as market. I will show, however, that as the Internet assumes more of the traditional mass media functions, it will be forced to adopt a more sophisticated notion of the mass audience. Indeed, the Internet is currently the site in which audience definitions borrowed from broadcasting are encountering and merging with definitions borrowed from marketing. The Internet apparently lends itself to both models. As a result, definitions of what the Internet does or is, and of how we should understand the audience, are suitably confused and opaque. And the behaviour of big Internet players, such as AOL and MSN, perfectly reflects this confusion as they seem to careen between a view of the Internet as the new television and a contrasting view of the Internet as the new shopping mall. Meanwhile, Internet users move in ways that most observers fail to capture. For example, Baran and Davis characterise mass communication as a process involving (1) an organized sender, (2) engaged in the distribution of messages, (3) directed toward a large audience. They argue that broadcasting fits this model whereas a LISTSERV does not because, even though the LISTSERV may have very many subscribers, its content is filtered through a single person or Webmaster. But why is the Webmaster suddenly more determining than a network programmer or magazine editor? The distinction seems to grow out of the Internet's technological characteristics: it is an interactive pipeline, therefore its use necessarily excludes the possibility of "broadcasting" which in turn causes us to reject "traditional" notions of the audience. However, if a media organisation were to establish an AOL discussion group in order to promote Warner TV shows, for example, would not the resulting communication suddenly fall under the definition as set out by Baran and Davis? It was precisely the confusion around such definitions that caused the CRTC (Canada's broadcasting and telecommunications regulator) to hold hearings in 1999 to determine what kind of medium the Internet is. Unlike traditional broadcasting, Internet communication does indeed include the possibility of interactivity and niche communities. In this sense, it is closer to narrowcasting than to broadcasting even while maintaining the possibility of broadcasting. Hence, the nature of the audience using the Internet quickly becomes muddy. While such muddiness might have led us to sharpen our definitions of the audience, it seems instead to have led many to focus on the medium itself. For example, Morris & Ogan define the Internet as a mass medium because it addresses a mass audience mediated through technology (Morris & Ogan 39). They divide producers and audiences on the Internet into four groups: One-to-one asynchronous communication (e-mail); Many-to-many asynchronous communication (Usenet and News Groups); One-to-one, one-to-few, and one-to-many synchronous communication (topic groups, construction of an object, role-playing games, IRC chats, chat rooms); Asynchronous communication (searches, many-to-one, one-to-one, one to- many, source-receiver relations (Morris & Ogan 42-3) Thus, some Internet communication qualifies as mass communication while some does not. However, the focus remains firmly anchored on either the sender or the medium because the receiver --the audience -- is apparently too slippery to define. When definitions do address the content distributed over the Net, they make a distinction between passive reception and interactive participation. As the World Wide Web makes pre-packaged content the norm, the Internet increasingly resembles a traditional mass medium. Timothy Roscoe argues that the main focus of the World Wide Web is not the production of content (and, hence, the fulfilment of the Internet's democratic potential) but rather the presentation of already produced material: "the dominant activity in relation to the Web is not producing your own content but surfing for content" (Rosco 680). He concludes that if the emphasis is on viewing material, the Internet will become a medium similar to television. Within media studies, several models of the audience compete for dominance in the "new media" economy. Denis McQuail recalls how historically, the electronic media furthered the view of the audience as a "public". The audience was an aggregate of common interests. With broadcasting, the electronic audience was delocalised and socially decomposed (McQuail, Mass 212). According to McQuail, it was not a great step to move from understanding the audience as a dispersed "public" to thinking about the audience as itself a market, both for products and as a commodity to be sold to advertisers. McQuail defines this conception of the audience as an "aggregate of potential customers with a known social- economic profile at which a medium or message is directed" (McQuail, Mass 221). Oddly though, in light of the emancipatory claims made for the Internet, this is precisely the dominant view of the audience in the "new media economy". Media Audience as Market How does the marketing model characterise the relationship between audience and producer? According to McQuail, the marketing model links sender and receiver in a cash transaction between producer and consumer rather than in a communicative relationship between equal interlocutors. Such a model ignores the relationships amongst consumers. Indeed, neither the effectiveness of the communication nor the quality of the communicative experience matters. This model, explicitly calculating and implicitly manipulative, is characteristically a "view from the media" (McQuail, Audience 9). Some scholars, when discussing new media, no longer even refer to audiences. They speak of users or consumers (Pavick & Dennis). The logic of the marketing model lies in the changing revenue base for media industries. Advertising-supported media revenues have been dropping since the early 1990s while user-supported media such as cable, satellite, online services, and pay-per-view, have been steadily growing (Pavlik & Dennis 19). In the Internet-based media landscape, the audience is a revenue stream and a source of consumer information. As Bill Gates says, it is all about "eyeballs". In keeping with this view, AOL hopes to attract consumers with its "one-stop shopping and billing". And Internet providers such as MSN do not even consider their subscribers as "audiences". Instead, they work from a consumer model derived from the computer software industry: individuals make purchases without the seller providing content or thematising the likely use of the software. The analogy extends well beyond the transactional moment. The common practice of prototyping products and beta-testing software requires the participation of potential customers in the product development cycle not as a potential audience sharing meanings but as recalcitrant individuals able to uncover bugs. Hence, media companies like MTV now use the Internet as a source of sophisticated demographic research. Recently, MTV Asia established a Website as a marketing tool to collect preferences and audience profiles (Slater 50). The MTV audience is now part of the product development cycle. Another method for getting information involves the "cookie" file that automatically provides a Website with information about the user who logs on to a site (Pavick & Dennis). Simultaneously, though, both Microsoft and AOL have consciously shifted from user-subscription revenues to advertising in an effort to make online services more like television (Gomery; Darlin). For example, AOL has long tried to produce content through its own studios to generate sufficiently heavy traffic on its Internet service in order to garner profitable advertising fees (Young). However, AOL and Microsoft have had little success in providing content (Krantz; Manes). In fact, faced with the AOL/Time-Warner merger, Microsoft declared that it was in the software rather than the content business (Trott). In short, they are caught between a broadcasting model and a consumer model and their behaviour is characteristically erratic. Similarly, media companies such as Time-Warner have failed to establish their own portals. Indeed, Time-Warner even abandoned attempts to create large Websites to compete with other Internet services when it shut down its Pathfinder site (Egan). Instead it refocussed its Websites so as to blur the line between pitching products and covering them (Reid; Lyons). Since one strategy for gaining large audiences is the creation of portals - - large Websites that keep surfers within the confines of a single company's site by providing content -- this is the logic behind the AOL/Time-Warner merger though both companies have clearly been unsuccessful at precisely such attempts. AOL seems to hope that Time- Warner will act as its content specialist, providing the type of compelling material that will make users want to use AOL, whereas Time- Warner seems to hope that AOL will become its privileged pipeline to the hearts and minds of untold millions. Neither has a coherent view of the audience, how it behaves, or should behave. Consequently, their efforts have a distinctly "unmanaged" and slighly inexplicable air to them, as though everyone were simultaneously hopeful and clueless. While one might argue that the stage is set to capitalise on the audience as commodity, there are indications that the success of such an approach is far from guaranteed. First, the AOL/Time-Warner/EMI transaction, merely by existing, has sparked conflicts over proprietary rights. For example, the Recording Industry Association of America, representing Sony, Universal, BMG, Warner and EMI, recently launched a $6.8 billion lawsuit against MP3.com -- an AOL subsidiary -- for alleged copyright violations. Specifically, MP3.com is being sued for selling digitized music over the Internet without paying royalties to the record companies (Anderson). A similar lawsuit has recently been launched over the issue of re- broadcasting television programs over the Internet. The major US networks have joined together against Canadian Internet company iCravetv for the unlawful distribution of content. Both the iCravetv and the MP3.com cases show how dominant media players can marshal their forces to protect proprietary rights in both content and distribution. Since software and media industries have failed to recreate the Internet in the image of traditional broadcasting, the merger of the dominant players in each industry makes sense. However, their simultaneous failure to secure proprietary rights reflects both the competitive nature of the "new media economy" and the weakness of the marketing view of the audience. Media Audience as Public It is often said that communication produces social cohesion. From such cohesion communities emerge on which political or social orders can be constructed. The power of social cohesion and attachment to group symbols can even create a sense of belonging to a "people" or nation (Deutsch). Sociologist Daniel Bell described how the mass media helped create an American culture simply by addressing a large enough audience. He suggested that on the evening of 7 March 1955, when one out of every two Americans could see Mary Martin as Peter Pan on television, a kind of social revolution occurred and a new American public was born. "It was the first time in history that a single individual was seen and heard at the same time by such a broad public" (Bell, quoted in Mattelart 72). One could easily substitute the 1953 World Series or the birth of little Ricky on I Love Lucy. The desire to document such a process recurs with the Internet. Internet communities are based on the assumption that a common experience "creates" group cohesion (Rheingold; Jones). However, as a mass medium, the Internet has yet to find its originary moment, that event to which all could credibly point as the birth of something genuine and meaningful. A recent contender was the appearance of Paul McCartney at the refurbished Cavern Club in Liverpool. On Tuesday, 14 December 1999, McCartney played to a packed club of 300 fans, while another 150,000 watched on an outdoor screen nearby. MSN arranged to broadcast the concert live over the Internet. It advertised an anticipated global audience of 500 million. Unfortunately, there was such heavy Internet traffic that the system was unable to accommodate more than 3 million people. Servers in the United Kingdom were so congested that many could only watch the choppy video stream via an American link. The concert raises a number of questions about "virtual" events. We can draw several conclusions about measuring Internet audiences. While 3 million is a sizeable audience for a 20 minute transmission, by advertising a potential audience of 500 million, MSN showed remarkably poor judgment of its inherent appeal. The Internet is the first medium that allows access to unprocessed material or information about events to be delivered to an audience with neither the time constraints of broadcast media nor the space limitations of the traditional press. This is often cited as one of the characteristics that sets the Internet apart from other media. This feeds the idea of the Internet audience as a participatory, democratic public. For example, it is often claimed that the Internet can foster democratic participation by providing voters with uninterpreted information about candidates and issues (Selnow). However, as James Curran argues, the very process of distributing uninterrupted, unfiltered information, at least in the case of traditional mass media, represents an abdication of a central democratic function -- that of watchdog to power (Curran). In the end, publics are created and maintained through active and continuous participation on the part of communicators and audiences. The Internet holds together potentially conflicting communicative relationships within the same technological medium (Merrill & Ogan). Viewing the audience as co-participant in a communicative relationship makes more sense than simply focussing on the Internet audience as either an aggregate of consumers or a passively constructed symbolic public. Audience as Relationship Many scholars have shifted attention from the producer to the audience as an active participant in the communication process (Ang; McQuail, Audience). Virginia Nightingale goes further to describe the audience as part of a communicative relationship. Nightingale identifies four factors in the relationship between audiences and producers that emphasize their co-dependency. The audience and producer are engaged in a symbiotic relationship in which consumption and use are necessary but not sufficient explanations of audience relations. The notion of the audience invokes, at least potentially, a greater range of activities than simply use or consumption. Further, the audience actively, if not always consciously, enters relationships with content producers and the institutions that govern the creation, distribution and exhibition of content (Nightingale 149-50). Others have demonstrated how this relationship between audiences and producers is no longer the one-sided affair characterised by the marketing model or the model of the audience as public. A global culture is emerging based on critical viewing skills. Kavoori calls this a reflexive mode born of an increasing familiarity with the narrative conventions of news and an awareness of the institutional imperatives of media industries (Kavoori). Given the sophistication of the emergent global audience, a theory that reduces new media audiences to a set of consumer preferences or behaviours will inevitably prove inadequate, just as it has for understanding audience behavior in old media. Similarly, by ignoring those elements of audience behavior that will be easily transported to the Web, we run the risk of idealising the Internet as a medium that will create an illusory, pre-technological public. Conclusion There is an understandable confusion between the two models of the audience that appear in the examples above. The "new economy" will have to come to terms with sophisticated audiences. Contrary to IBM's claim that they want to "get to know all about you", Internet users do not seem particularly interested in becoming a perpetual source of market information. The fragmented, autonomous audience resists attempts to lock it into proprietary relationships. Internet hypesters talk about creating publics and argue that the Internet recreates the intimacy of community as a corrective to the atomisation and alienation characteristic of mass society. This faith in the power of a medium to create social cohesion recalls the view of the television audience as a public constructed by the common experience of watching an important event. However, MSN's McCartney concert indicates that creating a public from spectacle it is not a simple process. In fact, what the Internet media conglomerates seem to want more than anything is to create consumer bases. Audiences exist for pleasure and by the desire to be entertained. As Internet media institutions are established, the cynical view of the audience as a source of consumer behavior and preferences will inevitably give way, to some extent, to a view of the audience as participant in communication. Audiences will be seen, as they have been by other media, as groups whose attention must be courted and rewarded. Who knows, maybe the AOL/Time-Warner merger might, indeed, signal the new medium's coming of age. References Anderson, Lessley. "To Beam or Not to Beam. MP3.com Is Being Sued by the Major Record Labels. Does the Digital Download Site Stand a Chance?" Industry Standard 31 Jan. 2000. <http://www.thestandard.com>. Ang, Ien. Watching Dallas: Soap Opera and the Melodramatic Imagination. London: Methuen, 1985. Baran, Stanley, and Dennis Davis. Mass Communication Theory: Foundations, Ferment, and Future. 2nd ed. Belmont, Calif.: Wadsworth 2000. Curran, James. "Mass Media and Democracy Revisited." Mass Media and Society. Eds. James Curran and Michael Gurevitch. New York: Hodder Headline Group, 1996. Darlin, Damon. "He Wants Your Eyeballs." Forbes 159 (16 June 1997): 114-6. Egan, Jack, "Pathfinder, Rest in Peace: Time-Warner Pulls the Plug on Site." US News and World Report 126.18 (10 May 1999): 50. Gomery, Douglas. "Making the Web Look like Television (American Online and Microsoft)." American Journalism Review 19 (March 1997): 46. Jones, Steve, ed. CyberSociety: Computer-Mediated Communication and Community. Thousand Oaks: Sage, 1995. Kavoori, Amandam P. "Discursive Texts, Reflexive Audiences: Global Trends in Television News Texts and Audience Reception." Journal of Broadcasting and Electronic Media 43.3 (Summer 1999): 386-98. Krantz, Michael. "Is MSN on the Block?" Time 150 (20 Oct. 1997): 82. Ledbetter, James. "AOL-Time-Warner Make It Big." Industry Standard 11 Jan. 2000. <http://www.thestandard.com>. Lyons, Daniel. "Desparate.com (Media Companies Losing Millions on the Web Turn to Electronic Commerce)." Forbes 163.6 (22 March 1999): 50-1. Manes, Stephen. "The New MSN as Prehistoric TV." New York Times 4 Feb. 1997: C6. McQuail, Denis. Audience Analysis. Thousand Oaks, Calif.: Sage, 1997. ---. Mass Communication Theory. 2nd ed. London: Sage, 1987. Mattelart, Armand. Mapping World Communication: War, Progress, Culture. Trans. Susan Emanuel and James A. Cohen. Minneapolis: U of Minnesota P, 1994. Morris, Merrill, and Christine Ogan. "The Internet as Mass Medium." Journal of Communications 46 (Winter 1996): 39-50. Nightingale, Virginia. Studying Audience: The Shock of the Real. London: Routledge, 1996. Pavlik, John V., and Everette E. Dennis. New Media Technology: Cultural and Commercial Perspectives. 2nd ed. Boston: Allyn and Bacon, 1998. Reid, Calvin. "Time-Warner Seeks Electronic Synergy, Profits on the Web (Pathfinder Site)." Publisher's Weekly 242 (4 Dec. 1995): 12. Rheingold, Howard. Virtual Community: Homesteading on the Electronic Frontier. New York: Harper, 1993. Roscoe, Timothy. "The Construction of the World Wide Web Audience." Media, Culture and Society 21.5 (1999): 673-84. Saap, Geneva, and Ephraim Schwarrtz. "AOL-Time-Warner Deal to Impact Commerce, Content, and Access Markets." Infoworld 11 January 2000. <http://infoworld.com/articles/ic/xml/00/01/11/000111icimpact.xml>. Slater, Joanna. "Cool Customers: Music Channels Hope New Web Sites Tap into Teen Spirit." Far Eastern Economic Review 162.9 (4 March 1999): 50. Trott, Bob. "Microsoft Views AOL-Time-Warner as Confirmation of Its Own Strategy." Infoworld 11 Jan. 2000. <http://infoworld.com/articles/pi/xml/00/01/11/000111pimsaoltw.xml>. Yan, Catherine. "A Major Studio Called AOL?" Business Week 1 Dec. 1997: 1773-4. Citation reference for this article MLA style: Daniel M. Downes. "The Medium Vanishes? The Resurrection of the Mass Audience in the New Media Economy." M/C: A Journal of Media and Culture 3.1 (2000). [your date of access] <http://www.uq.edu.au/mc/0003/mass.php>. Chicago style: Daniel M. Downes, "The Medium Vanishes? The Resurrection of the Mass Audience in the New Media Economy," M/C: A Journal of Media and Culture 3, no. 1 (2000), <http://www.uq.edu.au/mc/0003/mass.php> ([your date of access]). APA style: Daniel M. Downes. (2000) The Medium Vanishes? The Resurrection of the Mass Audience in the New Media Economy. M/C: A Journal of Media and Culture 3(1). <http://www.uq.edu.au/mc/0003/mass.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography