To see the other types of publications on this topic, follow the link: Utilities (Computer program).

Journal articles on the topic 'Utilities (Computer program)'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Utilities (Computer program).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Goldman, Charles A., Ian Hoffman, Sean Murphy, Natalie Mims Frick, Greg Leventis, and Lisa Schwartz. "The Cost of Saving Electricity: A Multi-Program Cost Curve for Programs Funded by U.S. Utility Customers." Energies 13, no. 9 (2020): 2369. http://dx.doi.org/10.3390/en13092369.

Full text
Abstract:
This study analyzed the cost performance of electricity efficiency programs implemented by 116 investor-owned utilities between 2009 and 2015 in 41 states, representing about three-quarters of the total spending on U.S. efficiency programs. We applied our typology to characterize efficiency programs along several dimensions (market sector, technology, delivery approach, and intervention strategy) and report the costs incurred by utilities and other program administrators to achieve electricity savings as a result of the programs. Such cost performance data can be used to compare relative costs of different types of efficiency programs, evaluate efficiency options alongside other electricity resources, benchmark local efficiency programs against regional and national cost estimates, and assess the costs of meeting state efficiency policies. The savings-weighted average cost of saved electricity for the period was $0.025/kilowatt-hour (kWh). The cost of saved electricity for programs that targeted residential customers was $0.021/kWh, compared to $0.025/kWh for programs for commercial and industrial customers. Ultimately, we developed an aggregate program savings “cost curve” for the actual electricity efficiency resource during the period that provides insights into the relative costs of various types of efficiency programs and the savings contribution of each program type to the efficiency resource at a national level.
APA, Harvard, Vancouver, ISO, and other styles
2

Moukalled, F., N. Naim, and I. Lakkis. "Computer-Aided Analysis of Centrifugal Compressors." International Journal of Mechanical Engineering Education 22, no. 4 (1994): 245–58. http://dx.doi.org/10.1177/030641909402200402.

Full text
Abstract:
This paper describes computer-aided analysis of centrifugal compressors (CAACC), a micro-computer-based, interactive, and menu-driven software package for use as an educational tool by mechanical engineering students studying radial flow compressors. CAACC is written in the Pascal computer language and runs on IBM PC, or compatible, computers. In addition to solving for any unknown variables, the graphical utilities of the package allow the user to display a diagrammatic sketch of the compressor and to draw velocity diagrams at several locations. Furthermore, the program allows the investigation and plotting of the variation of any parameter versus any other parameter. Through this option, the package guides the student in learning the basics of centrifugal compressors by the various performance studies that can be undertaken and graphically displayed. The comprehensive example presented demonstrates the capabilities of the package as a teaching tool.
APA, Harvard, Vancouver, ISO, and other styles
3

Waits, Bert K., and Franklin Demana. "Computers and the Rational-Root Theorem—Another View." Mathematics Teacher 82, no. 2 (1989): 124–25. http://dx.doi.org/10.5951/mt.82.2.0124.

Full text
Abstract:
A recent article offers a 153-line computer program that numerically determines the rational roots of polynomial equations with degree n ≤ 10 (O'Donnell 1988). The program forms and systematically checks all possible rational roots. This article outlines another approach to finding the rational roots of polynomial equations based on computer graphing that is more general and integrates graphing with the purely algebraic approach. The method is easy and can be used with popular computer graphing utilities or with graphing calculators such as the Casio fx- 70000 or the Sharp EL-5200.
APA, Harvard, Vancouver, ISO, and other styles
4

Toby, Brian H., Robert B. Von Dreele, and Allen C. Larson. "CIF applications. XIV. Reporting of Rietveld results using pdCIF:GSAS2CIF." Journal of Applied Crystallography 36, no. 5 (2003): 1290–94. http://dx.doi.org/10.1107/s0021889803016819.

Full text
Abstract:
A discussion of the process of creating powder diffraction CIF documents (pdCIF) from Rietveld results is presented, with particular focus on the computer programGSAS2CIF. The data structures used withinGSAS2CIFare described, as well as how the program implements template files for descriptive information. Two graphical user interface utilities are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
5

Lee, Eunjung, Jinho Kim, and Dongsik Jang. "Load Profile Segmentation for Effective Residential Demand Response Program: Method and Evidence from Korean Pilot Study." Energies 13, no. 6 (2020): 1348. http://dx.doi.org/10.3390/en13061348.

Full text
Abstract:
Due to the heterogeneity of demand response behaviors among customers, selecting a suitable segment is one of the key factors for the efficient and stable operation of the demand response (DR) program. Most utilities recognize the importance of targeted enrollment. Customer targeting in DR programs is normally implemented based on customer segmentation. Residential customers are characterized by low electricity consumption and large variability across times of consumption. These factors are considered to be the primary challenges in household load profile segmentation. Existing customer segmentation methods have limitations in reflecting daily consumption of electricity, peak demand timings, and load patterns. In this study, we propose a new clustering method to segment customers more effectively in residential demand response programs and thereby, identify suitable customer targets in DR. The approach can be described as a two-stage k-means procedure including consumption features and load patterns. We provide evidence of the outstanding performance of the proposed method compared to existing k-means, Self-Organizing Map (SOM) and Fuzzy C-Means (FCM) models. Segmentation results are also analyzed to identify appropriate groups participating in DR, and the DR effect of targeted groups was estimated in comparison with customers without load profile segmentation. We applied the proposed method to residential customers who participated in a peak-time rebate pilot DR program in Korea. The result proves that the proposed method shows outstanding performance: demand reduction increased by 33.44% compared with the opt-in case and the utility saving cost in DR operation was 437,256 KRW. Furthermore, our study shows that organizations applying DR programs, such as retail utilities or independent system operators, can more economically manage incentive-based DR programs by selecting targeted customers.
APA, Harvard, Vancouver, ISO, and other styles
6

KARKI, NAVA RAJ, AJIT KUMAR VERMA, RAJESH KARKI, and ARBIND KUMAR MISHRA. "RESIDENTIAL CUSTOMERS' OUTAGE COST ANALYSIS FOR URBAN AND SEMI-URBAN AREAS IN A DEVELOPING COUNTRY." International Journal of Reliability, Quality and Safety Engineering 16, no. 06 (2009): 581–93. http://dx.doi.org/10.1142/s0218539309003599.

Full text
Abstract:
This paper discusses cost of unreliability of electricity supply to the residential customers. Electricity supply outage cost is evaluated by customers' survey technique for urban and semi-urban residential areas of a developing country. The energy consumption pattern of urban and sub-urban areas is determined. Electricity supply outage cost is evaluated using both preparatory action approach and contingent valuation method. A detailed breakdown of customer's average outages costs for a entire day based on 8 three-hourly time durations are obtained that give easily understandable figures to the utilities. The power consumption pattern obtained are in close agreement with the utility's load curve. A MATLAB based computer program using Least Square Error Approximation is used to evaluate the outage cost.
APA, Harvard, Vancouver, ISO, and other styles
7

Moukalled, F., and A. Honein. "Computer-Aided Analysis of the Pelton Wheel." International Journal of Mechanical Engineering Education 23, no. 4 (1995): 297–314. http://dx.doi.org/10.1177/030641909502300403.

Full text
Abstract:
This paper describes PELTON, a microcomputer-based, interactive, and menu-driven software package for use as an educational tool by mechanical and civil engineering students in studying the operation of the Pelton wheel. The program is written in the Pascal computer language and runs on IBM PC, or compatible, computers. The package can handle problems related to impulse turbines by solving for unknown variables through a complete set of equations covering the turbine installation. Model-prototype problems can be tackled through similarity laws. This facility is included to help analysing and manipulating experimental data. Furthermore, the graphical utilities of PELTON allow the user to display diagrammatic sketches of the turbine, to employ some recommended charts, and to draw velocity triangles at several locations. The most important feature of the program, however, is its ability to plot the variation of any variable versus any other one. Through this option, the package guides the student in understanding the effects of varying design parameters on the overall performance of the machine. Finally, a comprehensive example problem is provided to show how user-friendly and encouraging-to-use PELTON is, and to demonstrate the capabilities of the package as an instructional tool.
APA, Harvard, Vancouver, ISO, and other styles
8

Moukalled, F., and A. Honein. "Computer-Aided Analysis of Hydraulic Reaction Turbines." International Journal of Mechanical Engineering Education 25, no. 2 (1997): 73–91. http://dx.doi.org/10.1177/030641909702500201.

Full text
Abstract:
A microcomputer-based educational software package designed to help mechanical engineering students to understand hydraulic reaction turbines is described. The software is interactive, menu-driven, and easy-to-use, is written in the Pascal computer language, and runs on IBM PC, or compatible, computers. The program can handle radial, mixed, or axial flow turbine problems by solving for unknown variables through a complete set of equations covering the turbine installation. Using similarity laws, model-prototype problems or operation under different conditions can also be tackled. Furthermore, the program is equipped with graphical utilities that include many diagrammatic sketches of reaction turbines, some recommended charts, and the possibility of drawing velocity triangles when corresponding variables are available. The most important feature of the package is an option that allows one to plot the variation of any parameter versus any other one. Through this option, the student can easily understand and discuss the effects of varying design parameters on the overall performance of the machine. Finally, some special features that are important in making the package user-friendly and encouraging-to-use are also available, and the comprehensive example problem provided demonstrates the capabilities of the package as an instructional tool.
APA, Harvard, Vancouver, ISO, and other styles
9

Rocha, Gabriel Vianna Soares, Raphael Pablo de Souza Barradas, João Rodrigo Silva Muniz, et al. "Optimized Surge Arrester Allocation Based on Genetic Algorithm and ATP Simulation in Electric Distribution Systems." Energies 12, no. 21 (2019): 4110. http://dx.doi.org/10.3390/en12214110.

Full text
Abstract:
The efficient protection of electric power distribution networks against lightning discharges is a crucial problem for distribution electric utilities. To solve this problem, the great challenge is to find a solution for the installation of surge arresters at specific points in the electrical grid and in a sufficient quantity that can ensure an adequate level of equipment protection and be within the utility’s budget. As a solution to this problem of using ATP (Alternative Transient Program), this paper presents a methodology for optimized surge arrester allocation based on genetic algorithm (GA), with a fitness function that maximizes the number of protected equipment according to the financial availability for investment in surge arresters. As ATP may demand too much processing time when running large distribution grids, an innovative procedure is implemented to obtain an overvoltage severity description of the grid and select only the most critical electric nodes for the incidence of lightning discharges, in the GA allocation procedure. The results obtained for the IEEE-123 bus electric feeder indicate a great reduction of flashover occurrence, thus increasing the equipment protection level.
APA, Harvard, Vancouver, ISO, and other styles
10

Achour, Nebil, Masakatsu Miyajima, Federica Pascale, and Andrew D.F. Price. "Hospital resilience to natural hazards: classification and performance of utilities." Disaster Prevention and Management 23, no. 1 (2014): 40–52. http://dx.doi.org/10.1108/dpm-03-2013-0057.

Full text
Abstract:
Purpose – The purpose of this paper is to: explore major and potential challenges facing healthcare facilities operation specifically those related to utility supplies; and quantify the impact of utility supplies interruption on the operation of healthcare facilities through the development of an estimation model. Design/methodology/approach – A pluralistic qualitative and quantitative research approach benefiting from an online computer program that applies the discriminant function analysis approach. Information was collected from 66 hospitals following three major earthquakes that struck northeast Japan in 2003. Findings – Analysis demonstrated that healthcare utilities face three major challenges: vulnerability of infrastructure to natural hazards; low performance of alternative sources; and lack of consideration of healthcare utility supplies in resilience codes and legislations. The study also proposed a method to estimate the impact of utility interruption of healthcare facilities. A model has been developed for the case study hospitals in Northern Japan following three major earthquakes in 2003. Practical implications – The findings are expected to raise the awareness of the critical role utilities play for the operation of healthcare facilities which will potentially lead to upgrading resilience codes and legislations. The findings are also expected to pool the literature with more information about the resilience of healthcare utility publications. Originality/value – The topic and issues discussed in this research are original based on authors’ investigations following three major earthquakes that took place in northeast Japan. The study followed a statistical approach in addressing the inter-relationship between the utility systems post disasters to develop an innovative unique index to predict the impact of utility shortage on healthcare.
APA, Harvard, Vancouver, ISO, and other styles
11

Kelso, John, Steven G. Satterfield, Lance E. Arsenault, Peter M. Ketchan, and Ronald D. Kriz. "DIVERSE: A Framework for Building Extensible and Reconfigurable Device-Independent Virtual Environments and Distributed Asynchronous Simulations." Presence: Teleoperators and Virtual Environments 12, no. 1 (2003): 19–36. http://dx.doi.org/10.1162/105474603763835314.

Full text
Abstract:
We present DIVERSE, a highly modular collection of complimentary software packages designed to facilitate the creation of device-independent virtual environments and distributed asynchronous simulations. DIVERSE is free/open source software, containing both end-user programs and C++ application programming interfaces (APIs). DPF is the DIVERSE graphics interface to OpenGL Performer. A program using the DPF API can run without modification on platforms ranging from fully immersive systems such as CAVEs to generic desktop workstations. The DIVERSE toolkit (DTK) contains all the nongraphical components of DIVERSE, such as networking utilities, hardware device access, and navigational techniques. It introduces a software implementation of networks of replicated noncoherent shared memory. It also introduces a method that seamlessly extends hardware drivers into interprocess and Internet hardware services. We will describe the design of DIVERSE and present a specific example of how it is being used to aid researchers.
APA, Harvard, Vancouver, ISO, and other styles
12

Lisova, Ekaterina V. "Housing conditions of the population as an indicator of the region social development level." Scientific notes of the Russian academy of entrepreneurship 20, no. 1 (2021): 172–77. http://dx.doi.org/10.24182/2073-6258-2021-20-1-172-177.

Full text
Abstract:
The article is devoted to the level of social development of Russian regions. It is noted that at present the social component plays a leading role in the research of various authors. It is proposed to use the indicators published in the statistical collection "Regions of Russia. Socio-economic indicators", as well as regional housing and utilities tariffs for assessing the level of social development. A method for determining the integral indicator of social development has been developed, and a computer program has been created to determine its numerical values based on indicators of housing conditions of the population. The calculated generalized indicators for the regions of the Central Federal District are given. Conclusions are drawn about the possibility of using the obtained data to assess and predict the level of social regional development.
APA, Harvard, Vancouver, ISO, and other styles
13

Mancini, Francesco, Gianluigi Lo Basso, and Livio de Santoli. "Energy Use in Residential Buildings: Impact of Building Automation Control Systems on Energy Performance and Flexibility." Energies 12, no. 15 (2019): 2896. http://dx.doi.org/10.3390/en12152896.

Full text
Abstract:
This work shows the results of a research activity aimed at characterizing the energy habits of Italian residential users. In detail, by the energy simulation of a buildings sample, the opportunity to implement a demand/response program (DR) has been investigated. Italian residential utilities are poorly electrified and flexible loads are low. The presence of an automation system is an essential requirement for participating in a DR program and, in addition, it can allow important reductions in energy consumption. In this work the characteristics of three control systems have been defined, based on the services incidence on energy consumptions along with a sensitivity analysis on some energy drivers. Using the procedure established by the European Standard EN 15232, the achievable energy and economic savings have been evaluated. Finally, a financial analysis of the investments has been carried out, considering also the incentives provided by the Italian regulations. The payback time is generally not very long: depending on the control system features it varies from 7 to 10 years; moreover, the automation system installation within dwellings is a relatively simple activity, which is characterized by a limited execution times and by an initial expenditure ranging in 1000 € to 4000 €, related to the three sample systems.
APA, Harvard, Vancouver, ISO, and other styles
14

Singh, Asheesh, Pradeep Singour, and Parul Singh. "Computer system validation in the perspective of the pharmaceutical industry." Journal of Drug Delivery and Therapeutics 8, no. 6-s (2018): 359–65. http://dx.doi.org/10.22270/jddt.v8i6-s.2102.

Full text
Abstract:
Computer Systems Validation (CSV) is a process used to ensure (and document) that a computer based systems will produce information or data that meet a set of defined requirements. If a system meets these requirements, it can be assumed that it is consistently performing in the way it was intended. Quality is an imperative for customers whenever they consider a product or service. It is also important as it relates to life-saving products such as pharmaceuticals. In this regard, the Food and Drug Administration introduced good manufacturing practice (GMP) to maintain and improve the quality of pharmaceutical products. GMP ensures that products are consistently produced and controlled according to the quality standards appropriate to the intended use and as required by the marketing authorization. One of the major GMP requirements is that all of the critical manufacturing equipment, utilities, and facilities in the pharmaceutical industries must be properly qualified and validated prior to production. Currently, this practice forms the core of the regulations that are strictly followed by pharmaceutical companies worldwide. A validation assessment program is a necessity in the pharma industry to ensure adherence to pharmaceutical cGMP guidelines, and to help companies maintain consistent quality. The same principles are applied in computer system validation to a computer system or an information technology system. It’s essential to maintain quality standards in pharma since non-conformance can have far-reaching consequences. Computer system validation checks the effectiveness and the efficiency with which the system is meeting the purpose for which it was designed. This study aims to identify needs of computer system validation of instrument/equipment practiced in the perspective of pharmaceutical industry.
 Keywords: Computer system validation, Validation, Qualification, GAMP
APA, Harvard, Vancouver, ISO, and other styles
15

Petruzzi, Alessandro, Francesco D'Auria, Tomislav Bajs, Francesc Reventos, and Yassin Hassan. "International Course to Support Nuclear Licensing by User Training in the Areas of Scaling, Uncertainty, and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes: 3D S.UN.COP Seminars." Science and Technology of Nuclear Installations 2008 (2008): 1–16. http://dx.doi.org/10.1155/2008/874023.

Full text
Abstract:
Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the “user effect” and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to the areas of the scaling, uncertainty, and 3D coupled code analysis.
APA, Harvard, Vancouver, ISO, and other styles
16

Wattanasaeng, Niroot, and Kasin Ransikarbum. "Model and Analysis of Economic- and Risk-Based Objective Optimization Problem for Plant Location within Industrial Estates Using Epsilon-Constraint Algorithms." Computation 9, no. 4 (2021): 46. http://dx.doi.org/10.3390/computation9040046.

Full text
Abstract:
In many countries, a number of industrial estates have been established to support the growth of the industrial sector, which is an essential strategy to drive economic growth. Planning for the location of industrial factories within an industrial estate, however, becomes complex, given the various types of industrial plants and the requirements of utilities to support operations within an industrial park. In this research, we model and analyze bi-objective optimization for locating plants within an industrial estate by considering economic- and risk-based cost objectives. Whereas economic objectives are associated with utility distances between plant locations, risk-based cost is a surrogate criterion derived from safety considerations. Next, risk-based data are further generated from Areal Locations of Hazardous Atmospheres (ALOHA), the hazard modeling program, and solutions to the bi-objective model are obtained from the Epsilon-constraint algorithm. Finally, the model is applied to a regional case study in a Thailand industrial estate, and the Pareto frontier is evaluated to demonstrate the trade-off between objectives.
APA, Harvard, Vancouver, ISO, and other styles
17

Adekunle, Adefemi Adeyemi, Samuel B. Adejuyigbe, and Lateef O. Mudashiru. "Improvement on Development of CAD Software for Shaft under Various Loading Conditions." Advanced Materials Research 628 (December 2012): 343–49. http://dx.doi.org/10.4028/www.scientific.net/amr.628.343.

Full text
Abstract:
In time past, designs of shafts were conservative in approach as relatively low working stresses were employed to cover up for unknown “Factors of Ignorance”. [1]. And where high stresses are involved, greater shaft dimension is employed for such design. This approach necessarily made the production cost high as quite a lot of materials are involved in such production. The primary reason for such approach is to safeguard the shaft failure in most operations. Unfortunately, the mechanism of failure mode was not well understood and because most of the applications in which shafts are employed are of great importance, which in some cases affects life negatively (i.e. involving loss of lives) and distortion of operations. Despite all these developments, designers are still faced with the problems of working with large numbers of formulas, computations and iteration procedures involved in the design of shafts. These have made the design procedures of shafts both cumbersome and rigorous, hence time consuming. This problem becomes more pronounced if the designer is interested in seeing the effects of the variation of one or more design parameter, this means he has to start again from scratch. Fortunately, these problems can be overcome if the various advantages, utilities and flexibilities which modern high – speed micro – computers offer are put into good use in the design of shafts. This study was carried out to design shaft under various loading conditions using Computer Aided Design, and results gotten proved that it saves wastage of materials, and it safes time as well A software package (or program) was developed using the formulas initially derived and a numerical procedure for computing the deflection using the double integration methods. The Programming language used was Visual Studio C#. This objective was achieved in part as the program so developed satisfactorily handle design based on strength and safety of the shaft.
APA, Harvard, Vancouver, ISO, and other styles
18

Herz, R. K., and A. T. Lipkow. "Strategic water network rehabilitation planning." Water Supply 3, no. 1-2 (2003): 35–42. http://dx.doi.org/10.2166/ws.2003.0083.

Full text
Abstract:
This paper presents the approach taken and the tools developed and advanced within the European research project CARE-W (Computer Aided REhabilitation of Water networks) for strategic rehabilitation investment planning as a complement to short-term performance monitoring and annual rehabilitation (rehab) budget allocation planning. In a first step, future rehab needs are quantified with a cohort-survival model from the present stock of assets taking into account the specific service lives of its components. Utility managers may choose in the short and medium range from many rehab options: doing more or less, sooner or later, on particular network components and with specific rehab technologies at lower or higher cost. So, in a second step, alternative medium-term rehab programs are specified and tested for their effects. The annual costs and benefits of these alternative rehab programs are forecast with the cohort-survival model beyond the rehab program period to capture the long-term effects of rehabilitating these long-lived assets. Advantages and disadvantages of alternative rehab programs are systematically compared to find out which one is most appropriate under local constraints. However, whereas the survival of network components can be forecast over very long periods with sufficient accuracy, many other characteristics of the water supply system that must be considered for finding the best network rehab strategy may take unforeseeable paths into the far future. Therefore, a scenario writing tool was developed allowing consistent scenarios for particular water utilities to be created and to test whether the alternative rehab programs are robust enough to meet all eventualities of the future. This approach is illustrated by a case study from East Germany.
APA, Harvard, Vancouver, ISO, and other styles
19

Sohn, Seung-Hyun, Gyu-Jung Cho, and Chul-Hwan Kim. "A Study on Application of Recloser Operation Algorithm for Mixed Transmission System Based on Travelling Wave Method." Energies 13, no. 10 (2020): 2610. http://dx.doi.org/10.3390/en13102610.

Full text
Abstract:
Recently, the operation of a mixed transmission system has increased due to rapid urbanization and the purpose of a good view. Therefore, a proper protection scheme for a mixed transmission system is required. Generally, when a fault occurs on a transmission line, auto reclosing is performed for the purpose of improving the continuity of service by clearing the fault and restoring the power system. However, the auto reclosing scheme should be applied to a mixed transmission system carefully because the mixed transmission system involves underground cable sections. When a fault occurs in the underground cable section, it is mostly a permanent fault. If auto reclosing is performed on a permanent fault condition, it may cause excessive overcurrent and switching surge, which can generate a serious impact on the whole transmission system and even cause an explosion. Due to this, many utilities worldwide do not allow auto reclosing or only apply it very restrictively on a mixed transmission system based on their practice. However, there is no clear guidance or standard related to auto reclosing on a mixed transmission system. Therefore, in this paper, an application of a recloser operation algorithm is proposed. Based on the proposed algorithm, reclosers can work properly and protect the transmission system. To verify the proposed algorithm, simulations were conducted using the ElectroMagnetic Transient Program (EMTP).
APA, Harvard, Vancouver, ISO, and other styles
20

Song, Jiyoung, Kyeon Hur, Jeehoon Lee, et al. "Hardware-in-the-Loop Simulation Using Real-Time Hybrid-Simulator for Dynamic Performance Test of Power Electronics Equipment in Large Power System." Energies 13, no. 15 (2020): 3955. http://dx.doi.org/10.3390/en13153955.

Full text
Abstract:
This paper presents the hardware-in-the-loop simulation for dynamic performance test (HILS-DPT) of power electronic equipment replicas using a real-time hybrid simulator (RTHS). The authors developed the procedure of HILS-DPT, and as an actual case example, the results of HILS-DPT of Static VAR Compensator (SVC) replica using RTHS is presented. RTHS is a co-simulation tool that synthesizes real-time simulator (RTS) with transient stability program to perform real-time dynamic simulation of a large power system. As power electronics applications have been increasing, the electric utilities have performed HILS-DPT of the power electronics equipment to validate the performance and investigate interactions. Because inspection tests are limited in their ability to validate its impact on the power system during various contingencies, all power electronics equipment newly installed in the Korean power system should take HILS-DPT using large-scale RTS with replicas since 2018. Although large-scaled RTS offers an accuracy improvement, it requires lots of hardware resources, time, and effort to model and simulate the equipment and power systems. Therefore, the authors performed SVC HILS-DPT using RTHS, and the result of the first practical application of RTHS present feasibility comparing the result of HILS-DPT using large-scale RTS. The authors will discuss the test results and share lessons learned from the industrial experience of HILS-DPT using RTHS.
APA, Harvard, Vancouver, ISO, and other styles
21

P. Monteiro, Flávia, Suzane A. Monteiro, Maria E. Tostes, and Ubiratan H. Bezerra. "Using True RMS Current Measurements to Estimate Harmonic Impacts of Multiple Nonlinear Loads in Electric Distribution Grids." Energies 12, no. 21 (2019): 4132. http://dx.doi.org/10.3390/en12214132.

Full text
Abstract:
Currently, for analyzing harmonic impacts on voltage at a point of interest, due to multiple nonlinear loads, the literature recommends carrying out simultaneous and synchronized measurement campaigns in all suspicious points with the use of high cost energy quality analyzers that are usually not available at the customers’ facilities and very often also not at the electric utilities. To overcome this drawback this paper proposes a method of assessing the harmonic impact due to multiple nonlinear loads on the total voltage harmonic distortion using only the load current true RMS values which are already available in all customers’ installations. The proposed methodology is based on Regression Tree technique using the Permutation Importance indicator which is validated in two case studies using two different electrical systems. The first case study is to ratify the use of Permutation Importance to measure the impact factor of each nonlinear load in a controlled scenario, the IEEE-13 bus test system, using ATP simulation (Alternative Transient Program). The second is to apply the methodology to a real system, an Advanced Measurement Infrastructure System (AMI) implanted on a campus of a Brazilian University, using low cost meters with only true RMS current measurements. The results achieved demonstrated the feasibility of applying the proposed methodology in real electric systems without the need for additional investments in high-cost energy quality analyzers.
APA, Harvard, Vancouver, ISO, and other styles
22

Zinovjev, Kirill, and Marc W. van der Kamp. "Enlighten2: molecular dynamics simulations of protein–ligand systems made accessible." Bioinformatics 36, no. 20 (2020): 5104–6. http://dx.doi.org/10.1093/bioinformatics/btaa643.

Full text
Abstract:
Abstract Motivation Experimental structural data can allow detailed insight into protein structure and protein–ligand interactions, which is crucial for many areas of bioscience, including drug design and enzyme engineering. Typically, however, little more than a static picture of protein–ligand interactions is obtained, whereas dynamical information is often required for deeper understanding and to assess the effect of mutations. Molecular dynamics (MD) simulations can provide such information, but setting up and running these simulations is not straightforward and requires expert knowledge. There is thus a need for a tool that makes protein–ligand simulation easily accessible to non-expert users. Results We present Enlighten2: efficient simulation protocols for protein–ligand systems alongside a user-friendly plugin to the popular visualization program PyMOL. With Enlighten2, non-expert users can straightforwardly run and visualize MD simulations on protein–ligand models of interest. There is no need to learn new programs and all underlying tools are free and open source. Availability and implementation The Enlighten2 Python package and PyMOL plugin are free to use under the GPL3.0 licence and can be found at https://enlighten2.github.io. We also provide a lightweight Docker image via DockerHub that includes Enlighten2 with all the required utilities.
APA, Harvard, Vancouver, ISO, and other styles
23

Esselman, W. H., and G. Z. Ben-Yaacov. "EPRI-developed computer programs for electric utilities." IEEE Computer Applications in Power 1, no. 2 (1988): 18–24. http://dx.doi.org/10.1109/67.908.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Sirisuraphong, Saran, Sunee Duangnet, and Chalida U-Tapao. "A Study of the Technique for Assessment of Locations and Conditions of the Underground Pipeline: Case Study for Nakhon Pathom Province, Thailand." Applied Mechanics and Materials 897 (April 2020): 231–38. http://dx.doi.org/10.4028/www.scientific.net/amm.897.231.

Full text
Abstract:
Underground pipeline systems in Thailand have been used for a long time. Lack of maintenances and longtime using cause a lot of damages. The water distribution systems are affected by the quality of water from contaminations and the quantity of water from leaking. As well as the sewer systems may flow through and contaminate the groundwater. Due to the long period of time of using underground pipeline systems, the verification of the underground location is need before testing the conditions of the systems with undisturbed area methods. This study focuses on the collection of the underground utilities locating methods and the conditions assessment methods. We also apply one suitable method for our case study area which is Nakhon Pathom province, Thailand. The results show the collection of methods to verify the underground location and conditions systems as well as the advantages and disadvantages of each method. Results also describe about conventional CCTV with push-rod camera for assessing underground pipeline systems for Nakhon Pathom province and the locations verification is fitted with ground penetrating radar, infrared thermography and acoustic location system. The design of a mobile robot to be flexible according to the pipe size and the creation of a computer program that can analyze the most suitable methods for examining buried pipes is a good option for further expansion in the future.
APA, Harvard, Vancouver, ISO, and other styles
25

Yi, Nengjun, and Shizhong Xu. "Bayesian Mapping of Quantitative Trait Loci for Complex Binary Traits." Genetics 155, no. 3 (2000): 1391–403. http://dx.doi.org/10.1093/genetics/155.3.1391.

Full text
Abstract:
AbstractA complex binary trait is a character that has a dichotomous expression but with a polygenic genetic background. Mapping quantitative trait loci (QTL) for such traits is difficult because of the discrete nature and the reduced variation in the phenotypic distribution. Bayesian statistics are proved to be a powerful tool for solving complicated genetic problems, such as multiple QTL with nonadditive effects, and have been successfully applied to QTL mapping for continuous traits. In this study, we show that Bayesian statistics are particularly useful for mapping QTL for complex binary traits. We model the binary trait under the classical threshold model of quantitative genetics. The Bayesian mapping statistics are developed on the basis of the idea of data augmentation. This treatment allows an easy way to generate the value of a hypothetical underlying variable (called the liability) and a threshold, which in turn allow the use of existing Bayesian statistics. The reversible jump Markov chain Monte Carlo algorithm is used to simulate the posterior samples of all unknowns, including the number of QTL, the locations and effects of identified QTL, genotypes of each individual at both the QTL and markers, and eventually the liability of each individual. The Bayesian mapping ends with an estimation of the joint posterior distribution of the number of QTL and the locations and effects of the identified QTL. Utilities of the method are demonstrated using a simulated outbred full-sib family. A computer program written in FORTRAN language is freely available on request.
APA, Harvard, Vancouver, ISO, and other styles
26

Callahan, T., W. B. Gleason, and T. P. Lybrand. "PAP: a protein analysis package." Journal of Applied Crystallography 23, no. 5 (1990): 434–36. http://dx.doi.org/10.1107/s0021889890004228.

Full text
Abstract:
A program package has been assembled for the analysis of protein coordinates which are in the Brookhaven Protein Data Bank (PDB) format. These programs can be used to make two types of φ–ψ plots: a Ramachandran-style scatter plot, and a plot of φ and ψ values as a function of the linear sequence. Programs are also available for the display of distance diagonal plots for proteins. Two protein structures can be compared and the resulting r.m.s. differences in the structures plotted as a function of sequence. Temperature factors can be analyzed and plotted as a function of the linear sequence. In addition, various utilities are supplied for splitting PDB files which contain multiple subunits into individual files and also for renumbering PDB files. A utility is also provided for converting Amber-style PDB files into standard PDB files. Priestle's program RIBBON [J. Appl. Cryst. (1988), 21, 572–576] has been converted to run in a stand-alone mode with interactive rotation of the three-dimensional ribbon picture. Programs are Silicon Graphics four-dimensional level and have been tested on 4D70/GT and personal Iris workstations, although programs which give Postscript output have been converted to run on Digital Equipment Corporation VAX computers and Sun workstations.
APA, Harvard, Vancouver, ISO, and other styles
27

Hassan, Roshidi, and Megat Zuhairy Megat Tajuddin. "Bridging the digital divide : effort towards establishing the real impact of information technology to the rural community." Social and Management Research Journal 7, no. 1 (2010): 23. http://dx.doi.org/10.24191/smrj.v7i1.5184.

Full text
Abstract:
The Bridging Digital Divide (BDD) Program initiated by the Malaysian government in 1999 has successfully introduced the Information Technology (IT) to the rural communities particularly in Malaysia. The computers and internet connections are available almost at every part ofMalaysia including those in the remote areas. However; to have a real impact of technology on people slives, the effort has to go beyond the infrastructures and technology literacy programs and awareness. Having information available suit to the needs of the local community will bring greater impact ofthe technology to them. Nevertheless, the new approaches need to be introduced to make the effort less costly. Some of the Digital Divide programs require proper coordination with other projects. Coordination with the responsible agencies in providing power utilities and road infrastructures is essential to make such expensive digital divide program more cost effective·and give greater impact to the community. Thus, this paper will provide numerous suggestions on the improvement in providing greater benefits; both the users and the project implementers. Besides, this paper will also discuss on the need of local content in local language with self-sustained nature. lt further highlights the importanceof having basic infrastructure such as electricity and road accessibility that may have influence over the success of BDD program.
APA, Harvard, Vancouver, ISO, and other styles
28

Aronesty, Erik. "Comparison of Sequencing Utility Programs." Open Bioinformatics Journal 7, no. 1 (2013): 1–8. http://dx.doi.org/10.2174/1875036201307010001.

Full text
Abstract:
High throughput sequencing (HTS) has resulted in extreme growth rates of sequencing data. At our lab, we generate terabytes of data every day. It is usually seen as required for data output to be “cleaned” and processed in various ways prior to use for common tasks such as variant calling, expression quantification and assembly. Two common tasks associated with HTS are adapter trimming and paired-end joining. I have developed two tools at Expression Analysis, Inc. to address these common tasks. The names of these programs are fastq-mcf and fastq-join. I compared the performance of these tools to similar open-source utilities, both in terms of resource efficiency, and effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
29

Lopez-Lezcano, Fernando. "Searching for the GRAIL." Computer Music Journal 40, no. 4 (2016): 91–103. http://dx.doi.org/10.1162/comj_a_00393.

Full text
Abstract:
This article describes a quest for the GRAIL (Giant Radial Array for Immersive Listening), a large-scale loudspeaker system with related hardware and software control equipment. The GRAIL was developed at the Center for Computer Research in Music and Acoustics (CCRMA) at Stanford University, evolving from the need for optimal sound quality in our multichannel concerts. It is also used for teaching and research. The current GRAIL is one step in an ongoing evolutionary process, characterized by the use of off-the-shelf hardware components and custom software–based on free software languages and libraries. While developing our software, we have, as much as possible, aimed to take advantage of existing programs and utilities.
APA, Harvard, Vancouver, ISO, and other styles
30

Bo Hu and Alex Z. Fu. "Predicting Utility for Joint Health States: A General Framework and a New Nonparametric Estimator." Medical Decision Making 30, no. 5 (2010): E29—E39. http://dx.doi.org/10.1177/0272989x10374508.

Full text
Abstract:
Measuring utility is important in clinical decision making and cost-effectiveness analysis because utilities are often used to compute quality-adjusted life expectancy, a metric used in measuring the effectiveness of health care programs and medical interventions. Predicting utility for joint health states has become an increasingly valuable research topic because of the aging of the population and the increasing prevalence of comorbidities. Although multiplicative, minimum, and additive estimators are commonly used in practice, research has shown that they are all biased. In this study, the authors propose a general framework for predicting utility for joint health states. This framework includes these 3 nonparametric estimators as special cases. A new simple nonparametric estimator, the adjusted decrement estimator, [Uij = Umin - Umin (1 - Ui )(1 - Uj )], is introduced under the proposed framework. When applied to 2 independent data sources, the new nonparametric estimator not only generated unbiased prediction of utilities for joint health states but also had the least root mean squared error and highest concordance when compared with other nonparametric and parametric estimators. Further research and validation of this new estimator are needed.
APA, Harvard, Vancouver, ISO, and other styles
31

Yazgan, K. "RE-NUM-OR: Python-based renumbering and reordering software for pedigree files." Czech Journal of Animal Science 63, No. 2 (2018): 70–77. http://dx.doi.org/10.17221/64/2017-cjas.

Full text
Abstract:
RE-NUM-OR is a new, flexible, and user-friendly renumbering and reordering data arrangement software for unprocessed pedigree files used in genetic evaluation systems for personal computers. RE-NUM-OR was written using Python (Ver. 2.7.13) 64 byte programming language and compiled with PyInstaller-3.2.1 software which is a set of utilities for freezing Python scripts into executable. RE-NUM-OR not only runs on 32 or 64 byte MS Windows but also runs on 64 byte GNU/Linux. The program has new, practical, and simple interface and the user does not need to create a parameter file for running the program and .txt, .xlsx or .et extension files can be used as input files directly. Output files (.txt, .xlsx, .et, .dat) can be obtained. For animal, sire and dam IDs, it can read all standard characters (ASCII codes from 32 to 126) in input files. The program supports both dot (.) and comma (,) for numerical data. Pedigree lines for parents do not need to be arranged prior to their progeny. If pedigree lines for parents follow their progeny, the program can detect this and it can reorder the animals as pedigree lines placing parents prior to their progeny. Another feature of this software is that it has a pop-up window with error notification. Also, it supports repeated observations. RE-NUM-OR executable, its user manual, and sample input files are available from www.kemalyazgan.com.tr.
APA, Harvard, Vancouver, ISO, and other styles
32

Piera-Jiménez, Jordi, Anne Etzelmueller, Spyros Kolovos, Frans Folkvord, and Francisco Lupiáñez-Villanueva. "Guided Internet-Based Cognitive Behavioral Therapy for Depression: Implementation Cost-Effectiveness Study." Journal of Medical Internet Research 23, no. 5 (2021): e27410. http://dx.doi.org/10.2196/27410.

Full text
Abstract:
Background Major depressive disorder is a chronic condition; its prevalence is expected to grow with the aging trend of high-income countries. Internet-based cognitive-behavioral therapy has proven efficacy in treating major depressive disorder. Objective The objective of this study was to assess the cost-effectiveness of implementing a community internet-based cognitive behavioral therapy intervention (Super@, the Spanish program for the MasterMind project) for treating major depressive disorder. Methods The cost-effectiveness of the Super@ program was assessed with the Monitoring and Assessment Framework for the European Innovation Partnership on Active and Healthy Ageing tool, using a 3-state Markov model. Data from the cost and effectiveness of the intervention were prospectively collected from the implementation of the program by a health care provider in Badalona, Spain; the corresponding data for usual care were gathered from the literature. The health states, transition probabilities, and utilities were computed using Patient Health Questionnaire–9 scores. Results The analysis was performed using data from 229 participants using the Super@ program. Results showed that the intervention was more costly than usual care; the discounted (3%) and nondiscounted incremental cost-effectiveness ratios were €29,367 and €26,484 per quality-adjusted life-year, respectively (approximately US $35,299 and $31,833, respectively). The intervention was cost-effective based on the €30,000 willingness-to-pay threshold typically applied in Spain (equivalent to approximately $36,060). According to the deterministic sensitivity analyses, the potential reduction of costs associated with intervention scale-up would reduce the incremental cost-effectiveness ratio of the intervention, although it remained more costly than usual care. A discount in the incremental effects up to 5% exceeded the willingness-to-pay threshold of €30,000. Conclusions The Super@ program, an internet-based cognitive behavioral therapy intervention for treating major depressive disorder, cost more than treatment as usual. Nevertheless, its implementation in Spain would be cost-effective from health care and societal perspectives, given the willingness-to-pay threshold of €30,000 compared with treatment as usual.
APA, Harvard, Vancouver, ISO, and other styles
33

Velázquez-Libera, José Luis, Fabio Durán-Verdugo, Alejandro Valdés-Jiménez, Gabriel Núñez-Vivanco, and Julio Caballero. "LigRMSD: a web server for automatic structure matching and RMSD calculations among identical and similar compounds in protein-ligand docking." Bioinformatics 36, no. 9 (2020): 2912–14. http://dx.doi.org/10.1093/bioinformatics/btaa018.

Full text
Abstract:
Abstract Motivation Root mean square deviation (RMSD) is one of the most useful and straightforward features for structural comparison between different conformations of the same molecule. Commonly, protein-ligand docking programs have included some utilities that allow the calculation of this value; however, they only work efficiently when exists a complete atom label equivalence between the evaluated conformations. Results We present LigRMSD, a free web-server for the automatic matching and RMSD calculations among identical or similar chemical compounds. This server allows the user to submit only a pair of identical or similar molecules or dataset of similar compounds to compare their three-dimensional conformations. Availability and implementation LigRMSD can be freely accessed at https://ligrmsd.appsbio.utalca.cl. Supplementary information Supplementary data are available at Bioinformatics online.
APA, Harvard, Vancouver, ISO, and other styles
34

Garrido-Zafra, Joaquín, Antonio Moreno-Munoz, Aurora Gil-de-Castro, Emilio J. Palacios-Garcia, Carlos D. Moreno-Moreno, and Tomás Morales-Leal. "A Novel Direct Load Control Testbed for Smart Appliances." Energies 12, no. 17 (2019): 3336. http://dx.doi.org/10.3390/en12173336.

Full text
Abstract:
The effort to continuously improve and innovate smart appliances (SA) energy management requires an experimental research and development environment which integrates widely differing tools and resources seamlessly. To this end, this paper proposes a novel Direct Load Control (DLC) testbed, aiming to conveniently support the research community, as well as analyzing and comparing their designs in a laboratory environment. Based on the LabVIEW computing platform, this original testbed enables access to knowledge of major components such as online weather forecasting information, distributed energy resources (e.g., energy storage, solar photovoltaic), dynamic electricity tariff from utilities and demand response (DR) providers together with different mathematical optimization features given by General Algebraic Modelling System (GAMS). This intercommunication is possible thanks to the different applications programming interfaces (API) incorporated into the system and to intermediate agents specially developed for this case. Different basic case studies have been presented to envision the possibilities of this system in the future and more complex scenarios, to actively support the DLC strategies. These measures will offer enough flexibility to minimize the impact on user comfort combined with support for multiple DR programs. Thus, given the successful results, this platform can lead to a solution towards more efficient use of energy in the residential environment.
APA, Harvard, Vancouver, ISO, and other styles
35

BROADIE, MARK, and WEIWEI SHEN. "HIGH-DIMENSIONAL PORTFOLIO OPTIMIZATION WITH TRANSACTION COSTS." International Journal of Theoretical and Applied Finance 19, no. 04 (2016): 1650025. http://dx.doi.org/10.1142/s0219024916500254.

Full text
Abstract:
This paper studies Merton’s portfolio optimization problem with proportional transaction costs in a discrete-time finite horizon. Facing short-sale and borrowing constraints, investors have access to a risk-free asset and multiple risky assets whose returns follow a multivariate geometric Brownian motion. Lower and upper bounds for optimal solutions up to the problem with 20 risky assets and 40 investment periods are computed. Three lower bounds are proposed: the value function optimization (VF), the hyper-sphere and the hyper-cube policy parameterizations (HS and HC). VF attacks the conundrums in traditional value function iteration for high-dimensional dynamic programs with continuous decision and state spaces. HS and HC respectively approximate the geometry of the trading policy in the high-dimensional state space by two surfaces. To evaluate lower bounds, two new upper bounds are provided via a duality method based on a new auxiliary problem (OMG and OMG2). Compared with existing methods across various suites of parameters, new methods lucidly show superiority. The three lower bound methods always achieve higher utilities, HS and HC cut run times by a factor of 100, and OMG and OMG2 mostly provide tighter upper bounds. In addition, how the no-trading region characterizing the optimal policy deforms when short-sale and borrowing constraints bind is investigated.
APA, Harvard, Vancouver, ISO, and other styles
36

Cruickshank, Robert, Gregor Henze, Rajagopalan Balaji, Bri-Mathias Hodge, and Anthony Florita. "Quantifying the Opportunity Limits of Automatic Residential Electric Load Shaping." Energies 12, no. 17 (2019): 3204. http://dx.doi.org/10.3390/en12173204.

Full text
Abstract:
Electric utility residential demand response programs typically reduce load a few times a year during periods of peak energy use. In the future, utilities and consumers may monetarily and environmentally benefit from continuously shaping load by alternatively encouraging or discouraging the use of electricity. One way to shape load and introduce elasticity is to broadcast forecasts of dynamic electricity prices that orchestrate electricity supply and demand in order to maximize the efficiency of conventional generation and the use of renewable resources including wind and solar energy. A binary control algorithm that influences the on and off states of end uses was developed and applied to empirical time series data to estimate price-based instantaneous opportunities for shedding and adding electric load. To overcome the limitations of traditional stochastic methods in quantifying diverse, non-Gaussian, non-stationary distributions of observed appliance behaviour, recent developments in wavelet-based analysis were applied to capture and simulate time-frequency domain behaviour. The performance of autoregressive and spectral reconstruction methods was compared, with phase reconstruction providing the best simulation ensembles. Results show spatiotemporal differences in the amount of load that can be shed and added, which suggest further investigation is warranted in estimating the benefits anticipated from the wide-scale deployment of continuous automatic residential load shaping. Empirical data and documented software code are included to assist in reproducing and extending this work.
APA, Harvard, Vancouver, ISO, and other styles
37

Yanine, Fernando, Antonio Sánchez-Squella, Aldo Barrueto, Antonio Parejo, Felisa Cordova, and Hans Rother. "Grid-Tied Distributed Generation Systems to Sustain the Smart Grid Transformation: Tariff Analysis and Generation Sharing." Energies 13, no. 5 (2020): 1187. http://dx.doi.org/10.3390/en13051187.

Full text
Abstract:
In this paper a novel model is being proposed and considered by ENEL—the largest electric utility in Chile—and analyzed thoroughly, whereby electric power control and energy management for a 60-apartments’ residential building is presented as an example of the utility’s green energy program, part of its Smart Grid Transformation plan to install grid-tied distributed generation (DG) systems, namely microgrids, with solar generation and energy storage in Santiago, Chile. The particular tariffs scheme analysis shown is part of the overall projected tentative benefits of adopting the new scheme, which will require the utility’s customers to adapt their consumption behavior to the limited supply of renewable energy by changing energy consumption habits and schedules in a way that maximizes the capacity and efficiency of the grid-tied microgrid with energy storage. The change in behavior entails rescheduling power consumption to hours where the energy supply capacity in the DG system is higher and price is lower as well as curtailing their power needs in certain hourly blocks so as to maximize DG system’s efficiency and supply capacity. Nevertheless, the latter presents a problem under the perspective of ENEL’s renewable energy sources (RES) integration plan with the electric utility’s grid supply, which, up until now and due to current electric tariffs law, has not had a clear solution. Under said scenario, a set of strategies based on energy homeostasis principles for the coordination and control of the electricity supply versus customers’ demand has been devised and tested. These strategies which consider various scenarios to conform to grid flexibility requirements by ENEL, have been adapted for the specific needs of these types of customers while considering the particular infrastructure of the network. Thus, the microgrid adjusts itself to the grid in order to complement the grid supply while seeking to maximize green supply capacity and operational efficiency, wherein the different energy users and their energy consumption profiles play a crucial role as “active loads”, being able to respond and adapt to the needs of the grid-connected microgrid while enjoying economic benefits. Simulation results are presented under different tariff options, system’s capacity and energy storage alternatives, in order to compare the proposed strategies with the actual case of traditional grid’s electricity distribution service, where no green energy is present. The results show the advantage of the proposed tariffs scheme, along with power control and energy management strategies for the integration of distributed power generation within ENEL’s Smart Grid Transformation in Chile.
APA, Harvard, Vancouver, ISO, and other styles
38

Zhang, Xiao Juan, Zhenzhen Li, and Hepu Deng. "Information security behaviors of smartphone users in China: an empirical analysis." Electronic Library 35, no. 6 (2017): 1177–90. http://dx.doi.org/10.1108/el-09-2016-0183.

Full text
Abstract:
Purpose Understanding user behavior is increasingly critical for information security in the use of smartphones. There is, however, lack of empirical studies about the behavior of smartphone users for information security in China. The purpose of this paper is to present an empirical analysis of the behavior of smartphone users in China in relation to information security. Design/methodology/approach A review of the related literature is conducted, leading to the development of a questionnaire for investigating the behavior of smartphone users. An online survey of the smartphone users in China is conducted. The collected data are analyzed with the use of descriptive analysis and Pearson’s chi-square test to better understand the behavior of smartphone users on information security. Findings The paper shows that there are serious concerns about information security in the use of smartphones in China including the ignorance of security information in downloading and using applications, inadequate phone settings, inappropriate enabling of add-on utilities and lack of proper disaster recovery plans. The study also reveals that there is a significant difference between different groups of users on information security in smartphone use. Research limitations/implications This paper is based on a purposeful sample of smartphone users in China. It is exploratory in nature. Practical implications The paper can lead to a better understanding of the behavior of smartphone users and information security in China and provide relevant government departments and institutions with useful information for developing appropriate strategies and policies and designing specific training programs to improve information security in the smartphone use. Originality/value This paper is the first of this kind to collect quantitative data from users in China for better understanding the behavior of smartphone users on information security. It provides insight towards the adoption of various measures for information security from the perspective of smartphone users in China.
APA, Harvard, Vancouver, ISO, and other styles
39

Chau, Bonny, Marita Zimmermann, Michael Wagner, and Lee D. Cranmer. "Cost-utility analysis of surveillance strategies for soft tissue sarcoma recurrence." Journal of Clinical Oncology 38, no. 15_suppl (2020): 11565. http://dx.doi.org/10.1200/jco.2020.38.15_suppl.11565.

Full text
Abstract:
11565 Background: Soft tissue sarcomas (STS) are uncommon malignancies with significant biological and clinical variation. After primary curative therapy, patients enter a program of surveillance for recurrence with periodic chest x-rays (CXR) or computed tomography (CT). We compared costs and health outcomes of CXR and CT at a range of frequencies and durations of follow-up. Methods: We used a Markov model to simulate a cohort of 10,000 STS patients through their surveillance experience for lung metastasis after completion of definitive treatment for stage II or III primary disease. Health states in the model were no evidence of disease, recurrence, and death. We assessed the method of chest imaging, duration of follow-up, and interval (every 3 months or 6 months) of surveillance in the first 3 years. Cost effectiveness was assessed for each screening modality and screening frequency. Recurrence probabilities, utilities, treatment costs, and other parameters were from previously published data. Outcomes were costs, quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratios (ICER). Results: The initial evaluation comparing screening every 3 months for 3 years, every 6 months for years 4 and 5, and annually from years 6 to 10 resulted in 632,264 QALYs and $1,038,351,481 costs for CT, compared to 631,834 QALYs and $746,019,937 for CXR, resulting in an ICER of $679,322/QALY. In comparing screening intervals, less frequent screening intervals of every 6 months compared to every 3 months for the first three years using CT resulted in an ICER of $690,527/QALY and for CXR, the ICER was $271,423/QALY. In comparing screening duration between 6 years and 10 years of follow-up, strategies with longer follow-up resulted in slightly higher QALYs in each of the comparison scenarios, at a much higher cost. Conclusions: In our evaluation, more frequent screening the first 3 years and longer duration of surveillance resulted in higher QALYs in both screening modalities. However, in the comparisons the ICERs exceed common willingness to pay thresholds of $150,000/QALY gained; CXR is the more cost-effective imaging modality. Limitation of this model includes the simplification of disease progression and heterogeneity in STS.
APA, Harvard, Vancouver, ISO, and other styles
40

Roberson, Janie, Allison Wrenn, John Poole, Andrew Jaeger, and Isam A. Eltoum. "Constructing a modern cytology laboratory: A toolkit for planning and design." CytoJournal 10 (February 28, 2013): 3. http://dx.doi.org/10.4103/1742-6413.107983.

Full text
Abstract:
Introduction: Constructing or renovating a laboratory can be both challenging and rewarding. UAB Cytology (UAB CY) recently undertook a project to relocate from a building constructed in 1928 to new space. UAB CY is part of an academic center that provides service to a large set of patients, support training of one cytotechnology program and one cytopathology fellowship training program and involve actively in research and scholarly activity. Our objectives were to provide a safe, aesthetically pleasing space and gain efficiencies through lean processes. Methods: The phases of any laboratory design project are Planning, Schematic Design (SD), Design Development (DD), Construction Documents (CD) and Construction. Lab personnel are most critical in the Planning phase. During this time stakeholders, relationships, budget, square footage and equipment were identified. Equipment lists, including what would be relocated, purchased new and projected for future growth ensure that utilities were matched to expected need. A chemical inventory was prepared and adequate storage space was planned. Regulatory and safety requirements were discussed. Tours and high level process flow diagrams helped architects and engineers understand the laboratory daily work. Future needs were addressed through a questionnaire which identified potential areas of growth and technological change. Throughout the project, decisions were driven by data from the planning phase. During the SD phase, objective information from the first phase was used by architects and planners to create a general floor plan. This was the basis of a series of meetings to brainstorm and suggest modifications. DD brings more detail to the plans with engineering, casework, equipment specifics, finishes. Design changes should be completed at this phase. The next phase, CD took the project from the lab purview into purely technical mode. Construction documents were used by the contractor for the bidding process and ultimately the Construction phase. Results: The project fitted out a total of 9,000 square feet; 4,000 laboratory and 5,000 office/support. Lab space includes areas for Prep, CT screening, sign out and Imaging. Adjacent space houses faculty offices and conferencing facilities. Transportation time was reduced (waste removal) by a Pneumatic Tube System, specimen drop window to Prep Lab and a pass thru window to the screening area. Open screening and prep areas allow visual management control. Efficiencies were gained by ergonomically placing CT Manual and Imaging microscopes and computers in close proximity, also facilitating a paperless workflow for additional savings. Logistically, closer proximity to Surgical Pathology maximized the natural synergies between the areas. Conclusions: Lab construction should be a systematic process based on sound principles for safety, high quality testing, and finance. Our detailed planning and design process can be a model for others undertaking similar projects
APA, Harvard, Vancouver, ISO, and other styles
41

Vītola, Alise, Iveta Baltiņa, Liena Ādamsone, Ilze Judrupa, and Maija Šenfelde. "DEVELOPMENT OF RURAL TERRITORIES IN LATVIA IMPLEMENTING TELEWORK." Via Latgalica, no. 5 (December 31, 2013): 142. http://dx.doi.org/10.17770/latg2013.5.1649.

Full text
Abstract:
Population decline is taking place in rural areas in Latvia as well as in rural areas in Europe. There is a question of utmost importance - will people choose to live in the rural area doing remote work or will they choose the job in the towns. Increased pace of population declining is forecasted in the event of steady decreasing working places and services. Growing service costs per inhabitant may infl uence lowering of accessibility of some services in the territory. Till nowadays measurements of telework potential have been made in the national and regional level. There is shortage of these measurements in different municipalities. The purpose of this article is to study the attitude of Latvian people with regard to telework adoption in two municipalities: Limbazi and Balvi in Latvia. Scientists indicate a positive effect of the information and communication technology (ICT) on the local economy if it is integrated into rural economy relating with needs of entrepreneurs and inhabitants (Grimes, 2000). Scientifi c methods of qualitative analysis of documents and such methods of quantitative analysis like statistical data and analyses of questionnaires have been used there. Observation and questionnaires were used in conjunction with the literature to develop an understanding of the infl uencing issues. Questionnaires ensure information about frequency of telework, willingness to do remote work and about benefi ts and barriers of teleworking in the rural areas and towns. The research reveals development possibilities of rural territories relying on higher involvement of ICT and knowledge economy. The main fi ndings are revealing signifi cant challenges faced by rural territories in a globalized world as the number of jobs in agriculture and public services is decreasing. It is important to develop not only agricultural activities but to provide other kinds of entrepreneurship with jobs physically (providing transport possibilities) or virtually (providing ICT). Results. 81% percent of the responding persons positively evaluate opportunity of telework. It reveals willingness of the people to do remote work as an alternative form of the existing work form. The largest interest (30%) about telework was shown in the age group of 31 – 40 in Limbazi municipality but the largest interest (26%) about telework in Balvi municipality was shown in the age group of 18 – 30. A significant percentage of the responding persons, 82% have the necessary computer abilities and knowledge on this topic. The majority of respondents is willing and is able to do telework taking into account the specifics of telework. The distribution of responses reveals that (52%) respondents are willing to use the premises of telecentres. In result the clients of the telecentres are not obliged to invest money into personal computers, multifunctional equipment and into different computer programs at home. When teleworking is offered, 77% of the respondents are interested to work from home but 14% prefer to work in a telecentre. When analysing the respondents’ answers about available services in the telecentre they indicate the most important services for them: copying, printing, scenery, accessibility of computer and the Internet-equipped working place. They are also interested in socializing and networking activities. They are also interested to receive some advice about entrepreneurship, job vacancies etc. Integration of telecentres in the territory helps to save commuting time. This is important for 52% percent of the responding persons. 34% percent of the responding persons would be able to save from 30 min to 1 hour of commuting time, 34% of the responding persons would be able to save more than 1 hour, 21% of the responding persons would be able to save less than 1/2 hour. Respondents indicate the economical and personal benefits as the main benefits of telework. 56% of the respondents indicate personal benefits to be the most important. They have more time for family, flexible working time. However, 39% of the respondents emphasised economic benefits as the most important, for example, less transport costs. Conclusions. 1. Rural territories face significant challenges in a globalized world as the number of jobs in agriculture and public services is decreasing. At the same time, information and communication technologies, as well as changes in professional duties allow the community from these regions to participate in the knowledge economy. The importance of virtual accessibility will grow when the costs of energy resources and transport are rising. Telework ensure possibility to involve disabled people in the labour market. 2. Involvement of remote work could improve accessibility of jobs in towns for people living in the rural areas using ICT and in such way partly or completely resolving internal and external migration problems. It brings benefits for municipalities, inhabitants and for entrepreneurs as well. Implementation of telework and telecentre in the territory can improve the assessment of a definite region from the working age people. It becomes more attractive place for living and staying there. 3. There are direct and indirect benefits from telework and telecentres as follows: reduction of expenditures like reduction of fuel and fuel expenditures, expenditures of car parking in the city or transport expenditures, the reduction of commuting time. It influences opportunity for cost reduction and growth of productivity if individual uses the saved time alternatively and productively. There are social benefits too, e.g. elastic working time, de-reutilization of work, reduction of external effects, e.g. the reduction of noise and stress in the office, increase in mentoring opportunities, more time for family and friends, hobbies, improved work/life balance, the possibility of living in rural areas while retaining challenging jobs in the knowledge economy traditionally linked to metropolis etc. 4. Participation by community members would increase through the use of telecentres. The main factors are the development of ICT and its infrastructure, wider use of ICT, changes in professional duties allow the community from these regions to participate in the knowledge economy. Promoting factor for entrepreneurs is cost saving, lowering of costs as follows: furniture purchase, ICT, programmes, public utilities payment, staff training and rent. Telework approach helps company to attract good, high motivated staff even with better qualification.
APA, Harvard, Vancouver, ISO, and other styles
42

Feng, Xiaoying, and Linda Behar-Horenstein. "Maximizing NVivo Utilities to Analyze Open-Ended Responses." Qualitative Report, March 17, 2019. http://dx.doi.org/10.46743/2160-3715/2019.3692.

Full text
Abstract:
Open-ended responses are widely used to explore and understand participants’ experiences and perspectives in a variety of fields. As one of the most powerful computer-assisted qualitative data analysis software, NVivo allows researchers to analyze open-ended responses to survey and/or interview questions, as well as other text data like reflective writing, image, and videos. The purpose of this paper is to describe and demonstrate how the NVivo word frequency, text search, and matrix coding features can be used to analyze qualitative data from a longitudinal evaluation project. The authors show how the matrix coding feature maximizes NVivo utilities in an analysis of open-ended responses and highlights differences across and within participants’ groups. The authors explain this approach by presenting a step by step overview: data cleaning and case coding; data import; word frequency analysis; text coding and reference extracting; and matrix coding and inductive analysis. Using this approach, the Clinical Translational Science Institute (CTSI) evaluation team acquired deeper insight into the participants’ experiences and perspectives about CTSI programs and received insights that may lead to improvement. From a methodological perspective, this approach capitalizes on NVivo’s features to mine qualitative data. The methodology described in this paper is applicable to other educational or program evaluations. Also, it is appropriate for analyzing large samples or longitudinal qualitative data in marketing and management.
APA, Harvard, Vancouver, ISO, and other styles
43

Yang, Zhifei, Yali Chen, and Hu Luo. "Research and Development of Validation and Drill System for Full Scope of Severe Accident Management Guideline." Journal of Nuclear Engineering and Radiation Science 3, no. 4 (2017). http://dx.doi.org/10.1115/1.4036892.

Full text
Abstract:
To respond to the urgent needs of verification, training, and drill for full scope severe accident management guidelines (FSSAMG) among nuclear regulators, utilities, and research institutes, the FSSAMG verification and drill system is developed. The FSSAMG includes comprehensive scenarios under power condition, shutdown condition, spent fuel pool (SFP) condition, and refueling conditions. This article summarized the research and development of validation and drill system for FSSAMG by using the severe accident analysis computer code modular accident analysis program 5 (MAAP5). Realistic accident scenarios can be verified and exercised in the developed system to support FSSAMG training, drill, examination, and verification.
APA, Harvard, Vancouver, ISO, and other styles
44

"Impact from road transport of radioactive wastes in Spain." Issue 2 14, no. 2 (2013): 202–9. http://dx.doi.org/10.30955/gnj.000858.

Full text
Abstract:
This work focuses in transport of high level radioactive wastes, mainly consisting of spent fuels from
 nuclear power utilities in Spain. To date, these wastes are being stored in the own generating facilities,
 or they are sent to France; but these options are not any more technically or economically sustainable,
 thus a central repository has been planned in the country -its final location just decided in
 November 2011- so that we have chosen this site to advance in the studies.
 The objectives are to describe the logistics of these transports and estimate their associated radiological
 impacts: eight routes are considered, corresponding to the nuclear power installations in
 Spain, plus the return of spent fuels which are also temporary stored in France; calculation of impacts
 is based in the radiation index of transport, speed and distances, populations exposed, the
 corresponding doses and average values of risk.
 We have developed a computer application, which needs only to enter the radiation level at 1 m from
 the transport and select the route; thus the radiological impacts are automatically obtained, concluding
 that they are irrelevant in normal conditions of transports, while the program (web available) can
 be useful for further analysis of such operations.
APA, Harvard, Vancouver, ISO, and other styles
45

"Software Reviews : 1ST-CLASS FUSION, Release 1.0 Reviewed by Carl Grafton, Auburn University, Montgomery Publisher: Programs in Motion, 10 Sycamore Rd., Wayland, MA 01778 (telephone: 617-653-5093) Year of Publication: 1987 Materials: Program disk, runtime disk, and utilities disk (none is copy-protected); 427-page manual in three-ring binder; slipcase Price: $1,295, including runtime program and distribution license allowing unlimited distribution of an expert system developed by the user (but not of 1ST-CLASS FUSION itself) if proper copyright information is included in the distributed expert system Machine Specificity: IBM PC and compatibles System Requirements: 512K memory (640K is desirable) Effectiveness: Excellent User-Friendliness: Excellent Documentation: Excellent." Social Science Computer Review 6, no. 2 (1988): 310–12. http://dx.doi.org/10.1177/089443938800600222.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Manalastas-Cantos, Karen, Petr V. Konarev, Nelly R. Hajizadeh, et al. "ATSAS 3.0: expanded functionality and new tools for small-angle scattering data analysis." Journal of Applied Crystallography 54, no. 1 (2021). http://dx.doi.org/10.1107/s1600576720013412.

Full text
Abstract:
The ATSAS software suite encompasses a number of programs for the processing, visualization, analysis and modelling of small-angle scattering data, with a focus on the data measured from biological macromolecules. Here, new developments in the ATSAS 3.0 package are described. They include IMSIM, for simulating isotropic 2D scattering patterns; IMOP, to perform operations on 2D images and masks; DATRESAMPLE, a method for variance estimation of structural invariants through parametric resampling; DATFT, which computes the pair distance distribution function by a direct Fourier transform of the scattering data; PDDFFIT, to compute the scattering data from a pair distance distribution function, allowing comparison with the experimental data; a new module in DATMW for Bayesian consensus-based concentration-independent molecular weight estimation; DATMIF, an ab initio shape analysis method that optimizes the search model directly against the scattering data; DAMEMB, an application to set up the initial search volume for multiphase modelling of membrane proteins; ELLLIP, to perform quasi-atomistic modelling of liposomes with elliptical shapes; NMATOR, which models conformational changes in nucleic acid structures through normal mode analysis in torsion angle space; DAMMIX, which reconstructs the shape of an unknown intermediate in an evolving system; and LIPMIX and BILMIX, for modelling multilamellar and asymmetric lipid vesicles, respectively. In addition, technical updates were deployed to facilitate maintainability of the package, which include porting the PRIMUS graphical interface to Qt5, updating SASpy – a PyMOL plugin to run a subset of ATSAS tools – to be both Python 2 and 3 compatible, and adding utilities to facilitate mmCIF compatibility in future ATSAS releases. All these features are implemented in ATSAS 3.0, freely available for academic users at https://www.embl-hamburg.de/biosaxs/software.html.
APA, Harvard, Vancouver, ISO, and other styles
47

"Software Reviews : SHOWCASE: STANDARD CROSS-CULTURAL SAMPLE, Version 1.0 Reviewed by Richard A. Wagner, Human Relations Area Files, Inc. (HRAF) Publisher: Cognitive Development, Inc., Suite 141, 12345 Lake City Way NE, Seattle, wA 98125 (telephone: 206-363-6905) Year of Publication: 1988 Materials: Program and utilities disks (not copy protected) Instructor's Book of Demonstrations (by Rodney Stark, University of Washington), User's Guide, and codebook (totalling 90 pages) Price: $195 Machine Specificity: IBM PC, XT, AT, compatibles System Requirements: One 360K drive, 256K RAM, color graphics Effectiveness: Fair User-Friendliness: Good Documentation: Fair to Poor." Social Science Computer Review 7, no. 3 (1989): 367–69. http://dx.doi.org/10.1177/089443938900700312.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

VENCES, MIGUEL, AURÉLIEN MIRALLES, SOPHIE BROUILLET, et al. "iTaxoTools 0.1: Kickstarting a specimen-based software toolkit for taxonomists." Megataxa 6, no. 2 (2021). http://dx.doi.org/10.11646/megataxa.6.2.1.

Full text
Abstract:
While powerful and user-friendly software suites exist for phylogenetics, and an impressive cybertaxomic infrastructure of online species databases has been set up in the past two decades, software targeted explicitly at facilitating alpha-taxonomic work, i.e., delimiting and diagnosing species, is still in its infancy. Here we present a project to develop a bioinformatic toolkit for taxonomy, based on open-source Python code, including tools focusing on species delimitation and diagnosis and centered around specimen identifiers. At the core of iTaxoTools is user-friendliness, with numerous autocorrect options for data files and with intuitive graphical user interfaces. Assembled standalone executables for all tools or a suite of tools with a launcher window will be distributed for Windows, Linux, and Mac OS systems, and in the future also implemented on a web server. The initial version (iTaxoTools 0.1) distributed with this paper (https://github.com/iTaxoTools/iTaxoTools-Executables) contains graphical user interface (GUI) versions of six species delimitation programs (ABGD, ASAP, DELINEATE, GMYC, PTP, tr2) and a simple threshold-clustering delimitation tool. There are also new Python implementations of existing algorithms, including tools to compute pairwise DNA distances, ultrametric time trees based on non-parametric rate smoothing, species-diagnostic nucleotide positions, and standard morphometric analyses. Other utilities convert among different formats of molecular sequences, geographical coordinates, and units; merge, split and prune sequence files, tables and species partition files; and perform simple statistical tests. As a future perspective, we envisage iTaxoTools to become part of a bioinformatic pipeline for next-generation taxonomy that accelerates the inventory of life while maintaining high-quality species hypotheses. The open source code and binaries of all tools are available from Github (https://github.com/iTaxoTools) and further information from the website (http://itaxotools.org)
APA, Harvard, Vancouver, ISO, and other styles
49

Jr., Joseph Reagle. "Open Content Communities." M/C Journal 7, no. 3 (2004). http://dx.doi.org/10.5204/mcj.2364.

Full text
Abstract:
In this brief essay I sketch the characteristics of an open content community by considering a number of prominent examples, reviewing sociological literature, teasing apart the concepts of open and voluntary implicit in most usages of the term, and I offer a definition in which the much maligned possibility of 'forking' is actually an integral aspect of openness. Introduction What is often meant by the term 'open' is a generalization from the Free Software, Open Source and open standards movements. Communities marshaling themselves under these banners cooperatively produce, in public view, software, technical standards, or other content that is intended to be widely shared. Free Software and Open Source The Free Software movement was begun by Richard Stallman at MIT in the 1980s. Previously, computer science operated within the scientific norm of collaboration and information sharing. When Stallman found it difficult to obtain the source code of a troublesome Xerox printer, he feared that the norms of freedom and openness were being challenged by a different, proprietary, conceptualization of information. To challenge this shift he created the GNU Project in 1984 (Stallman 1998), the Free Software Foundation (FSF) in 1985 (Stallman 1996), and the authored the GNU General Public License in 1989. The goal of the GNU Project was to create a free version of the UNIX computing environment with which many computer practitioners were familiar with, and even contributed to, but was increasingly being encumbered with proprietary claims. GNU is playful form of a recursive acronym: GNU is Not Unix. The computing environment was supposed to be similar to but independent of UNIX and include everything a user needed including an operating system kernel (e.g., Hurd) and common applications such as small utilities, text editors (e.g., EMACS) and software compilers (e.g,. GCC). The FSF is now the principle sponsor of the GNU Project and focuses on administrative issues such as copyright licenses, policy, and funding issues; software development and maintenance is still an activity of GNU. The GPL is the FSF's famous copyright license for 'free software'; it ensures that the 'freedom' associated with being able to access and modify software is maintained with the original software and its derivations. It has important safeguards, including its famous 'viral' provision: if you modify and distribute software obtained under the GPL license, your derivation also must be publicly accessible and licensed under the GPL. In 1991, Linus Torvalds started development of Linux: a UNIX like operating system kernel, the core computer program that mediates between applications and the underlying hardware. While it was not part of the GNU Project, and differed in design philosophy and aspiration from the GNU's kernel (Hurd), it was released under the GPL. While Stallman's stance on 'freedom' is more ideological, Torvalds approach is more pragmatic. Furthermore, other projects, such as the Apache web server, and eventually Netscape's Mozilla web browser, were being developed in open communities and under similar licenses except that, unlike the GPL, they often permit proprietary derivations. With such a license, a company may take open source software, change it, and include it in their product without releasing their changes back to the community. The tension between the ideology of free software and its other, additional, benefits led to the concept of Open Source in 1998. The Open Source Initiative (OSI) was founded when, "We realized it was time to dump the confrontational attitude that has been associated with 'free software' in the past and sell the idea strictly on the same pragmatic, business-case grounds that motivated Netscape" (OSI 2003). Since the open source label is intended to cover open communities and licenses beyond the GPL, they have developed a meta (more abstract) Open Source Definition (OSI 1997) which defines openness as: Free redistribution Accessible source code Permits derived works Ensures the integrity of the author's source code Prohibits discrimination against persons or groups Prohibits discrimination against fields of endeavor Prohibits NDA (Non-Disclosure Agreement) entanglements Ensures the license must not be specific to a product Ensures the license must not restrict other software Ensures the license must be technology-neutral A copyright license which is found by OSI to satisfy these requirements will be listed as a OSI certified/approved license, including the GPL of course. Substantively, Free Software and Open Source are not that different: the differences are of motivation, personality, and strategy. The FLOSS (Free/Libre and Open Source Software) survey of 2,784 Free/Open Source (F/OS) developers found that 18% of those that identified with the Free Software community and 9% of those that identified with the Open Source community considered the distinction to be 'fundamental' (Ghosh et al. 2002:55). Given the freedom of these communities, forking (a split of the community where work is taken in a different direction) is common to the development of the software and its communities. One can conceive of Open Source movement as having forked from Free Software movement. The benefits of openness are not limited to the development of software. The Internet Engineering Task Force (IETF) and World Wide Web Consortium (W3C) host the authoring of technical specifications that are publicly available and implemented by applications that must interoperably communicate over the Internet. For example, different Web servers and browsers should be able to work together using the technical specifications of HTML, which structures a Web page, and HTTP, which is used to request and send Web pages. The approach of these organizations is markedly different from the 'big S' (e.g., ISO) standards organizations which typically predicate membership on nationality and often only provide specifications for a fee. This model of openness has extended even to forms of cultural production beyond technical content. For example, the Wikipedia is a collaborative encyclopedia and the Creative Commons provides licenses and community for supporting the sharing of texts, photos, and music. Openness and Voluntariness Organization can be characterized along numerous criteria including size; public versus private ownership; criterion for membership; beneficiaries (cui bono); Hughes's voluntary, military, philanthropic, corporate, and family types; Parsons's social pattern variables; and Thompson and Tuden's decision making strategies, among others (Blau and Scott 1962:40). I posit that within the contemporary usage of the term 'open,' one can identify a number of defining characteristics as well as an implicit connotation of voluntariness. Openness The definition of an 'open' community in the previous section is extensional: describing the characteristics of Free/Open Software (F/OS), and open standards and content. While useful, this approach is incomplete because such a description is of products, not of the social organization of producers. For example, private firms do release F/OS software but this tells us little about how work is done 'in the open.' The approach of Tzouris was to borrow from the literature of 'epistemic' communities so as to provide four characteristics of 'free/open' communities: Shared normative and principled beliefs: refers to the shared understanding of the value-based rationale for contributing to the software. Shared causal beliefs: refers to the shared causal understanding or the reward structures. Therefore, shared causal beliefs have a coordinating effect on the development process. Shared notions of validity: refers to contributors' consensus that the adopted solution is a valid solution for the problem at hand. Common policy enterprise: refers to a common goal that can be achieved through contributing code to the software. In simple words, there is a mutual understanding, a common frame of reference of what to develop and how to do it. (Tzouris 2002:21) However, these criteria seem over-determined: it is difficult to imagine a coherent community ('open' or otherwise) that does not satisfy these requirements. Consequently, I provide an alternative set of criteria that also resists myopic notions of perfect 'openness' or 'democracy.' Very few organizations have completely homogeneous social structures. As argued in Why the Internet is Good: Community Governance That Works Well (Reagle 1999), even an organization like the IETF with the credo of, "We reject kings, presidents and voting. We believe in rough consensus and running code," has explicit authority roles and informal elders. Consequently, in the following definition of open communities there is some room for contention. An open community delivers or demonstrates: Open products: provides products which are available under licenses like those that satisfy the Open Source Definition. Transparency: makes its processes, rules, determinations, and their rationales available. Integrity: ensures the integrity of the processes and the participants' contributions. Non-discrimination: prohibits arbitrary discrimination against persons, groups, or characteristics not relevant to the community's scope of activity. Persons and proposals should be judged on their merits. Leadership should be based on meritocratic or representative processes. Non-interference: the linchpin of openness, if a constituency disagrees with the implementation of the previous three criteria, the first criteria permits them to take the products and commence work on them under their own conceptualization without interference. While 'forking' is often complained about in open communities -- it can create some redundancy/inefficiency -- it is an essential characteristic and major benefit of open communities as well. Voluntariness In addition to the models of organization referenced by Blau and Scott (1962), Amitai Etzioni describes three types of organizations: 'coercive' organizations that use physical means (or threats thereof), 'utilitarian' organizations that use material incentives, and 'normative' organizations that use symbolic awards and status. He also describes three types of membership: 'alienative members' feel negatively towards the organization and wish to leave, 'calculative members' weigh benefits and limitations of belonging, and 'moral members' feel positively towards the organization and may even sublimate their own needs in order to participate (Etzioni 1961). As noted by Jennifer Lois (1999:118) normative organizations are the most underrepresented type of organization discussed in the sociological literature. Even so, Etzioni's model is sufficient such that I define a -- voluntary -- community as a 'normative' organization of 'moral' members. I adopt this synonymous definition not only because it allows me to integrate the character of the members into the character of the organization, but to echo the importance of the sense of the collaborative 'gift' in discussions among members of the community. Yet, obviously, not all voluntary organizations are necessarily open according to the definition above. A voluntary community can produce proprietary products and have opaque processes -- collegiate secret societies are a silly but demonstrative example. However, like with openness, it is difficult to draw a clear line: one cannot exclusively locate open communities and their members strictly within the 'normative' and 'moral' categories, though they are dominant in the open communities I introduced. Many members of those open communities are volunteers, either because of a 'moral' inclination and/or informal 'calculative' concern with a sense of satisfaction and reputation. While the FLOSS survey concluded, "that this activity still resembles rather a hobby than salaried work" (Ghosh et al. 2002:67), 15.7% of their sample declared they do receive some renumeration for developing F/OS. Even at the IETF and W3C, where many engineers are paid to participate, it is not uncommon for some to endeavor to maintain their membership even when not employed or their employers change. The openness of these communities is perhaps dominant in describing the character of the organization, though the voluntariness is critical to understanding the moral/ideological light in which many of the members view their participation. Conclusion I've attempted to provide a definition for openness that reflects an understanding of contemporary usage. The popular connotation, and consequently the definition put forth in this essay, arises from well known examples that include -- at least in part -- a notion of voluntary effort. On further consideration, I believe we can identity a loose conceptualization of shared products, and a process of transparency, integrity, and non-discrimination. Brevity prevents me from considering variations of these characteristics and consequent claims of 'openness' in different communities. And such an exercise isn't necessary for my argument. A common behavior of an open community is the self-reflexive discourse of what it means to be open on difficult boundary cases; the test of an open community is if a constituency that is dissatisfied with the results of such a discussion can can fork (relocate) the work elsewhere. Works Cited Blau, Peter and W. Richard Scott. Formal organizations: a comparative approach. New York, NY: John Wiley, 1962. Etzioni, Amitai. Modern organizations. New York, NY: Free Press of Glencoe., 1961. Ghosh, Rishab, Ruediger Glott, Bernhard Krieger, and Gregorio Robles. Free/Libre and open source software: survey and study. 2002. http://www.infonomics.nl/FLOSS/report/ Lois, Jennifer. "Socialization to heroism: individualism and collectivism in a voluntary search and rescue group." Social Psychology Quarterly 62 (1999): 117-135. Nardi, Bonnie and Steve Whittaker. "The place of face-to-face communication in distributed work." Distributed Work. Ed. Pamela Hinds and Sara Kiesler. Boston, Ma: MIT Press., 2002. chapter 4. Reagle, Joseph. Why the Internet is good community governance that works well. 1999.http://cyber.law.harvard.edu/people/reagle/regulation-19990326.html Stallman, Richard. Free Software Foundation. 1996. http://www.gnu.org/fsf/fsf.html Stallman, Richard. Linux and the GNU project. 1997. http://www.gnu.org/gnu/linux-and-gnu.html Stallman, Richard. The GNU project. 1998. http://www.gnu.org/gnu/thegnuproject.html Tzouris, Menelaos. Software freedom, open software and the participant's motivation -- a multidisciplinary study. London, UK: London School of Economics and Political Science, 2002. Citation reference for this article MLA Style Reagle Jr., Joseph. "Open Content Communities." M/C Journal 7.3 (2004). <http://www.media-culture.org.au/0406/06_Reagle.rft.php>. APA Style Reagle Jr., J. (2004, Jul.) Open Content Communities, M/C Journal, 7(3), <http://www.media-culture.org.au/0406/06_Reagle.rft.php>.
APA, Harvard, Vancouver, ISO, and other styles
50

Kibby, Marjorie. "Shared Files." M/C Journal 6, no. 2 (2003). http://dx.doi.org/10.5204/mcj.2160.

Full text
Abstract:
The carefully constructed record collection with detailed liner notes and displayable album cover art is little more than a quaint anachronism for the twenty-year-olds of 2003. For them, a music collection is more likely to be a fat, glovebox sized folder of anonymous CD ROMs. Affective investments in particular bands, releases, tracks, have been replaced by a desire for a sort of musical 'affluence' where the size and currency of the collection is valued, rather than the constituent components of the collection. The explanation for this transition from the collection of fetishised albums to the folder of disposable files lies in the increasing dissatisfaction with the CD album as a product, and the development of technology that enabled file sharing to become an effective music distribution method. A decade before music file-sharing became widespread, Frith (73) commented on “the changing place of music in leisure generally ... music is being used differently and in different, more flexible forms.” Frith was discussing home-taping. He recognised, as the record companies did not, that the explosion of home-taping was not simply a legal matter, a breach-of-copyright issue, but a production and marketing issue. The youth market, in particular, was using music in new and different ways, ways that demanded more flexibility in format and increased personalisation of music compilations. During the 90s music sales continued to decline, particularly in the youth segment. Fox & Wrenn report that “between 1991 and 2000, the overall market share of young consumers has declined substantially” dropping from eighteen per cent to thirteen per cent for fifteen to nineteen year olds, and with similar figures for the twenty to twenty-four year olds. (112). Global music sales fell from $41.5 billion in 1995 to $38.5 billion in 1999. While there is a good deal of market research that attempts to prove that file sharing was responsible for the declining CD sales, much of the analysis is suspect and there are indications that assuming a cause and effect relationship is inaccurate. The growth period for music file sharing was 2000 to 2001, and during this period CD sales actually rose five per cent. In 2002 one of the most downloaded artists was Eminem, with millions downloading The Eminem Show before its release, yet the CD went on to set sales records. Very little market research addresses consumer satisfaction with CDs as a music product. However, focus groups conducted with groups of first year university students suggest that young consumers in particular think that CDs are too expensive and that record companies are ripping them off, they don't like being forced to buy tracks that they do not want, in order to own the tracks that they like, many prefer a mix of artists rather than a whole CD of the same performer, some do not want the case and cover and resent having to pay for them, some want their music in chunks longer than the fifty minutes or so of the average CD, many feel that they get insufficient information about the artist and the track before they have to make the decision to buy. Overall the feeling was: "this is not a product I want, and it costs more than I think I should have to pay." Without a high level of consumer dissatisfaction with the music products on offer, peer-to-peer file sharing programs would not have been able to create the waves that they have in the music industry. Music file sharing is a social phenomenon, as well a technological revolution. Both the social function of music and the cooperative history of the Internet set the stage for music file sharing. Music consumption is grounded in a communal philosophy, and one of the pleasures of listening to music is that it connects the individual to a social group or subculture. The Internet’s first civilian uses were based in collective efforts, and content was made freely accessible to all users. Commercialisation came later. According to Segal, this development history has "bred an entitlement philosophy in Internet users" (97). From its beginning the Internet facilitated the sharing of files, text, graphics, software and music. However locating desired music files was only for the determined, and downloading them solely for the patient, until the development of compressed music formats like MP3 and specialised file sharing utilities such as Napster. Music consumers quickly discovered the benefits of music file sharing, and today's most popular services – KaZaA, Grokster, and Morpheus – have an estimated seventy million active users. Peer- to-peer music offers a different music consumption experience. First, it is free, and it is free in huge quantities. Endless numbers of tracks and albums are available from golden oldies to yet to be released, obscure examples of minor genres to the latest pop hit, major artists to you-haven't-heard-of-us-yet. This is inevitably changing consumers' relationship to music, with many "downloading music in an obsessive manner, without identifying with it or experiencing a passionate attachment" (Kasaras). Contributing to this is the uneven quality of files available for downloading – many have been uploaded with more enthusiasm than care, and may be misnamed, wrongly attributed, or of poor sound quality. Peer-to-peer music also provides an experience of community, as users chat about live performances and music related products, and exchange information on lyrics and concert listings. Another aspect of p-2-p music that has had an impact on the way consumers experience music is the play-list editor, which allows music files to be categorised and ordered into lists for playback. Play-lists can be organised by artist, genre, date or theme into several hours of back-to-back music – providing, in essence, a personal radio station. The music collection becomes an evanescent experience, rather than a valued commodity. The music industry's immediate reaction to changing consumer behaviour was to attempt to litigate the competition out of existence, or to buy them out. A belated response was to establish rival services. However MusicNet, PressPlay and the new commercial Napster have met with a lukewarm response from their target market. All have fairly limited lists of files available – no full albums, few recent releases – all are expensive for what they provide, and all have severe restrictions on how much can be downloaded, and how the downloaded files can be used. As a Time article commented “All three are so restrictive you would think you were downloading homeland‑security documents, not 'N Sync” (Taylor 74). The dissatisfaction with the commercial music file subscription services, and the decentralised nature of the new p-2-p networks has led the popular press to hail a democratic revolution in music distribution. However its optimism may be a little premature, as the current file-sharing networks are not without problems for the consumer. Industry providers will always have advantages over amateur file uploaders in the areas of standards, convenience and quality. Finding other than top‑forty tracks is still a time consuming activity, and downloading over a modem still takes time. When offered several choices of a track, only trial and error can determine which is the best choice. Making the wrong choice often means downloading an unplayable file. The current generation of file-sharing services may be more amorphous than those they replaced, but though they are distributed networks the file traffic is concentrated in a single direction. Only a small number of users actually contribute files, and of this group perhaps only 1% respond to requests for files. Because the music files are treated as a public good, most users feel entitled to download files without ever contributing files to the pool. When the majority ride free on the efforts of others the performance of the system is seriously degraded. Free riding also makes the system open to legal action. Though in theory the seventy million users are beyond law suits because of the anonymity of numbers, if only a few are uploading files then these few are vulnerable to service bans and litigation. As the number of users continues to grow the problem of free riding compounds, and if users don't contribute to the public good on which all depend, the system is in danger of collapse (Adar and Huberman). Peer-to-peer file sharing is therefore unlikely to replace industry mediated music consumption in the long term. However that does not mean that the CD will be restored to top place in consumer affections. The album is pretty much a seventies concept, largely dictated by the demands of producing, stocking and selling vinyl records. Increasingly young consumers are rejecting the concept. Digital technology and Internet distribution have made possible new ways of experiencing music, and consumers are becoming accustomed to new norms of music consumption: cheap or free, flexibility of formats, immediacy, breadth of choice, connections with artists and other fans, and access to related commodities. Increasingly they are looking to music as a service, rather than a product. The sheer amount and diversity of music available through p-2-p networks, has created a music consumer with immense, but reconfigured appetites. The industry's current business model is dependent on controlling the distribution of a physical product to a mass market. To meet the needs of the 'new' music consumer it will have to abandon this model and adopt the one-on-one interactive model of the Internet – music: any kind, anytime, anywhere. Works Cited Adar, Eytan, and Bernardo A. Huberman. “Free Riding on Gnutella.” First Monday 5.10 (2000). 13 Jan. 2003 <http://www.firstmonday.org/>. Fox, Mark, and Bruce Wrenn. “A Broadcasting Model for the Music Industry.” JMM 3.2 (2001): 112-9. Frith, Simon. “The Industrialisation of Popular Music.” Popular Music and Communication. Ed. James Lull. London: Sage, 1987. 53-77. Kasaras, Kostas. “Music in the Age of Free Distribution.” First Monday 7.1 (2002). 11 July 2002 <http://firstmonday.org/issue7_1/kasaras/index.php>. Segal, Adam. “Dissemination of Digitised Music on the Internet: A Challenge to the Copyright Act.” Computer and High Technology Law Journal 12 (1996): 97-138. Taylor, Chris. “Hitting All the Wrong Notes.” Time 159.8 (25 Feb. 2002): 74. Links http://firstmonday.org/issue7_1/kasaras/index.html http://firstmonday.org/issues/issue7_2/fox/index.html http://musicdish.com/mag/?id=7376 http://www.firstmonday.dk/issues/issue5_10/adar/ http://www.firstmonday.org/ http://www.mp3newswire.net/stories/2002/teentrade.html http://www.pwcglobal.com/extweb/indissue.nsf/DocID/51457FCD520E3CC38525698800562CD http://www8.techmall.com/techdocs/TS0006151.html Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Kibby, Marjorie. "Shared Files" M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0304/05-sharedfiles.php>. APA Style Kibby, M. (2003, Apr 23). Shared Files. M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0304/05-sharedfiles.php>
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!