Academic literature on the topic 'Emergency management – United States – Data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Emergency management – United States – Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Emergency management – United States – Data processing"

1

Grams, Ralph R., Georgina C. Peck, James K. Massey, and James J. Austin. "Review of Hospital Data Processing in the United States (1982?1984)." Journal of Medical Systems 9, no. 4 (August 1985): 175–269. http://dx.doi.org/10.1007/bf00992884.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bennett, PhD, DeeDee M. "Diversity in emergency management scholarship." Journal of Emergency Management 17, no. 2 (March 1, 2019): 148. http://dx.doi.org/10.5055/jem.2019.0407.

Full text
Abstract:
Women and racial/ethnic minorities have long been underrepresented in the field of emergency management. This is true for both practice and research. The lack of women and racial/ethnic minorities in the profession and their perceived absence in research or scholarly study may have impacts on the effectiveness of response and recovery efforts as well as the broader scientific knowledge within the field. Historically, women and racial/ethnic minority communities have disproportionately experienced negative impacts following disasters. Earlier related studies have pointed to the underrepresentation as a contributing factor in community vulnerability. The scarcity of women in practice and as students in this field has been particularly evident in the United States. Using data from a recent survey of emergency management programs nationwide, this article reviews the concerns in research with regards to women and ethnic minority communities during disasters, efforts to increase representation of these groups in the field, and discusses the implications for practice, policy, and future research. The findings show that women have a strong presence in emergency management programs nationwide, and while specific data on racial and ethnic minorities are lacking, the observed increases reported in this article encourages further study.
APA, Harvard, Vancouver, ISO, and other styles
3

Pinet-Peralta, PhD, Luis M., Rick Bissell, PhD, Katrina Hein, BSc, MSc, and David Prakash, MSc. "Emergency management policies and natural hazards in the United States: A state-level analysis." Journal of Emergency Management 9, no. 2 (March 1, 2011): 27. http://dx.doi.org/10.5055/jem.2011.0051.

Full text
Abstract:
Every year, natural hazards kill and injure hundreds of people and also have significant social, economic, and political effects on society. However, not all disasters or crises are the focus of state, regional, or national efforts to mitigate their effects. In this article, the authors use Wilson’s policy typology to describe the unintended consequences that disaster legislation has had on the distribution of costs and benefits of disaster relief programs in the United States. The data provide evidence that the concentration of disaster relief programs for natural disasters is not based on need and that interest groups commonly drive disaster policies to benefit those with the greatest risk for losses rather than those in greatest need. Policymakers can use this information to examine both intended and unintended consequences of disaster response and recovery policies and can orient the limited resources available toward those who are least capable of recovering from natural disasters.
APA, Harvard, Vancouver, ISO, and other styles
4

Saud, Pradip, Jingxin Wang, Benktesh D. Sharma, and Weiguo Liu. "Carbon impacts of hardwood lumber processing in the northeastern United States." Canadian Journal of Forest Research 45, no. 12 (December 2015): 1699–710. http://dx.doi.org/10.1139/cjfr-2015-0082.

Full text
Abstract:
Carbon emission from hardwood lumber processing in different-sized sawmills under varying energy sources, management strategies, and potential carbon offsetting capacity through useful life (service life) of lumber in the northeastern United States was analyzed using analytical statistics such as analysis of variance (ANOVA), mixed-effect model, principal component analysis, and Monte Carlo simulation. Data obtained from a regional sawmill survey (Pennsylvania, New York, Ohio, and West Virginia), energy audit of sawmills, public databases, and relevant literature were analyzed for the gate-to-gate life cycle inventory framework. Results showed that mean carbon emission (megagrams (Mg) per thousand cubic metres (TCM)) for lumber processing significantly differs among sawmill sizes. The total carbon emission from electricity consumption and wood residue of lumber processing was approximately 62.5%, 80.3%, and 66.2% of carbon stored in lumber processed for small, medium, and large sawmills, respectively. Efficient management and potential opportunities of improvement in sawmills can significantly reduce carbon emission (10.96% ± 1.57%) from hardwood lumber processing. Carbon stock from lumber production could be enhanced by either reducing carbon emission from energy consumption or decreasing lumber export quantity. The carbon emission–loss ratio (CELR) suggested that after 100 years, nearly 50% of carbon stored in lumber would be still available for carbon accountability. Electricity generation from either a single resource (natural gas) or mixed resources as is the case in RFC EAST (eGrid subregion) would be beneficial in lowering carbon emission from sawmill processing.
APA, Harvard, Vancouver, ISO, and other styles
5

Barnes, Jessica, Larry Segars, Jason Wasserman, Patrick Karabon, and Tracey A. Taylor. "611. Infectious Disease Management of Homeless and Non-Homeless Populations in United States Emergency Departments." Open Forum Infectious Diseases 7, Supplement_1 (October 1, 2020): S366. http://dx.doi.org/10.1093/ofid/ofaa439.805.

Full text
Abstract:
Abstract Background Studies have long documented the increased emergency department usage in the United States by homeless persons compared to their housed counterparts, as well as an increased overall prevalence of infectious diseases. However, there is a gap in knowledge on the treatment that homeless persons receive for these infectious diseases within United States emergency departments compared to their housed counterparts. This study seeks to understand this potential difference in treatment, including diagnostic services tested, procedures performed, and medications prescribed. Methods This study utilized a retrospective, cohort study design to analyze data from the 2007-2010 National Hospital Ambulatory Medical Care Survey (NHAMCS) database. Complex sample logistic regression analysis was used to compare variables, including diagnostic services, procedures, and medication classes prescribed between homeless and private residence individuals seeking emergency department treatment for infectious diseases. This provided an odds ratio to compare the two populations, which was then adjusted for confounding variables. Results Compared to private residence individuals, homeless persons were more likely (OR: 10.99, p< 0.05, CI: 1.08-111.40) to receive sutures or staples when presenting with an infectious disease in United States emergency departments. Compared to private residence persons, homeless individuals were less likely (OR: 0.29, p< 0.05, CI: 0.10-0.87) to be provided medications or immunizations when presenting with an infectious disease in United States emergency departments, and significant differences were detected in prescribing habits of multiple medication classes. Conclusion This study detected a significant difference in suturing/stapling and medication prescribing patterns for homeless persons with an infectious disease in United States emergency departments, compared to their housed counterparts. These results provide a platform for continual research. Disclosures All Authors: No reported disclosures
APA, Harvard, Vancouver, ISO, and other styles
6

Laney, Christine, Katherine LeVan, Claire Lunch, and Katherine Thibault. "Sample Management Across the National Ecological Observatory Network." Biodiversity Information Science and Standards 2 (May 18, 2018): e25351. http://dx.doi.org/10.3897/biss.2.25351.

Full text
Abstract:
From 81 study sites across the United States, the US National Ecological Observatory Network (NEON), generates >75,000 samples per year. Samples range from soil and dust deposition material, tissue samples (e.g., small mammals and fish), DNA extracts, and whole organisms (e.g., ground beetles and ticks). Samples are collected, processed, and documented according to protocols that are standardized across study sites and according to the needs of the ecological research community for future studies. NEON has faced numerous challenges with managing data related to these many diverse physical samples, particularly when data are gathered at numerous steps throughout processing. Here, we share these challenges as well as solutions, including innovative semantically driven software tools and processing pipelines that manage data from each sample's point of collection to its ultimate fate (consumption, archive facility, or partnering data repository) while maintaining links across sample hierarchies.
APA, Harvard, Vancouver, ISO, and other styles
7

Mahmood, Rezaul, Ryan Boyles, Kevin Brinson, Christopher Fiebrich, Stuart Foster, Ken Hubbard, David Robinson, Jeff Andresen, and Dan Leathers. "Mesonets: Mesoscale Weather and Climate Observations for the United States." Bulletin of the American Meteorological Society 98, no. 7 (July 1, 2017): 1349–61. http://dx.doi.org/10.1175/bams-d-15-00258.1.

Full text
Abstract:
Abstract Mesoscale in situ meteorological observations are essential for better understanding and forecasting the weather and climate and to aid in decision-making by a myriad of stakeholder communities. They include, for example, state environmental and emergency management agencies, the commercial sector, media, agriculture, and the general public. Over the last three decades, a number of mesoscale weather and climate observation networks have become operational. These networks are known as mesonets. Most are operated by universities and receive different levels of funding. It is important to communicate the current status and critical roles the mesonets play. Most mesonets collect standard meteorological data and in many cases ancillary near-surface data within both soil and water bodies. Observations are made by a relatively spatially dense array of stations, mostly at subhourly time scales. Data are relayed via various means of communication to mesonet offices, with derived products typically distributed in tabular, graph, and map formats in near–real time via the World Wide Web. Observed data and detailed metadata are also carefully archived. To ensure the highest-quality data, mesonets conduct regular testing and calibration of instruments and field technicians make site visits based on “maintenance tickets” and prescheduled frequencies. Most mesonets have developed close partnerships with a variety of local, state, and federal-level entities. The overall goal is to continue to maintain these networks for high-quality meteorological and climatological data collection, distribution, and decision-support tool development for the public good, education, and research.
APA, Harvard, Vancouver, ISO, and other styles
8

Kim, Jung Wook, and Kyujin Jung. "Does Voluntary Organizations’ Preparedness Matter in Enhancing Emergency Management of County Governments?" Lex localis - Journal of Local Self-Government 14, no. 1 (January 2, 2016): 1–17. http://dx.doi.org/10.4335/14.1.1-17(2016).

Full text
Abstract:
While voluntary organizations have played a critical role in preparing for and responding to disasters, few have intentionally examined the preparedness of voluntary organizations, which are fundamentally required to enhance local emergency management. The purpose of this research is to examine the relationship between the preparedness of voluntary organizations and their effectiveness on local emergency management. By using a survey data collected among county governments in the United States, this research tests the effect of voluntary organizations’ preparedness on local emergency management. The results show that the voluntary organizations' preparedness behaviors such as their participation in local emergency planning as well as training, education, and resources for local emergency management positively affect their effectiveness on local emergency management. The findings imply that systemic volunteer management can build more effective emergency management systems through cohesive and comprehensive collaboration between public and voluntary organizations.
APA, Harvard, Vancouver, ISO, and other styles
9

Lai, Anita, Elliott Tenpenny, David Nestler, Erik Hess, and Ian G. Stiell. "Comparison of management and outcomes of ED patients with acute decompensated heart failure between the Canadian and United States’ settings." CJEM 18, no. 2 (June 22, 2015): 81–89. http://dx.doi.org/10.1017/cem.2015.43.

Full text
Abstract:
AbstractIntroductionThe objective of this study was to compare the emergency department (ED) management and rate of admission of acute decompensated heart failure (ADHF) between two hospitals in Canada and the United States and to compare the outcomes of these patients.MethodsThis was a health records review of adults presenting with ADHF to two EDs in Canada and the United States between January 1 and April 30, 2010. Outcome measures were admission to the hospital, myocardial infarction (MI), and death or relapse rates to the ED. Data were analysed using descriptive, univariate and multivariate analyses.ResultsIn total, 394 cases were reviewed and 73 were excluded. Comparing 156 Canadian to 165 U.S. patients, respectively, mean age was 76.0 and 75.8 years; male sex was 54.5% and 52.1%. Canadian and U.S. ED treatments were noninvasive ventilation 7.7% v. 12.8% (p=0.13); IV diuretics 77.6% v. 36.0% (p<0.001); IV nitrates 4.5% v. 6.7% (p=0.39). There were significant differences in rate of admission (50.6% v. 95.2%, p<0.001) and length of stay in ED (6.7 v. 3.0 hours, p<0.001). Proportion of Canadian and U.S. patients who died within 30 days of the ED visit was 5.1% v. 9.7% (p=0.12); relapsed to the ED within 30 days was 20.8% v. 17.5% (p=0.5); and had MI within 30 days was 2.0% v. 1.9% (p=1.0).ConclusionsThe U.S. and Canadian centres saw ADHF patients with similar characteristics. Although the U.S. site had almost double the admission rate, the outcomes were similar between the sites, which question the necessity of routine admission for patients with ADHF.
APA, Harvard, Vancouver, ISO, and other styles
10

Lentzos, Filippa. "The American biodefense industry: From emergency to nonemergence." Politics and the Life Sciences 26, no. 1 (March 2007): 15–23. http://dx.doi.org/10.2990/26_1_15.

Full text
Abstract:
Since 1998, and especially since the “Amerithrax” emergency of 2001, the United States has ambitiously funded biodefense projects, intending not only to enhance detection and management of any biological-weapons attack but also to establish a robust domestic biodefense industry. I asked if the United States had fulfilled this latter intention. Using the RAND Corporation's RaDiUS database, I examined federal biodefense grants and contracts awarded from 1995 through most of 2005, noting recipient type, awarding unit, funding level, and the disease focus of research-and-development support. Patterns in these data as well as other sources suggest that the biodefense industry as late as 2005 remained in a nascent stage, with most firms small, precariously financed, and more responsive to funders' announcements and solicitations than to opportunities for self-directed innovation. A biodefense industry with investor-capital funding and retained earnings, with its own leading companies, with its own stock analysts, and with its own legitimacy in commercial and financial markets did not emerge over the period studied, nor does its emergence appear imminent.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Emergency management – United States – Data processing"

1

Indrakanti, Saratchandra. "Computational Methods for Vulnerability Analysis and Resource Allocation in Public Health Emergencies." Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc804902/.

Full text
Abstract:
POD (Point of Dispensing)-based emergency response plans involving mass prophylaxis may seem feasible when considering the choice of dispensing points within a region, overall population density, and estimated traffic demands. However, the plan may fail to serve particular vulnerable sub-populations, resulting in access disparities during emergency response. Federal authorities emphasize on the need to identify sub-populations that cannot avail regular services during an emergency due to their special needs to ensure effective response. Vulnerable individuals require the targeted allocation of appropriate resources to serve their special needs. Devising schemes to address the needs of vulnerable sub-populations is essential for the effectiveness of response plans. This research focuses on data-driven computational methods to quantify and address vulnerabilities in response plans that require the allocation of targeted resources. Data-driven methods to identify and quantify vulnerabilities in response plans are developed as part of this research. Addressing vulnerabilities requires the targeted allocation of appropriate resources to PODs. The problem of resource allocation to PODs during public health emergencies is introduced and the variants of the resource allocation problem such as the spatial allocation, spatio-temporal allocation and optimal resource subset variants are formulated. Generating optimal resource allocation and scheduling solutions can be computationally hard problems. The application of metaheuristic techniques to find near-optimal solutions to the resource allocation problem in response plans is investigated. A vulnerability analysis and resource allocation framework that facilitates the demographic analysis of population data in the context of response plans, and the optimal allocation of resources with respect to the analysis are described.
APA, Harvard, Vancouver, ISO, and other styles
2

Bullock, Kenneth F. "Navy Marine Corps Intranet : an analysis of its approach to the challenges associated with seat management contracting." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03Jun%5FBullock.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ling, Meng-Chun. "Senior health care system." CSUSB ScholarWorks, 2005. https://scholarworks.lib.csusb.edu/etd-project/2785.

Full text
Abstract:
Senior Health Care System (SHCS) is created for users to enter participants' conditions and store information in a central database. When users are ready for quarterly assessments the system generates a simple summary that can be reviewed, modified, and saved as part of the summary assessments, which are required by Federal and California law.
APA, Harvard, Vancouver, ISO, and other styles
4

"Selling chain reengineering enabled by information technology: a case of data general corporation." Chinese University of Hong Kong, 1997. http://library.cuhk.edu.hk/record=b5889034.

Full text
Abstract:
by Leung Man-Wai, Dannie.
Thesis (M.B.A.)--Chinese University of Hong Kong, 1997.
Includes bibliographical references (leaves 68-70).
ABSTRACT --- p.i
TABLE OF CONTENTS --- p.ii
Chapter
Chapter I. --- INTRODUCTION --- p.1
Chapter II. --- METHODOLOGY --- p.5
Chapter III. --- LITERATURE REVIEW --- p.7
Reengineering --- p.7
What is Reengineering? --- p.7
Approaches to Reengineering --- p.10
The Role of Management in Reengineering --- p.11
Why Reengineering Project Failed and Succeeded? --- p.13
Relationship Between Reengineering and Information Technology --- p.14
Human Dimensions in Reengineering --- p.16
The Concept of Selling Chain Management --- p.18
Chapter IV. --- REENGINEERING THE SELLING CHAIN AT DATA GENERAL CORPORATION --- p.20
Company Background --- p.20
Call to Action --- p.21
The Reengineering Competency Group --- p.22
Reengineer the Selling Chain --- p.24
Problems Before Reengineering --- p.24
Three Phases of Reengineering --- p.27
Redesign Concepts Applied --- p.37
The Lessons Learnt --- p.38
The Reengineered Data General --- p.42
Critical Success Factors of Data General Reengineering --- p.42
Chapter V. --- TECHNOLOGY-ENABLED SELLING --- p.45
Aligning Technology-Enabled Selling with Selling Chain Management --- p.45
What Is Technology-Enabled Selling? --- p.46
The Impact of the Sales Organization on SCM --- p.46
A Transition in Customer Relationships --- p.47
Moving the Decision Point --- p.47
Fact-Based Presentation --- p.48
Optimizing the Selling Chain and Maximizing Profit --- p.49
The Building Blocks of Technology-Enabled Selling --- p.49
Opportunity Management System --- p.49
Marketing Information System --- p.50
Sales Configuration System --- p.51
Sales Order Management System --- p.53
The Interactive Selling System --- p.53
Which Building Block Should be Implemented First? --- p.54
Leveraging the Benefits of Technology-Enabled Selling --- p.55
Avoiding the Cost-Reduction Pitfall --- p.55
Understand the Market Force --- p.56
Demonstrating Customer Value --- p.57
Selling Models Consideration --- p.58
Chapter VI. --- REDESIGNING DATA GENERAL'S SELLING CHAIN IN ASIA --- p.62
Chapter VII. --- CONCLUSION --- p.67
BIBLIOGRAPHY --- p.68
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Emergency management – United States – Data processing"

1

Emergency planning for the year 2000: Preparation or panic? : hearing before the Special Committee on the Year 2000 Technology Problem, United States Senate, One Hundred Fifth Congress, second session, on the preparedness of emergency service agencies at the state, county, and local government levels, October 2, 1998. Washington: U.S. G.P.O., 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

United States. Congress. House. Committee on Transportation and Infrastructure. Subcommittee on Oversight, Investigations, and Emergency Management. Program data quality: Hearing before the Subcommittee on Oversight, Investigations, and Emergency Management of the Committee on Transportation and Infrastructure, House of Representatives, One Hundred Sixth Congress, second session, March 22, 2000. Washington: U.S. G.P.O., 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Counterterrorism, United States Congress House Select Committee on Homeland Security Subcommittee on Intelligence and. Department of Homeland Security's Information Analysis and Infrastructure Protection budget proposal for fiscal year 2005: Joint hearing before the Subcommittee on Intelligence and Counterterrorism and Subcommittee on Infrastructure and Border Security of the Select Committee on Homeland Security, House of Representatives, One Hundred Eighth Congress, second session, March 4, 2004. Washington: U.S. G.P.O., 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

United States. Congress. House. Select Committee on Homeland Security. Subcommittee on Intelligence and Counterterrorism. Department of Homeland Security's Information Analysis and Infrastructure Protection budget proposal for fiscal year 2005: Joint hearing before the Subcommittee on Intelligence and Counterterrorism and Subcommittee on Infrastructure and Border Security of the Select Committee on Homeland Security, House of Representatives, One Hundred Eighth Congress, second session, March 4, 2004. Washington: U.S. G.P.O., 2005.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

United States. Congress. House. Committee on Transportation and Infrastructure. Subcommittee on Economic Development, Public Buildings, and Emergency Management, ed. Recovery Act project to replace the Social Security Administration's national computer center: Hearing before the Committee on Ways and Means, Subcommittee on Social Security, joint with the Committee on Transportation and Infrastructure, Subcommittee on Economic Development, Public Buildings, and Emergency Management, U.S. House of Representatives, One Hundred Eleventh Congress, first session, December 15, 2009. Washington: U.S. G.P.O., 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

United States. Congress. Senate. Committee on Environment and Public Works. Subcommittee on Superfund and Waste Management. Consider S. 3871, a bill directing the EPA to establish a hazardous waste manifest: Hearing before the Subcommittee on Superfund and Waste Management of the Committee on Environment and Public Works, United States Senate, One Hundred Ninth Congress, second session, September 28, 2006. Washington: U.S. G.P.O., 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Consider S. 3871, a bill directing the EPA to establish a hazardous waste manifest: Hearing before the Subcommittee on Superfund and Waste Management of the Committee on Environment and Public Works, United States Senate, One Hundred Ninth Congress, second session, September 28, 2006. Washington: U.S. G.P.O., 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

United States. Congress. House. Committee on Transportation and Infrastructure. Subcommittee on Economic Development, Public Buildings, and Emergency Management, ed. Joint oversight hearing on managing costs and mitigating delays in the new building of Social Security's new National Computer Center: Joint hearing before the Subcommittee on Social Security of the Committee on Ways and Means and the Subcommittee on Economic Development, Public Buildings, and Emergency Management of the Committee on Transportation and Infrastructure of the U.S. House of Representatives, One Hundred Twelfth Congress, first session, February 11, 2011. Washington: U.S. G.P.O., 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

United States Army Soldier Support Institute. Automatic data processing management handbook. 4th ed. [Ft. Benjamin Harrison, IN]: The Institute, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

1965-, Sui Daniel Z., ed. Geospatial technologies and homeland security: Research frontiers and future challenges. [Dordrecht, Netherlands?]: Springer, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Emergency management – United States – Data processing"

1

Segall, Richard S., and Gao Niu. "Overview of Big Data and Its Visualization." In Advances in Data Mining and Database Management, 1–32. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-3142-5.ch001.

Full text
Abstract:
Big Data is data sets that are so voluminous and complex that traditional data processing application software are inadequate to deal with them. This chapter discusses what Big Data is and its characteristics, and how this information revolution of Big Data is transforming our lives and the new technology and methodologies that have been developed to process data of these huge dimensionalities. This chapter discusses the components of the Big Data stack interface, categories of Big Data analytics software and platforms, descriptions of the top 20 Big Data analytics software. Big Data visualization techniques are discussed with real data from fatality analysis reporting system (FARS) managed by National Highway Traffic Safety Administration (NHTSA) of the United States Department of Transportation. Big Data web-based visualization software are discussed that are both JavaScript-based and user-interface-based. This chapter also discusses the challenges and opportunities of using Big Data and presents a flow diagram of the 30 chapters within this handbook.
APA, Harvard, Vancouver, ISO, and other styles
2

"From Catastrophe to Recovery: Stories of Fishery Management Success." In From Catastrophe to Recovery: Stories of Fishery Management Success, edited by Ronald J. Essig, R. Wilson Laney, Max H. Appelman, Fred A. Harris, Roger A. Rulifson, and Kent L. Nelson. American Fisheries Society, 2019. http://dx.doi.org/10.47886/9781934874554.ch22.

Full text
Abstract:
<i>Abstract</i>.—The Striped Bass <i>Morone saxatilis</i> is an extremely important commercial and recreational species with a coastal migratory stock in the United States referred to as “Atlantic Striped Bass” managed by the Atlantic States Marine Fisheries Commission (ASMFC). Atlantic Striped Bass has four major contributing stocks, including the Chesapeake Bay, which comprises 70–90%, and the Hudson River, the Delaware River, and the Albemarle Sound/Roanoke River (A/R). The collapse of Atlantic Striped Bass in the late 1970s precipitated federal funding and legislation like the Emergency Striped Bass Study for research on causative factors of the decline and potential management recommendations. The 1981 ASMFC Interstate Fishery Management Plan (ISFMP) for Atlantic Striped Bass was nonmandatory and mostly ineffective until the 1984 Atlantic Striped Bass Conservation Act provided regulatory authorities to the ASMFC and the federal government to close fisheries in states out of compliance with ISFMPs. Restrictions and moratoria on harvest imposed in several states reduced mortality, and under favorable environmental conditions and given Striped Bass life history, multiple years of good recruitment occurred. This allowed target thresholds for female spawning stock biomass to be achieved and the ASMFC to declare recoveries of Atlantic Striped Bass stocks from 1995 to 1998. Regulation of river flows was particularly important for the A/R stock recovery, and this stock is presented as a case study. During the 20+ years following recovery, long-term monitoring by states in support of adaptive management was primarily supported by the stable, nonappropriated funding of the Sport Fish Restoration Act. Monitoring includes spawning stock characterization and biomass estimation, juvenile abundance surveys, cooperative coastwide tagging, and harvest data collection. Future issues facing the recovered Atlantic Striped Bass include interspecies effects of relatively high abundance, management of stocks separately instead of as a single coastal stock, and ecosystem-based fisheries management. Key lessons learned in the Atlantic Striped Bass recovery are that high societal value of the species provided the political impetus to create and fund the recovery program, coordination of management and enforcement efforts among all jurisdictions was essential for this migratory species, and fully funded long-term monitoring programs are critical to adaptive population management.
APA, Harvard, Vancouver, ISO, and other styles
3

Kuruvilla, Abey, and Suraj M. Alexander. "Predicting Ambulance Diverson." In Advancing the Service Sector with Evolving Technologies, 1–10. IGI Global, 2012. http://dx.doi.org/10.4018/978-1-4666-0044-7.ch001.

Full text
Abstract:
The high utilization level of emergency departments in hospitals across the United States has resulted in the serious and persistent problem of ambulance diversion. This problem is magnified by the cascading effect it has on neighboring hospitals, delays in emergency care, and the potential for patients’ clinical deterioration. We provide a predictive tool that would give advance warning to hospitals of the impending likelihood of diversion. We hope that with a predictive instrument, such as the one described in this paper, hospitals can take preventive or mitigating actions. The proposed model, which uses logistic and multinomial regression, is evaluated using real data from the Emergency Management System (EM Systems) and 911 call data from Firstwatch® for the Metropolitan Ambulance Services Trust (MAST) of Kansas City, Missouri. The information in these systems that was significant in predicting diversion includes recent 911 calls, season, day of the week, and time of day. The model illustrates the feasibility of predicting the probability of impending diversion using available information. We strongly recommend that other locations, nationwide and abroad, develop and use similar models for predicting diversion.
APA, Harvard, Vancouver, ISO, and other styles
4

Lim, Soo Kyoung. "A Framework to Evaluate the Informatization Level." In Information Technology Evaluation Methods and Management, 144–52. IGI Global, 2001. http://dx.doi.org/10.4018/978-1-878289-90-2.ch009.

Full text
Abstract:
As information and communication technologies have rapidly developed in the 1990s, enormous changes have taken place everywhere. At work environment, these have been newer tools for increasing organizational productivity, and these are transforming organizations to the degree that Taylorism once did (Davenport, 1998). These trends have spread over various fields of society, and have over countries caused economical and cultural innovation and reformation. These phenomena can be summarized as informatization. Informatization is defined as “converting the main goods and energy of a social economy to information through the revolution of high data communication technology and utilizing information produced by gathering, processing and distributing data within the vast fields of the society” (National Computerization Agency [NCA], 1997). Since The United States’ NII project has been evaluated as one of the important success factors for economical growth, most countries have considered informatization as one of the most effective means for improving a nation’s competitiveness. Similarly, many organizations have considered informatization as a strategy to improve quality of public service and productivity. They have tried to implement informatization and extensive investments are often budgeted and expanded to acquire information technology (IT). An Information Strategy Plan (ISP) is needed at first to implement informatization of an organization. ISP usually includes business strategy, information technology strategy, project priorities, and an organization’s structure strategy. Thus, when an ISP is set up, it describes whether the business or organization’s strategic goals and objectives can be achieved through IT, in which field further IT investment will be needed, and whether efficient investment in IT will be made. In order to discuss these topics, the current organization’s informatization level first must be known. Moreover, since the middle of 1990, many countries have put emphasis on performance based management, in which the government has to set up investment plans according to its performance. For example, to budget IT, it is required to first evaluate its performance and results. In this respect, evaluation of an organization’s informatization level in order to review how much organization informatization it achieves is an important managerial concern. However, this is not a simple problem because informatization includes many intangible factors such as the quality of information and an organization’s culture. In this chapter, framework and metrics are introduced to evaluate the organization’s informatization level. This framework is designed to provide reasonable information by gathering and analyzing various IT metrics for determining whether organizations have made efficient and effective use of IT and have achieved the organizational strategic goals and objectives through IT. Therefore, the evaluation results can be used to improve the organization’s informatization level. The remainder of this paper is organized as follows: in the following section, some case studies and background information are presented. The next section introduces a framework, and then future trends are discussed in the next section. Finally, the summary and conclusion are presented.
APA, Harvard, Vancouver, ISO, and other styles
5

"Island in the Stream: Oceanography and Fisheries of the Charleston Bump." In Island in the Stream: Oceanography and Fisheries of the Charleston Bump, edited by John V. Miglarese and Robert H. Boyles. American Fisheries Society, 2001. http://dx.doi.org/10.47886/9781888569230.ch16.

Full text
Abstract:
<em>Abstract.</em>—Competition between commercial and recreational fishers for fishery resources is common throughout the United States. This competition for resources occurs throughout the south Atlantic region. However, competition around an area known as the Charleston Bump led to controversy and public calls for closure of that area to commercial fishing. In 1997, controversy erupted over the proposed lease of a fish processing facility at the newly completed Charleston Maritime Center. A group of commercial fishermen proposed to open the Maritime Center’s facilities to all types of commercial fishing craft, but with emphasis on longline vessels. The high level of public awareness and knowledge of South Carolina’s offshore fisheries helped to catapult the Charleston Bump to the forefront of state and federal marine fisheries policy, research, and management. Parties to this dispute looked to state fisheries managers for interpretation of technical information upon which to base their decisions. However, fisheries managers soon learned that the data on the significance of the Charleston Bump as a nursery area were inconsistent and spotty. This lack of reliable data left the managers in a policy dilemma: how to make technical recommendations on the management of the fisheries of the Charleston Bump given the lack of data. The fisheries managers responded by acknowledging the lack of data and suggesting that a comprehensive ecological analysis of the Charleston Bump be performed. In addition, the fisheries managers responded by interpreting the data based on the precautionary principle (i.e., do no harm to the resource) and advised the parties to the Maritime Center dispute against any move that might consolidate fishing effort on the Charleston Bump. The purpose of this paper is to: (1) document the approach taken by the State of South Carolina to analyze this public controversy and; (2) describe how public involvement in the development of a local public policy issue can create the need for further scientific inquiry and research. The authors present an overview of this controversy and highlight how public perceptions and demand for action resulted in a policy stance. The authors describe how the public’s direct involvement led not only to the colloquium but also to a renewed scientific interest in the Charleston Bump.
APA, Harvard, Vancouver, ISO, and other styles
6

Partow-Navid, Parviz, and Ludwig Slusky. "IT Security Policy in Public Organizations." In Information Security and Ethics, 2745–54. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-937-3.ch183.

Full text
Abstract:
Today, information security is one of the highest priorities on the IT agenda. In 2003, Luftman and McLean (2004) conducted a survey of Society for Information Management members to identify the top 20 information technology (IT) issues for executives. Security and privacy issues were ranked third, after IT/ business alignment and IT strategic planning. Concept of information security applies to all the data stored in information systems or being communicated in information networks and encompasses measures applied on all layers of open system interconnect (OSI) model of international standards such as application, networking, and physical. Sophisticated technologies and methods have been developed to: • Control access to computer networks • Secure information systems with advanced cryptography and security models • Establish standards for operating systems with focus on confidentiality • Communication integrity and availability for securing different types of networks • Manage trustworthy networks and support business continuity planning, disaster recovery, and auditing The most widely recognized standards are: • In the United States: Trusted Computer System Evaluation Criteria (TCSEC). • In Canada: Canadian Trusted Computer Product Evaluation Criteria (CTCPEC). • In Europe: Information Technology Security Evaluation Criteria (ITSEC). All of theses standards have recently been aggregated into Common Criteria standards. And yet, the information systems continue to be penetrated internally and externally at a high rate by malicious code, attacks leading to loss of processing capability (like distributed denial-of-service attack), impersonation and session hijacking (like man-in-the-middle attack), sniffing, illegal data mining, spying, and others. The problem points to three areas: technology, law, and IT administration. Even prior to the drama of 9/11, several computer laws were enacted in the USA and yet more may come in the future. Still the fundamental threats to information security, whether they originated outside the network or by the company’s insiders, are based on fundamental vulnerabilities inherent to the most common communication protocols, operating systems, hardware, application systems, and operational procedures. Among all technologies, the Internet, which originally was created for communication where trust was not a characteristic, presents the greatest source of vulnerabilities for public information systems infrastructures. Here, a threat is a probable activity, which, if realized, can cause damage to a system or create a loss of confidentiality, integrity, or availability of data. Consequently, vulnerability is a weakness in a system that can be exploited by a threat. Although, some of these attacks may ultimately lead to an organization’s financial disaster, an all-out defense against these threats may not be economically feasible. The defense actions must be focused and measured to correspond to risk assessment analysis provided by the business and IT management. That puts IT management at the helm of the information security strategy in public organizations.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Emergency management – United States – Data processing"

1

Drake, J. Andrew, Mark L. Hereth, Daniel B. Martin, Terry D. Boss, and Jeryl Mohn. "Integrity Management Continuous Improvement." In 2012 9th International Pipeline Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/ipc2012-90406.

Full text
Abstract:
At the end of 2010, recognizing that the baseline period for the integrity management of high consequence areas (HCAs) along natural gas transmission pipelines in the United States was nearly complete, INGAA members decided to reflect on the accomplishments of the first eight years and define where the overall integrity of systems could be improved. High profile incidents such as the one on the PG&E system in California heightened the need for such an analysis. There was a conscious decision to define a future path as the industry had done on many other occasions, and not simply wait for legislation and regulation. A Board level task force was formed to provide guidance and oversight and a technical steering team was constituted under the direction of Andy Drake of Spectra Energy. The technical steering team met for two months and defined a set of guiding principles and nine initiatives and assembled working groups to address each area. This paper will report at a high level on the completion of work and the integration of efforts. The first initiative is directed at improving the transparency by periodically and formally sharing measures of performance, and actively promoting the guidance developed by the Pipelines and Informed Planning Alliance (PIPA). A second initiative is directed at defining a path to extend integrity management principles beyond HCAs. A third initiative has been undertaken to examine how we can improve the tools applied in managing threats to integrity and analysis of data derived from the tools to address uncertainty. The PG&E incident showed us the need to define a process for evaluating records for pre-regulation pipe and managing pre-regulation pipe. While the role of hydrostatic testing is clear, the investment that has been made in making systems piggable has created the opportunity for a fourth initiative to define requirements for historical records and how in-line inspection can play a role in managing pre-regulation pipe. The focus of improving tools and evaluation techniques surfaced a need to intensify our efforts in research, development and commercialization. A fifth initiative has been undertaken to develop a road map for research, development and commercialization. In developing the guiding principles we studied other industries that have worked to define ways of improving safety performance, especially those where the cost of failure is unacceptable in the public eye. These included commercial aviation, medical, chemical and petroleum refining and nuclear. It has become clear that a focus on safety culture and ultimately application of a management system is a means of improving safety performance, and a sixth initiative has been undertaken to address the role of safety culture and more broadly management systems. A seventh initiative has been undertaken to examine ways to improve emergency response effectiveness including the use of automated valves, integrated mitigation plans and enhanced public awareness. There were a series of projects undertaken in 2009 and 2010 as an eighth initiative conducted under the auspices of the INGAA Foundation directed at improving material procurement and construction. Recognizing challenges in storage field operations and the criticality of storage in maintaining gas supply, a ninth initiative has been undertaken to clarify regulatory oversight of storage facilities.
APA, Harvard, Vancouver, ISO, and other styles
2

Lonia, B., N. K. Nayar, S. B. Singh, and P. L. Bali. "Techno Economic Aspects of Power Generation From Agriwaste in India." In 17th International Conference on Fluidized Bed Combustion. ASMEDC, 2003. http://dx.doi.org/10.1115/fbc2003-170.

Full text
Abstract:
The agricultural operations in India are suffering from a serious problem of shortage of electrical power on one side and economic and effective disposal of agriwaste stuff on the other. India being agriculture based country, 70% of its main income (share in GDP) comes from agriculture sector. Any enhancement of income from this sector is based upon adequate supply of basic inputs in this sector. Regular and adequate power supply is one such input. But, the position of power supply in our country defies both these characteristics. With a major portion of power produced being sent to the industrial and urban consumers, there is a perennial shortage of power in the agriculture sector. Consequently, there is an emergent need to produce more power in order to fulfil the needs of this sector effectively. One way of accomplishing this is setting up captive, preferably rural based, small power generation plants. In these power plants, instead of water-head, diesel oil or coal, we can use agri-residue to produce electricity. One such power plant (1–2 MW capacity) can satisfy the power need of 25 to 40 nearby villages. The agriwaste like rice straw, sugarcane-trash, coir-pith, peanut shells, wheat stalks & straw, cottonseed, stalks and husk, soyabean stalks, maize stalks & cobs, sorghum. Bagasse, wallnut shells, sunflower seeds, shells, hulls and kernels and coconut husk, wastewood and saw dust can be fruitfully utilized in power generation. This stuff is otherwise a waste and liability and consumes a lot of effort on its disposal; in addition to being a fire and health hazard. Agriwaste stuff which at present is available in abundance and prospects of its utilization in producing energy are enormous. This material can be procured at reasonably low rates from the farmers who will thus be benefited economically, apart from being relieved of the responsibility of its disposal. Agri-residue has traditionally been a major source of heat energy in rural areas in India. It is a valuable fuel even in the sub-urban areas. Inspite of rapid increase in the supply of, access lo and use of fossil fuels, agri-residue is likely to continue to play an important role, in the foreseeable future. Therefore, developing and promoting techno-economically-viable technologies to utilize agri-residue efficiently should be a persuit of high priority. Though there is no authentic data available with regard to the exact quantity of agricultural and agro-industrial residues, its rough estimate has been put at about 350 mt per annum. It is also estimated that the total cattle refuse generated is nearly 250 mt per year. Further, nearly 20% of the total land is under forest cover, which produces approximately 50 mt of fuel wood and with associated forest waste of about 5 mt.(1). Taking into account the utilization of even a portion (say 30%) of this agri-residue & agro-industrial waste as well as energy plantation on one million hectare (mha) of wastelands for power generation through bioenergy technologies, a potential of some 18000 MW of power has been estimated. From the foregoing, it is clear that there is an enormous untapped potential for energy generation from agri-residue. What is required is an immediate and urgent intensification of dedicated efforts in this field, with a view to bringing down the unit energy cost and improving efficiency and reliability of agri-waste production, conversion and utilisation, leading to subsequent saving of fossil fuels for other pressing applications. The new initiatives in national energy policy are most urgently needed to accelerate the social and economic development of the rural areas. It demands a substantial increase in production and consumption of energy for productive purposes. Such initiatives are vital for promoting the goals of sustainability. cleaner production and reduction of long-term risks of environmental pollution and consequent adverse climatic changes in future. A much needed significant social, economic and industrial development has yet to take place in large parts of rural India; be it North, West, East or South. It can be well appreciated that a conscious management of agri-residue, which is otherwise a serious liability of the farmer, through its economic conversion into electric power can offer a reasonably viable solution to our developmental needs. This vision will have to be converted into a reality within a decade or so through dedicated and planned R&D work in this area. There is a shimmering promise that the whole process of harvesting, collection, transport and economic processing and utilisation of agri-waste can be made technically and economically more viable in future. Thus, the foregoing paras amply highlight the value of agri-residue as a prospective source of electric power, particularly for supplementing the main grid during the lean supply periods or peak load hours and also for serving the remote areas in the form of stand-alone units giving a boost to decentralised power supply. This approach and option seems to be positive in view of its potential contribution to our economic and social development. No doubt, this initiative needs to be backed and perused rigorously for removing regional imbalances as well as strengthening National economy. This paper reviews the current situation with regards to generation of agriwaste and its prospects of economic conversion into electrical power, technologies presently available for this purpose, and the problems faced in such efforts. It emphasizes the need for an integrated approach to devise ways and means for generating electrical power from agriwaste; keeping in mind the requirements of cleaner production and environmental protection so that the initiative leads to a total solution.
APA, Harvard, Vancouver, ISO, and other styles
3

Alamdari, Nasim, Nicholas MacKinnon, Fartash Vasefi, Reza Fazel-Rezai, Minhal Alhashim, Alireza Akhbardeh, Daniel L. Farkas, and Kouhyar Tavakolian. "Effect of Lesion Segmentation in Melanoma Diagnosis for a Mobile Health Application." In 2017 Design of Medical Devices Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/dmd2017-3522.

Full text
Abstract:
In 2016, more than 76,380 new melanoma cases were diagnosed and 10,130 people were expected to die from skin cancer in the United States (one death per hour) [1]. A recent study demonstrates that the economic burden of skin cancer treatment is substantial and, in the United States, the cost was increased from $3.6 billion in 2002–2006 to $8.1 billion in 2007–2011 [2]. Monitoring moderate and high-risk patients and identifying melanoma in the earliest stage of disease should save lives and greatly diminish the cost of treatment. In this project, we are focused on detection and monitoring of new potential melanoma sites with medium/high risk patients. We believe those patients have a serious need and they need to be motivated to be engaged in their treatment plan. High-risk patients are more likely to be engaged with their skin health and their health care providers (physicians). Considering the high morbidity and mortality of melanoma, these patients are motivated to spend money on low-cost mobile device technology, either from their own pocket or through their health care provider if it helps reduce their risk with early detection and treatment. We believe that there is a role for mobile device imaging tools in the management of melanoma risk, if they are based on clinically validated technology that supports the existing needs of patients and the health care system. In a study issued in the British Journal of Dermatology [2] of 39 melanoma apps [2], five requested to do risk assessment, while nine mentioned images for expert review. The rest fell into the documentation and education categories. This seems like to be reliable with other dermatology apps available on the market. In a study at University of Pittsburgh [3], Ferris et al. established 4 apps with 188 clinically validated skin lesions images. From images, 60 of them were melanomas. Three of four apps tested misclassified +30% of melanomas as benign. The fourth app was more accurate and it depended on dermatologist interpretation. These results raise questions about proper use of smartphones in diagnosis and treatment of the patients and how dermatologists can effectively involve with these tools. In this study, we used a MATLAB (The MathWorks Inc., Natick, MA) based image processing algorithm that uses an RGB color dermoscopy image as an input and classifies malignant melanoma versus benign lesions based on prior training data using the AdaBoost classifier [5]. We compared the classifier accuracy when lesion boundaries are detected using supervised and unsupervised segmentation. We have found that improving the lesion boundary detection accuracy provides significant improvement on melanoma classification outcome in the patient data.
APA, Harvard, Vancouver, ISO, and other styles
4

Doll, Kristopher, and Conrad S. Tucker. "Mining End-of-Life Materials Suitable for Material Resynthesis and Discovering New Application Domains." In ASME 2014 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/detc2014-34779.

Full text
Abstract:
The United States generates more than 250 million tons of municipal solid waste (trash/garbage), with only 34% being recycled. In the broader global environment, the problem of waste management is becoming increasingly relevant, demanding innovative solutions. Traditional End-of-Life (EOL) approaches to managing waste include recycle, reuse, remanufacture and disposal. Recently, resynthesis was proposed as an alternative to traditional EOL options that combines multiple products to create a new product distinct from its parent assemblies. Resynthesis employs data mining and natural language processing algorithms to quantify assembly/subassembly combinations suitable for new product combinations. However, existing resynthesis methodologies proposed in the design community have been limited to exploring subassembly combinations, failing to explore potential combinations on a materials level. The authors of this paper propose a material resynthesis methodology that combines the materials of multiple EOL products using conventional manufacturing processes that generate candidate resynthesized materials that satisfy the needs of existing domains/applications. Appropriate applications for a resynthesized material are discovered by comparing the properties of the new material to the functional requirements of application classes which are found using clustering and latent semantic analysis. In the course of this paper, the authors present a case study that demonstrates the feasibility of the proposed material resynthesis methodology in the construction materials domain.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography