Academic literature on the topic 'United States. Federal Emergency Management Agency. Resource Management and Administration'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'United States. Federal Emergency Management Agency. Resource Management and Administration.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "United States. Federal Emergency Management Agency. Resource Management and Administration"

1

Seelman, MS, MBA, Sharon, Stelios Viazis, PhD, Sheila Pack Merriweather, MPH, Tami Craig Cloyd, DVM, Megan Aldridge, MPH, and Kari Irvin, MS. "Integrating the Food and Drug Administration Office of the Coordinated Outbreak Response and Evaluation Network’s foodborne illness outbreak surveillance and response activities with principles of the National Incident Management System." Journal of Emergency Management 19, no. 2 (March 1, 2021): 131–41. http://dx.doi.org/10.5055/jem.0567.

Full text
Abstract:
The Food Safety Modernization Act mandates building a national Integrated Food Safety System, which represents a seamless partnership among federal, state, local, territorial, and tribal agencies. During multistate foodborne illness outbreak investigations, local and state partners, the Centers for Disease Control and Prevention, the United States Food and Drug Administration (FDA), or the United States Department of Agriculture Food Safety Inspection Service, depending on the regulated food product, become engaged and assist in coordinating the efforts between partners involved and determine the allocation of resources. The FDA Center for Food Safety and Applied Nutrition (CFSAN) Office of the Coordinated Outbreak Response and Evaluation (CORE) Network coordinates foodborne illness outbreak surveillance, response, and post-response activities related to incidents involving multiple illnesses linked to FDA-regulated human food, dietary supplements, and cosmetic products. FDA has implemented the National Incident Management System (NIMS) Incident Command System (ICS) principles across the agency to coordinate federal response efforts, and CORE has adapted NIMS ICS principles for the emergency management of multistate foodborne illness outbreaks. CORE’s implementation of ICS principles has provided several benefits to the operational cycle of foodborne illness outbreak investigations, including establishing a consistent, standardized, and transparent step-by-step approach to outbreak investigations. ICS principles have been instrumental in the development of a national platform for rapid and systematic laboratory, traceback, and epidemiologic information sharing, data analysis, and decision-making. This allows for partners across jurisdictions to reach a consensus regarding outbreak goals and objectives, deploy resources, and take regulatory and public health actions.
APA, Harvard, Vancouver, ISO, and other styles
2

Mohammad, Sohail. "Neuropsychiatric Manifestations of Wildfire Exposure." Prehospital and Disaster Medicine 34, s1 (May 2019): s152. http://dx.doi.org/10.1017/s1049023x19003418.

Full text
Abstract:
Introduction:Wildfires are life threatening incessant fires in thickly vegetated areas that spread extremely rapidly to human habitat and are difficult to control by human force. The impact of wildfires is enormous on population health and causes tremendous financial burden to individuals and communities.Aim:The aim is to understand the potential disease burden secondary to wildfires both at an individual and population level and reflect upon the immediate and delayed neuropsychiatric manifestations of smoke exposure.Methods:Data on wildfires associated direct and indirect costs on individual health and health care delivery appears to be scant. The effort of this presentation is to present the federal data from 2012 to 2016 on nationwide wildfires, estimated acreage consumed in wildfires, the population exposed, and deaths. Information was extracted from the National Interagency Fire Center, the United States Fire Administration, and the Federal Emergency Management Agency. Through literature review on neuropsychological sequelae of wildfires smoke inhalation and associated trauma, the goal is to reflect upon potential healthcare burden secondary to neuropsychiatric manifestations.Results:Per National Center for Health Statistics, the national fire death rates from 2012 to 2016 ranged 10 to 11 per million population each year, and the property loss both residential and non-residential was estimated at 9 to 10 billion dollars each year. We know healthcare cost is expensive in the United States, and with the stated estimates, one can only envision the health care and public health system burden.Discussion:The characteristic neuropathology of carbon monoxide toxicity is bilateral Globus pallidus necrosis and the common neuropsychological symptoms include fatigue, affective conditions, emotional distress, memory deficits, sleep disturbance, vertigo, dementia, and psychosis. The health effects and associated disability demand policymakers to allocate resources for wildfire prevention/ containment and primary health care providers education, research, and building effective healthcare delivery systems.
APA, Harvard, Vancouver, ISO, and other styles
3

Gilbert, Jacqueline A. "An Empirical Examination of Resources in a Diverse Environment." Public Personnel Management 29, no. 2 (June 2000): 175–84. http://dx.doi.org/10.1177/009102600002900202.

Full text
Abstract:
The objective of this research is to empirically assess antecedents of resources within a diverse work environment. Specifically, 83 managerial employees were surveyed in a branch of a federal government agency located in a large metropolitan city in the southwestern United States. Multiple regression analysis showed that perceived resource availability was positively associated with outcomes of empowerment and work group integration. Additionally, racial minorities perceived that fewer resources were available to them at work. Implications for human resource mangers and changing work force demography are discussed.
APA, Harvard, Vancouver, ISO, and other styles
4

Gilbert, Jacqueline A., and Thomas Li-Ping Tang. "An Examination of Organizational Trust Antecedents." Public Personnel Management 27, no. 3 (September 1998): 321–38. http://dx.doi.org/10.1177/009102609802700303.

Full text
Abstract:
The objective of this research is to empirically assess antecedents of organizational trust. To accomplish this objective, 83 managerial employees were surveyed in a branch of a federal governmental agency located in a large metropolitan city in the Southwestern United States. Multiple regression analysis showed that age, marital status, and work group cohesion were positively associated with organizational trust. Organizational trust did not differ by either race or gender. Results are discussed in light of competitive challenges facing human resource managers.
APA, Harvard, Vancouver, ISO, and other styles
5

Rauhaus, Beth, and Andrew Johnson. "Social Inequities Highlighted by the Prolonged Pandemic: Expanding Sick Leave." Journal of Public and Nonprofit Affairs 7, no. 1 (April 1, 2021): 154–63. http://dx.doi.org/10.20899/jpna.7.1.154-163.

Full text
Abstract:
Since the beginning of the COVID-19 pandemic, a number of federal responses have been enacted in the United States to address the public health crisis, as well as the economic fallout and inequalities caused by the pandemic. A key feature globally in fighting the pandemic has been paid sick leave, as other nations have been successful in flattening the curve of infections by enacting emergency paid sick leave. This work explores best practices globally of paid sick leave used during the COVID-19 pandemic. Using the theoretical framework of punctuated equilibrium, this work spotlights the increased need to address paid sick leave in the United States. This work contributes further to understanding how policymaking in a federal system of government occurs during times of crisis.
APA, Harvard, Vancouver, ISO, and other styles
6

Pollak, Cheryl L. ""Hurricane" Sandy." Texas A&M Journal of Property Law 5, no. 2 (December 2018): 157–92. http://dx.doi.org/10.37419/jpl.v5.i2.3.

Full text
Abstract:
On the evening of October 29, 2012, “Hurricane” Sandy made land- fall on the New York coastline, battering the land with strong winds, torrential rain, and record-breaking storm surges. Homes and commercial structures were destroyed; roads and tunnels were flooded; and more than 23,000 people sought refuge in temporary shelters, with many others facing weeks without power and electricity. At the time, Sandy was heralded as one of the costliest hurricanes in the his- tory of the United States; the second costliest hurricane only to Katrina, which hit New Orleans in 2005. Unfortunately, recent experience with Hurricanes Florence, Maria, Harvey, and Irma suggest that this pattern of devastating superstorms may become the new norm as climate change produces more extreme and unpredictable weather events. In Sandy’s aftermath, as individuals returned to their homes, or what remained of them, and communities began to rebuild, the true cost of the storm became apparent. A year after the storm, the Federal Emergency Management Agency (“FEMA”) estimated that over $1.4 billion in assistance was provided to 182,000 survivors of the dis- aster; another $3.2 billion was provided to state and local governments for debris removal, infrastructure repair, and emergency protective measures. More than $2.4 billion was provided to individuals and businesses in the form of low-interest loans through the Small Business Administration (“SBA”), and millions more were spent on grants de- signed to implement mitigation measures in the future and to provide unemployment assistance to survivors. Before the storm, homeowners paid premiums for flood insurance provided through the National Flood Insurance Program (“NFIP”), and for homeowner’s insurance provided by dozens of private insurers. In the months following the storm, they began to file claims for assistance in rebuilding their homes. While many such claims were re- solved successfully, many homeowners were unhappy with the settlement amounts offered by their insurance carriers and felt compelled to file lawsuits in the surrounding state and federal courts. Many of those lawsuits were filed in the United States District Court for the Eastern District of New York (“EDNY”). This case study describes the EDNY’s specifically crafted, unique approach to handling the mass litigation that ensued from Sandy’s devastation, documents some of the problems that the Court faced during that mass litigation, and describes some of the lessons learned from the Court’s experience.
APA, Harvard, Vancouver, ISO, and other styles
7

Narin van Court, Wade A., Michael S. Hildebrand, and Gregory G. Noll. "WHAT RECENT HHFT DERAILMENT FIRES TELL US." International Oil Spill Conference Proceedings 2017, no. 1 (May 1, 2017): 2078–95. http://dx.doi.org/10.7901/2169-3358-2017.1.2078.

Full text
Abstract:
ABSTRACT ID: 2017-145. In July 2016, TRC Environmental Corporation (TRC) and Hildebrand and Noll Associates, Inc. (HNA) were requested to develop planning guidance on train derailments involving large volumes/high concentrations of denatured ethanol for the Massachusetts Emergency Management Agency (MEMA). As part of this project, as well as similar projects conducted by HNA for other clients, TRC and HNA assessed current firefighting strategies for the release of ethanol and/or crude oil from High Hazard Flammable Trains (HHFT) and developed the planning assumptions necessary to prepare for these types of incidents. For these projects, studies and in-depth analyses of 27 HHFT derailments resulting in tank cars breaches that occurred in the United States and Canada involving denatured ethanol1 (ethanol) and/or crude oil2 from 2006 through 2015 were performed. The analyses were primarily based on the information from the National Transportation Safety Board (NTSB), Federal Railroad Administration (FRA), and/or Transport Canada (TC) databases, with supplemental information from news reports in some cases. The objective of these analyses was to identify key planning assumptions that would be used in developing appropriate firefighting strategies by focusing on the number and types of cars derailed, approximate train speeds at the time of the derailment, number of cars breached, amount of product released, and whether or not the released product caught fire. Additionally, the studies included obtaining and reviewing information on the properties and characteristics of ethanol, crude oils, and other Class 3 flammable materials, as well as information for railroad tank cars. Insights and understandings gained from these studies were used to further develop the firefighting strategies for HHFT derailment fires.
APA, Harvard, Vancouver, ISO, and other styles
8

Delazari, Luciene Stamato, Leonardo Ercolin Filho, and Ana Luiza Stamato Delazari Skroch. "UFPR CampusMap: a laboratory for a Smart City developments." Abstracts of the ICA 1 (July 15, 2019): 1–2. http://dx.doi.org/10.5194/ica-abs-1-57-2019.

Full text
Abstract:
<p><strong>Abstract.</strong> A Smart City is based on intelligent exchanges of information that flow between its many different subsystems. This flow of information is analyzed and translated into citizen and commercial services. The city will act on this information flow to make its wider ecosystem more resource-efficient and sustainable. The information exchange is based on a smart governance operating framework designed to make cities sustainable.</p><p>The public administration needs updated and reliable geospatial data which depicts the urban environment. These data can be obtained through smart devices (smartphones, e.g.), human agents (collaborative mapping) and remote sensing technologies, such as UAV (Unnamed Aerial Vehicles). According to some authors, there are four dimensions in a Smart City. The first dimension concerns the application of a wide range of electronic and digital technologies to create a cyber, digital, wired, informational or knowledge-based city; the second is the use of information technology to transform life and work; the third is to embed ICT (Information and Communication Technology) in the city infrastructure; the fourth is to bring ICT and people together to enhance innovation, learning, and knowledge. Analyzing these dimensions, it is possible to say that in all of them the geospatial information is crucial, otherwise, none of them are possible. Considering these aspects, this research intends to use the Smart City concept as a methodological approach using the UFPR (Federal University of Parana) as a target to develop a case study.</p><p>The UFPR has 26 campus in different cities of the Paraná State, south of Brazil. Its structure has 14 institutes. It comprises 11 million square meters of area, 500,000 square meters of constructed area and 316 buildings. There are more than 6,300 employees (staff and administration), 50,000 undergraduate students and 10,000 graduate students. Besides these figures, there are external people who need access to the UFPR facilities, such as deliveries, service providers and the community in general.</p><p>The lack of knowledge about the space and its characteristics has a direct impact on issues such as resources management (human and material), campi infrastructure (outside and inside of the buildings), security and other activities which can be supported using an updated geospatial database. In 2014, the UFPR CampusMap project was started with the indoor mapping as the main goal. However, the base map of the campus was needed in order to support the indoor mapping, the available one was produced in 2000. Thereafter, the campus Centro Politécnico (located in the city of Curitiba) is being used as a case study to develop methodologies to create a geospatial database which will allows to different users the knowledge and management of the space.</p><p>According to Gruen (2013), a Smart City must have spatial intelligence. Moreover, it is necessary the establishment of a database, in particular, a geospatial database. The knowledge of the space where the events happen is a key element in this context. This author also states that to achieve this objective are necessary the following items:</p> <ul><li>Automatic or semi-automated Digital Surface Models (DSM) generation from satellite, aerial and terrestrialimages and/or LiDAR data;</li><li>Further development of the semi-automated techniques onto a higher level of automation; </li><li>Integrated automated and semi-automated processing of LiDAR point clouds and images, both from aerial andterrestrial platforms; </li><li>Streamlining the processing pipeline for UAV image data projects; </li><li>Set-up of GIS with 3D/4D capabilities; </li><li>Change detection and databases updating; </li><li>Handling of dynamic and semantic aspects of city modeling and simulation. This leads to 4D city models; </li><li>LBS (Location Based Services) system investigations (PDAs, mobiles); and </li><li>Establishment of a powerful visualization and interaction platform.</li></ul><p>Some of these aspects are being addressed in this research. The first one is the integration of indoor/outdoor data to helps the space management and provides a tool for navigation between the spaces. The base map was updated through a stereo mapping compilation from images collected using a UAV Phantom 4 from DJI (https://www.dji.com/phantom-4). The use of this technology for data acquisition is not only faster but also cheaper compared to the traditional photogrammetric method. Besides the quality of the images (in this case a GSD – Ground Sample Distance – of 2,5 cm), it can be use in urban areas as a rapid response in emergency situations.</p><p> To georreferencing the image block, it was used 50 control points collected by GNSS (Global Navigation Satellite System) and the software Agisoft Photoscan (http://www.agisoft.com/) to perform the bundle block adjustment with self-calibration. After the processing, the exterior orientation parameters of image block and the tridimensional coordinates of each tie point were calculated simultaneously with the determination of the interior orientation parameters: focal length (f), principal point coordinates (x0, y0), radial symmetric (k1, k2, k3) and decentering distortion coefficients (p1, p2).</p><p> In the mapping production step, the features were extracted through stereo mapping compilation accordingly the standards defined by the Brazilian Mapping Agency. The several layers were edited in GIS software (QGIS) and then the topology was built. Afterward, it was created a spatial database using Postgre/PostGIS. Also, the dense point cloud was generated using SfM (Structure from Motion) algorithms to allow to generate the digital surface model and orthomosaics.</p><p> Meanwhile, a website using HTML5+CSS3&amp;reg; and JavaScript&amp;reg; technologies was developed to publish the results and the first applications. (www.campusmap.ufpr.br). The architecture of this application uses JavaScript&amp;reg;, LeafLet, PgRouting library (to calculate the routes between interest points), files in GeoJson format and custom applications. The indoor database comprises the data about the interior of the buildings and provides to the user some functionalities such as: search for rooms, laboratories, and buildings; routes between points (inside and outside the buildings), floor change. Also, some web applications were developed in order to demonstrate the capabilities of the use of geospatial information in an environment very similar to a city and its problems, e.g. parking management, security, logistics, resources inventory, among others. It was developed a mobile application to provide the indoor user positioning through Wi-Fi (Wireless Fidelity) networks. This, combined with the indoor mapping, will allow the users to navigate in real time inside the buildings. Using the data from the point cloud and the CityGML standard it was developed a 3D model of some buildings. An application to inform crime occurrences (such as robbery, assaults) was also developed so these occurrences can be mapped, and the administration can increase the security of the campus.</p><ol type="a"> <li>Design an interface with functionalities to integrate all applications which are being presented in individual Webpages;</li><li>Develop a visualization tool for 3D models using CityGML;</li><li>Evaluate the potential of UAV images for different applications in urban scenarios;</li><li>Develop an interface for collaborative database update.</li><li>Expand the database to other campus of UFPR and develop new functionalities to different users;</li></ol><p> The “smart city” concept allows to develop an optimized system that use geospatial data to understand the complexity of the urban environments. The use of the geospatial data can improve efficiency and security to manage urban aspects like infrastructure, building and public spaces, natural environment, urban services, health and education. Also, this concept can give a support to the city management agents during the design, realization and evaluation of the urban projects.</p><p>In the present project, we believe these are the first steps to build a connected environment and apply the “smart city” concept into the university administration to make the sustainable use of resources and could suit as an example to some existing problems in public administrations.</p>
APA, Harvard, Vancouver, ISO, and other styles
9

Strosnider, Heather, Patrick Wall, Holly Wilson, Joseph Ralph, and Fuyuen Yip. "Tracking environmental hazards and health outcomes to inform decision-making in the United States." Online Journal of Public Health Informatics 11, no. 1 (May 30, 2019). http://dx.doi.org/10.5210/ojphi.v11i1.9772.

Full text
Abstract:
ObjectiveTo increase the availability and accessibility of standardized environmental health data for public health surveillance and decision-making.IntroductionIn 2002, the United States (US) Centers for Disease Control and Prevention (CDC) launched the National Environmental Public Health Tracking Program (Tracking Program) to address the challenges in environmental health surveillance described by the Pew Environmental Commission (1). The report cited gaps in our understanding of how the environment affects our health and attributed these gaps to a dearth of surveillance data for environmental hazards, human exposures, and health effects. The Tracking Program’s mission is to provide information from a nationwide network of integrated health and environmental data that drives actions to improve the health of communities. Accomplishing this mission requires a range of expertise from environmental health scientists to programmers to communicators employing the best practices and latest technical advances of their disciplines. Critical to this mission, the Tracking Program must identify and prioritize what data are needed, address any gaps found, and integrate the data into the network for ongoing surveillance.MethodsThe Tracking Program identifies important environmental health topics with data challenges based on the recommendations in the Pew Commission report as well as input from federal, state, territorial, tribal, and local partners. For each topic, the first step is to formulate the key surveillance question, which includes identifying the decision-maker or end user. Next, available data are evaluated to determine if the data can answer the question and, if not, what enhancements or new data are needed. Standards are developed to establish data requirements and to ensure consistency and comparability. Standardized data are then integrated into the network at national, state, and local levels. Standardized measures are calculated to translate the data into the information needed. These measures are then publically disseminated via national, state, and local web-based portals. Data are updated annually or as they are available and new data are added regularly. All data undergo a multi-step validation process that is semi-automated, routinized, and reproducible.ResultsThe first set of nationally consistent data and measures (NCDM) was released in 2008 and covered 8 environmental health topics. Since then the NCDM have grown to cover 14 topics. Additional standardized data and measures are integrated into the national network resulting in 23 topics with standardized 450 measures (Figure). On the national network, measures can be queried via the Data Explorer, viewed in the info-by-location application, or connected to via the network’s Application Program Interface (API). On average, 15,000 and 3300 queries are run every month on the Data Explorer and the API respectfully. Additional locally relevant data are available on state and local tracking networks.Gaps in data have been addressed through standards for new data collections, models to extend available data, new methodologies for using existing data, and expansion of the utility of non-traditional public health data. For example, the program has collaborated with the Environmental Protection Agency to develop daily estimates of fine particulate matter and ozone for every county in the conterminous US and to develop the first national database of standardized radon testing data. The program also collaborated with the National Aeronautics and Space Administration and its academic partners to transform satellite data into data products for public health.The Tracking Program has analyzed the data to address important gaps in our understanding of the relationship between negative health outcomes and environmental hazards. Data have been used in epidemiologic studies to better quantify the association between fine particulate matter, ozone, wildfire smoke, and extreme heat on emergency department visits and hospitalizations. Results are translated into measures of health burden for public dissemination and can be used to inform regulatory standards and public health interventions.ConclusionsThe scope of the Tracking Program’s mission and the volume of data within the network requires the program to merge traditional public health expertise and practices with current technical and scientific advances. Data integrated into the network can be used to (1) describe temporal and spatial trends in health outcomes and potential environmental exposures, (2) identify populations most affected, (3) generate hypotheses about associations between health and environmental exposures, and (4) develop, guide, and assess the environmental public health policies and interventions aimed at reducing or eliminating health outcomes associated with environmental factors. The program continues to expand the data within the network and the applications deployed for others to access the data. Current data challenges include the need for more temporally and spatially resolved data to better understand the complex relationships between environmental hazards, health outcomes, and risk factors at a local level. National standards are in development for systematically generating, analyzing, and disseminating small area data and real-time data that will allow for comparisons between different datasets over geography and time.References1. Pew Environmental Health Tracking Project Team. America’s Environmental Health Gap: Why the Country Needs a Nationwide Health Tracking Network. Johns Hopkins School of Hygiene and Public Health, Department of Health Policy and Management; 2000.
APA, Harvard, Vancouver, ISO, and other styles
10

Paull, John. "Beyond Equal: From Same But Different to the Doctrine of Substantial Equivalence." M/C Journal 11, no. 2 (June 1, 2008). http://dx.doi.org/10.5204/mcj.36.

Full text
Abstract:
A same-but-different dichotomy has recently been encapsulated within the US Food and Drug Administration’s ill-defined concept of “substantial equivalence” (USFDA, FDA). By invoking this concept the genetically modified organism (GMO) industry has escaped the rigors of safety testing that might otherwise apply. The curious concept of “substantial equivalence” grants a presumption of safety to GMO food. This presumption has yet to be earned, and has been used to constrain labelling of both GMO and non-GMO food. It is an idea that well serves corporatism. It enables the claim of difference to secure patent protection, while upholding the contrary claim of sameness to avoid labelling and safety scrutiny. It offers the best of both worlds for corporate food entrepreneurs, and delivers the worst of both worlds to consumers. The term “substantial equivalence” has established its currency within the GMO discourse. As the opportunities for patenting food technologies expand, the GMO recruitment of this concept will likely be a dress rehearsal for the developing debates on the labelling and testing of other techno-foods – including nano-foods and clone-foods. “Substantial Equivalence” “Are the Seven Commandments the same as they used to be, Benjamin?” asks Clover in George Orwell’s “Animal Farm”. By way of response, Benjamin “read out to her what was written on the wall. There was nothing there now except a single Commandment. It ran: ALL ANIMALS ARE EQUAL BUT SOME ANIMALS ARE MORE EQUAL THAN OTHERS”. After this reductionist revelation, further novel and curious events at Manor Farm, “did not seem strange” (Orwell, ch. X). Equality is a concept at the very core of mathematics, but beyond the domain of logic, equality becomes a hotly contested notion – and the domain of food is no exception. A novel food has a regulatory advantage if it can claim to be the same as an established food – a food that has proven its worth over centuries, perhaps even millennia – and thus does not trigger new, perhaps costly and onerous, testing, compliance, and even new and burdensome regulations. On the other hand, such a novel food has an intellectual property (IP) advantage only in terms of its difference. And thus there is an entrenched dissonance for newly technologised foods, between claiming sameness, and claiming difference. The same/different dilemma is erased, so some would have it, by appeal to the curious new dualist doctrine of “substantial equivalence” whereby sameness and difference are claimed simultaneously, thereby creating a win/win for corporatism, and a loss/loss for consumerism. This ground has been pioneered, and to some extent conquered, by the GMO industry. The conquest has ramifications for other cryptic food technologies, that is technologies that are invisible to the consumer and that are not evident to the consumer other than via labelling. Cryptic technologies pertaining to food include GMOs, pesticides, hormone treatments, irradiation and, most recently, manufactured nano-particles introduced into the food production and delivery stream. Genetic modification of plants was reported as early as 1984 by Horsch et al. The case of Diamond v. Chakrabarty resulted in a US Supreme Court decision that upheld the prior decision of the US Court of Customs and Patent Appeal that “the fact that micro-organisms are alive is without legal significance for purposes of the patent law”, and ruled that the “respondent’s micro-organism plainly qualifies as patentable subject matter”. This was a majority decision of nine judges, with four judges dissenting (Burger). It was this Chakrabarty judgement that has seriously opened the Pandora’s box of GMOs because patenting rights makes GMOs an attractive corporate proposition by offering potentially unique monopoly rights over food. The rear guard action against GMOs has most often focussed on health repercussions (Smith, Genetic), food security issues, and also the potential for corporate malfeasance to hide behind a cloak of secrecy citing commercial confidentiality (Smith, Seeds). Others have tilted at the foundational plank on which the economics of the GMO industry sits: “I suggest that the main concern is that we do not want a single molecule of anything we eat to contribute to, or be patented and owned by, a reckless, ruthless chemical organisation” (Grist 22). The GMO industry exhibits bipolar behaviour, invoking the concept of “substantial difference” to claim patent rights by way of “novelty”, and then claiming “substantial equivalence” when dealing with other regulatory authorities including food, drug and pesticide agencies; a case of “having their cake and eating it too” (Engdahl 8). This is a clever slight-of-rhetoric, laying claim to the best of both worlds for corporations, and the worst of both worlds for consumers. Corporations achieve patent protection and no concomitant specific regulatory oversight; while consumers pay the cost of patent monopolization, and are not necessarily apprised, by way of labelling or otherwise, that they are purchasing and eating GMOs, and thereby financing the GMO industry. The lemma of “substantial equivalence” does not bear close scrutiny. It is a fuzzy concept that lacks a tight testable definition. It is exactly this fuzziness that allows lots of wriggle room to keep GMOs out of rigorous testing regimes. Millstone et al. argue that “substantial equivalence is a pseudo-scientific concept because it is a commercial and political judgement masquerading as if it is scientific. It is moreover, inherently anti-scientific because it was created primarily to provide an excuse for not requiring biochemical or toxicological tests. It therefore serves to discourage and inhibit informative scientific research” (526). “Substantial equivalence” grants GMOs the benefit of the doubt regarding safety, and thereby leaves unexamined the ramifications for human consumer health, for farm labourer and food-processor health, for the welfare of farm animals fed a diet of GMO grain, and for the well-being of the ecosystem, both in general and in its particularities. “Substantial equivalence” was introduced into the food discourse by an Organisation for Economic Co-operation and Development (OECD) report: “safety evaluation of foods derived by modern biotechnology: concepts and principles”. It is from this document that the ongoing mantra of assumed safety of GMOs derives: “modern biotechnology … does not inherently lead to foods that are less safe … . Therefore evaluation of foods and food components obtained from organisms developed by the application of the newer techniques does not necessitate a fundamental change in established principles, nor does it require a different standard of safety” (OECD, “Safety” 10). This was at the time, and remains, an act of faith, a pro-corporatist and a post-cautionary approach. The OECD motto reveals where their priorities lean: “for a better world economy” (OECD, “Better”). The term “substantial equivalence” was preceded by the 1992 USFDA concept of “substantial similarity” (Levidow, Murphy and Carr) and was adopted from a prior usage by the US Food and Drug Agency (USFDA) where it was used pertaining to medical devices (Miller). Even GMO proponents accept that “Substantial equivalence is not intended to be a scientific formulation; it is a conceptual tool for food producers and government regulators” (Miller 1043). And there’s the rub – there is no scientific definition of “substantial equivalence”, no scientific test of proof of concept, and nor is there likely to be, since this is a ‘spinmeister’ term. And yet this is the cornerstone on which rests the presumption of safety of GMOs. Absence of evidence is taken to be evidence of absence. History suggests that this is a fraught presumption. By way of contrast, the patenting of GMOs depends on the antithesis of assumed ‘sameness’. Patenting rests on proven, scrutinised, challengeable and robust tests of difference and novelty. Lightfoot et al. report that transgenic plants exhibit “unexpected changes [that] challenge the usual assumptions of GMO equivalence and suggest genomic, proteomic and metanomic characterization of transgenics is advisable” (1). GMO Milk and Contested Labelling Pesticide company Monsanto markets the genetically engineered hormone rBST (recombinant Bovine Somatotropin; also known as: rbST; rBGH, recombinant Bovine Growth Hormone; and the brand name Prosilac) to dairy farmers who inject it into their cows to increase milk production. This product is not approved for use in many jurisdictions, including Europe, Australia, New Zealand, Canada and Japan. Even Monsanto accepts that rBST leads to mastitis (inflammation and pus in the udder) and other “cow health problems”, however, it maintains that “these problems did not occur at rates that would prohibit the use of Prosilac” (Monsanto). A European Union study identified an extensive list of health concerns of rBST use (European Commission). The US Dairy Export Council however entertain no doubt. In their background document they ask “is milk from cows treated with rBST safe?” and answer “Absolutely” (USDEC). Meanwhile, Monsanto’s website raises and answers the question: “Is the milk from cows treated with rbST any different from milk from untreated cows? No” (Monsanto). Injecting cows with genetically modified hormones to boost their milk production remains a contested practice, banned in many countries. It is the claimed equivalence that has kept consumers of US dairy products in the dark, shielded rBST dairy farmers from having to declare that their milk production is GMO-enhanced, and has inhibited non-GMO producers from declaring their milk as non-GMO, non rBST, or not hormone enhanced. This is a battle that has simmered, and sometimes raged, for a decade in the US. Finally there is a modest victory for consumers: the Pennsylvania Department of Agriculture (PDA) requires all labels used on milk products to be approved in advance by the department. The standard issued in October 2007 (PDA, “Standards”) signalled to producers that any milk labels claiming rBST-free status would be rejected. This advice was rescinded in January 2008 with new, specific, department-approved textual constructions allowed, and ensuring that any “no rBST” style claim was paired with a PDA-prescribed disclaimer (PDA, “Revised Standards”). However, parsimonious labelling is prohibited: No labeling may contain references such as ‘No Hormones’, ‘Hormone Free’, ‘Free of Hormones’, ‘No BST’, ‘Free of BST’, ‘BST Free’,’No added BST’, or any statement which indicates, implies or could be construed to mean that no natural bovine somatotropin (BST) or synthetic bovine somatotropin (rBST) are contained in or added to the product. (PDA, “Revised Standards” 3) Difference claims are prohibited: In no instance shall any label state or imply that milk from cows not treated with recombinant bovine somatotropin (rBST, rbST, RBST or rbst) differs in composition from milk or products made with milk from treated cows, or that rBST is not contained in or added to the product. If a product is represented as, or intended to be represented to consumers as, containing or produced from milk from cows not treated with rBST any labeling information must convey only a difference in farming practices or dairy herd management methods. (PDA, “Revised Standards” 3) The PDA-approved labelling text for non-GMO dairy farmers is specified as follows: ‘From cows not treated with rBST. No significant difference has been shown between milk derived from rBST-treated and non-rBST-treated cows’ or a substantial equivalent. Hereinafter, the first sentence shall be referred to as the ‘Claim’, and the second sentence shall be referred to as the ‘Disclaimer’. (PDA, “Revised Standards” 4) It is onto the non-GMO dairy farmer alone, that the costs of compliance fall. These costs include label preparation and approval, proving non-usage of GMOs, and of creating and maintaining an audit trail. In nearby Ohio a similar consumer versus corporatist pantomime is playing out. This time with the Ohio Department of Agriculture (ODA) calling the shots, and again serving the GMO industry. The ODA prescribed text allowed to non-GMO dairy farmers is “from cows not supplemented with rbST” and this is to be conjoined with the mandatory disclaimer “no significant difference has been shown between milk derived from rbST-supplemented and non-rbST supplemented cows” (Curet). These are “emergency rules”: they apply for 90 days, and are proposed as permanent. Once again, the onus is on the non-GMO dairy farmers to document and prove their claims. GMO dairy farmers face no such governmental requirements, including no disclosure requirement, and thus an asymmetric regulatory impost is placed on the non-GMO farmer which opens up new opportunities for administrative demands and technocratic harassment. Levidow et al. argue, somewhat Eurocentrically, that from its 1990s adoption “as the basis for a harmonized science-based approach to risk assessment” (26) the concept of “substantial equivalence” has “been recast in at least three ways” (58). It is true that the GMO debate has evolved differently in the US and Europe, and with other jurisdictions usually adopting intermediate positions, yet the concept persists. Levidow et al. nominate their three recastings as: firstly an “implicit redefinition” by the appending of “extra phrases in official documents”; secondly, “it has been reinterpreted, as risk assessment processes have … required more evidence of safety than before, especially in Europe”; and thirdly, “it has been demoted in the European Union regulatory procedures so that it can no longer be used to justify the claim that a risk assessment is unnecessary” (58). Romeis et al. have proposed a decision tree approach to GMO risks based on cascading tiers of risk assessment. However what remains is that the defects of the concept of “substantial equivalence” persist. Schauzu identified that: such decisions are a matter of “opinion”; that there is “no clear definition of the term ‘substantial’”; that because genetic modification “is aimed at introducing new traits into organisms, the result will always be a different combination of genes and proteins”; and that “there is no general checklist that could be followed by those who are responsible for allowing a product to be placed on the market” (2). Benchmark for Further Food Novelties? The discourse, contestation, and debate about “substantial equivalence” have largely focussed on the introduction of GMOs into food production processes. GM can best be regarded as the test case, and proof of concept, for establishing “substantial equivalence” as a benchmark for evaluating new and forthcoming food technologies. This is of concern, because the concept of “substantial equivalence” is scientific hokum, and yet its persistence, even entrenchment, within regulatory agencies may be a harbinger of forthcoming same-but-different debates for nanotechnology and other future bioengineering. The appeal of “substantial equivalence” has been a brake on the creation of GMO-specific regulations and on rigorous GMO testing. The food nanotechnology industry can be expected to look to the precedent of the GMO debate to head off specific nano-regulations and nano-testing. As cloning becomes economically viable, then this may be another wave of food innovation that muddies the regulatory waters with the confused – and ultimately self-contradictory – concept of “substantial equivalence”. Nanotechnology engineers particles in the size range 1 to 100 nanometres – a nanometre is one billionth of a metre. This is interesting for manufacturers because at this size chemicals behave differently, or as the Australian Office of Nanotechnology expresses it, “new functionalities are obtained” (AON). Globally, government expenditure on nanotechnology research reached US$4.6 billion in 2006 (Roco 3.12). While there are now many patents (ETC Group; Roco), regulation specific to nanoparticles is lacking (Bowman and Hodge; Miller and Senjen). The USFDA advises that nano-manufacturers “must show a reasonable assurance of safety … or substantial equivalence” (FDA). A recent inventory of nano-products already on the market identified 580 products. Of these 11.4% were categorised as “Food and Beverage” (WWICS). This is at a time when public confidence in regulatory bodies is declining (HRA). In an Australian consumer survey on nanotechnology, 65% of respondents indicated they were concerned about “unknown and long term side effects”, and 71% agreed that it is important “to know if products are made with nanotechnology” (MARS 22). Cloned animals are currently more expensive to produce than traditional animal progeny. In the course of 678 pages, the USFDA Animal Cloning: A Draft Risk Assessment has not a single mention of “substantial equivalence”. However the Federation of Animal Science Societies (FASS) in its single page “Statement in Support of USFDA’s Risk Assessment Conclusion That Food from Cloned Animals Is Safe for Human Consumption” states that “FASS endorses the use of this comparative evaluation process as the foundation of establishing substantial equivalence of any food being evaluated. It must be emphasized that it is the food product itself that should be the focus of the evaluation rather than the technology used to generate cloned animals” (FASS 1). Contrary to the FASS derogation of the importance of process in food production, for consumers both the process and provenance of production is an important and integral aspect of a food product’s value and identity. Some consumers will legitimately insist that their Kalamata olives are from Greece, or their balsamic vinegar is from Modena. It was the British public’s growing awareness that their sugar was being produced by slave labour that enabled the boycotting of the product, and ultimately the outlawing of slavery (Hochschild). When consumers boycott Nestle, because of past or present marketing practices, or boycott produce of USA because of, for example, US foreign policy or animal welfare concerns, they are distinguishing the food based on the narrative of the food, the production process and/or production context which are a part of the identity of the food. Consumers attribute value to food based on production process and provenance information (Paull). Products produced by slave labour, by child labour, by political prisoners, by means of torture, theft, immoral, unethical or unsustainable practices are different from their alternatives. The process of production is a part of the identity of a product and consumers are increasingly interested in food narrative. It requires vigilance to ensure that these narratives are delivered with the product to the consumer, and are neither lost nor suppressed. Throughout the GM debate, the organic sector has successfully skirted the “substantial equivalence” debate by excluding GMOs from the certified organic food production process. This GMO-exclusion from the organic food stream is the one reprieve available to consumers worldwide who are keen to avoid GMOs in their diet. The organic industry carries the expectation of providing food produced without artificial pesticides and fertilizers, and by extension, without GMOs. Most recently, the Soil Association, the leading organic certifier in the UK, claims to be the first organisation in the world to exclude manufactured nonoparticles from their products (Soil Association). There has been the call that engineered nanoparticles be excluded from organic standards worldwide, given that there is no mandatory safety testing and no compulsory labelling in place (Paull and Lyons). The twisted rhetoric of oxymorons does not make the ideal foundation for policy. Setting food policy on the shifting sands of “substantial equivalence” seems foolhardy when we consider the potentially profound ramifications of globally mass marketing a dysfunctional food. If there is a 2×2 matrix of terms – “substantial equivalence”, substantial difference, insubstantial equivalence, insubstantial difference – while only one corner of this matrix is engaged for food policy, and while the elements remain matters of opinion rather than being testable by science, or by some other regime, then the public is the dupe, and potentially the victim. “Substantial equivalence” has served the GMO corporates well and the public poorly, and this asymmetry is slated to escalate if nano-food and clone-food are also folded into the “substantial equivalence” paradigm. Only in Orwellian Newspeak is war peace, or is same different. It is time to jettison the pseudo-scientific doctrine of “substantial equivalence”, as a convenient oxymoron, and embrace full disclosure of provenance, process and difference, so that consumers are not collateral in a continuing asymmetric knowledge war. References Australian Office of Nanotechnology (AON). Department of Industry, Tourism and Resources (DITR) 6 Aug. 2007. 24 Apr. 2008 < http://www.innovation.gov.au/Section/Innovation/Pages/ AustralianOfficeofNanotechnology.aspx >.Bowman, Diana, and Graeme Hodge. “A Small Matter of Regulation: An International Review of Nanotechnology Regulation.” Columbia Science and Technology Law Review 8 (2007): 1-32.Burger, Warren. “Sidney A. Diamond, Commissioner of Patents and Trademarks v. Ananda M. Chakrabarty, et al.” Supreme Court of the United States, decided 16 June 1980. 24 Apr. 2008 < http://caselaw.lp.findlaw.com/cgi-bin/getcase.pl?court=US&vol=447&invol=303 >.Curet, Monique. “New Rules Allow Dairy-Product Labels to Include Hormone Info.” The Columbus Dispatch 7 Feb. 2008. 24 Apr. 2008 < http://www.dispatch.com/live/content/business/stories/2008/02/07/dairy.html >.Engdahl, F. William. Seeds of Destruction. Montréal: Global Research, 2007.ETC Group. Down on the Farm: The Impact of Nano-Scale Technologies on Food and Agriculture. Ottawa: Action Group on Erosion, Technology and Conservation, November, 2004. European Commission. Report on Public Health Aspects of the Use of Bovine Somatotropin. Brussels: European Commission, 15-16 March 1999.Federation of Animal Science Societies (FASS). Statement in Support of FDA’s Risk Assessment Conclusion That Cloned Animals Are Safe for Human Consumption. 2007. 24 Apr. 2008 < http://www.fass.org/page.asp?pageID=191 >.Grist, Stuart. “True Threats to Reason.” New Scientist 197.2643 (16 Feb. 2008): 22-23.Hochschild, Adam. Bury the Chains: The British Struggle to Abolish Slavery. London: Pan Books, 2006.Horsch, Robert, Robert Fraley, Stephen Rogers, Patricia Sanders, Alan Lloyd, and Nancy Hoffman. “Inheritance of Functional Foreign Genes in Plants.” Science 223 (1984): 496-498.HRA. Awareness of and Attitudes toward Nanotechnology and Federal Regulatory Agencies: A Report of Findings. Washington: Peter D. Hart Research Associates, 25 Sep. 2007.Levidow, Les, Joseph Murphy, and Susan Carr. “Recasting ‘Substantial Equivalence’: Transatlantic Governance of GM Food.” Science, Technology, and Human Values 32.1 (Jan. 2007): 26-64.Lightfoot, David, Rajsree Mungur, Rafiqa Ameziane, Anthony Glass, and Karen Berhard. “Transgenic Manipulation of C and N Metabolism: Stretching the GMO Equivalence.” American Society of Plant Biologists Conference: Plant Biology, 2000.MARS. “Final Report: Australian Community Attitudes Held about Nanotechnology – Trends 2005-2007.” Report prepared for Department of Industry, Tourism and Resources (DITR). Miranda, NSW: Market Attitude Research Services, 12 June 2007.Miller, Georgia, and Rye Senjen. “Out of the Laboratory and on to Our Plates: Nanotechnology in Food and Agriculture.” Friends of the Earth, 2008. 24 Apr. 2008 < http://nano.foe.org.au/node/220 >.Miller, Henry. “Substantial Equivalence: Its Uses and Abuses.” Nature Biotechnology 17 (7 Nov. 1999): 1042-1043.Millstone, Erik, Eric Brunner, and Sue Mayer. “Beyond ‘Substantial Equivalence’.” Nature 401 (7 Oct. 1999): 525-526.Monsanto. “Posilac, Bovine Somatotropin by Monsanto: Questions and Answers about bST from the United States Food and Drug Administration.” 2007. 24 Apr. 2008 < http://www.monsantodairy.com/faqs/fda_safety.html >.Organisation for Economic Co-operation and Development (OECD). “For a Better World Economy.” Paris: OECD, 2008. 24 Apr. 2008 < http://www.oecd.org/ >.———. “Safety Evaluation of Foods Derived by Modern Biotechnology: Concepts and Principles.” Paris: OECD, 1993.Orwell, George. Animal Farm. Adelaide: ebooks@Adelaide, 2004 (1945). 30 Apr. 2008 < http://ebooks.adelaide.edu.au/o/orwell/george >.Paull, John. “Provenance, Purity and Price Premiums: Consumer Valuations of Organic and Place-of-Origin Food Labelling.” Research Masters thesis, University of Tasmania, Hobart, 2006. 24 Apr. 2008 < http://eprints.utas.edu.au/690/ >.Paull, John, and Kristen Lyons. “Nanotechnology: The Next Challenge for Organics.” Journal of Organic Systems (in press).Pennsylvania Department of Agriculture (PDA). “Revised Standards and Procedure for Approval of Proposed Labeling of Fluid Milk.” Milk Labeling Standards (2.0.1.17.08). Bureau of Food Safety and Laboratory Services, Pennsylvania Department of Agriculture, 17 Jan. 2008. ———. “Standards and Procedure for Approval of Proposed Labeling of Fluid Milk, Milk Products and Manufactured Dairy Products.” Milk Labeling Standards (2.0.1.17.08). Bureau of Food Safety and Laboratory Services, Pennsylvania Department of Agriculture, 22 Oct. 2007.Roco, Mihail. “National Nanotechnology Initiative – Past, Present, Future.” In William Goddard, Donald Brenner, Sergy Lyshevski and Gerald Iafrate, eds. Handbook of Nanoscience, Engineering and Technology. 2nd ed. Boca Raton, FL: CRC Press, 2007.Romeis, Jorg, Detlef Bartsch, Franz Bigler, Marco Candolfi, Marco Gielkins, et al. “Assessment of Risk of Insect-Resistant Transgenic Crops to Nontarget Arthropods.” Nature Biotechnology 26.2 (Feb. 2008): 203-208.Schauzu, Marianna. “The Concept of Substantial Equivalence in Safety Assessment of Food Derived from Genetically Modified Organisms.” AgBiotechNet 2 (Apr. 2000): 1-4.Soil Association. “Soil Association First Organisation in the World to Ban Nanoparticles – Potentially Toxic Beauty Products That Get Right under Your Skin.” London: Soil Association, 17 Jan. 2008. 24 Apr. 2008 < http://www.soilassociation.org/web/sa/saweb.nsf/848d689047 cb466780256a6b00298980/42308d944a3088a6802573d100351790!OpenDocument >.Smith, Jeffrey. Genetic Roulette: The Documented Health Risks of Genetically Engineered Foods. Fairfield, Iowa: Yes! Books, 2007.———. Seeds of Deception. Melbourne: Scribe, 2004.U.S. Dairy Export Council (USDEC). Bovine Somatotropin (BST) Backgrounder. Arlington, VA: U.S. Dairy Export Council, 2006.U.S. Food and Drug Administration (USFDA). Animal Cloning: A Draft Risk Assessment. Rockville, MD: Center for Veterinary Medicine, U.S. Food and Drug Administration, 28 Dec. 2006.———. FDA and Nanotechnology Products. U.S. Department of Health and Human Services, U.S. Food and Drug Administration, 2008. 24 Apr. 2008 < http://www.fda.gov/nanotechnology/faqs.html >.Woodrow Wilson International Center for Scholars (WWICS). “A Nanotechnology Consumer Products Inventory.” Data set as at Sep. 2007. Woodrow Wilson International Center for Scholars, Project on Emerging Technologies, Sep. 2007. 24 Apr. 2008 < http://www.nanotechproject.org/inventories/consumer >.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "United States. Federal Emergency Management Agency. Resource Management and Administration"

1

United States. General Accounting Office. RCED. Results Act: Observations on the Federal Emergency Management Agency's draft strategic plan. Washington, D.C: The Office, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

RCED, United States General Accounting Office. Results Act: Observations on the Federal Emergency Management Agency's draft strategic plan. Washington, D.C: The Office, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ensuring strong FEMA regional offices: An examination of resources and responsibilities : hearing before the Subcommittee on Emergency Communications, Preparedness, and Response of the Committee on Homeland Security, House of Representatives, One Hundred Eleventh Congress, second session, March 16, 2010. Washington: U.S. G.P.O., 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Calbom, Linda M. Financial management: Financial audit results at GSA, EPA, and DOT : statement for the record by Linda M. Calbom, Director, Resources, Community, and Economic Development Accounting and Financial Management Issues, Accounting and Information Management Division, before the Subcommittee on Oversight and Investigations and Emergency Management, Committee on Transportation and Infrastructure, House of Representatives. Washington, D.C: The Office, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

United States. General Accounting Office. RCED. Results Act: Observations on USDA's draft strategic plan. Washington, D.C: The Office, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

United States. General Accounting Office. RCED. Results Act: Observations on the Small Business Administration's draft strategic plan. Washington, D.C: The Office, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Agency, United States Federal Emergency Management. Permanent relocation projects under superfund. [Washington, D.C.?]: Federal Emergency Management Agency, 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

United States. General Accounting Office. RCED. Results Act: Observations on the Nuclear Regulatory Commission's draft strategic plan. Washington, D.C: The Office, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

United States. General Accounting Office. RCED. Results Act: Observations on EPA's draft strategic plan. Washington, D.C: The Office, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

United States. General Accounting Office. RCED. Results Act: Observations on the Department of Transportation's annual performance plan for fiscal year 1999. Washington, D.C. (P.O. Box 37050, Washington, D.C. 20013): The Office, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "United States. Federal Emergency Management Agency. Resource Management and Administration"

1

Changnon, Stanley A. "Impacts of El Niño’s Weather." In El Niño, 1997-1998. Oxford University Press, 2000. http://dx.doi.org/10.1093/oso/9780195135510.003.0012.

Full text
Abstract:
The societal, economic, and environmental consequences of weather events and climate conditions in the United States vary across the nation as a result of hot and dry conditions in one region and cold and wet conditions in others, or storms in one area and none in others. Thus, for any given period, such as a season or year, the weather-caused impacts in the United States reveal a mix of winners and losers. This was certainly true with the impacts resulting from El Niño 97-98. The official National Oceanic and Atmospheric Administration (NOAA) predictions issued in June 1997 calling for more storms in parts of the nation and heavy precipitation for the South and Far West (Climate Prediction Center, August 13,1997) created major fears about large economic and social losses. The warnings of FEMA and the ensuing media hype created a nationwide perception that all “El Niño weather” was going to be damaging. This fear is illustrated in the cartoon in Figure 6-1. For example, the Financial Times (July 28, 1997) tied the strong El Niño 97-98 conditions to the huge U.S. losses due to El Niño 1982-1983, with 161 killed and losses of $2.2 billion. Such connections and citations resulted from the fact that the official El Niño predictions and Federal Emergency Management Agency (FEMA) warnings were comparing the large El Niño 97-98 to the large 1982-1983 event (CPC, July 1997; FEMA, August 12, 1997). California newspapers also focused on the 1982-1983 losses in that state, which included fourteen killed and $265 million in damages (San Francisco Chronicle, August 14, 1997; Sacramento Bee, October 15, 1997). This helped create considerable concern and launched major mitigation endeavors in California where storm and rain predictions were ominous. The resulting 1997- 1998 mitigative activities in California reduced losses and were a major beneficial impact of the use of the long-range predictions of the Climate Prediction Center (CPC) and the warnings issued by FEMA that promoted mitigation actions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography