WorldWideScience

Sample records for human hazard analysis

  1. Integrating human factors into process hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kariuki, S.G. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany); Loewe, K. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany)]. E-mail: katharina.loewe@tu-berlin.de

    2007-12-15

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors.

  2. Job Hazard Analysis

    National Research Council Canada - National Science Library

    1998-01-01

    .... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...

  3. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  4. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  5. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  6. Counterfactual Volcano Hazard Analysis

    Science.gov (United States)

    Woo, Gordon

    2013-04-01

    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However

  7. MGR External Events Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  8. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  9. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    Energy Technology Data Exchange (ETDEWEB)

    R. Longwell; J. Keifer; S. Goodin

    2001-01-22

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  10. 14 CFR 437.29 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  11. 21 CFR 120.7 - Hazard analysis.

    Science.gov (United States)

    2010-04-01

    ... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.7 Hazard... to occur and thus, constitutes a food hazard that must be addressed in the HACCP plan. A food hazard... intended consumer. (e) HACCP plans for juice need not address the food hazards associated with...

  12. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  13. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-03-01

    When training for a job in which human error has the potential of producing catastrophic results, an understanding of the hazards that may be encountered is of paramount importance. In high consequence activities, it is important that the training program be conducted in a safe environment and yet emphasize the potential hazards. Because of the high consequence of a human error the use of a high-fidelity simulation is of great importance to provide the safe environment the worker needs to learn and hone required skills. A hazards analysis identifies the operation hazards, potential human error, and associated positive measures that aid in the mitigation or prevention of the hazard. The information gained from the hazards analysis should be used in the development of training. This paper will discuss the integration of information from the hazards analysis into the development of simulation components of a training program.

  14. The Integrated Hazard Analysis Integrator

    Science.gov (United States)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  15. Reducing hazards for animals from humans

    Directory of Open Access Journals (Sweden)

    Paul-Pierre Pastoret

    2012-06-01

    Full Text Available If animals may be a source of hazards for humans, the reverse is equally true. The main sources of hazards from humans to animals, are the impact of human introduction of transboundary animal diseases, climate change, globalisation, introduction of invasive species and reduction of biodiversity.There is also a trend toward reducing genetic diversity in domestic animals, such as cattle; there are presently around 700 different breeds of cattle many of which at the verge of extinction (less than 100 reproductive females. The impact of humans is also indirect through detrimental effects on the environment. It is therefore urgent to implement the new concept of “one health"....

  16. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  17. 14 CFR 437.55 - Hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  18. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  19. Human-Environment Interaction: Natural Hazards as a Classic Example.

    Science.gov (United States)

    Montz, Burrell E.

    1989-01-01

    Urges that natural hazards be studied in order to analyze the geographic theme of human-environment interaction. Suggests ways in which this information can be introduced in the classroom. Identifies field studies and cross-cultural analysis with follow-up discussions as possible activities. Points out information sources. (KO)

  20. The use of hazards analysis in the development of training

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.

    1998-12-01

    A hazards analysis identifies the operation hazards and the positive measures that aid in the mitigation or prevention of the hazard. If the tasks are human intensive, the hazard analysis often credits the personnel training as contributing to the mitigation of the accident`s consequence or prevention of an accident sequence. To be able to credit worker training, it is important to understand the role of the training in the hazard analysis. Systematic training, known as systematic training design (STD), performance-based training (PBT), or instructional system design (ISD), uses a five-phase (analysis, design, development, implementation, and evaluation) model for the development and implementation of the training. Both a hazards analysis and a training program begin with a task analysis that documents the roles and actions of the workers. Though the tasks analyses are different in nature, there is common ground and both the hazard analysis and the training program can benefit from a cooperative effort. However, the cooperation should not end with the task analysis phase of either program. The information gained from the hazards analysis should be used in all five phases of the training development. The training evaluation, both of the individual worker and institutional training program, can provide valuable information to the hazards analysis effort. This paper will discuss the integration of the information from the hazards analysis into a training program. The paper will use the installation and removal of a piece of tooling that is used in a high-explosive operation. This example will be used to follow the systematic development of a training program and demonstrate the interaction and cooperation between the hazards analysis and training program.

  1. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-12

    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios received increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.

  2. Canister storage building hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.; Garvin, L.J.

    1997-07-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  3. Cold Vacuum Drying Facility hazard analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  4. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Provisions § 123.6 Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (a) Hazard... fish or fishery product being processed in the absence of those controls. (b) The HACCP plan. Every...

  5. Chemical incidents resulted in hazardous substances releases in the context of human health hazards.

    Science.gov (United States)

    Pałaszewska-Tkacz, Anna; Czerczak, Sławomir; Konieczko, Katarzyna

    2017-02-21

    The research purpose was to analyze data concerning chemical incidents in Poland collected in 1999-2009 in terms of health hazards. The data was obtained, using multimodal information technology (IT) system, from chemical incidents reports prepared by rescuers at the scene. The final analysis covered sudden events associated with uncontrolled release of hazardous chemical substances or mixtures, which may potentially lead to human exposure. Releases of unidentified substances where emergency services took action to protect human health or environment were also included. The number of analyzed chemical incidents in 1999-2009 was 2930 with more than 200 different substances released. The substances were classified into 13 groups of substances and mixtures posing analogous risks. Most common releases were connected with non-flammable corrosive liquids, including: hydrochloric acid (199 cases), sulfuric(VI) acid (131 cases), sodium and potassium hydroxides (69 cases), ammonia solution (52 cases) and butyric acid (32 cases). The next group were gases hazardous only due to physico-chemical properties, including: extremely flammable propane-butane (249 cases) and methane (79 cases). There was no statistically significant trend associated with the total number of incidents. Only with the number of incidents with flammable corrosive, toxic and/or harmful liquids, the regression analysis revealed a statistically significant downward trend. The number of victims reported was 1997, including 1092 children and 18 fatalities. The number of people injured, number of incidents and the high 9th place of Poland in terms of the number of Seveso establishments, and 4 times higher number of hazardous industrial establishments not covered by the Seveso Directive justify the need for systematic analysis of hazards and their proper identification. It is advisable enhance health risk assessment, both qualitative and quantitative, by slight modification of the data collection system so as

  6. Chemical incidents resulted in hazardous substances releases in the context of human health hazards

    Directory of Open Access Journals (Sweden)

    Anna Pałaszewska-Tkacz

    2017-02-01

    Full Text Available Objectives: The research purpose was to analyze data concerning chemical incidents in Poland collected in 1999–2009 in terms of health hazards. Material and Methods: The data was obtained, using multimodal information technology (IT system, from chemical incidents reports prepared by rescuers at the scene. The final analysis covered sudden events associated with uncontrolled release of hazardous chemical substances or mixtures, which may potentially lead to human exposure. Releases of unidentified substances where emergency services took action to protect human health or environment were also included. Results: The number of analyzed chemical incidents in 1999–2009 was 2930 with more than 200 different substances released. The substances were classified into 13 groups of substances and mixtures posing analogous risks. Most common releases were connected with non-flammable corrosive liquids, including: hydrochloric acid (199 cases, sulfuric(VI acid (131 cases, sodium and potassium hydroxides (69 cases, ammonia solution (52 cases and butyric acid (32 cases. The next group were gases hazardous only due to physico-chemical properties, including: extremely flammable propane-butane (249 cases and methane (79 cases. There was no statistically significant trend associated with the total number of incidents. Only with the number of incidents with flammable corrosive, toxic and/or harmful liquids, the regression analysis revealed a statistically significant downward trend. The number of victims reported was 1997, including 1092 children and 18 fatalities. Conclusions: The number of people injured, number of incidents and the high 9th place of Poland in terms of the number of Seveso establishments, and 4 times higher number of hazardous industrial establishments not covered by the Seveso Directive justify the need for systematic analysis of hazards and their proper identification. It is advisable enhance health risk assessment, both qualitative and

  7. Risk analysis based on hazards interactions

    Science.gov (United States)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  8. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each...

  9. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  10. Multicriteria analysis in hazards assessment in Libya

    Science.gov (United States)

    Zeleňáková, Martina; Gargar, Ibrahim; Purcz, Pavol

    2012-11-01

    Environmental hazards (natural and man-made) have always constituted problem in many developing and developed countries. Many applications proved that these problems could be solved through planning studies and detailed information about these prone areas. Determining time and location and size of the problem are important for decision makers for planning and management activities. It is important to know the risk represented by those hazards and take actions to protect against them. Multicriteria analysis methods - Analytic hierarchy process, Pairwise comparison, Ranking method are used to analyse which is the most dangerous hazard facing Libya country. The multicriteria analysis ends with a more or less stable ranking of the given alternatives and hence a recommendation as to which alternative(s) problems should be preferred. Regarding our problem of environmental risk assessment, the result will be a ranking or categorisation of hazards with regard to their risk level.

  11. Hazard screening application guide. Safety Analysis Report Update Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  12. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  13. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  14. Analysis of urinary human chorionic gonadotrophin concentrations in normal and failing pregnancies using longitudinal, Cox proportional hazards and two-stage modelling.

    Science.gov (United States)

    Marriott, Lorrae; Zinaman, Michael; Abrams, Keith R; Crowther, Michael J; Johnson, Sarah

    2017-09-01

    Background Human chorionic gonadotrophin is a marker of early pregnancy. This study sought to determine the possibility of being able to distinguish between healthy and failing pregnancies by utilizing patient-associated risk factors and daily urinary human chorionic gonadotrophin concentrations. Methods Data were from a study that collected daily early morning urine samples from women trying to conceive (n = 1505); 250 of whom became pregnant. Data from 129 women who became pregnant (including 44 miscarriages) were included in these analyses. A longitudinal model was used to profile human chorionic gonadotrophin, a Cox proportional hazards model to assess demographic/menstrual history data on the time to failed pregnancy, and a two-stage model to combine these two models. Results The profile for log human chorionic gonadotrophin concentrations in women suffering miscarriage differs to that of viable pregnancies; rate of human chorionic gonadotrophin rise is slower in those suffering a biochemical loss (loss before six weeks, recognized by a rise and fall of human chorionic gonadotrophin) and tends to plateau at a lower log human chorionic gonadotrophin in women suffering an early miscarriage (loss six weeks or later), compared with viable pregnancies. Maternal age, longest cycle length and time from luteinizing hormone surge to human chorionic gonadotrophin reaching 25 mIU/mL were found to be significantly associated with miscarriage risk. The two-stage model found that for an increase of one day in the time from luteinizing hormone surge to human chorionic gonadotrophin reaching 25 mIU/mL, there is a 30% increase in miscarriage risk (hazard ratio: 1.30; 95% confidence interval: 1.04, 1.62). Conclusion Rise of human chorionic gonadotrophin in early pregnancy could be useful to predict pregnancy viability. Daily tracking of urinary human chorionic gonadotrophin may enable early identification of some pregnancies at risk of miscarriage.

  15. Repository Subsurface Preliminary Fire Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  16. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  17. Integrating population dynamics into mapping human exposure to seismic hazard

    Directory of Open Access Journals (Sweden)

    S. Freire

    2012-11-01

    Full Text Available Disaster risk is not fully characterized without taking into account vulnerability and population exposure. Assessment of earthquake risk in urban areas would benefit from considering the variation of population distribution at more detailed spatial and temporal scales, and from a more explicit integration of this improved demographic data with existing seismic hazard maps. In the present work, "intelligent" dasymetric mapping is used to model population dynamics at high spatial resolution in order to benefit the analysis of spatio-temporal exposure to earthquake hazard in a metropolitan area. These night- and daytime-specific population densities are then classified and combined with seismic intensity levels to derive new spatially-explicit four-class-composite maps of human exposure. The presented approach enables a more thorough assessment of population exposure to earthquake hazard. Results show that there are significantly more people potentially at risk in the daytime period, demonstrating the shifting nature of population exposure in the daily cycle and the need to move beyond conventional residence-based demographic data sources to improve risk analyses. The proposed fine-scale maps of human exposure to seismic intensity are mainly aimed at benefiting visualization and communication of earthquake risk, but can be valuable in all phases of the disaster management process where knowledge of population densities is relevant for decision-making.

  18. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard...) Physical hazards. (b) The HACCP plan. (1) Every establishment shall develop and implement a written HACCP...

  19. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  20. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    Science.gov (United States)

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  1. Landslide hazards and systems analysis: A Central European perspective

    Science.gov (United States)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  2. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  3. Fire hazard analysis for fusion energy experiments

    Energy Technology Data Exchange (ETDEWEB)

    Alvares, N.J.; Hasegawa, H.K.

    1979-01-01

    The 2XIIB mirror fusion facility at Lawrence Livermore Laboratory (LLL) was used to evaluate the fire safety of state-of-the-art fusion energy experiments. The primary objective of this evaluation was to ensure the parallel development of fire safety and fusion energy technology. Through fault-tree analysis, we obtained a detailed engineering description of the 2XIIB fire protection system. This information helped us establish an optimum level of fire protection for experimental fusion energy facilities as well as evaluate the level of protection provided by various systems. Concurrently, we analyzed the fire hazard inherent to the facility using techniques that relate the probability of ignition to the flame spread and heat-release potential of construction materials, electrical and thermal insulations, and dielectric fluids. A comparison of the results of both analyses revealed that the existing fire protection system should be modified to accommodate the range of fire hazards inherent to the 2XIIB facility.

  4. Decision analysis for INEL hazardous waste storage

    Energy Technology Data Exchange (ETDEWEB)

    Page, L.A.; Roach, J.A.

    1994-01-01

    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft{sup 2} of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies.

  5. Human health hazards of veterinary medications: information for emergency departments.

    Science.gov (United States)

    Lust, Elaine Blythe; Barthold, Claudia; Malesker, Mark A; Wichman, Tammy O

    2011-02-01

    There are over 5000 approved prescription and over-the-counter medications, as well as vaccines, with labeled indications for veterinary patients. Of these, there are several products that have significant human health hazards upon accidental or intentional exposure or ingestion in humans: carfentanil, clenbuterol (Ventipulmin), ketamine, tilmicosin (Micotil), testosterone/estradiol (Component E-H and Synovex H), dinoprost (Lutalyse/Prostamate), and cloprostenol (Estromate/EstroPlan). The hazards range from mild to life-threatening in terms of severity, and include bronchospasm, central nervous system stimulation, induction of miscarriage, and sudden death. To report medication descriptions, human toxicity information, and medical management for the emergent care of patients who may have had exposure to veterinary medications when they present to an emergency department (ED). The intended use of this article is to inform and support ED personnel, drug information centers, and poison control centers on veterinary medication hazards. There is a need for increased awareness of the potential hazards of veterinary medications within human medicine circles. Timely reporting of veterinary medication hazards and their medical management may help to prepare the human medical community to deal with such exposures or abuses when time is of the essence. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Scientific hazards of human reproductive 'cloning'.

    Science.gov (United States)

    Young, Lorraine E

    2003-05-01

    The scientific and clinical professional societies and associations covering the remit of Human Fertility are unanimously opposed to human reproductive 'cloning'. This article describes the main scientific objections to human reproductive 'cloning'. Data collected from numerous studies in a range of animal species indicate a high incidence of fetal defects, a stillbirth rate typically of more than 90% and a lack of adequate information on postnatal development. These concerns are exacerbated by misconceptions about the current ability to screen preimplantation embryos for 'cloning-induced' defects. Scientists and clinicians are sometimes treated with mistrust in the eyes of the public and media over such issues, perhaps because scientific information is not as well communicated as it might be. The duty of reproductive specialists is to convey the limits of their knowledge on this issue to the public and policymakers.

  7. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-02-19

    ... Preventive Controls for Human Food; Extension of Comment Period for Information Collection Provisions AGENCY... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in the... ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food...

  8. 78 FR 48636 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-08-09

    ... Preventive Controls for Human Food; Extension of Comment Periods AGENCY: Food and Drug Administration, HHS... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food,'' that appeared in... Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food...

  9. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  10. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    Science.gov (United States)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  11. A System of Systems Interface Hazard Analysis Technique

    Science.gov (United States)

    2007-03-01

    16 Table 2. HAZOP Process ................................................................................. 21...Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...analysis techniques.28 c. Hazards and Operability Analysis Hazards and Operability ( HAZOP ) Analysis applies a systematic exploration of system

  12. Guidance Index for Shallow Landslide Hazard Analysis

    Directory of Open Access Journals (Sweden)

    Cheila Avalon Cullen

    2016-10-01

    Full Text Available Rainfall-induced shallow landslides are one of the most frequent hazards on slanted terrains. Intense storms with high-intensity and long-duration rainfall have high potential to trigger rapidly moving soil masses due to changes in pore water pressure and seepage forces. Nevertheless, regardless of the intensity and/or duration of the rainfall, shallow landslides are influenced by antecedent soil moisture conditions. As of this day, no system exists that dynamically interrelates these two factors on large scales. This work introduces a Shallow Landslide Index (SLI as the first implementation of antecedent soil moisture conditions for the hazard analysis of shallow rainfall-induced landslides. The proposed mathematical algorithm is built using a logistic regression method that systematically learns from a comprehensive landslide inventory. Initially, root-soil moisture and rainfall measurements modeled from AMSR-E and TRMM respectively, are used as proxies to develop the index. The input dataset is randomly divided into training and verification sets using the Hold-Out method. Validation results indicate that the best-fit model predicts the highest number of cases correctly at 93.2% accuracy. Consecutively, as AMSR-E and TRMM stopped working in October 2011 and April 2015 respectively, root-soil moisture and rainfall measurements modeled by SMAP and GPM are used to develop models that calculate the SLI for 10, 7, and 3 days. The resulting models indicate a strong relationship (78.7%, 79.6%, and 76.8% respectively between the predictors and the predicted value. The results also highlight important remaining challenges such as adequate information for algorithm functionality and satellite based data reliability. Nevertheless, the experimental system can potentially be used as a dynamic indicator of the total amount of antecedent moisture and rainfall (for a given duration of time needed to trigger a shallow landslide in a susceptible area. It is

  13. Fire hazards analysis of transuranic waste storage and assay facility

    Energy Technology Data Exchange (ETDEWEB)

    Busching, K.R., Westinghouse Hanford

    1996-07-31

    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  14. Human and nature-caused hazards: the affect heuristic causes biased decisions.

    Science.gov (United States)

    Siegrist, Michael; Sütterlin, Bernadette

    2014-08-01

    People are more concerned about the negative consequences of human hazards compared with natural hazards. Results of four experiments show that the same negative outcome (e.g., number of birds killed by an oil spill) was more negatively evaluated when caused by humans than when caused by nature. Results further show that when identical risk information was provided, participants evaluated nuclear power more negatively compared with solar power. The affect associated with the hazard per se influenced the interpretation of the new information. Furthermore, the affect experienced in the situation fully mediated the evaluation of the negative outcomes of a hazard. People's reliance on the affect heuristic is a challenge for acceptance of cost-benefit analyses because equally negative outcomes are differently evaluated depending on the cause. Symbolically significant information and the affect evoked by this information may result in biased and riskier decisions. © 2014 Society for Risk Analysis.

  15. Preliminary Tsunami Hazard Analysis for Uljin NPP Site using Tsunami Propagation Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyunme; KIm, Minkyu; Choi, Inkil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Donghoon [Chonnam National Univ., Gwangju (Korea, Republic of)

    2014-05-15

    The tsunami hazard analysis is based on the seismic hazard analysis method. The seismic hazard analysis had been performed by using the deterministic or probabilistic method. Recently, the probabilistic method has been received more attention than the deterministic method because the probabilistic approach can be considered well uncertainties of hazard analysis. Therefore the studies on the probabilistic tsunami hazard analysis (PTHA) have been performed in this study. This study was focused on the wave propagation analysis which was the most different thing between seismic hazard analysis and tsunami hazard analysis.

  16. Safety analysis of contained low-hazard biotechnology applications.

    Science.gov (United States)

    Pettauer, D; Käppeli, O; van den Eede, G

    1998-06-01

    A technical safety analysis has been performed on a containment-level-2 pilot plant in order to assess an upgrading of the existing facility, which should comply with good manufacturing practices. The results were obtained by employing the hazard and operability (HAZOP) assessment method and are discussed in the light of the appropriateness of this procedural tool for low-hazard biotechnology applications. The potential release of micro-organisms accounts only for a minor part of the hazardous consequences. However, in certain cases the release of a large or moderate amount of micro-organisms would not be immediately identified. Most of the actions required to avoid these consequences fall into the realm of operational procedures. As a major part of potential failures result from human errors, standard operating procedures play a prominent role when establishing the concept of safety management. The HAZOP assessment method was found to be adequate for the type of process under investigation. The results also may be used for the generation of checklists which, in most cases, are sufficient for routine safety assurance.

  17. Fire Hazard Analysis for Turbine Building of NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Seung Jun [KMENT, Seoul (Korea, Republic of); Park, Jun Hyun [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    In order to prove fire safety of operating nuclear power plants, plant-specific fire hazard analysis should be performed. Furthermore the effect of design changes on fire safety should be reviewed periodically. At the estimating fire vulnerability stage, the factors that influence fire vulnerability include ignition sources, combustibles, fire barriers, fire protection features such as detection, alarm, suppression, evacuation are investigated. At the stage of fire hazard assessment, ignition and propagation hazard, passive and active fire protection features, and fire protection program such as pre-fire plan and related procedures are investigated. Based on the result of fire hazard analysis, reasonable improvement plan for fire protection can be established. This paper describes the result of fire hazard analysis classified by fire area for turbine building of which fire hazards and fire frequencies are relatively high in operating nuclear power plant.

  18. Antibiotic, Pesticide, and Microbial Contaminants of Honey: Human Health Hazards

    Directory of Open Access Journals (Sweden)

    Noori Al-Waili

    2012-01-01

    Full Text Available Agricultural contamination with pesticides and antibiotics is a challenging problem that needs to be fully addressed. Bee products, such as honey, are widely consumed as food and medicine and their contamination may carry serious health hazards. Honey and other bee products are polluted by pesticides, heavy metals, bacteria and radioactive materials. Pesticide residues cause genetic mutations and cellular degradation and presence of antibiotics might increase resistant human or animal's pathogens. Many cases of infant botulisms have been attributed to contaminated honey. Honey may be very toxic when produced from certain plants. Ingestion of honey without knowing its source and safety might be problematic. Honey should be labeled to explore its origin, composition, and clear statement that it is free from contaminants. Honey that is not subjected for analysis and sterilization should not be used in infants, and should not be applied to wounds or used for medicinal purposes. This article reviews the extent and health impact of honey contamination and stresses on the introduction of a strict monitoring system and validation of acceptable minimal concentrations of pollutants or identifying maximum residue limits for bee products, in particular, honey.

  19. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    Science.gov (United States)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  20. 14 CFR 417.227 - Toxic release hazard analysis.

    Science.gov (United States)

    2010-01-01

    ... members of the public on land and on any waterborne vessels, populated offshore structures, and aircraft... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  1. Hazard and operability (HAZOP) analysis. A literature review.

    Science.gov (United States)

    Dunjó, Jordi; Fthenakis, Vasilis; Vílchez, Juan A; Arnaldos, Josep

    2010-01-15

    Hazard and operability (HAZOP) methodology is a Process Hazard Analysis (PHA) technique used worldwide for studying not only the hazards of a system, but also its operability problems, by exploring the effects of any deviations from design conditions. Our paper is the first HAZOP review intended to gather HAZOP-related literature from books, guidelines, standards, major journals, and conference proceedings, with the purpose of classifying the research conducted over the years and define the HAZOP state-of-the-art.

  2. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  3. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-04-15

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, each based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.

  4. Catastrophic debris-flows: geological hazard and human influence

    Science.gov (United States)

    Del Ventisette, Chiara; Garfagnoli, Francesca; Ciampalini, Andrea; Battistini, Alessandro; Gigli, Giovanni; Moretti, Sandro; Casagli, Nicola

    2013-04-01

    Rainfall-induced landslides are widespread phenomena often affecting urbanized areas and causing intense damages and casualties. The management of the post-event phase requires a fast evaluation of the involved areas and triggering factors. The latter are fundamental to evaluate the stability of the area affected by landslides, in order to facilitate quick and safe activities of the Civil Protection Authorities during the emergency. On October 1st 2009, a prolonged and intense rainstorm triggered hundreds of landslides (predominantly debris flows) in an area of about 50 km2 in the north-eastern sector of Sicily (Italy). Debris flows swept the highest parts of many villages and passed over the SS114 state highway and the Messina-Catania railway, causing more than 30 fatalities. This work deals with the geological and hydro-geomorphological studies performed as a part of the post-disaster activities operated in collaboration with Civil Protection Authority, with the aim of examining landslides effects and mechanisms. The data were elaborated into a GIS platform, to evaluate the influence of urbanization on the drainage pattern and were correlated with the lithological and structural framework of the area. The case study of Giampilieri focuses the attention on the necessity of sustainable land use and reasonable urban management in areas characterized by a high hydrogeological hazard and on the tremendous destructive power of these phenomena, which are capable of causing a large number of victims in such small villages. Field surveys and stereo-photo geomorphological analysis revealed a significant human influence on determining landslide triggering causes, as well as the final amount of damage. In particular, destruction and injuries in the built-up area of Giampilieri were made even more severe by the main water flow lines made narrower due to building activity and enlargement of the urban area. The area maintains a high degree of hazard: deposits of poorly

  5. Hazardous Waste Site Analysis (Small Site Technology)

    Science.gov (United States)

    1990-08-01

    information. " RCRA required all treaters , storers, and/or disposers to either have permits by November 1980, or qualify for interim status, by notifying...carbon dioxide or compressed liquid state propane ) is used as a solvent to extract organic hazardous constituents from waste. Additional processing

  6. User’s Guide - Seismic Hazard Analysis

    Science.gov (United States)

    1993-02-01

    Eartquake Magnitude Cutoff 8.5 example 8.8 Enter Site Longitude (Degrees) 117 example 115.0 Enter Site Latitude (Degrees) 38 example 38.5 Any Chnges? Y / H...the art for assessing earthquake hazards in the United States catalogue of strong motion eartquake records, Wtaerways Experiment Station, Vicks- burg

  7. A Situational Analysis of Priority Disaster Hazards in Uganda ...

    African Journals Online (AJOL)

    Background: Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for ...

  8. Seismic Hazard Analysis of the Bandung Triga 2000 Reactor Site

    OpenAIRE

    Parithusta, Rizkita; P, Sindur; Mangkoesoebroto

    2004-01-01

    SEISMIC HAZARD ANALYSIS OF THE BANDUNG TRIGA 2000 REACTOR SITE. A seismic hazard analysis of the West Java region is carried out to estimate the peak ground acceleration at the Bandung TRIGA 2000 nuclear reactor site. Both the probabilistic and deterministic approaches are employed to better capture the uncertainties considering the enclosing fault systems. Comprehensive analysis is performed based on the newly revised catalog of seismic data, the most recent results of the construction of se...

  9. Arc flash hazard analysis and mitigation

    CERN Document Server

    Das, J C

    2012-01-01

    "All the aspects of arc flash hazard calculations and their mitigation have been covered. Knowledge of electrical power systems up to undergraduate level is assumed. The calculations of short-circuits, protective relaying and varied electrical system configurations in industrial power systems are addressed. Protection systems address differential relays, arc flash sensing relays, protective relaying coordination, current transformer operation and saturation and applications to major electrical equipments from the arc flash considerations. Current technologies and strategies for arc flash mitigation have been covered. A new algorithm for the calculation of arc flash hazard accounting for the decaying nature of the short-circuit currents is included. There are many practical examples and study cases. Review questions and references follow each chapter"--

  10. Seismic hazard analysis for Jayapura city, Papua

    Energy Technology Data Exchange (ETDEWEB)

    Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A. [Geological Agency, Diponegoro Road No.57, Bandung, 40122 (Indonesia)

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  11. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-11-20

    ... Preventive Controls for Human Food; Extension of Comment Periods AGENCY: Food and Drug Administration, HHS... 3646), entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk- Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...

  12. 78 FR 24691 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-04-26

    ... Preventive Controls for Human Food; Extension of Comment Periods AGENCY: Food and Drug Administration, HHS... the proposed rule, ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive... rule entitled ``Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...

  13. The use of animals as a surveillance tool for monitoring environmental health hazards, human health hazards and bioterrorism.

    Science.gov (United States)

    Neo, Jacqueline Pei Shan; Tan, Boon Huan

    2017-05-01

    This review discusses the utilization of wild or domestic animals as surveillance tools for monitoring naturally occurring environmental and human health hazards. Besides providing early warning to natural hazards, animals can also provide early warning to societal hazards like bioterrorism. Animals are ideal surveillance tools to humans because they share the same environment as humans and spend more time outdoors than humans, increasing their exposure risk. Furthermore, the biologically compressed lifespans of some animals may allow them to develop clinical signs more rapidly after exposure to specific pathogens. Animals are an excellent channel for monitoring novel and known pathogens with outbreak potential given that more than 60 % of emerging infectious diseases in humans originate as zoonoses. This review attempts to highlight animal illnesses, deaths, biomarkers or sentinel events, to remind human and veterinary public health programs that animal health can be used to discover, monitor or predict environmental health hazards, human health hazards, or bioterrorism. Lastly, we hope that this review will encourage the implementation of animals as a surveillance tool by clinicians, veterinarians, ecosystem health professionals, researchers and governments. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Controlling organic chemical hazards in food manufacturing: a hazard analysis critical control points (HACCP) approach.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-08-01

    Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.

  15. Analysis of hazardous material releases due to natural hazards in the United States.

    Science.gov (United States)

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Cruz, Ana Maria

    2012-10-01

    Natural hazards were the cause of approximately 16,600 hazardous material (hazmat) releases reported to the National Response Center (NRC) between 1990 and 2008-three per cent of all reported hazmat releases. Rain-induced releases were most numerous (26 per cent of the total), followed by those associated with hurricanes (20 per cent), many of which resulted from major episodes in 2005 and 2008. Winds, storms or other weather-related phenomena were responsible for another 25 per cent of hazmat releases. Large releases were most frequently due to major natural disasters. For instance, hurricane-induced releases of petroleum from storage tanks account for a large fraction of the total volume of petroleum released during 'natechs' (understood here as a natural hazard and the hazardous materials release that results). Among the most commonly released chemicals were nitrogen oxides, benzene, and polychlorinated biphenyls. Three deaths, 52 injuries, and the evacuation of at least 5,000 persons were recorded as a consequence of natech events. Overall, results suggest that the number of natechs increased over the study period (1990-2008) with potential for serious human and environmental impacts. © 2012 The Author(s). Journal compilation © Overseas Development Institute, 2012.

  16. IARC monographs: 40 years of evaluating carcinogenic hazards to humans.

    Science.gov (United States)

    Pearce, Neil; Blair, Aaron; Vineis, Paolo; Ahrens, Wolfgang; Andersen, Aage; Anto, Josep M; Armstrong, Bruce K; Baccarelli, Andrea A; Beland, Frederick A; Berrington, Amy; Bertazzi, Pier Alberto; Birnbaum, Linda S; Brownson, Ross C; Bucher, John R; Cantor, Kenneth P; Cardis, Elisabeth; Cherrie, John W; Christiani, David C; Cocco, Pierluigi; Coggon, David; Comba, Pietro; Demers, Paul A; Dement, John M; Douwes, Jeroen; Eisen, Ellen A; Engel, Lawrence S; Fenske, Richard A; Fleming, Lora E; Fletcher, Tony; Fontham, Elizabeth; Forastiere, Francesco; Frentzel-Beyme, Rainer; Fritschi, Lin; Gerin, Michel; Goldberg, Marcel; Grandjean, Philippe; Grimsrud, Tom K; Gustavsson, Per; Haines, Andy; Hartge, Patricia; Hansen, Johnni; Hauptmann, Michael; Heederik, Dick; Hemminki, Kari; Hemon, Denis; Hertz-Picciotto, Irva; Hoppin, Jane A; Huff, James; Jarvholm, Bengt; Kang, Daehee; Karagas, Margaret R; Kjaerheim, Kristina; Kjuus, Helge; Kogevinas, Manolis; Kriebel, David; Kristensen, Petter; Kromhout, Hans; Laden, Francine; Lebailly, Pierre; LeMasters, Grace; Lubin, Jay H; Lynch, Charles F; Lynge, Elsebeth; 't Mannetje, Andrea; McMichael, Anthony J; McLaughlin, John R; Marrett, Loraine; Martuzzi, Marco; Merchant, James A; Merler, Enzo; Merletti, Franco; Miller, Anthony; Mirer, Franklin E; Monson, Richard; Nordby, Karl-Cristian; Olshan, Andrew F; Parent, Marie-Elise; Perera, Frederica P; Perry, Melissa J; Pesatori, Angela Cecilia; Pirastu, Roberta; Porta, Miquel; Pukkala, Eero; Rice, Carol; Richardson, David B; Ritter, Leonard; Ritz, Beate; Ronckers, Cecile M; Rushton, Lesley; Rusiecki, Jennifer A; Rusyn, Ivan; Samet, Jonathan M; Sandler, Dale P; de Sanjose, Silvia; Schernhammer, Eva; Costantini, Adele Seniori; Seixas, Noah; Shy, Carl; Siemiatycki, Jack; Silverman, Debra T; Simonato, Lorenzo; Smith, Allan H; Smith, Martyn T; Spinelli, John J; Spitz, Margaret R; Stallones, Lorann; Stayner, Leslie T; Steenland, Kyle; Stenzel, Mark; Stewart, Bernard W; Stewart, Patricia A; Symanski, Elaine; Terracini, Benedetto; Tolbert, Paige E; Vainio, Harri; Vena, John; Vermeulen, Roel; Victora, Cesar G; Ward, Elizabeth M; Weinberg, Clarice R; Weisenburger, Dennis; Wesseling, Catharina; Weiderpass, Elisabete; Zahm, Shelia Hoar

    2015-06-01

    Recently, the International Agency for Research on Cancer (IARC) Programme for the Evaluation of Carcinogenic Risks to Humans has been criticized for several of its evaluations, and also for the approach used to perform these evaluations. Some critics have claimed that failures of IARC Working Groups to recognize study weaknesses and biases of Working Group members have led to inappropriate classification of a number of agents as carcinogenic to humans. The authors of this Commentary are scientists from various disciplines relevant to the identification and hazard evaluation of human carcinogens. We examined criticisms of the IARC classification process to determine the validity of these concerns. Here, we present the results of that examination, review the history of IARC evaluations, and describe how the IARC evaluations are performed. We concluded that these recent criticisms are unconvincing. The procedures employed by IARC to assemble Working Groups of scientists from the various disciplines and the techniques followed to review the literature and perform hazard assessment of various agents provide a balanced evaluation and an appropriate indication of the weight of the evidence. Some disagreement by individual scientists to some evaluations is not evidence of process failure. The review process has been modified over time and will undoubtedly be altered in the future to improve the process. Any process can in theory be improved, and we would support continued review and improvement of the IARC processes. This does not mean, however, that the current procedures are flawed. The IARC Monographs have made, and continue to make, major contributions to the scientific underpinning for societal actions to improve the public's health.

  17. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    Science.gov (United States)

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  18. AN ENHANCED HAZARD ANALYSIS PROCESS FOR THE HANFORD TANK FARMS

    Energy Technology Data Exchange (ETDEWEB)

    SHULTZ MV

    2008-05-15

    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process.

  19. Pedestrian Evacuation Analysis for Tsunami Hazards

    Science.gov (United States)

    Jones, J. M.; Ng, P.; Wood, N. J.

    2014-12-01

    Recent catastrophic tsunamis in the last decade, as well as the 50th anniversary of the 1964 Alaskan event, have heightened awareness of the threats these natural hazards present to large and increasing coastal populations. For communities located close to the earthquake epicenter that generated the tsunami, strong shaking may also cause significant infrastructure damage, impacting the road network and hampering evacuation. There may also be insufficient time between the earthquake and first wave arrival to rely on a coordinated evacuation, leaving at-risk populations to self-evacuate on foot and across the landscape. Emergency managers evaluating these coastal risks need tools to assess the evacuation potential of low-lying areas in order to discuss mitigation options, which may include vertical evacuation structures to provide local safe havens in vulnerable communities. The U.S. Geological Survey has developed the Pedestrian Evacuation Analyst software tool for use by researchers and emergency managers to assist in the assessment of a community's evacuation potential by modeling travel times across the landscape and producing both maps of travel times and charts of population counts with corresponding times. The tool uses an anisotropic (directionally dependent) least cost distance model to estimate evacuation potential and allows for the variation of travel speed to measure its effect on travel time. The effectiveness of vertical evacuation structures on evacuation time can also be evaluated and compared with metrics such as travel time maps showing each structure in place and graphs displaying the percentage change in population exposure for each structure against the baseline. Using the tool, travel time maps and at-risk population counts have been generated for some coastal communities of the U.S. Pacific Northwest and Alaska. The tool can also be used to provide valuable decision support for tsunami vertical evacuation siting.

  20. Chemical hazards analysis of resilient flooring for healthcare.

    Science.gov (United States)

    Lent, Tom; Silas, Julie; Vallette, Jim

    2010-01-01

    This article addresses resilient flooring, evaluating the potential health effects of vinyl flooring and the leading alternatives-synthetic rubber, polyolefin, and linoleum-currently used in the healthcare marketplace. The study inventories chemicals incorporated as components of each of the four material types or involved in their life cycle as feedstocks, intermediary chemicals, or emissions. It then characterizes those chemicals using a chemical hazard-based framework that addresses persistence and bioaccumulation, human toxicity, and human exposures.

  1. Seismic Hazard Analysis and Uniform Hazard Spectra for Different Regions of Kerman

    Directory of Open Access Journals (Sweden)

    Gholamreza Ghodrati Amiri

    2015-09-01

    Full Text Available This paper was present a seismic hazard analysis and uniform hazard spectra for different regions of Kerman city. A collected catalogue containing both historical and instrumental events and covering the period from 8th century AD until now within the area of 200 Km in radius were used and Seismic sources are modeled. Kijko method has been applied for estimating the seismic parameters considering lack of suitable seismic data, inaccuracy of the available information and uncertainty of magnitude in different periods. To determine the peak ground acceleration the calculations were performed by using the logic tree method. Two weighted attenuation relations were used; including Ghodrati et al, 0.6 and Zare et al, 0.4. Analysis was conducted for 13×8 grid points over Kerman region and adjacent areas with SEISRISK III software and in order to determine the seismic spectra Ghodrati et al, spectral attenuation relationships was used.

  2. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, W.S.

    1994-08-23

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.

  3. IARC Monographs: 40 Years of Evaluating Carcinogenic Hazards to Humans

    Science.gov (United States)

    Blair, Aaron; Vineis, Paolo; Ahrens, Wolfgang; Andersen, Aage; Anto, Josep M.; Armstrong, Bruce K.; Baccarelli, Andrea A.; Beland, Frederick A.; Berrington, Amy; Bertazzi, Pier Alberto; Birnbaum, Linda S.; Brownson, Ross C.; Bucher, John R.; Cantor, Kenneth P.; Cardis, Elisabeth; Cherrie, John W.; Christiani, David C.; Cocco, Pierluigi; Coggon, David; Comba, Pietro; Demers, Paul A.; Dement, John M.; Douwes, Jeroen; Eisen, Ellen A.; Engel, Lawrence S.; Fenske, Richard A.; Fleming, Lora E.; Fletcher, Tony; Fontham, Elizabeth; Forastiere, Francesco; Frentzel-Beyme, Rainer; Fritschi, Lin; Gerin, Michel; Goldberg, Marcel; Grandjean, Philippe; Grimsrud, Tom K.; Gustavsson, Per; Haines, Andy; Hartge, Patricia; Hansen, Johnni; Hauptmann, Michael; Heederik, Dick; Hemminki, Kari; Hemon, Denis; Hertz-Picciotto, Irva; Hoppin, Jane A.; Huff, James; Jarvholm, Bengt; Kang, Daehee; Karagas, Margaret R.; Kjaerheim, Kristina; Kjuus, Helge; Kogevinas, Manolis; Kriebel, David; Kristensen, Petter; Kromhout, Hans; Laden, Francine; Lebailly, Pierre; LeMasters, Grace; Lubin, Jay H.; Lynch, Charles F.; Lynge, Elsebeth; ‘t Mannetje, Andrea; McMichael, Anthony J.; McLaughlin, John R.; Marrett, Loraine; Martuzzi, Marco; Merchant, James A.; Merler, Enzo; Merletti, Franco; Miller, Anthony; Mirer, Franklin E.; Monson, Richard; Nordby, Karl-Cristian; Olshan, Andrew F.; Parent, Marie-Elise; Perera, Frederica P.; Perry, Melissa J.; Pesatori, Angela Cecilia; Pirastu, Roberta; Porta, Miquel; Pukkala, Eero; Rice, Carol; Richardson, David B.; Ritter, Leonard; Ritz, Beate; Ronckers, Cecile M.; Rushton, Lesley; Rusiecki, Jennifer A.; Rusyn, Ivan; Samet, Jonathan M.; Sandler, Dale P.; de Sanjose, Silvia; Schernhammer, Eva; Costantini, Adele Seniori; Seixas, Noah; Shy, Carl; Siemiatycki, Jack; Silverman, Debra T.; Simonato, Lorenzo; Smith, Allan H.; Smith, Martyn T.; Spinelli, John J.; Spitz, Margaret R.; Stallones, Lorann; Stayner, Leslie T.; Steenland, Kyle; Stenzel, Mark; Stewart, Bernard W.; Stewart, Patricia A.; Symanski, Elaine; Terracini, Benedetto; Tolbert, Paige E.; Vainio, Harri; Vena, John; Vermeulen, Roel; Victora, Cesar G.; Ward, Elizabeth M.; Weinberg, Clarice R.; Weisenburger, Dennis; Wesseling, Catharina; Weiderpass, Elisabete; Zahm, Shelia Hoar

    2015-01-01

    Background: Recently, the International Agency for Research on Cancer (IARC) Programme for the Evaluation of Carcinogenic Risks to Humans has been criticized for several of its evaluations, and also for the approach used to perform these evaluations. Some critics have claimed that failures of IARC Working Groups to recognize study weaknesses and biases of Working Group members have led to inappropriate classification of a number of agents as carcinogenic to humans. Objectives: The authors of this Commentary are scientists from various disciplines relevant to the identification and hazard evaluation of human carcinogens. We examined criticisms of the IARC classification process to determine the validity of these concerns. Here, we present the results of that examination, review the history of IARC evaluations, and describe how the IARC evaluations are performed. Discussion: We concluded that these recent criticisms are unconvincing. The procedures employed by IARC to assemble Working Groups of scientists from the various disciplines and the techniques followed to review the literature and perform hazard assessment of various agents provide a balanced evaluation and an appropriate indication of the weight of the evidence. Some disagreement by individual scientists to some evaluations is not evidence of process failure. The review process has been modified over time and will undoubtedly be altered in the future to improve the process. Any process can in theory be improved, and we would support continued review and improvement of the IARC processes. This does not mean, however, that the current procedures are flawed. Conclusions: The IARC Monographs have made, and continue to make, major contributions to the scientific underpinning for societal actions to improve the public’s health. Citation: Pearce N, Blair A, Vineis P, Ahrens W, Andersen A, Anto JM, Armstrong BK, Baccarelli AA, Beland FA, Berrington A, Bertazzi PA, Birnbaum LS, Brownson RC, Bucher JR, Cantor KP

  4. Hazardous-waste analysis plan for LLNL operations

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  5. Computer software for process hazards analysis.

    Science.gov (United States)

    Hyatt, N

    2000-10-01

    Computerized software tools are assuming major significance in conducting HAZOPs. This is because they have the potential to offer better online presentations and performance to HAZOP teams, as well as better documentation and downstream tracking. The chances of something being "missed" are greatly reduced. We know, only too well, that HAZOP sessions can be like the industrial equivalent of a trip to the dentist. Sessions can (and usually do) become arduous and painstaking. To make the process easier for all those involved, we need all the help computerized software can provide. In this paper I have outlined the challenges addressed in the production of Windows software for performing HAZOP and other forms of PHA. The object is to produce more "intelligent", more user-friendly software for performing HAZOP where technical interaction between team members is of key significance. HAZOP techniques, having already proven themselves, are extending into the field of computer control and human error. This makes further demands on HAZOP software and emphasizes its importance.

  6. Historical analysis of US pipeline accidents triggered by natural hazards

    Science.gov (United States)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  7. Application of systems and control theory-based hazard analysis to radiation oncology.

    Science.gov (United States)

    Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G

    2016-03-01

    Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve

  8. Seismic hazard analysis of Sinop province, Turkey using ...

    Indian Academy of Sciences (India)

    Using 4.0 and greater magnitude earthquakes which occurred between 1 January 1900 and 31 December 2008 in the Sinop province of Turkey this study presents a seismic hazard analysis based on the probabilistic and statistical methods. According to the earthquake zonation map, Sinop is divided into first, second, third ...

  9. Environmental Impact and Hazards Analysis Critical Control Point ...

    African Journals Online (AJOL)

    Tsire is a local meat delicacy (kebab) in northern Nigeria, which has become popular and widely acceptable throughout the country and even beyond. Three production sites of tsire were evaluated for the environmental impact and hazard analysis critical control point (HACCP) on the microbiological and chemical qualities ...

  10. Development of Hazard Analysis Critical Control Points (HACCP ...

    African Journals Online (AJOL)

    Development of Hazard Analysis Critical Control Points (HACCP) and Enhancement of Microbial Safety Quality during Production of Fermented Legume Based ... Nigerian Food Journal ... Critical control points during production of iru and okpehe, two fermented condiments, were identified in four processors in Nigeria.

  11. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    Energy Technology Data Exchange (ETDEWEB)

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  12. 14 CFR 417.223 - Flight hazard area analysis.

    Science.gov (United States)

    2010-01-01

    ... § 417.205(a) apply. The analysis must account for, at a minimum: (1) All trajectory times from liftoff to the planned safe flight state of § 417.219(c), including each planned impact, for an orbital... trajectory dispersion effects in the surface impact domain. (b) Public notices. A flight hazard areas...

  13. HADES: Microprocessor Hazard Analysis via Formal Verification of Parameterized Systems

    Directory of Open Access Journals (Sweden)

    Lukáš Charvát

    2016-12-01

    Full Text Available HADES is a fully automated verification tool for pipeline-based microprocessors that aims at flaws caused by improperly handled data hazards. It focuses on single-pipeline microprocessors designed at the register transfer level (RTL and deals with read-after-write, write-after-write, and write-after-read hazards. HADES combines several techniques, including data-flow analysis, error pattern matching, SMT solving, and abstract regular model checking. It has been successfully tested on several microprocessors for embedded applications.

  14. Hazard analysis of Clostridium perfringens in the Skylab Food System

    Science.gov (United States)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  15. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    Science.gov (United States)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  16. Frequency Analysis of Aircraft hazards for License Application

    Energy Technology Data Exchange (ETDEWEB)

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  17. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    Science.gov (United States)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  18. Choosing Appropriate Hazards Analysis Techniques For Your Process

    Science.gov (United States)

    1996-08-21

    Study ( HAZOP ); (v) Failure Mode and Effects Analysis (FMEA); (vi ) Fault Tree Analysis; or (vii) An appropriate equivalent methodology.” The safety...CFR 1910.119: ! Checklist ! What-if ! What-if Checklist ! Hazards and Operability Study ( HAZOP ) ! Fault Tree / Logic Diagram ! Failure Modes and...than the other methods and are more appropriate for a simple process. The HAZOP has found much use in the petroleum and chemical industries and the

  19. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    OpenAIRE

    Kristian Beckers; Jürgen Dürrwang; Dominik Holling

    2016-01-01

    The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied to...

  20. ANALYSIS OF INTERNAL SOURCES OF HAZARDS IN CIVIL AIR OPERATIONS

    Directory of Open Access Journals (Sweden)

    Katarzyna CHRUZIK

    2017-03-01

    Full Text Available International air law imposes an obligation on the part of transport operators to operationalize risk management, and hence develop records of hazards and estimate the level of risk in the respective organization. Air transport is a complex system combining advanced technical systems, operators and procedures. Sources of hazards occur in all of these closely related and mutually interacting areas, which operate in highly dispersed spaces with a short time horizon. A highly important element of risk management is therefore to identify sources of danger, not only in terms of their own internal risks (the source of threats and activation of threats within the same transport organization, but also in the area of common risk (sources of threats beyond the transport system to which the activation of the hazard is related and external risks (sources of threats outside the transport system. The overall risk management of a transport organization should consider all three risk areas. The paper presents an analysis of internal sources of threats to civil air operations and the resulting main risk areas. The article complements a previous paper by the same authors entitled “Analysis of external sources of hazards in civil air operations”.

  1. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    Science.gov (United States)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  2. Uncertainty analysis for seismic hazard in Northern and Central Italy

    Science.gov (United States)

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  3. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    Science.gov (United States)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  4. Hazardous Materials Routing Study Phase II: Analysis of Hazardous Materials Truck Routes in Proximity to the Dallas Central Business District

    Science.gov (United States)

    1985-10-01

    This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...

  5. Environmental risk analysis of hazardous material rail transportation.

    Science.gov (United States)

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials. Published by Elsevier B.V.

  6. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    Science.gov (United States)

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  7. Mercury hazards from gold mining to humans, plants, and animals

    Science.gov (United States)

    Eisler, R.

    2004-01-01

    Mercury contamination of the environment from historical and ongoing mining practices that rely on mercury amalgamation for gold extraction is widespread. Contamination was particularly severe in the immediate vicinity of gold extraction and refining operations; however, mercury--especially in the form of water-soluble methylmercury--may be transported to pristine areas by rainwater, water currents, deforestation, volatilization, and other vectors. Examples of gold mining-associated mercury pollution are shown for Canada, the United States, Africa, China, the Philippines, Siberia, and South America. In parts of Brazil, for example, mercury concentrations in all abiotic materials, plants, and animals--including endangered species of mammals and reptiles--collected near ongoing mercury-amalgamation gold mining sites were far in excess of allowable mercury levels promulgated by regulatory agencies for the protection of human health and natural resources. Although health authorities in Brazil are unable to detect conclusive evidence of human mercury intoxication, the potential exists in the absence of mitigation for epidemic mercury poisoning of the mining population and environs. In the United States, environmental mercury contamination is mostly from historical gold mining practices, and portions of Nevada remain sufficiently mercury-contaminated to pose a hazard to reproduction of carnivorous fishes and fish-eating birds. Concentrations of total mercury lethal to sensitive representative natural resources range from 0.1 to 2.0 ug/L of medium for aquatic organisms; from 2200 to 31,000 ug/kg body weight (acute oral) and 4000 to 40,000 ug/kg (dietary) for birds; and from 100 to 500 ug/kg body weight (daily dose) and 1000 to 5000 ug/kg diet for mammals. Significant adverse sublethal effects were observed among selected aquatic species at water concentrations of 0.03 to 0.1 ug Hg/L. For some birds, adverse effects--mainly on reproduction--have been associated with total

  8. Contaminants in human nail dust: an occupational hazard in podiatry?

    Science.gov (United States)

    Tinley, Paul D; Eddy, Karen; Collier, Peter

    2014-02-20

    There has been limited literature indicating that podiatrists' health may be at risk from exposure to human nail dust. Previous studies carried out in the UK have shown that large amounts of dust become airborne during the human nail drilling procedure and are present in the air up to 10 hours after a clinical session. This increases the risk of Respiratory Tract (RT) infection for the practitioner. This study used a nasal swabbing technique and fungal culture to determine whether podiatrists (n = 50) had the same microbes present in their nasal cavities as non-podiatry health professional control group (n = 45). All swabs were cultured, counted and identified for each subject. Survey data of use and type of nail drill, type of mask used and frequency of change over a two week period. The results showed podiatrists had a greater range of microbes in their nasal cavities although the controls had greater overall numbers of organisms. The known pathogen and common mould, Aspergillus fumigatus was ost commonly found fungus within the podiatric group with 44% of the group having the fungus present. All nail drills used by the podiatrists had some form of dust extraction (except one). Of concern was 17% (n = 8) of the podiatrists did not use a mask at all whilst drilling and seemed unaware of any infection control issues. Simple disposable masks were the most frequently worn with only half being changed after each patient further increasing the cross infection risk The high levels of Aspergilus contamination is a significant finding in the podiatry group as this fungus is small enough to enter the tissue of the nasal cavity and as a small particle will stay airborne in the room for up to 16 hours. Aspergilus has been shown to cause brain and soft tissue tumours in extreme cases. The high levels of upper respiratory track problems reported in the literature may well be caused by this fungal agent. The non use and use of inappropriate masks by podiatrists is clearly an

  9. Geophysics in Mejillones Basin, Chile: Dynamic analysis and associatedseismic hazard

    Science.gov (United States)

    Maringue, J. I.; Yanez, G. A.; Lira, E.; Podestá, L., Sr.; Figueroa, R.; Estay, N. P.; Saez, E.

    2016-12-01

    The active margin of South America has a high seismogenic potential. In particular, the Mejillones peninsula, located in northern Chile, represents a site of interest for seismic hazard due to 100-year seismic gap, the potentially large site effects, and the presence of the most important port in the region. We perform a dynamic analysis of the zone from a spatial and petrophysical model of the Mejillones Basin, to understand its behavior under realistic seismic scenarios. Geometry and petrophysics of the basin were obtained from an integrated modeling of geophysics observations (gravity, seismic and electromagnetic data) distributed mainly in Pampa Mejillones whose western edge is limited by Mejillones Fault, oriented north-south. This regional-scale normal fault shows a half-graben geometry which controls the development of the Mejillones basin eastwards. The gravimetric and magnetotelluric methods allow to define the geometry of the basin, through a cover/basement density contrast, and the transition zone from very low-moderate electrical resistivities, respectively. The seismic method complements the petrophysics in terms of the shear wave depth profile. The results show soil's thicknesses up to 700 meters on deeper zone, with steeper slopes to the west and lower slopes to the east, in agreement with the normal-fault-half-graben basin geometry. Along the N-S direction there are not great differences in basin depth, comprising an almost 2D problem. In terms of petrophysics, the sedimentary stratum is characterized by shear velocities between 300-700 m/s, extremely low electrical resistivities, below 1 ohm-m, and densities from 1.4 to 1.8 gr/cc. The numerical simulation of the seismic waves amplification gives values in the order of 0.8g, which implying large surface damages. The results demonstrate a potential risk in Mejillones bay to future events, therefore is very important to generate mitigations policies for infrastructure and human settlements.

  10. Flood Hazard and Risk Analysis in Urban Area

    Science.gov (United States)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien

    2017-04-01

    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  11. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Sevougian, S. David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  12. Mercury hazards from gold mining to humans, plants, and animals.

    Science.gov (United States)

    Eisler, Ronald

    2004-01-01

    Mercury contamination of the environment from historical and ongoing mining practices that rely on mercury amalgamation for gold extraction is widespread. Contamination was particularly severe in the immediate vicinity of gold extraction and refining operations; however, mercury, especially in the form of water-soluble methylmercury, may be transported to pristine areas by rainwater, water currents, deforestation, volatilization, and other vectors. Examples of gold mining-associated mercury pollution have been shown for Canada, the U.S., Africa, China, the Philippines, Siberia, and South America. In parts of Brazil, for example, mercury concentrations in all abiotic materials, plants, and animals, including endangered species of mammals and reptiles, collected near ongoing mercury amalgamation gold mining sites were far in excess of allowable mercury levels promulgated by regulatory agencies for the protection of human health and natural resources. Although health authorities in Brazil are unable to detect conclusive evidence of human mercury intoxication, the potential exists in the absence of mitigation for epidemic mercury poisoning of the mining population and environs. In the U.S., environmental mercury contamination is mostly from historical gold mining practices, and portions of Nevada remain sufficiently mercury contaminated to pose a hazard to reproduction of carnivorous fishes and fish-eating birds. Concentrations of total mercury lethal to sensitive representative natural resources range from 0.1 to 2.0 microg/L of medium for aquatic organisms; from 2,200 to 31,000 microg/kg BW (acute oral) and from 4,000 to 40,000 microg/kg (dietary) for birds; and from 100 to 500 microg/kg BW (daily dose) and from 1,000 to 5,000 microg/kg diet for mammals. Significant adverse sublethal effects were observed among selected aquatic species at water concentrations of 0.03-0.1 microg Hg/L. For some birds, adverse effects, mainly on reproduction, have been associated with

  13. Seismic Hazard analysis of Adjaria Region in Georgia

    Science.gov (United States)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  14. Tree-ring analysis in natural hazards research - an overview

    Science.gov (United States)

    Stoffel, M.; Bollschweiler, M.

    2008-03-01

    The understanding of geomorphic processes and knowledge of past events are important tasks for the assessment of natural hazards. Tree rings have on varied occasions proved to be a reliable tool for the acquisition of data on past events. In this review paper, we provide an overview on the use of tree rings in natural hazards research, starting with a description of the different types of disturbances by geomorphic processes and the resulting growth reactions. Thereafter, a summary is presented on the different methods commonly used for the analysis and interpretation of reactions in affected trees. We illustrate selected results from dendrogeomorphological investigations of geomorphic processes with an emphasis on fluvial (e.g., flooding, debris flows) and mass-movement processes (e.g., landslides, snow avalanche), where lots of data have been generated over the past few decades. We also present results from rockfall and permafrost studies, where data are much scarcer, albeit data from tree-ring studies have proved to be of great value in these fields as well. Most studies using tree rings have focused on alpine environments in Europe and North America, whereas other parts of the world have been widely neglected by dendrogeomorphologists so far. We therefore challenge researchers to focus on other regions with distinct climates as well, to look on less frequently studied processes as well and to broaden and improve approaches and methods commonly used in tree-ring research so as to allow a better understanding of geomorphic processes, natural hazards and risk.

  15. Hazardous materials transportation: a risk-analysis-based routing methodology.

    Science.gov (United States)

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-07

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  16. Evaluation of an active learning module to teach hazard and risk in Hazard Analysis and Critical Control Points (HACCP) classes.

    Science.gov (United States)

    Oyarzabal, Omar A; Rowe, Ellen

    2017-04-01

    The terms hazard and risk are significant building blocks for the organization of risk-based food safety plans. Unfortunately, these terms are not clear for some personnel working in food manufacturing facilities. In addition, there are few examples of active learning modules for teaching adult participants the principles of hazard analysis and critical control points (HACCP). In this study, we evaluated the effectiveness of an active learning module to teach hazard and risk to participants of HACCP classes provided by the University of Vermont Extension in 2015 and 2016. This interactive module is comprised of a questionnaire; group playing of a dice game that we have previously introduced in the teaching of HACCP; the discussion of the terms hazard and risk; and a self-assessment questionnaire to evaluate the teaching of hazard and risk. From 71 adult participants that completed this module, 40 participants (56%) provided the most appropriate definition of hazard, 19 participants (27%) provided the most appropriate definition of risk, 14 participants (20%) provided the most appropriate definitions of both hazard and risk, and 23 participants (32%) did not provide an appropriate definition for hazard or risk. Self-assessment data showed an improvement in the understanding of these terms (P < 0.05). Thirty participants (42%) stated that the most valuable thing they learned with this interactive module was the difference between hazard and risk, and 40 participants (65%) responded that they did not attend similar presentations in the past. The fact that less than one third of the participants answered properly to the definitions of hazard and risk at baseline is not surprising. However, these results highlight the need for the incorporation of modules to discuss these important food safety terms and include more active learning modules to teach food safety classes. This study suggests that active learning helps food personnel better understand important food safety

  17. Evaluation of an active learning module to teach hazard and risk in Hazard Analysis and Critical Control Points (HACCP classes

    Directory of Open Access Journals (Sweden)

    Omar A. Oyarzabal

    2017-04-01

    Full Text Available The terms hazard and risk are significant building blocks for the organization of risk-based food safety plans. Unfortunately, these terms are not clear for some personnel working in food manufacturing facilities. In addition, there are few examples of active learning modules for teaching adult participants the principles of hazard analysis and critical control points (HACCP. In this study, we evaluated the effectiveness of an active learning module to teach hazard and risk to participants of HACCP classes provided by the University of Vermont Extension in 2015 and 2016. This interactive module is comprised of a questionnaire; group playing of a dice game that we have previously introduced in the teaching of HACCP; the discussion of the terms hazard and risk; and a self-assessment questionnaire to evaluate the teaching of hazard and risk. From 71 adult participants that completed this module, 40 participants (56% provided the most appropriate definition of hazard, 19 participants (27% provided the most appropriate definition of risk, 14 participants (20% provided the most appropriate definitions of both hazard and risk, and 23 participants (32% did not provide an appropriate definition for hazard or risk. Self-assessment data showed an improvement in the understanding of these terms (P < 0.05. Thirty participants (42% stated that the most valuable thing they learned with this interactive module was the difference between hazard and risk, and 40 participants (65% responded that they did not attend similar presentations in the past. The fact that less than one third of the participants answered properly to the definitions of hazard and risk at baseline is not surprising. However, these results highlight the need for the incorporation of modules to discuss these important food safety terms and include more active learning modules to teach food safety classes. This study suggests that active learning helps food personnel better understand important

  18. Hazard analysis of a computer based medical diagnostic system.

    Science.gov (United States)

    Chudleigh, M F

    1994-07-01

    Medical screening of sectors of the population is now a routine and vital part of health care: an example is cervical smear testing. There is currently significant interest in the possible introduction of semi-automated microscopy systems for cervical cytology and one such experimental system is now undergoing laboratory trials. A collaborative project has been set up to demonstrate the benefits and constraints that arise from applying safety-critical methods developed in other domains to such a diagnostic system. We have carried out a system hazard analysis, successfully using the HAZOP technique adapted from the petrochemical industry.

  19. Identifying nursing hazards in the emergency department: a new approach to nursing job hazard analysis.

    Science.gov (United States)

    Ramsay, Jim; Denny, Frank; Szirotnyak, Kara; Thomas, Jonathan; Corneliuson, Elizabeth; Paxton, Kim L

    2006-01-01

    It is widely acknowledged that nurses are crucial components in healthcare system. In their roles, nurses are regularly confronted with a variety of biological, physical, and chemical hazards during the course of performing their duties. The safety of nurses themselves, and subsequently that of their patients, depends directly upon the degree to which nurses have knowledge of occupational hazards specific to their jobs and managerial mechanisms for mitigating those hazards. The level of occupational safety and health training resources available to nurses, as well as management support, are critical factors in preventing adverse outcomes from routine job-related hazards. This study will identify gaps in self protective safety education for registered nurses working in emergency departments as well as for nursing students. Furthermore, this study reviews the nature and scope of occupational nursing hazards, and the degree to which current nursing education and position descriptions (or functional statements) equip nurses to recognize and address the hazards inherent in their jobs. This study has three parts. First, a literature review was performed to summarize the nature and scope of occupational nursing hazards. Second, the safety components of position descriptions from 29 Veterans Affairs (VA) hospitals across the United States were obtained and evaluated by an expert panel of occupational health nurses. Finally, an expert panel of occupational health nurses evaluated the degree to which nursing accreditation standards are integrated with OSHA's list of known emergency department hazards; and a separate expert panel of occupational health nurses evaluated the degree to which current VA emergency department nursing position descriptions incorporated hazard recognition and control strategies. Ultimately, prevention of job-related injuries for nurses, and subsequently their patients, will depend directly on the degree to which nurses can identify and control the

  20. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers

    2016-06-01

    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  1. Fire hazard analysis for Plutonium Finishing Plant complex

    Energy Technology Data Exchange (ETDEWEB)

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  2. Natural hazards, disasters and human kind: Whither ecosystem management?

    Digital Repository Service at National Institute of Oceanography (India)

    Mascarenhas, A.; Mudholkar, A.V.

    disincentive to the construction programs. A major cause for the Himalayan flood was water charged with moraine deposits (boulders, sand and clays) which came hurtling down due to the breaking of the moraine lake barrier, located ~4 km uphill of Kedarnath... or erosion (Pilkey et al. 2000). Vegetated landforms have an inherent hazard-prevention value and hence reinforce the need to classify them as critical areas to be preserved. The most ideal low-risk development is the one that recognizes natural geological...

  3. Job Hazards Analysis Among A Group Of Surgeons At Zagazig ...

    African Journals Online (AJOL)

    Methods: A cross section study was don upon a random sample of surgeons working at Zagazig University teaching hospitals evaluated to their job hazards using quantitative hazard assessment questionnaire and calculating job steps total hazards score by standardized risk assessment score followed by expert panel ...

  4. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis.

    Science.gov (United States)

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA.

  5. The urban environment, its hazards and human behaviour

    Directory of Open Access Journals (Sweden)

    Marko Polič

    1999-01-01

    Full Text Available The physical environment is only a tool, a medium or a place enabling human interrelations to develop. This is perhaps the most evident in cases of dangers people confront within an environment. Everything from disasters and minor incidents to vandalism and crime is reflected in human behaviour, from satisfying our basic needs all the way to discerning the sense of reality. The article presents an array of reflections from accidents and dangers in an urban environment that can hurt the largest number of people, to less dangerous, but unpleasant acts for an individual.

  6. IARC Monographs: 40 Years of Evaluating Carcinogenic Hazards to Humans

    NARCIS (Netherlands)

    Pearce, Neil; Blair, Aaron; Vineis, Paolo; Ahrens, Wolfgang; Andersen, Aage; Anto, Josep M.; Armstrong, Bruce K.; Baccarelli, Andrea A.; Beland, Frederick A.; Berrington, Amy; Bertazzi, Pier Alberto; Birnbaum, Linda S.; Brownson, Ross C.; Bucher, John R.; Cantor, Kenneth P.; Cardis, Elisabeth; Cherrie, John W.; Christiani, David C.; Cocco, Pierluigi; Coggon, David; Comba, Pietro; Demers, Paul A.; Dement, John M.; Douwes, Jeroen; Eisen, Ellen A.; Engel, Lawrence S.; Fenske, Richard A.; Fleming, Lora E.; Fletcher, Tony; Fontham, Elizabeth; Forastiere, Francesco; Frentzel-Beyme, Rainer; Fritschi, Lin; Gerin, Michel; Goldberg, Marcel; Grandjean, Philippe; Grimsrud, Tom K.; Gustavsson, Per; Haines, Andy; Hartge, Patricia; Hansen, Johnni; Hauptmann, Michael; Heederik, Dick; Hemminki, Kari; Hemon, Denis; Hertz-Picciotto, Irva; Hoppin, Jane A.; Huff, James; Jarvholm, Bengt; Kang, Daehee; Karagas, Margaret R.; Kjaerheim, Kristina; Kjuus, Helge; Kogevinas, Manolis; Kriebel, David; Kristensen, Petter; Kromhout, Hans; Laden, Francine; Lebailly, Pierre; LeMasters, Grace; Lubin, Jay H.; Lynch, Charles F.; Lynge, Elsebeth; 't Mannetje, Andrea; McMichael, Anthony J.; McLaughlin, John R.; Marrett, Loraine; Martuzzi, Marco; Merchant, James A.; Merler, Enzo; Merletti, Franco; Miller, Anthony; Mirer, Franklin E.; Monson, Richard; Nordby, Karl-Cristian; Olshan, Andrew F.; Parent, Marie-Elise; Perera, Frederica P.; Perry, Melissa J.; Pesatori, Angela Cecilia; Pirastu, Roberta; Porta, Miquel; Pukkala, Eero; Rice, Carol; Richardson, David B.; Ritter, Leonard; Ritz, Beate; Ronckers, Cecile M.; Rushton, Lesley; Rusiecki, Jennifer A.; Rusyn, Ivan; Samet, Jonathan M.; Sandler, Dale P.; de Sanjose, Silvia; Schernhammer, Eva; Costantini, Adele Seniori; Seixas, Noah; Shy, Carl; Siemiatycki, Jack; Silverman, Debra T.; Simonato, Lorenzo; Smith, Allan H.; Smith, Martyn T.; Spinelli, John J.; Spitz, Margaret R.; Stallones, Lorann; Stayner, Leslie T.; Steenland, Kyle; Stenzel, Mark; Stewart, Bernard W.; Stewart, Patricia A.; Symanski, Elaine; Terracini, Benedetto; Tolbert, Paige E.; Vainio, Harri; Vena, John; Vermeulen, Roel; Victora, Cesar G.; Ward, Elizabeth M.; Weinberg, Clarice R.; Weisenburger, Dennis; Wesseling, Catharina; Weiderpass, Elisabete; Zahm, Shelia Hoar

    2015-01-01

    Recently, the International Agency for Research on Cancer (IARC) Programme for the Evaluation of Carcinogenic Risks to Humans has been criticized for several of its evaluations, and also for the approach used to perform these evaluations. Some critics have claimed that failures of IARC Working

  7. IARC Monographs: 40 Years of Evaluating Carcinogenic Hazards to Humans

    NARCIS (Netherlands)

    Pearce, Neil E; Blair, Aaron; Vineis, Paolo; Ahrens, Wolfgang; Andersen, Aage; Anto, Josep M; Armstrong, Bruce K; Baccarelli, Andrea A; Beland, Frederick A; Berrington, Amy; Bertazzi, Pier A; Birnbaum, Linda S; Brownson, Ross C; Bucher, John R; Cantor, Kenneth P; Cardis, Elisabeth; Cherrie, John W; Christiani, David C; Cocco, Pierluigi; Coggon, David; Comba, Pietro; Demers, Paul A; Dement, John M; Douwes, Jeroen; Eisen, Ellen A; Engel, Lawrence S; Fenske, Richard A; Fleming, Lora E; Fletcher, Tony; Fontham, Elizabeth; Forastiere, Francesco; Frentzel-Beyme, Rainer; Fritschi, Lin; Gerin, Michel; Goldberg, Marcel; Grandjean, Philippe; Grimsrud, Tom K; Gustavsson, Per; Haines, Andy; Hartge, Patricia; Hansen, Johnni; Hauptmann, Michael; Heederik, Dick; Hemminki, Kari; Hemon, Denis; Hertz-Picciotto, Irva; Hoppin, Jane A; Huff, James; Jarvholm, Bengt; Kang, Daehee; Karagas, Margaret R; Kjaerheim, Kristina; Kjuus, Helge; Kogevinas, Manolis; Kriebel, David; Kristensen, Petter; Kromhout, Hans; Laden, Francine; Lebailly, Pierre; LeMasters, Grace; Lubin, Jay H; Lynch, Charles F; Lynge, Elsebeth; 't Mannetje, Andrea; McMichael, Anthony J; McLaughlin, John R; Marrett, Loraine; Martuzzi, Marco; Merchant, James A; Merler, Enzo; Merletti, Franco; Miller, Anthony; Mirer, Franklin E; Monson, Richard; Nordby, Karl-Kristian; Olshan, Andrew F; Parent, Marie-Elise; Perera, Frederica P; Perry, Melissa J; Pesatori, Angela C; Pirastu, Roberta; Porta, Miquel; Pukkala, Eero; Rice, Carol; Richardson, David B; Ritter, Leonard; Ritz, Beate; Ronckers, Cecile M; Rushton, Lesley; Rusiecki, Jennifer A; Rusyn, Ivan; Samet, Jonathan M; Sandler, Dale P; de Sanjose, Silvia; Schernhammer, Eva; Seniori Constantini, Adele; Seixas, Noah; Shy, Carl; Siemiatycki, Jack; Silvermann, Debra T; Simonato, Lorenzo; Smith, Allan H; Smith, Martyn T; Spinelli, John J; Spitz, Margaret R; Stallones, Lorann; Stayner, Leslie T; Steenland, Kyle; Stenzel, Mark; Stewart, Bernard W; Stewart, Patricia A; Symanski, Elaine; Terracini, Benedetto; Tolbert, Paige E; Vainio, Harri; Vena, John; Vermeulen, Roel; Victora, Cesar G; Ward, Elizabeth M; Weinberg, Clarice R; Weisenburger, Dennis; Wesseling, Catharina; Weiderpass, Elisabete; Zahm, Shelia H

    2015-01-01

    BACKGROUND: Recently the International Agency for Research on Cancer (IARC) Programme for the Evaluation of Carcinogenic Risks to Humans has been criticized for several of its evaluations, and also the approach used to perform these evaluations. Some critics have claimed that IARC Working Groups'

  8. An Approach to Human Error Hazard Detection of Unexpected Situations in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sangjun; Oh, Yeonju; Shin, Youmin; Lee, Yong-Hee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Fukushima accident is a typical complex event including the extreme situations induced by the succeeding earthquake, tsunami, explosion, and human errors. And it is judged with incomplete cause of system build-up same manner, procedure as a deficiency of response manual, education and training, team capability and the discharge of operator from human engineering point of view. Especially, the guidelines of current operating NPPs are not enough including countermeasures to the human errors at the extreme situations. Therefore, this paper describes a trial to detect the hazards of human errors at extreme situation, and to define the countermeasures that can properly response to the human error hazards when an individual, team, organization, and working entities that encounter the extreme situation in NPPs. In this paper we try to propose an approach to analyzing and extracting human error hazards for suggesting additional countermeasures to the human errors in unexpected situations. They might be utilized to develop contingency guidelines, especially for reducing the human error accident in NPPs. But the trial application in this study is currently limited since it is not easy to find accidents cases in detail enough to enumerate the proposed steps. Therefore, we will try to analyze as more cases as possible, and consider other environmental factors and human error conditions.

  9. Radiation -- A Cosmic Hazard to Human Habitation in Space

    Science.gov (United States)

    Lewis, Ruthan; Pellish, Jonathan

    2017-01-01

    Radiation exposure is one of the greatest environmental threats to the performance and success of human and robotic space missions. Radiation permeates all space and aeronautical systems, challenges optimal and reliable performance, and tests survival and survivability. We will discuss the broad scope of research, technological, and operational considerations to forecast and mitigate the effects of the radiation environment for deep space and planetary exploration.

  10. An Update on the Hazards and Risks of Forensic Anthropology, Part I: Human Remains.

    Science.gov (United States)

    Roberts, Lindsey G; Dabbs, Gretchen R; Spencer, Jessica R

    2016-01-01

    This work reviews the hazards and risks of practicing forensic anthropology in North America, with a focus on pathogens encountered through contact with unpreserved human remains. Since the publication of Galloway and Snodgrass' seminal paper concerning the hazards of forensic anthropology, research has provided new information about known pathogen hazards, and regulating authorities have updated recommendations for the recognition and treatment of several infections. Additionally, forensic anthropology has gained popularity, exposing an increased number of students and practitioners to these hazards. Current data suggest many occupational exposures to blood or body fluids go unreported, especially among students, highlighting the need for this discussion. For each pathogen and associated disease, this work addresses important history, reviews routes of exposure, provides an overview of symptoms and treatments, lists decontamination procedures, and presents data on postmortem viability. Personal protection and laboratory guidelines should be established and enforced in conjunction with the consideration of these data. © 2015 American Academy of Forensic Sciences.

  11. Natural hazard modeling and uncertainty analysis [Chapter 2

    Science.gov (United States)

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  12. Spatial prediction of landslide hazard using discriminant analysis and GIS

    Science.gov (United States)

    Peter V. Gorsevski; Paul Gessler; Randy B. Foltz

    2000-01-01

    Environmental attributes relevant for spatial prediction of landslides triggered by rain and snowmelt events were derived from digital elevation model (DEM). Those data in conjunction with statistics and geographic information system (GIS) provided a detailed basis for spatial prediction of landslide hazard. The spatial prediction of landslide hazard in this paper is...

  13. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Science.gov (United States)

    2010-01-01

    ... either the methodology provided in the Risk Management Plan (RMP) Offsite Consequence Analysis Guidance..., App. I Appendix I to Part 417—Methodologies for Toxic Release Hazard Analysis and Operational Procedures I417.1General This appendix provides methodologies for performing toxic release hazard analysis...

  14. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  15. A landslide on a mudslide? Natural hazards and the right to life under the European Convention of Human Rights

    DEFF Research Database (Denmark)

    Lauta, Kristian Cedervall; Rytter, Jens Elo

    2016-01-01

    -threatening industrial hazards, while allowing States an especially broad margin of appreciation with regard to natural hazards. Drawing on contemporary disaster theory, the article examines whether and to what extent the Court's distinction between natural and industrial hazards can be maintained. The article proposes......This paper investigates the protection of individuals’ lives against natural hazards under the European Convention on Human Rights. In 2008, the European Court of Human Rights decided to include natural hazards in a well-established doctrine developed to protect individuals from life...

  16. Hazard analysis and critical control point (HACCP) for an ultrasound food processing operation.

    Science.gov (United States)

    Chemat, Farid; Hoarau, Nicolas

    2004-05-01

    Emerging technologies, such as ultrasound (US), used for food and drink production often cause hazards for product safety. Classical quality control methods are inadequate to control these hazards. Hazard analysis of critical control points (HACCP) is the most secure and cost-effective method for controlling possible product contamination or cross-contamination, due to physical or chemical hazard during production. The following case study on the application of HACCP to an US food-processing operation demonstrates how the hazards at the critical control points of the process are effectively controlled through the implementation of HACCP.

  17. Hazard Analysis and Disaster Preparedness in the Fairbanks North Star Borough, Alaska using Hazard Simulations, GIS, and Network Analysis

    Science.gov (United States)

    Schaefer, K.; Prakash, A.; Witte, W.

    2011-12-01

    The Fairbanks North Star Borough (FNSB) lies in interior Alaska, an area that is dominated by semiarid, boreal forest climate. FNSB frequently witnesses flooding events, wild land fires, earthquakes, extreme winter storms and other natural and man-made hazards. Being a large 19,065 km2 area, with a population of approximately 97,000 residents, providing emergency services to residents in a timely manner is a challenge. With only four highways going in and out of the borough, and only two of those leading to another city, most residents do not have quick access to a main road. Should a major disaster occur and block one of the two highways, options for evacuating or getting supplies to the area quickly dwindle. We present the design of a Geographic Information System (GIS) and network analysis based decision support tool that we have created for planning and emergency response. This tool will be used by Emergency Service (Fire/EMS), Emergency Management, Hazardous Materials Team, and Law Enforcement Agencies within FNSB to prepare and respond to a variety of potential disasters. The GIS combines available road and address networks from different FNSB agencies with the 2010 census data. We used ESRI's ArcGIS and FEMA's HAZUS-MH software to run multiple disaster scenarios and create several evacuation and response plans. Network analysis resulted in determining response time and classifying the borough by response times to facilitate allocation of emergency resources. The resulting GIS database can be used by any responding agency in FNSB to determine possible evacuation routes, where to open evacuation centers, placement of resources, and emergency response times. We developed a specific emergency response plan for three common scenarios: (i) major wildfire threatening Fairbanks, (ii) a major earthquake, (iii) loss of power during flooding in a flood-prone area. We also combined the network analysis results with high resolution imagery and elevation data to determine

  18. Analysis of hazardous biological material by MALDI mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    KL Wahl; KH Jarman; NB Valentine; MT Kingsley; CE Petersen; ST Cebula; AJ Saenz

    2000-03-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-MS) has become a valuable tool for analyzing microorganisms. The speed with which data can be obtained from MALDI-MS makes this a potentially important tool for biological health hazard monitoring and forensic applications. The excitement in the mass spectrometry community in this potential field of application is evident by the expanding list of research laboratories pursuing development of MALDI-MS for bacterial identification. Numerous research groups have demonstrated the ability to obtain unique MALDI-MS spectra from intact bacterial cells and bacterial cell extracts. The ability to differentiate strains of the same species has been investigated. Reproducibility of MALDI-MS spectra from bacterial species under carefully controlled experimental conditions has also been demonstrated. Wang et al. have reported on interlaboratory reproducibility of the MALDI-MS analysis of several bacterial species. However, there are still issues that need to be addressed, including the careful control of experimental parameters for reproducible spectra and selection of optimal experimental parameters such as solvent and matrix.

  19. Standard hazard analysis, critical control point and hotel management

    Directory of Open Access Journals (Sweden)

    Vujačić Vesna

    2017-01-01

    Full Text Available Tourism is a dynamic category which is continuously evolving in the world. Specificities that have to be respected in the execution in relation to the food industry are connected with the fact that the main differences which exist regarding the food serving procedure in catering, numerous complex recipes and production technologies, staff fluctuation, old equipment. For an effective and permanent implementation, the HACCP concept is very important for building a serious base. In this case, the base is represented by the people handling the food. This paper presents international ISO standards, the concept of HACCP and the importance of its application in the tourism and hospitality industry. The concept of HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. The aim of this paper is to present the importance of the application of HACCP concept in tourism and hotel management as a recognizable international standard.

  20. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  1. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  2. Application of Bayesian networks for hazard ranking of nanomaterials to support human health risk assessment

    NARCIS (Netherlands)

    Marvin, Hans J.P.; Bouzembrak, Yamine; Janssen, Esmée M.; Zande, van der Meike; Murphy, Finbarr; Sheehan, Barry; Mullins, Martin; Bouwmeester, Hans

    2017-01-01

    In this study, a Bayesian Network (BN) was developed for the prediction of the hazard potential and biological effects with the focus on metal- and metal-oxide nanomaterials to support human health risk assessment. The developed BN captures the (inter) relationships between the exposure route, the

  3. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    Science.gov (United States)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  4. Open Source Seismic Hazard Analysis Software Framework (OpenSHA)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — OpenSHA is an effort to develop object-oriented, web- & GUI-enabled, open-source, and freely available code for conducting Seismic Hazard Analyses (SHA). Our...

  5. ANALYSIS OF INTERNAL SOURCES OF HAZARDS IN CIVIL AIR OPERATIONS

    OpenAIRE

    Katarzyna CHRUZIK; Karolina WIŚNIEWSKA; Radosław FELLNER

    2017-01-01

    International air law imposes an obligation on the part of transport operators to operationalize risk management, and hence develop records of hazards and estimate the level of risk in the respective organization. Air transport is a complex system combining advanced technical systems, operators and procedures. Sources of hazards occur in all of these closely related and mutually interacting areas, which operate in highly dispersed spaces with a short time horizon. A highly important element o...

  6. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    Science.gov (United States)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  7. The importance of source area mapping for rockfall hazard analysis

    Science.gov (United States)

    Valagussa, Andrea; Frattini, Paolo; Crosta, Giovanni B.

    2013-04-01

    A problem in the characterization of the area affected by rockfall is the correct source areas definition. Different positions or different size of the source areas along a cliff result in different possibilities of propagation and diverse interaction with passive countermeasures present in the area. Through the use of Hy-Stone (Crosta et al., 2004), a code able to perform 3D numerical modeling of rockfall processes, different types of source areas were tested on a case study slope along the western flank of the Mt. de La Saxe (Courmayeur, AO), developing between 1200 and 2055 m s.l.m. The first set of source areas consists of unstable rock masses identified on the basis of field survey and Terrestrial Laser Scanning (IMAGEO, 2011). A second set of source areas has been identified by using different thresholds of slope gradient. We tested slope thresholds between 50° and 75° at 5° intervals. The third source area dataset has been generating by performing a kinematic stability analysis. For this analysis, we mapped the join sets along the rocky cliff by means of the software COLTOP 3D (Jaboyedoff, 2004), and then we identified the portions of rocky cliff where planar/wedge and toppling failures are possible assuming an average friction angle of 35°. Through the outputs of the Hy-Stone models we extracted and analyzed the kinetic energy, height of fly and velocity of the blocks falling along the rocky cliff in order to compare the controls of different source areas. We observed strong variations of kinetic energy and fly height among the different models, especially when using unstable masses identified through Terrestrial Laser Scanning. This is mainly related to the size of the blocks identified as susceptible to failure. On the contrary, the slope gradient thresholds does not have a strong impact on rockfall propagation. This contribution highlights the importance of a careful and appropriate mapping of rockfall source area for rockfall hazard analysis and the

  8. Analysis of hazardous substances released during CFRP laser processing

    Science.gov (United States)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  9. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    Energy Technology Data Exchange (ETDEWEB)

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  10. Identification of potentially hazardous human gene products in GMO risk assessment.

    Science.gov (United States)

    Bergmans, Hans; Logie, Colin; Van Maanen, Kees; Hermsen, Harm; Meredyth, Michelle; Van Der Vlugt, Cécile

    2008-01-01

    Genetically modified organisms (GMOs), e.g. viral vectors, could threaten the environment if by their release they spread hazardous gene products. Even in contained use, to prevent adverse consequences, viral vectors carrying genes from mammals or humans should be especially scrutinized as to whether gene products that they synthesize could be hazardous in their new context. Examples of such potentially hazardous gene products (PHGPs) are: protein toxins, products of dominant alleles that have a role in hereditary diseases, gene products and sequences involved in genome rearrangements, gene products involved in immunomodulation or with an endocrine function, gene products involved in apoptosis, activated proto-oncogenes. For contained use of a GMO that carries a construct encoding a PHGP, the precautionary principle dictates that safety measures should be applied on a "worst case" basis, until the risks of the specific case have been assessed. The potential hazard of cloned genes can be estimated before empirical data on the actual GMO become available. Preliminary data may be used to focus hazard identification and risk assessment. Both predictive and empirical data may also help to identify what further information is needed to assess the risk of the GMO. A two-step approach, whereby a PHGP is evaluated for its conceptual dangers, then checked by data bank searches, is delineated here.

  11. Occupational hazards control of hazardous substances in clean room of semiconductor manufacturing plant using CFD analysis.

    Science.gov (United States)

    Li, Jianfeng; Zhou, Ya-Fei

    2015-02-01

    The manufacturing processes in chip industries are complex, and many kinds of raw materials and solvents of different nature are used, most of which are highly toxic and dangerous. During the machine preventive maintenance period, these toxic and harmful substances will escape from the sealed reaction chamber to the clean workshop environment and endanger the health of the workers on-site, resulting in occupational diseases. From the perspective of prevention, the spread and prediction of hydrochloric acid (HCl) that escaped from the metal-etching chamber during maintenance were studied in this article. The computational fluid dynamics technology was used for a three-dimensional numerical simulation of the indoor air velocity field and the HCl concentration field, and the simulation results were then compared with the on-site monitoring data to verify the correctness and feasibility. The occupational hazards and control measures were analyzed based on the numerical simulation, and the optimal control measure was obtained. In this article, using the method of ambient air to analyze the occupational exposure can provide a new idea to the field of occupational health research in the integrated circuit industry and had theoretical and practical significance. © The Author(s) 2012.

  12. Morphometric and landuse analysis: implications on flood hazards ...

    African Journals Online (AJOL)

    This study assessed the morphometric, landuse and lithological attributes of five basins (Iwaraja, Ilesa, Olupona, Osogbo I and Osogbo II) with particular reference to flood hazards in Ilesa and Osogbo metropolis, Osun State Nigeria. Ilesa town is situated within Iwaraja and Ilesa basins while Osogbo metropolis spread ...

  13. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  14. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, Robert D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  15. Human and organizational factors in Chinese hazardous chemical accidents: a case study of the '8.12' Tianjin Port fire and explosion using the HFACS-HC.

    Science.gov (United States)

    Zhou, Lin; Fu, Gui; Xue, Yujingyang

    2017-11-07

    Human and organizational factors have been proven to be the prime causes of Chinese hazardous chemical accidents (HCAs). A modified version of the Human Factors Analysis and Classification System (HFACS), namely the HFACS-Hazardous Chemicals (HC), was developed to identify the human factors involved in Chinese HCAs. The '8.12' Tianjin Port fire and explosion, the costliest HCA in recent years, was reanalyzed using this framework, and the results were compared with the official accident inquiry report to determine their differences related to the identification of human and organizational factors. The study revealed that interacting human factors from different levels in Ruihai Company led to this catastrophe, and the inquiry report had limitations in the identification of human factors and the guidance for similar accident prevention. This study showed the applicability of the HFACS-HC in HCA analyses as well as the necessity to recommend this approach for future HCA investigations.

  16. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2010-07-14

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... 0910-0466)--Extension FDA's regulations in part 120 (21 CFR part 120) mandate the application of HACCP procedures to fruit and vegetable juice processing. HACCP is a preventative system of hazard control that can...

  17. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2013-11-20

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... 0910-0466)--Extension FDA regulations in part 120 (21 CFR part 120) mandate the application of HACCP principles to the processing of fruit and vegetable juices. HACCP is a preventive system of hazard control...

  18. Hazard Analysis of Arid and Semi-Arid (ASAL) Regions of Kenya ...

    African Journals Online (AJOL)

    There is need to undertake a comprehensive hazard and Vulnerability analysis at regional and country level to inform interventions and other developmental activities. Women should be targeted at the community and leadership level, and efforts to empower them should be stepped up. Keywords: hazard, natural disaster, ...

  19. Preliminary fire hazard analysis for the PUTDR and TRU trenches in the Solid Waste Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    Gaschott, L.J.

    1995-06-16

    This document represents the Preliminary Fire Hazards Analysis for the Pilot Unvented TRU Drum Retrieval effort and for the Transuranic drum trenches in the low level burial grounds. The FHA was developed in accordance with DOE Order 5480.7A to address major hazards inherent in the facility.

  20. Spatial temporal analysis of urban heat hazard in Tangerang City

    Science.gov (United States)

    Wibowo, Adi; Kuswantoro; Ardiansyah; Rustanto, Andry; Putut Ash Shidiq, Iqbal

    2016-11-01

    Urban heat is a natural phenomenon which might caused by human activities. The human activities were represented by various types of land-use such as urban and non-urban area. The aim of this study is to identify the urban heat behavior in Tangerang City as it might threats the urban environment. This study used three types of remote sensing data namely, Landsat TM, Landsat ETM+ and Landsat OLI-TIRS, to capture the urban heat behavior and to analysis the urban heat signature of Tangerang City in 2001, 2012, 2013, 2014, 2015 and 2016. The result showed that urban heat signature change dynamically each month based on the sun radiation. The urban heat island covered only small part of Tangerang City in 2001, but it was significantly increased and reached 50% of the area in 2012. Based on the result on urban heat signature, the threshold for threatening condition is 30 oC which recognized from land surface temperature (LST). The effective temperature (ET) index explains that condition as warm, uncomfortable, increase stress due to sweating and blood flow and may causing cardiovascular disorder.

  1. METHODOLOGY OF SITE-SPECIFIC SEISMIC HAZARD ANALYSIS FOR IMPORTANT CIVIL STRUCTURE

    Directory of Open Access Journals (Sweden)

    Donny T. Dangkua

    2007-01-01

    Full Text Available Note from the Editor The Indonesian archipelago is one of the most active tectonic zones in the world. Therefore to design an important (and dangerous structure such as a nuclear power plan knowledge of the seismicity of the site is very important. This could be achieved by doing a site-specific seismic hazard analysis. A site-specific seismic hazard analysis is required in the design state in order to determine the recommended seismic design criteria of the structure. A complete and thorough explanation of the methodology to do a site-specific seismic hazard analysis is presented in this Technical Note Abstract in Bahasa Indonesia :

  2. Decontamination and Management of Human Remains Following Incidents of Hazardous Chemical Release

    Energy Technology Data Exchange (ETDEWEB)

    Hauschild, Veronique [U.S. Army Public Health Command; Watson, Annetta Paule [ORNL; Bock, Robert Eldon [ORNL

    2012-01-01

    Abstract Objective: To provide specific procedural guidance and resources for identification, assessment, control, and mitigation of compounds that may contaminate human remains resulting from chemical attack or release. Design: A detailed technical, policy, and regulatory review is summarized. Setting: Guidance is suitable for civilian or military settings where human remains potentially contaminated with hazardous chemicals may be present. Settings would include sites of transportation accidents, natural disasters, terrorist or military operations, mortuary affairs or medical examiner processing and decontamination points, and similar. Patients, Participants: While recommended procedures have not been validated with actual human remains, guidance has been developed from data characterizing controlled experiments with fabrics, materiel, and laboratory animals. Main Outcome Measure(s): Presentation of logic and specific procedures for remains management, protection and decontamination of mortuary affairs personnel, as well as decision criteria for determining when remains are sufficiently decontaminated so as to pose no chemical health hazard. Results: Established procedures and existing equipment/materiel available for decontamination and verification provide appropriate and reasonable means to mitigate chemical hazards from remains. Extensive characterization of issues related to remains decontamination indicates that supra-lethal concentrations of liquid chemical warfare agent VX may prove difficult to decontaminate and verify in a timely fashion. Specialized personnel can and should be called upon to assist with monitoring necessary to clear decontaminated remains for transport and processing. Conclusions: Once appropriate decontamination and verification have been accomplished, normal procedures for remains processing and transport to the decedent s family and the continental United States can be followed.

  3. Hazard Analysis and Critical Control Point Program for Foodservice Establishments.

    Science.gov (United States)

    Control Point ( HACCP ) inspections in foodservice operations throughout the state. The HACCP system, which first emerged in the late 1960s, is a rational...has been adopted for use in the foodservice industry. The HACCP system consists of three main components which are the: (1) Assessment of the hazards...operations. This manual was developed to assist local sanitarians in conducting HACCP inspections and in educating foodservice operators and employees

  4. [Investigation and analysis on occupational hazards in a carbon enterprise].

    Science.gov (United States)

    Lu, C D; Ding, Q F; Wang, Z X; Shao, H; Sun, X C; Zhang, F

    2017-04-20

    Objective: To explore occupational-disease-inductive in a carbon enterprise workplace and personnel occupational health examination, providing the basis for occupational disease prevention and control of the industry. Methods: Field occupational health survey and inspection law were used to study the the situation and degree of occupational disease hazards in carbon enterprise from 2013 to 2015.Occupational health monitoring was used for workers, physical examination, detection of occupational hazard factors and physical examination results were analyzed comprehensive. Results: Dust, coal tar pitch volatiles, and noise in carbon enterprise were more serious than others. Among them, the over standard rate of coal tar pitch volatiles was 76.67%, the maximum point detection was 1.06 mg/m(3), and the maximum of the individual detection was 0.67 mg/m(3). There was no statistical difference among the 3 years (P>0.05) . There were no significant differences in the incidence of occupation health examination, chest X-ray, skin audiometry, blood routine, blood pressure, electrocardiogram between 3 years (P>0.05) , in which the skin and audiometry abnormal rate was higher than 10% per year. Conclusion: Dust, coal tar, and noise are the main occupational hazard factors of carbon enterprise, should strengthen the corresponding protection.

  5. Visual Analysis of Humans

    CERN Document Server

    Moeslund, Thomas B

    2011-01-01

    This unique text/reference provides a coherent and comprehensive overview of all aspects of video analysis of humans. Broad in coverage and accessible in style, the text presents original perspectives collected from preeminent researchers gathered from across the world. In addition to presenting state-of-the-art research, the book reviews the historical origins of the different existing methods, and predicts future trends and challenges. This title: features a Foreword by Professor Larry Davis; contains contributions from an international selection of leading authorities in the field; includes

  6. [Re-analysis of occupational hazards in foundry].

    Science.gov (United States)

    Zhang, Min; Qi, Cheng; Chen, Wei-Hong; Lu, Yang; Du, Xie-Yi; Li, Wen-Jie; Meng, Chuan-San

    2010-04-01

    To analyze systematically the characteristics of occupational hazards in the foundry, and provide precise data for epidemiology studies and control of occupational hazards in the foundry. Data of airborne dust, chemical occupational hazards and physical occupational agents in environment in the foundry from 1978 to 2008 were dynamically collected. Mean concentration and intensity (geometric mean) of occupational hazards were calculated by job in different years. Main occupational hazards in the foundry were silica, metal fume, noise and heat stress. Silica existed in all of main jobs. The mean concentration of silica before 1986 was an extremely high level of 8.6 mg/m(3), and then remarkably dropped after 1986, with the level of 2.4 mg/m(3) from 1986 to 1989, 2.7 mg/m(3) from 1990 to 2002 and 2.7 mg/m(3) from 2003 to 2008. The trend of silica concentrations by job was consistent with that in general. Silica concentrations among jobs were significantly different, with highest level in melting (4.4 mg/m(3)), followed by cast shakeout and finishing (3.4 mg/m(3)), pouring (3.4 mg/m(3)), sand preparation (2.4 mg/m(3)), moulding (2.1 mg/m(3)) and core-making (1.7 mg/m(3)). Concentration of respirable dust in pouring was highest (2.76 mg/m(3)), followed by cast shakeout and finishing (1.14 mg/m(3)). Mean concentration of asbestos dust in melting was a relative high level of 2.0 mg/m(3). In core-making and sand preparation, there existed emission production of adhesive, with mean concentrations as followed, ammonia (5.84 mg/m(3)), formaldehyde (0.60 mg/m(3)), phenol (1.73 mg/m(3)) and phenol formaldehyde resin (1.3 mg/m(3)) also existed. Benzene and its homologues existed in cast shakeout and finishing, and the level of benzene, toluene, xylene was 0.2 mg/m(3), 0.1 mg/m(3) and 1.3 mg/m(3), respectively. In pouring and melting, there existed chemical occupational hazards, including benzo(a) pyrene, metal fume (lead, cadmium, manganese, nickel, chromium) and gas

  7. LIFE CYCLE ASSESSMENT AND HAZARD ANALYSIS AND CRITICAL CONTROL POINTS TO THE PASTA PRODUCT

    Directory of Open Access Journals (Sweden)

    Yulexis Meneses Linares

    2016-10-01

    Full Text Available The objective of this work is to combine the Life Cycle Assessment (LCA and Hazard Analysis and Critical Control Points (HACCP methodologies for the determination of risks that the food production represents to the human health and the ecosystem. The environmental performance of the production of pastas in the “Marta Abreu” Pasta Factory of Cienfuegos is assessed, where the critical control points determined by the biological dangers (mushrooms and plagues and the physical dangers (wood, paper, thread and ferromagnetic particles were the raw materials: flour, semolina and its mixtures, and the disposition and extraction of them. Resources are the most affected damage category due to the consumption of fossil fuels.

  8. Seafood safety: economics of hazard analysis and Critical Control Point (HACCP) programmes

    National Research Council Canada - National Science Library

    Cato, James C

    1998-01-01

    .... This document on economic issues associated with seafood safety was prepared to complement the work of the Service in seafood technology, plant sanitation and Hazard Analysis Critical Control Point (HACCP) implementation...

  9. Development of hazard analysis by critical control points (HACCP) procedures to control organic chemical hazards in the agricultural production of raw food commodities.

    Science.gov (United States)

    Ropkins, Karl; Ferguson, Andrew; Beck, Angus J

    2003-01-01

    Hazard Analysis by Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards in the food chain. Effective HACCP requires the consideration of all chemical microbiological, and physical hazards. However, current procedures focus primarily on microbiological and physical hazards, while chemical aspects of HACCP have received relatively little attention. In this article we discuss the application of HACCP to organic chemical contaminants and the problems that are likely to be encountered in agriculture. We also present generic templates for the development of organic chemical contaminant HACCP procedures for selected raw food commodities, that is, cereal crops,raw meats, and milk.

  10. Analysis of temporal and spatial overlapping of hazards interactions at different scales

    Science.gov (United States)

    De Angeli, Silvia; Trasforini, Eva; Taylor, Faith; Rudari, Roberto; Rossi, Lauro

    2017-04-01

    The aim of this work is to develop a methodological framework to analyse the impact of multiple hazards on complex territorial systems, not only focusing on multi-hazard interactions but evaluating also the multi-risk, i.e. considering the impact of multiple hazards also in terms of exposure and vulnerability. Impacts generated by natural hazards in the last years are growing also because many regions of the world become subject to multiple hazards and cascading effects. The modelling of the multi-hazard dimension is a new challenge that allows the stakeholder to face with the chain effects between hazards and to model the risk in a real holistic way. Despite the recognition of the importance of a multi-hazard approach in risk assessment, there are only a few multi-risk approaches developed up to now. The examination of multiple hazards, in contrast to single-hazard cases, poses a series of challenges in each step of the risk analysis, starting from the assessment of the hazard level, passing trough the vulnerability evaluation, and arriving finally at the resultant risk level. Hazard interactions and hazard contemporaneity arising from their spatial and temporal overlap may not only influence the overall hazard level, but also the vulnerability of elements at risk. In the proposed approach a series of possible interactions between hazards are identified and classified. These interactions are then analysed looking at the temporal and spatial evolution of the hazards and the consequent impacts and represented through an explicative graphical framework. Different temporal dimensions are identified. The time of the impact differs from the time of the damage because, even after the end of the impact, damages remain until recovery and restoration processes are completed. The discrepancy between the time of the impact and time of the damage is very important for the modelling of multi-hazard damage. Whenever a certain interval of time occurs between two impacts

  11. Hazard and risk assessment of human exposure to toxic metals using in vitro digestion assay

    Directory of Open Access Journals (Sweden)

    Hani A. Alhadrami

    2016-10-01

    Full Text Available Clean-up targets for toxic metals require that the site be “fit for purpose”. This means that targets are set with respect to defined receptors that reflect intended land-use. In this study, the likely threat of human exposure to toxic metals has been evaluated by simulating the human digestion process in vitro. The effects of key attributes (i.e. sample fraction size, pH, Kd and total metal concentrations on the bioavailability of Cu and Ni were also investigated. Total metal concentration was the key explanatory factor for Cu and Ni bioavailability. A comparative ranking of metal concentrations in the context of tolerable daily intakes for Cu and Ni confirmed that the pH has the greatest impact on metals bioavailability. Rapid screening of key attributes and total toxic metal doses can reveal the relative hazard imposed on human, and this approach should be considered when defining threshold values for human protection.

  12. A historical analysis of hazardous events in the Alps – the case of Hindelang (Bavaria, Germany

    Directory of Open Access Journals (Sweden)

    F. Barnikel

    2003-01-01

    Full Text Available A historical analysis of natural hazards for the Hindelang area in the Bavarian Alps is done by researching and assessing data from different archives. The focus is on an evaluation of historical hazards on a local scale by working with written documents only. Data is compiled from the archives of governmental departments, local authorities, private collections and state archives. The bandwidth of the assessed hazards includes floods, mass movements and snow avalanches. So far we have collected more than 400 references for events in the Hindelang area, some of which at times or in places where natural hazards used to be thought of as unlikely or unknown. Our aim was to collect all written data for this area and to deduce as much information on the hazardous effects on the environment as possible, thereby enhancing our knowledge about past climatic and geomorphic dynamics in the Alps.

  13. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    Science.gov (United States)

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  14. Practicality for Software Hazard Analysis for Nuclear Safety I and C System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong-Ho; Moon, Kwon-Ki; Chang, Young-Woo; Jeong, Soo-Hyun [KEPCO Engineering and Construction Co., Deajeon (Korea, Republic of)

    2016-10-15

    We are using the concept of system safety in engineering. It is difficult to make any system perfectly safe and probably a complete system may not easily be achieved. The standard definition of a system from MIL-STD- 882E is: “The organization of hardware, software, material, facilities, personnel, data, and services needed to perform a designated function within a stated environment with specified results.” From the perspective of the system safety engineer and the hazard analysis process, software is considered as a subsystem. Regarding hazard analysis, to date, methods for identifying software failures and determining their effects is still a research problem. Since the success of software development is based on rigorous test of hardware and software, it is necessary to check the balance between software test and hardware test, and in terms of efficiency. Lessons learned and experience from similar systems are important for the work of hazard analysis. No major hazard has been issued for the software developed and verified in Korean NPPs. In addition to hazard analysis, software development, and verification and validation were thoroughly performed. It is reasonable that the test implementation including the development of the test case, stress and abnormal conditions, error recovery situations, and high risk hazardous situations play a key role in detecting and preventing software faults.

  15. Perchlorate on Mars: a chemical hazard and a resource for humans

    Science.gov (United States)

    Davila, Alfonso F.; Willson, David; Coates, John D.; McKay, Christopher P.

    2013-10-01

    Perchlorate (ClO4 -) is widespread in Martian soils at concentrations between 0.5 and 1%. At such concentrations, perchlorate could be an important source of oxygen, but it could also become a critical chemical hazard to astronauts. In this paper, we review the dual implications of ClO4 - on Mars, and propose a biochemical approach for removal of perchlorate from Martian soil that would be energetically cheap, environmentally friendly and could be used to obtain oxygen both for human consumption and to fuel surface operations.

  16. Network meta-analysis on the log-hazard scale, combining count and hazard ratio statistics accounting for multi-arm trials: A tutorial

    Directory of Open Access Journals (Sweden)

    Hawkins Neil

    2010-06-01

    Full Text Available Abstract Background Data on survival endpoints are usually summarised using either hazard ratio, cumulative number of events, or median survival statistics. Network meta-analysis, an extension of traditional pairwise meta-analysis, is typically based on a single statistic. In this case, studies which do not report the chosen statistic are excluded from the analysis which may introduce bias. Methods In this paper we present a tutorial illustrating how network meta-analyses of survival endpoints can combine count and hazard ratio statistics in a single analysis on the hazard ratio scale. We also describe methods for accounting for the correlations in relative treatment effects (such as hazard ratios that arise in trials with more than two arms. Combination of count and hazard ratio data in a single analysis is achieved by estimating the cumulative hazard for each trial arm reporting count data. Correlation in relative treatment effects in multi-arm trials is preserved by converting the relative treatment effect estimates (the hazard ratios to arm-specific outcomes (hazards. Results A worked example of an analysis of mortality data in chronic obstructive pulmonary disease (COPD is used to illustrate the methods. The data set and WinBUGS code for fixed and random effects models are provided. Conclusions By incorporating all data presentations in a single analysis, we avoid the potential selection bias associated with conducting an analysis for a single statistic and the potential difficulties of interpretation, misleading results and loss of available treatment comparisons associated with conducting separate analyses for different summary statistics.

  17. In silico analysis of nanomaterials hazard and risk.

    Science.gov (United States)

    Cohen, Yoram; Rallo, Robert; Liu, Rong; Liu, Haoyang Haven

    2013-03-19

    Because a variety of human-related activities, engineer-ed nanoparticles (ENMs) may be released to various environmental media and may cross environmental boundaries, and thus will be found in most media. Therefore, the potential environmental impacts of ENMs must be assessed from a multimedia perspective and with an integrated risk management approach that considers rapid developments and increasing use of new nanomaterials. Accordingly, this Account presents a rational process for the integration of in silico ENM toxicity and fate and transport analyses for environmental impact assessment. This approach requires knowledge of ENM toxicity and environmental exposure concentrations. Considering the large number of current different types of ENMs and that those numbers are likely to increase, there is an urgent need to accelerate the evaluation of their toxicity and the assessment of their potential distribution in the environment. Developments in high throughput screening (HTS) are now enabling the rapid generation of large data sets for ENM toxicity assessment. However, these analyses require the establishment of reliable toxicity metrics, especially when HTS includes data from multiple assays, cell lines, or organisms. Establishing toxicity metrics with HTS data requires advanced data processing techniques in order to clearly identify significant biological effects associated with exposure to ENMs. HTS data can form the basis for developing and validating in silico toxicity models (e.g., quantitative structure-activity relationships) and for generating data-driven hypotheses to aid in establishing and/or validating possible toxicity mechanisms. To correlate the toxicity of ENMs with their physicochemical properties, researchers will need to develop quantitative structure-activity relationships for nanomaterials (i.e., nano-SARs). However, as nano-SARs are applied in regulatory applications, researchers must consider their applicability and the acceptance level of

  18. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  19. Genetic k-Means Clustering Approach for Mapping Human Vulnerability to Chemical Hazards in the Industrialized City: A Case Study of Shanghai, China

    Directory of Open Access Journals (Sweden)

    Weihua Zeng

    2013-06-01

    Full Text Available Reducing human vulnerability to chemical hazards in the industrialized city is a matter of great urgency. Vulnerability mapping is an alternative approach for providing vulnerability-reducing interventions in a region. This study presents a method for mapping human vulnerability to chemical hazards by using clustering analysis for effective vulnerability reduction. Taking the city of Shanghai as the study area, we measure human exposure to chemical hazards by using the proximity model with additionally considering the toxicity of hazardous substances, and capture the sensitivity and coping capacity with corresponding indicators. We perform an improved k-means clustering approach on the basis of genetic algorithm by using a 500 m × 500 m geographical grid as basic spatial unit. The sum of squared errors and silhouette coefficient are combined to measure the quality of clustering and to determine the optimal clustering number. Clustering result reveals a set of six typical human vulnerability patterns that show distinct vulnerability dimension combinations. The vulnerability mapping of the study area reflects cluster-specific vulnerability characteristics and their spatial distribution. Finally, we suggest specific points that can provide new insights in rationally allocating the limited funds for the vulnerability reduction of each cluster.

  20. Dispersal hazards of Pseudogymnoascus destructans by bats and human activity at hibernacula in summer

    Science.gov (United States)

    Ballmann, Anne; Torkelson, Miranda R.; Bohuski, Elizabeth A.; Russell, Robin E.; Blehert, David

    2017-01-01

    Bats occupying hibernacula during summer are exposed to Pseudogymnoascus destructans (Pd), the causative agent of white-nose syndrome (WNS), and may contribute to its dispersal. Furthermore, equipment and clothing exposed to cave environments are a potential source for human-assisted spread of Pd. To explore dispersal hazards for Pd during the nonhibernal season, we tested samples that were collected from bats, the environment, and equipment at hibernacula in the eastern US between 18 July–22 August 2012. Study sites included six hibernacula known to harbor bats with Pd with varying winter-count impacts from WNS and two hibernacula (control sites) without prior history of WNS. Nucleic acid from Pd was detected from wing-skin swabs or guano from 40 of 617 bats (7% prevalence), including males and females of five species at five sites where WNS had previously been confirmed as well as from one control site. Analysis of guano collected during summer demonstrated a higher apparent prevalence of Pd among bats (17%, 37/223) than did analysis of wing-skin swabs (1%, 4/617). Viable Pd cultured from wing skin (2%, 1/56) and low recapture rates at all sites suggested bats harboring Pd during summer could contribute to pathogen dispersal. Additionally, Pd DNA was detected on clothing and trapping equipment used inside and near hibernacula, and Pd was detected in sediment more readily than in swabs of hibernaculum walls. Statistically significant differences in environmental abundance of Pd were not detected among sites, but prevalence of Pd differed between sites and among bat species. Overall, bats using hibernacula in summer can harbor Pd on their skin and in their guano, and demonstration of Pd on clothing, traps, and other equipment used at hibernacula during summertime within the WNS-affected region indicates risk for pathogen dispersal during the nonhibernal season.

  1. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    Science.gov (United States)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  2. Health Risk Analysis of Cryptosporidiosis and other Hazards in ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Cryptosporidium is an important enteric pathogen that causes diarrhea in humans and animals. Young children, pregnant women and those with weakened immune systems are most susceptible. This project will endeavor to understand the transmission of Cryptosporidium by analyzing infection rates in animals over time, ...

  3. Integration of human reliability analysis into the high consequence process

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.; Morzinski, J.

    1998-12-01

    When performing a hazards analysis (HA) for a high consequence process, human error often plays a significant role in the hazards analysis. In order to integrate human error into the hazards analysis, a human reliability analysis (HRA) is performed. Human reliability is the probability that a person will correctly perform a system-required activity in a required time period and will perform no extraneous activity that will affect the correct performance. Even though human error is a very complex subject that can only approximately be addressed in risk assessment, an attempt must be made to estimate the effect of human errors. The HRA provides data that can be incorporated in the hazard analysis event. This paper will discuss the integration of HRA into a HA for the disassembly of a high explosive component. The process was designed to use a retaining fixture to hold the high explosive in place during a rotation of the component. This tool was designed as a redundant safety feature to help prevent a drop of the explosive. This paper will use the retaining fixture to demonstrate the following HRA methodology`s phases. The first phase is to perform a task analysis. The second phase is the identification of the potential human, both cognitive and psychomotor, functions performed by the worker. During the last phase the human errors are quantified. In reality, the HRA process is an iterative process in which the stages overlap and information gathered in one stage may be used to refine a previous stage. The rationale for the decision to use or not use the retaining fixture and the role the HRA played in the decision will be discussed.

  4. Hazardous properties and toxicological update of mercury: From fish food to human health safety perspective.

    Science.gov (United States)

    Okpala, Charles Odilichukwu R; Sardo, Giacomo; Vitale, Sergio; Bono, Gioacchino; Arukwe, Augustine

    2017-04-10

    The mercury (Hg) poisoning of Minamata Bay of Japan widely activated a global attention to Hg toxicity and its potential consequences to the aquatic ecosystem and human health. This has resulted to an increased need for a dynamic assembly, contextualization, and quantification of both the current state-of-the-art and approaches for understanding the cause-and-effect relationships of Hg exposure. Thus, the objective of this present review is to provide both hazardous toxic properties and toxicological update of Hg, focusing on how it ultimately affects the aquatic biota to potentially produce human health effects. Primarily, we discussed processes that relate to Hg exposure, including immunological aspects and risk assessment, vulnerability, toxicokinetics, and toxicodynamics, using edible fish, swordfish (Xiphias gladius), as a model. In addition, we summarized available information about Hg concentration limits set by different governmental agencies, as recognized by national and international standardization authorities.

  5. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    Science.gov (United States)

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

  6. Automated design synthesis of robotic/human workcells for improved manufacturing system design in hazardous environments

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Joshua M. [Los Alamos National Laboratory

    2012-06-12

    Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address this problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure

  7. Role of human- and animal-sperm studies in the evaluation of male reproductive hazards

    Energy Technology Data Exchange (ETDEWEB)

    Wyrobek, A.J.; Gordon, L.; Watchmaker, G.

    1982-04-07

    Human sperm tests provide a direct means of assessing chemically induced spermatogenic dysfunction in man. Available tests include sperm count, motility, morphology (seminal cytology), and Y-body analyses. Over 70 different human exposures have been monitored in various groups of exposed men. The majority of exposures studied showed a significant change from control in one or more sperm tests. When carefully controlled, the sperm morphology test is statistically the most sensitive of these human sperm tests. Several sperm tests have been developed in nonhuman mammals for the study of chemical spermatotoxins. The sperm morphology test in mice has been the most widely used. Results with this test seem to be related to germ-cell mutagenicity. In general, animal sperm tests should play an important role in the identification and assessment of potential human reproductive hazards. Exposure to spermatotoxins may lead to infertility, and more importantly, to heritable genetic damage. While there are considerable animal and human data suggesting that sperm tests may be used to detect agents causing infertility, the extent to which these tests detect heritable genetic damage remains unclear. (ERB)

  8. Analysis on the Industrial Design of Food Package and the Component of Hazardous Substance in the Packaging Material

    OpenAIRE

    Wei-Wen Huang

    2015-01-01

    Transferring the hazardous chemicals contained in food packaging materials into food would threaten the health of consumers, therefore, the related laws and regulations and the detection method of hazardous substance have been established at home and abroad to ensure the safety to use the food packaging material. According to the analysis on the hazardous component in the food packaging, a set of detection methods for hazardous substance in the food packaging was established in the paper and ...

  9. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  10. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    Science.gov (United States)

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-08

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  11. Integration of Formal Job Hazard Analysis and ALARA Work Practice

    CERN Document Server

    Nelsen, D P

    2002-01-01

    ALARA work practices have traditionally centered on reducing radiological exposure and controlling contamination. As such, ALARA policies and procedures are not well suited to a wide range of chemical and human health issues. Assessing relative risk, identifying appropriate engineering/administrative controls and selecting proper Personal Protective Equipment (PPE) for non nuclear work activities extends beyond the limitations of traditional ALARA programs. Forging a comprehensive safety management program in today's (2002) work environment requires a disciplined dialog between health and safety professionals (e.g. safety, engineering, environmental, quality assurance, industrial hygiene, ALARA, etc.) and personnel working in the field. Integrating organizational priorities, maintaining effective pre-planning of work and supporting a team-based approach to safety management represents today's hallmark of safety excellence. Relying on the mandates of any single safety program does not provide industrial hygien...

  12. Defining Human Failure Events for Petroleum Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  13. The Total Risk Analysis of Large Dams under Flood Hazards

    Directory of Open Access Journals (Sweden)

    Yu Chen

    2018-02-01

    Full Text Available Dams and reservoirs are useful systems in water conservancy projects; however, they also pose a high-risk potential for large downstream areas. Flood, as the driving force of dam overtopping, is the main cause of dam failure. Dam floods and their risks are of interest to researchers and managers. In hydraulic engineering, there is a growing tendency to evaluate dam flood risk based on statistical and probabilistic methods that are unsuitable for the situations with rare historical data or low flood probability, so a more reasonable dam flood risk analysis method with fewer application restrictions is needed. Therefore, different from previous studies, this study develops a flood risk analysis method for large dams based on the concept of total risk factor (TRF used initially in dam seismic risk analysis. The proposed method is not affected by the adequacy of historical data or the low probability of flood and is capable of analyzing the dam structure influence, the flood vulnerability of the dam site, and downstream risk as well as estimating the TRF of each dam and assigning corresponding risk classes to each dam. Application to large dams in the Dadu River Basin, Southwestern China, demonstrates that the proposed method provides quick risk estimation and comparison, which can help local management officials perform more detailed dam safety evaluations for useful risk management information.

  14. Application of Bayesian networks for hazard ranking of nanomaterials to support human health risk assessment.

    Science.gov (United States)

    Marvin, Hans J P; Bouzembrak, Yamine; Janssen, Esmée M; van der Zande, Meike; Murphy, Finbarr; Sheehan, Barry; Mullins, Martin; Bouwmeester, Hans

    2017-02-01

    In this study, a Bayesian Network (BN) was developed for the prediction of the hazard potential and biological effects with the focus on metal- and metal-oxide nanomaterials to support human health risk assessment. The developed BN captures the (inter) relationships between the exposure route, the nanomaterials physicochemical properties and the ultimate biological effects in a holistic manner and was based on international expert consultation and the scientific literature (e.g., in vitro/in vivo data). The BN was validated with independent data extracted from published studies and the accuracy of the prediction of the nanomaterials hazard potential was 72% and for the biological effect 71%, respectively. The application of the BN is shown with scenario studies for TiO2, SiO2, Ag, CeO2, ZnO nanomaterials. It is demonstrated that the BN may be used by different stakeholders at several stages in the risk assessment to predict certain properties of a nanomaterials of which little information is available or to prioritize nanomaterials for further screening.

  15. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    Science.gov (United States)

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  16. PENERAPAN SISTEM HAZARD ANALYSIS CRITICAL CONTROL POINT (HACCP PADA PROSES PEMBUATAN KERIPIK TEMPE

    Directory of Open Access Journals (Sweden)

    Rahmi Yuniarti

    2015-06-01

    Full Text Available Malang is one of the industrial centers of tempe chips. To maintain the quality and food safety, analysis is required to identify the hazards during the production process. This study was conducted to identify the hazards during the production process of tempe chips and provide recommendations for developing a HACCP system. The phases of production process of tempe chips are started from slice the tempe, move it to the kitchen, coat it with flour dough, fry it in the pan, drain it, package it, and then storage it. There are 3 types of potential hazards in terms of biological, physical, and chemical during the production process. With the CCP identification, there are three processes that have Critical Control Point. There are the process of slicing tempe, immersion of tempe into the flour mixture and draining. Recommendations for the development of HACCP systems include recommendations related to employee hygiene, supporting equipment, 5-S analysis, and the production layout.

  17. Analysis of the camouflage effect in time of segregation in texturized regions using the Cox proportional hazard model

    Directory of Open Access Journals (Sweden)

    Eduardo Yoshio Nakano

    2012-11-01

    Full Text Available Humans have trichromatic vision. However variations in gene can cause deficiency in color vision resulting to dichromatism. The aim of this work was to verify the real efficiency of dichromats to break colour camouflage. Total of nine colour-blind individuals participated in this study and the variable considered was the time to segregation of camouflaged targets. The interest was to compare the response time in several conditions of camouflage and the analysis was performed using the Cox proportional hazard model.

  18. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  19. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    Science.gov (United States)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  20. Natural and human-induced sinkhole hazards in Saudi Arabia: distribution, investigation, causes and impacts

    Science.gov (United States)

    Youssef, Ahmed M.; Al-Harbi, Hasan M.; Gutiérrez, Francisco; Zabramwi, Yasser A.; Bulkhi, Ali B.; Zahrani, Saeed A.; Bahamil, Alaa M.; Zahrani, Ahmed J.; Otaibi, Zaam A.; El-Haddad, Bosy A.

    2016-05-01

    Approximately 60 % of the 2,150,000 km2 area of Saudi Arabia is underlain by soluble sediments (carbonate and evaporite rock formations, salt diapirs, sabkha deposits). Despite its hyper-arid climate, a wide variety of recent sinkholes have been reported in numerous areas, involving significant property losses. Human activities, most notably groundwater extraction, have induced unstable conditions on pre-existing cavities. This work provides an overview of the sinkhole hazard in Saudi Arabia, a scarcely explored topic. It identifies the main karst formations and the distribution of the most problematic sinkhole areas, illustrated through several case studies covering the wide spectrum of subsidence mechanisms. Some of the main investigation methods are presented through selected examples, including remote sensing, trenching and geophysics. Based on the available data, the main causal factors are identified and further actions that should be undertaken to better assess and manage the risk are discussed.

  1. Hazards and Risks of Engineered Nanoparticles for the Environment and Human Health

    Directory of Open Access Journals (Sweden)

    Danail Hristozov

    2009-11-01

    Full Text Available The objectives of this article are to: (1 investigate the current state of knowledge of the risks of engineered nanoparticles for the environment and human health, (2 estimate whether this knowledge is sufficient to facilitate their comprehensive and effective risk assessment and (3 provide recommendations on future research in the field of risk assessment of nanomaterials. In order to meet the objectives, the relevance of each of the four steps of the risk assessment methodology (i.e., hazard identification, dose-response assessment, exposure assessment and risk characterization was evaluated in the context of the current state of knowledge of the risks of nanomaterials, limitations were identified and recommendations were given on how to overcome them.

  2. Reduction of uncertainties in probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choun, Young Sun; Choi, In Kil [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-02-01

    An integrated research for the reduction of conservatism and uncertainties in PSHA in Korea was performed. The research consisted of five technical task areas as follows; Task 1: Earthquake Catalog Development for PSHA. Task 2: Evaluation of Seismicity and Tectonics of the Korea Region. Task 3: Development of a Ground Motion Relationships. Task 4: Improvement of PSHA Modelling Methodology. Task 5: Development of Seismic Source Interpretations for the region of Korea for Inputs to PSHA. A series of tests on an ancient wooden house and an analysis on medium size earthquake in Korea were performed intensively. Signification improvement, especially in the estimation of historical earthquake, ground motion attenuation, and seismic source interpretations, were made through this study. 314 refs., 180 figs., 54 tabs. (Author)

  3. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    Science.gov (United States)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  4. Example process hazard analysis of a Department of Energy water chlorination process

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  5. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    Science.gov (United States)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  6. A compound methodology to assess the impact of human and organizational factors impact on the risk level of hazardous industrial plants

    DEFF Research Database (Denmark)

    Monferini, A.; Konstandinidou, M.; Nivolianitou, Z.

    2013-01-01

    This paper presents a compound methodology devised to relate Human and Organizational Factors (HOFs) to operators’ response time in critical operations within hazardous industrial plants. The methodology has been based on a sensitivity analysis of the nine “families” of the Common Performance...... Conditions (CPCs), as defined in the CREAM technique, in order to verify and rank their influence on the operators’ response time. To prove the methodology, a series of pilot experiments have been designed and performed so that human response is evaluated in a Virtual Environment (VE) reproducing the control...

  7. Extended-spectrum beta-lactamase-producing Pseudomonas aeruginosa in camel in Egypt: potential human hazard.

    Science.gov (United States)

    Elhariri, Mahmoud; Hamza, Dalia; Elhelw, Rehab; Dorgham, Sohad M

    2017-03-31

    The rapid increase of extended-spectrum beta-lactamase (ESBL) producing bacteria are a potential health hazard. Development of antimicrobial resistance in animal pathogens has serious implications for human health, especially when such strains could be transmitted to human. In this study, the antimicrobial resistance due to ESBL producing Pseudomonas aeruginosa in the camel meat was investigated. In this study meat samples from 200 healthy camels at two major abattoirs in Egypt (Cairo and Giza) were collected. Following culture on cetrimide agar, suspected P. aeruginosa colonies were confirmed with a Vitek 2 system (bioMe´rieux). P. aeruginosa isolates were phenotypically identified as ESBL by double disk synergy test. Additionally antimicrobial susceptibility testing of ESBL producing P. aeruginosa isolates were done against 11 antimicrobial drugs and carried out by disk diffusion method. The ESBL genotypes were determined by polymerase chain reaction according to the presence of the bla PER-1, bla CTX-M, bla SHV, and bla TEM. Pseudomonas aeruginosa was isolated from 45 camel meat sample (22.5%). The total percentage of ESBL producing P. aeruginosa was 45% (21/45) from camel meat isolates. Antibiogram results revealed the highest resistance was for c, ceftriaxone and rifampicin followed by cefepime and aztreonam. The prevalence rates of β-lactamase genes were recorded (bla PER-1 28.5%, bla CTX-M 38%, bla SHV 33.3% and bla TEM 23.8%). This study illustrates the presence of high rates of ESBL-P. aeruginosa in camels that represents an increasing alarming for the risk of transmission to human and opens the door for current and future antibiotics therapy failure. Livestock associated ESBL-P. aeruginosa is a growing disaster, therefore, attention has to be fully given to livestock associated ESBL-bacteria which try to find its way to human beings.

  8. "Dust Devils": Gardening Agents on the Surface of Mars, and Hidden Hazards to Human Exploration?

    Science.gov (United States)

    Marshall, J.; Smith, P.; White, B.; Farrell, W.

    1999-01-01

    Dust devils are familiar sites in the and regions of the world: they can produce quite spectacular displays of dust lofting when the vortices scavenge very loose dust from a dry lake bed or from recently disturbed agricultural fields. If one were to arrive at the center of an arid region, take one photograph, or even a series of photographs over a period of several days, then return the images for laboratory analysis, it would be most likely concluded that the region was inactive from an aeolian perspective. No images of general dust movement were obtained, nor were any dust devils "caught on camera" owing to their ephemeral and unpredictable appearance, and the fact that there was deceptively little residue of their actions. If, however, a camera were to take a 360 degree continuous recording over a period of a year, and the film were then to be shown at high speed over a period a several minutes, the impression might be that of a region ravaged by air vorticity and dust movement. Extrapolate this over geological time, and it is possible to visualize dust devils as prime aeolian agents, rather than insignificant vagaries of nature, On Mars, the thin atmosphere permits the surface of the planet to be heated but it does not itself retain heat with the capacity of the earth's atmosphere. This gives rise to greater thermal instability near the surface of Mars as "warm" air pockets diapiritically inject themselves into higher atmospheric layers. Resulting boundary-layer vorticity on Mars might therefore be expected to produce dust devils in abundance, if only seasonally. The spectacular images of dust devils obtained by Pathfinder within its brief functional period on the planet testify to the probability of highly frequent surface vorticity in light of the above reasoning about observational probability. Notably, the Pathfinder devils appeared to be at least a kilometer in height. There are several consequences for the geology of Mars, and for human exploration, if

  9. The importance of censoring in competing risks analysis of the subdistribution hazard

    OpenAIRE

    Mark W. Donoghoe; Val Gebski

    2017-01-01

    Background The analysis of time-to-event data can be complicated by competing risks, which are events that alter the probability of, or completely preclude the occurrence of an event of interest. This is distinct from censoring, which merely prevents us from observing the time at which the event of interest occurs. However, the censoring distribution plays a vital role in the proportional subdistribution hazards model, a commonly used method for regression analysis of time-to-event data in th...

  10. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.

    Science.gov (United States)

    Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W

    2017-08-18

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  11. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis

    Science.gov (United States)

    Karlsson, Caroline S. J.; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W.

    2017-11-01

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  12. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    Science.gov (United States)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard

  13. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation.

    Science.gov (United States)

    Liu, Xiang; Saat, Mohd Rapik; Barkan, Christopher P L

    2014-07-15

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. 78 FR 3646 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-01-16

    ... Critical Control Points (HACCP) Systems D. Food Safety Problems Associated With Manufacturing, Processing..., explains the principles and history of the use of Hazard Analysis and Critical Control Point (HACCP... as the Hazard Analysis and Critical Control Points (HACCP) approach to food safety. HACCP was...

  15. Potential hazards to embryo implantation: A human endometrial in vitro model to identify unwanted antigestagenic actions of chemicals

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, L.; Deppert, W.R. [Department of Obstetrics and Gynecology, University Hospital Freiburg (Germany); Pfeifer, D. [Department of Hematology and Oncology, University Hospital Freiburg (Germany); Stanzel, S.; Weimer, M. [Department of Biostatistics, German Cancer Research Center, Heidelberg (Germany); Hanjalic-Beck, A.; Stein, A.; Straßer, M.; Zahradnik, H.P. [Department of Obstetrics and Gynecology, University Hospital Freiburg (Germany); Schaefer, W.R., E-mail: wolfgang.schaefer@uniklinik-freiburg.de [Department of Obstetrics and Gynecology, University Hospital Freiburg (Germany)

    2012-05-01

    Embryo implantation is a crucial step in human reproduction and depends on the timely development of a receptive endometrium. The human endometrium is unique among adult tissues due to its dynamic alterations during each menstrual cycle. It hosts the implantation process which is governed by progesterone, whereas 17β-estradiol regulates the preceding proliferation of the endometrium. The receptors for both steroids are targets for drugs and endocrine disrupting chemicals. Chemicals with unwanted antigestagenic actions are potentially hazardous to embryo implantation since many pharmaceutical antiprogestins adversely affect endometrial receptivity. This risk can be addressed by human tissue-specific in vitro assays. As working basis we compiled data on chemicals interacting with the PR. In our experimental work, we developed a flexible in vitro model based on human endometrial Ishikawa cells. Effects of antiprogestin compounds on pre-selected target genes were characterized by sigmoidal concentration–response curves obtained by RT-qPCR. The estrogen sulfotransferase (SULT1E1) was identified as the most responsive target gene by microarray analysis. The agonistic effect of progesterone on SULT1E1 mRNA was concentration-dependently antagonized by RU486 (mifepristone) and ZK137316 and, with lower potency, by 4-nonylphenol, bisphenol A and apigenin. The negative control methyl acetoacetate showed no effect. The effects of progesterone and RU486 were confirmed on the protein level by Western blotting. We demonstrated proof of principle that our Ishikawa model is suitable to study quantitatively effects of antiprogestin-like chemicals on endometrial target genes in comparison to pharmaceutical reference compounds. This test is useful for hazard identification and may contribute to reduce animal studies. -- Highlights: ► We compare progesterone receptor-mediated endometrial effects of chemicals and drugs. ► 4-Nonylphenol, bisphenol A and apigenin exert weak

  16. Incorporating Site Amplification into Seismic Hazard Analysis: A Fully Probabilistic Approach

    Science.gov (United States)

    Cramer, C. H.

    2001-12-01

    Developing site-specific amplification factors from geological, geophysical, and geotechnical information has been the state-of-practice for the last couple of decades. Now the state-of-the-art is to develop a distribution of possible site-specific amplification factors for a given input rock ground-motion. These state-of-the-art site-amplification distributions account for the uncertainty in soil properties and Vs structure at the site. Applying these site amplification distributions to a site-specific seismic hazard analysis requires a fully probabilistic approach. One such approach is to modify the generic ground-motion attenuation relations used in a probabilistic seismic hazard analysis to site-specific relations using a site amplification distribution developed for that site. The modification of the ground-motion attenuation relation is done prior to calculating probabilistic seismic hazard at the site. This approach has been implemented using the USGS National Seismic Hazard Mapping codes. Standard hazard models and hard-rock ground-motion attenuation relations are input into the modified codes along with a description of the site-specific amplification in the form of a lognormal probability-density-function (pdf). For each hard-rock ground-motion level, the pdf is specified by the median site-amplification factor and its natural-logarithmic standard deviation. The fully probabilistic ground-motion hazard curves are always above the hazard curve derived from multiplying the hard-rock hazard curve by the site's median site-amplification factors. At Paducah, Kentucky the difference is significant for 2%-in-50-year ground-motion estimates (0.7g vs. 0.5g for PGA and 1.3g vs. 0.9g for 1.0 s Sa). At Memphis, Tennessee the differences are less significant and may only be important at long periods (1.0 s and longer) on Mississippi flood-plain (lowlands) deposits (on the uplands deposits: 0.35g vs. 0.30g for PGA and 0.8g vs. 0.7g for 1.0 s Sa; on the lowlands

  17. The influence of Alpine soil properties on shallow movement hazards, investigated through factor analysis

    Directory of Open Access Journals (Sweden)

    S. Stanchi

    2012-06-01

    Full Text Available Mountain watersheds are particularly vulnerable to extreme meteorological events, such as high intensity rainfall, and mountain soils often show pronounced fragility and low resilience due to severe environmental conditions. Alpine soil vulnerability is partly intrinsic but in part related to climate change (mainly precipitation regimes, and is enhanced by the abandonment of rural mountain areas that reduced the land maintenance actions traditionally carried out by farmers and local populations in the past. Soil hazards are related to different processes such as water erosion, loss of consistency, surface runoff and sediment transport, often occurring simultaneously and interacting with each other. Therefore, the overall effects on soil are not easy to quantify as they can be evaluated from different soil chemical and physical properties, referring to specific soil loss phenomena such as soil erosion, soil liquefaction, loss of consistency etc. In this study, we focus our attention on a mountain region in the NW Italian Alps (Valle d'Aosta, which suffered from diffuse soil instability phenomena in recent years, as a consequence of extreme rainfall events and general abandonment of the agricultural activities in marginal areas. The main effects were a large number of shallow landislides involving limited soil depths (less than 1 m, affecting considerable surfaces in the lower and middle part of the slopes. These events caused loss of human lives in the year 2000 and therefore raised the attention on land maintenance issues. Surface (topsoil: 0–20 cm and subsurface (subsoil: 20–70 cm samples were characterised chemically and physically (pH, carbon and nitrogen contents, cation exchange capacity, texture, aggregate stability, Atterberg limits etc. and they showed very different soil properties. Topsoils were characterised by better stability, structure, and consistency. The differences between the two depths were potential trigger factors for

  18. Application of hazard analysis critical control points (HACCP) to organic chemical contaminants in food.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-03-01

    Hazard Analysis Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards that was developed as an effective alternative to conventional end-point analysis to control food safety. It has been described as the most effective means of controlling foodborne diseases, and its application to the control of microbiological hazards has been accepted internationally. By contrast, relatively little has been reported relating to the potential use of HACCP, or HACCP-like procedures, to control chemical contaminants of food. This article presents an overview of the implementation of HACCP and discusses its application to the control of organic chemical contaminants in the food chain. Although this is likely to result in many of the advantages previously identified for microbiological HACCP, that is, more effective, efficient, and economical hazard management, a number of areas are identified that require further research and development. These include: (1) a need to refine the methods of chemical contaminant identification and risk assessment employed, (2) develop more cost-effective monitoring and control methods for routine chemical contaminant surveillance of food, and (3) improve the effectiveness of process optimization for the control of chemical contaminants in food.

  19. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  20. Current issues and related activities in seismic hazard analysis in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong-Moon [Korea Atomic Energy Research Inst., Taejon (Korea, Republic of); Lee, Jong-Rim; Chang, Chun-Joong

    1997-03-01

    This paper discusses some technical issues identified from the seismic hazard analyses for probabilistic safety assessment on the operating Korean nuclear power plants and the related activities to resolve the issues. Since there are no strong instrumental earthquake records in Korea, the seismic hazard analysis is mainly dependent on the historical earthquake records. Results of the past seismic hazard analyses show that there are many uncertainties in attenuation function and intensity level and that there is a need to improve statistical method. The identification of the activity of the Yangsan Fault, which is close to nuclear power plant sites, has been an important issue. But the issue has not been resolved yet in spite of much research works done. Recently, some capable faults were found in the offshore area of Gulupdo Island in the Yellow Sea. It is anticipated that the results of research on both the Yangsan Fault and reduction of uncertainty in seismic hazard analysis will have an significant influence on seismic design and safety assessment of nuclear power plants in the future. (author)

  1. Research on the spatial analysis method of seismic hazard for island

    Science.gov (United States)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  2. Geological Hazards analysis in Urban Tunneling by EPB Machine (Case study: Tehran subway line 7 tunnel

    Directory of Open Access Journals (Sweden)

    Hassan Bakhshandeh Amnieh

    2016-06-01

    Full Text Available Technological progress in tunneling has led to modern and efficient tunneling methods in vast underground spaces even under inappropriate geological conditions. Identification and access to appropriate and sufficient geological hazard data are key elements to successful construction of underground structures. Choice of the method, excavation machine, and prediction of suitable solutions to overcome undesirable conditions depend on geological studies and hazard analysis. Identifying and investigating the ground hazards in excavating urban tunnels by an EPB machine could augment the strategy for improving soil conditions during excavation operations. In this paper, challenges such as geological hazards, abrasion of the machine cutting tools, clogging around these tools and inside the chamber, diverse work front, severe water level fluctuations, existence of water, and fine-grained particles in the route were recognized in a study of Tehran subway line 7, for which solutions such as low speed boring, regular cutter head checks, application of soil improving agents, and appropriate grouting were presented and discussed. Due to the presence of fine particles in the route, foam employment was suggested as the optimum strategy where no filler is needed.

  3. A sensitivity analysis of hazardous waste disposal site climatic and soil design parameters using HELP3

    Energy Technology Data Exchange (ETDEWEB)

    Adelman, D.D. [Water Resources Engineer, Lincoln, NE (United States); Stansbury, J. [Univ. of Nebraska-Lincoln, Omaha, NE (United States)

    1997-12-31

    The Resource Conservation and Recovery Act (RCRA) Subtitle C, Comprehensive Environmental Response, Compensation, And Liability Act (CERCLA), and subsequent amendments have formed a comprehensive framework to deal with hazardous wastes on the national level. Key to this waste management is guidance on design (e.g., cover and bottom leachate control systems) of hazardous waste landfills. The objective of this research was to investigate the sensitivity of leachate volume at hazardous waste disposal sites to climatic, soil cover, and vegetative cover (Leaf Area Index) conditions. The computer model HELP3 which has the capability to simulate double bottom liner systems as called for in hazardous waste disposal sites was used in the analysis. HELP3 was used to model 54 combinations of climatic conditions, disposal site soil surface curve numbers, and leaf area index values to investigate how sensitive disposal site leachate volume was to these three variables. Results showed that leachate volume from the bottom double liner system was not sensitive to these parameters. However, the cover liner system leachate volume was quite sensitive to climatic conditions and less sensitive to Leaf Area Index and curve number values. Since humid locations had considerably more cover liner system leachate volume than and locations, different design standards may be appropriate for humid conditions than for and conditions.

  4. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  5. Impact of stinging jellyfish proliferations along south Italian coasts: human health hazards, treatment and social costs.

    Science.gov (United States)

    De Donno, Antonella; Idolo, Adele; Bagordo, Francesco; Grassi, Tiziana; Leomanni, Alessandro; Serio, Francesca; Guido, Marcello; Canitano, Mariarita; Zampardi, Serena; Boero, Ferdinando; Piraino, Stefano

    2014-02-27

    Stinging jellyfish outbreaks represent a health hazard, causing contact dermatitis and systemic reactions. This study investigated the epidemiology, severity, and treatment protocols of jellyfish stings in a coastal area with high tourist development and frequent stinging jellyfish outbreaks of the central Mediterranean (Salento, Southern Italy), and the associated costs for the Italian National Health Service. In 2007-2011, 1,733 bathers (mostly children and females) sought medical assistance following jellyfish stings, the main cause of human pathologies due to contact with marine organisms. The majority of events were reported in the years 2007-2009, whereas the occurrence of cnidarian jellyfish outbreaks has been increasingly reported in the same area since summer 2010. Most symptoms were limited to local and cutaneous reactions; conversely, 8.7% of cases evoked complications, mainly due to allergic reactions. The main drugs used were corticosteroids, locally applied and systemic (46% and 43%, respectively), and with ammonia (74%) as the main non-pharmacological treatment. The estimated cost of jellyfish-related first-aid services along the Salento coastline over the 5-year period was approximately 400,000 Euros. Therefore the management of jellyfish outbreak phenomena need coordinated research efforts towards a better understanding of underlying ecological mechanisms, together with the adoption of effective prevention policy, mitigation strategies, and appropriate planning of health services at tourist hot spots.

  6. Impact of Stinging Jellyfish Proliferations along South Italian Coasts: Human Health Hazards, Treatment and Social Costs

    Directory of Open Access Journals (Sweden)

    Antonella De Donno

    2014-02-01

    Full Text Available Stinging jellyfish outbreaks represent a health hazard, causing contact dermatitis and systemic reactions. This study investigated the epidemiology, severity, and treatment protocols of jellyfish stings in a coastal area with high tourist development and frequent stinging jellyfish outbreaks of the central Mediterranean (Salento, Southern Italy, and the associated costs for the Italian National Health Service. In 2007–2011, 1,733 bathers (mostly children and females sought medical assistance following jellyfish stings, the main cause of human pathologies due to contact with marine organisms. The majority of events were reported in the years 2007–2009, whereas the occurrence of cnidarian jellyfish outbreaks has been increasingly reported in the same area since summer 2010. Most symptoms were limited to local and cutaneous reactions; conversely, 8.7% of cases evoked complications, mainly due to allergic reactions. The main drugs used were corticosteroids, locally applied and systemic (46% and 43%, respectively, and with ammonia (74% as the main non-pharmacological treatment. The estimated cost of jellyfish-related first-aid services along the Salento coastline over the 5-year period was approximately 400,000 Euros. Therefore the management of jellyfish outbreak phenomena need coordinated research efforts towards a better understanding of underlying ecological mechanisms, together with the adoption of effective prevention policy, mitigation strategies, and appropriate planning of health services at tourist hot spots.

  7. Laser safety and hazard analysis for the temperature stabilized BSLT ARES laser system.

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2003-08-01

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. As a result of temperature stabilization of the BSLT laser the operating parameters of the laser had changed requiring a hazard analysis based on the new operating conditions. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  8. Human motion analysis and modeling

    Science.gov (United States)

    Prussing, Keith; Cathcart, J. Michael; Kocher, Brian

    2011-06-01

    Georgia Tech has investigated methods for the detection and tracking of personnel in a variety of acquisition environments. This research effort focused on a detailed phenomenological analysis of human physiology and signatures with the subsequent identification and characterization of potential observables. As a fundamental part of this research effort, Georgia Tech collected motion capture data on an individual for a variety of walking speeds, carrying loads, and load distributions. These data formed the basis for deriving fundamental properties of the individual's motion and supported the development of a physiologically-based human motion model. Subsequently this model aided the derivation and analysis of motion-based observables, particularly changes in the motion of various body components resulting from load variations. This paper will describe the data acquisition process, development of the human motion model, and use of the model in the observable analysis. Video sequences illustrating the motion data and modeling results will also be presented.

  9. Development of Technosols in abandoned mine lands to reduce hazards to ecosystems and human health

    Science.gov (United States)

    Zornoza, Raúl; Martínez-Martínez, Silvia; Acosta, Jose A.; Ángeles Muñoz, M.; Gómez-Garrido, Melisa; Gabarrón, Maria; Gómez-López, Maria Dolores; Faz, Ángel

    2017-04-01

    structure increased, associated to increased microbial biomass and activity and development of vegetation. Vegetation cover at the end of the study was 65% of the total surface, with appearance of second generation individuals, suggesting the self-sustainability of the new ecosystem. Owing to the creation of a soil, metals are immobilized and soil particles are retained by increased soil aggregate stability and vegetation cover; so, the dispersion of metals to the surroundings by erosion and leaching is minimized, decreasing the hazards for human health and the environment.

  10. Workflow Management of the SCEC Computational Platforms for Physics-Based Seismic Hazard Analysis

    Science.gov (United States)

    Jordan, T. H.; Callaghan, S.; Maechling, P. J.; Juve, G.; Deelman, E.; Rynge, M.; Vahi, K.; Silva, F.

    2012-12-01

    Earthquake simulation has the potential to substantially improve seismic hazard and risk forecasting, but the practicality of using simulation results is limited by the scale and complexity of the computations. Here we will focus on the experience of the Southern California Earthquake Center (SCEC) in applying workflow management tools to facilitate physics-based seismic hazard analysis. This system-level problem can be partitioned into a series of computational pathways according to causal sequences described in terms of conditional probabilities. For example, the exceedance probabilities of shaking intensities at geographically distributed sites conditional on a particular fault rupture (a ground motion prediction model or GMPM) can be combined with the probabilities of different ruptures (an earthquake rupture forecast or ERF) to create a seismic hazard map. Deterministic simulations of ground motions from very large suites (millions) of ruptures, now feasible through high-performance computational facilities such as SCEC's CyberShake Platform, are allowing seismologists to replace empirical GMPMs with physics-based models that more accurately represent wave propagation through heterogeneous geologic structures, such as the sedimentary basins that amplify seismic shaking. One iteration of the current broadband CyberShake hazard model for the Los Angeles region, which calculates ground motions deterministically up to 0.5 Hz and stochastically up to 10 Hz, requires the execution of about 3.3 billion jobs, taking 12.8 million computer hours and producing 10 TB of simulation data. We will show how the scalability and reliability of CyberShake calculations on some of the nation's largest computers has been improved using the Pegasus Workflow Management System. We will also describe the current challenges of scaling these calculations up by an order of magnitude to create a California-wide hazard model, which will be based on the new Uniform California Earthquake

  11. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-risk informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.

  12. The importance of censoring in competing risks analysis of the subdistribution hazard.

    Science.gov (United States)

    Donoghoe, Mark W; Gebski, Val

    2017-04-04

    The analysis of time-to-event data can be complicated by competing risks, which are events that alter the probability of, or completely preclude the occurrence of an event of interest. This is distinct from censoring, which merely prevents us from observing the time at which the event of interest occurs. However, the censoring distribution plays a vital role in the proportional subdistribution hazards model, a commonly used method for regression analysis of time-to-event data in the presence of competing risks. We present the equations that underlie the proportional subdistribution hazards model to highlight the way in which the censoring distribution is included in its estimation via risk set weights. By simulating competing risk data under a proportional subdistribution hazards model with different patterns of censoring, we examine the properties of the estimates from such a model when the censoring distribution is misspecified. We use an example from stem cell transplantation in multiple myeloma to illustrate the issue in real data. Models that correctly specified the censoring distribution performed better than those that did not, giving lower bias and variance in the estimate of the subdistribution hazard ratio. In particular, when the covariate of interest does not affect the censoring distribution but is used in calculating risk set weights, estimates from the model based on these weights may not reflect the correct likelihood structure and therefore may have suboptimal performance. The estimation of the censoring distribution can affect the accuracy and conclusions of a competing risks analysis, so it is important that this issue is considered carefully when analysing time-to-event data in the presence of competing risks.

  13. The importance of censoring in competing risks analysis of the subdistribution hazard

    Directory of Open Access Journals (Sweden)

    Mark W. Donoghoe

    2017-04-01

    Full Text Available Abstract Background The analysis of time-to-event data can be complicated by competing risks, which are events that alter the probability of, or completely preclude the occurrence of an event of interest. This is distinct from censoring, which merely prevents us from observing the time at which the event of interest occurs. However, the censoring distribution plays a vital role in the proportional subdistribution hazards model, a commonly used method for regression analysis of time-to-event data in the presence of competing risks. Methods We present the equations that underlie the proportional subdistribution hazards model to highlight the way in which the censoring distribution is included in its estimation via risk set weights. By simulating competing risk data under a proportional subdistribution hazards model with different patterns of censoring, we examine the properties of the estimates from such a model when the censoring distribution is misspecified. We use an example from stem cell transplantation in multiple myeloma to illustrate the issue in real data. Results Models that correctly specified the censoring distribution performed better than those that did not, giving lower bias and variance in the estimate of the subdistribution hazard ratio. In particular, when the covariate of interest does not affect the censoring distribution but is used in calculating risk set weights, estimates from the model based on these weights may not reflect the correct likelihood structure and therefore may have suboptimal performance. Conclusions The estimation of the censoring distribution can affect the accuracy and conclusions of a competing risks analysis, so it is important that this issue is considered carefully when analysing time-to-event data in the presence of competing risks.

  14. Mapping the hazard of extreme rainfall by peaks-over-threshold extreme value analysis and spatial regression techniques

    NARCIS (Netherlands)

    Beguería, S.; Vicente-Serrano, S.M.

    2006-01-01

    The occurrence of rainfalls of high magnitude constitutes a primary natural hazard in many parts of the world, and the elaboration of maps showing the hazard of extreme rainfalls has great theoretical and practical interest. In this work a procedure based on extreme value analysis and spatial

  15. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    Science.gov (United States)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics

  16. Natural Hazards, Second Edition

    Science.gov (United States)

    Rouhban, Badaoui

    Natural disaster loss is on the rise, and the vulnerability of the human and physical environment to the violent forces of nature is increasing. In many parts of the world, disasters caused by natural hazards such as earthquakes, floods, landslides, drought, wildfires, intense windstorms, tsunami, and volcanic eruptions have caused the loss of human lives, injury, homelessness, and the destruction of economic and social infrastructure. Over the last few years, there has been an increase in the occurrence, severity, and intensity of disasters, culminating with the devastating tsunami of 26 December 2004 in South East Asia.Natural hazards are often unexpected or uncontrollable natural events of varying magnitude. Understanding their mechanisms and assessing their distribution in time and space are necessary for refining risk mitigation measures. This second edition of Natural Hazards, (following a first edition published in 1991 by Cambridge University Press), written by Edward Bryant, associate dean of science at Wollongong University, Australia, grapples with this crucial issue, aspects of hazard prediction, and other issues. The book presents a comprehensive analysis of different categories of hazards of climatic and geological origin.

  17. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette Jackson [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the new methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.

  18. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    Science.gov (United States)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered

  19. Evaluating the influence of gully erosion on landslide hazard analysis triggered by heavy rainfall

    Science.gov (United States)

    Ruljigaljig, Tjuku; Tsai, Ching-Jun; Peng, Wen-Fei; Yu, Teng-To

    2017-04-01

    During the rainstorm period such as typhoon or heavy rain, the development of gully will induce a large-scale landslide. The purpose of this study is to assess and quantify the existence and development of gully for the purpose of triggering landslides by analyzing the landslides hazard. Firstly, based on multi-scale DEM data, this study uses wavelet transform to construct an automatic algorithm. The 1-meter DEM is used to evaluate the location and type of gully, and to establish an evaluation model for predicting erosion development.In this study, routes in the Chai-Yi were studied to clarify the damage potential of roadways from local gully. The local of gully is regarded as a parameter to reduce the strength parameter. The distribution of factor of safe (F.S.) is compared with the landslide inventory map. The result of this research could be used to increase the prediction accuracy of landslide hazard analysis due to heavy rainfalls.

  20. Mathematical Decision Models Applied for Qualifying and Planning Areas Considering Natural Hazards and Human Dealing

    Science.gov (United States)

    Anton, Jose M.; Grau, Juan B.; Tarquis, Ana M.; Sanchez, Elena; Andina, Diego

    2014-05-01

    The authors were involved in the use of some Mathematical Decision Models, MDM, to improve knowledge and planning about some large natural or administrative areas for which natural soils, climate, and agro and forest uses where main factors, but human resources and results were important, natural hazards being relevant. In one line they have contributed about qualification of lands of the Community of Madrid, CM, administrative area in centre of Spain containing at North a band of mountains, in centre part of Iberian plateau and river terraces, and also Madrid metropolis, from an official study of UPM for CM qualifying lands using a FAO model from requiring minimums of a whole set of Soil Science criteria. The authors set first from these criteria a complementary additive qualification, and tried later an intermediate qualification from both using fuzzy logic. The authors were also involved, together with colleagues from Argentina et al. that are in relation with local planners, for the consideration of regions and of election of management entities for them. At these general levels they have adopted multi-criteria MDM, used a weighted PROMETHEE, and also an ELECTRE-I with the same elicited weights for the criteria and data, and at side AHP using Expert Choice from parallel comparisons among similar criteria structured in two levels. The alternatives depend on the case study, and these areas with monsoon climates have natural hazards that are decisive for their election and qualification with an initial matrix used for ELECTRE and PROMETHEE. For the natural area of Arroyos Menores at South of Rio Cuarto town, with at North the subarea of La Colacha, the loess lands are rich but suffer now from water erosions forming regressive ditches that are spoiling them, and use of soils alternatives must consider Soil Conservation and Hydraulic Management actions. The use of soils may be in diverse non compatible ways, as autochthonous forest, high value forest, traditional

  1. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    Science.gov (United States)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  2. Seismic fragility analysis of a nuclear building based on probabilistic seismic hazard assessment and soil-structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.; Ni, S.; Chen, R.; Han, X.M. [CANDU Energy Inc, Mississauga, Ontario (Canada); Mullin, D. [New Brunswick Power, Point Lepreau, New Brunswick (Canada)

    2016-09-15

    Seismic fragility analyses are conducted as part of seismic probabilistic safety assessment (SPSA) for nuclear facilities. Probabilistic seismic hazard assessment (PSHA) has been undertaken for a nuclear power plant in eastern Canada. Uniform Hazard Spectra (UHS), obtained from the PSHA, is characterized by high frequency content which differs from the original plant design basis earthquake spectral shape. Seismic fragility calculations for the service building of a CANDU 6 nuclear power plant suggests that the high frequency effects of the UHS can be mitigated through site response analysis with site specific geological conditions and state-of-the-art soil-structure interaction analysis. In this paper, it is shown that by performing a detailed seismic analysis using the latest technology, the conservatism embedded in the original seismic design can be quantified and the seismic capacity of the building in terms of High Confidence of Low Probability of Failure (HCLPF) can be improved. (author)

  3. Pasteurised milk and implementation of HACCP (Hazard Analysis Critical Control Point

    Directory of Open Access Journals (Sweden)

    T.B Murdiati

    2004-10-01

    Full Text Available The purpose of pasteurisation is to destroy pathogen bacteria without affecting the taste, flavor, and nutritional value. A study on the implementation of HACCP (Hazard Analysis Critical Control Point in producing pasteurized milk was carried out in four processing unit of pasteurised milk, one in Jakarta, two in Bandung and one in Bogor. The critical control points in the production line were identified. Milk samples were collected from the critical points and were analysed for the total number of microbes. Antibiotic residues were detected on raw milks. The study indicated that one unit in Bandung dan one unit in Jakarta produced pasteurized milk with lower number of microbes than the other units, due to better management and control applied along the chain of production. Penisilin residues was detected in raw milk used by unit in Bogor. Six critical points and the hazard might arise in those points were identified, as well as how to prevent the hazards. Quality assurance system such as HACCP would be able to produce high quality and safety of pasteurised milk, and should be implemented gradually.

  4. Laser hazard analysis for airborne AURA (Big Sky variant) Proteus platform.

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2004-02-01

    A laser safety and hazard analysis was performed for the airborne AURA (Big Sky Laser Technology) lidar system based on the 2000 version of the American National Standard Institute's (ANSI) Standard Z136.1, for the Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for the Safe Use of Lasers Outdoors. The AURA lidar system is installed in the instrument pod of a Proteus airframe and is used to perform laser interaction experiments and tests at various national test sites. The targets are located at various distances or ranges from the airborne platform. In order to protect personnel, who may be in the target area and may be subjected to exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength, calculate the Nominal Ocular Hazard Distance (NOHD), and determine the maximum 'eye-safe' dwell times for various operational altitudes and conditions. It was also necessary to calculate the appropriate minimum Optical Density (ODmin) of the laser safety eyewear used by authorized personnel who may receive hazardous exposures during ground base operations of the airborne AURA laser system (system alignment and calibration).

  5. A first hazard analysis of the Harrat Ash Shamah volcanic field, Syria-Jordan Borderline

    Science.gov (United States)

    Cagnan, Zehra; Akkar, Sinan; Moghimi, Saed

    2017-04-01

    The northernmost part of the Saudi Cenozoic Volcanic Fields, the 100,000 km2 Harrat Ash Shamah has hosted some of the most recent volcanic eruptions along the Syria-Jordan borderline. With rapid growth of the cities in this region, exposure to any potential renewed volcanism increased considerably. We present here a first-order probabilistic hazard analysis related to new vent formation and subsequent lava flow from Harrat Ash Shamah. The 733 visible eruption vent sites were utilized to develop a probability density function for new eruption sites using Gaussian kernel smoothing. This revealed a NNW striking zone of high spatial hazard surrounding the cities Amman and Irbid in Jordan. The temporal eruption recurrence rate is estimated to be approximately one vent per 3500 years, but the temporal record of the field is so poorly constrained that the lower and upper bounds for the recurrence interval are 17,700 yrs and 70 yrs, respectively. A Poisson temporal model is employed within the scope of this study. In order to treat the uncertainties associated with the spatio-temporal models as well as size of the area affected by the lava flow, the logic tree approach is adopted. For the Syria-Jordan borderline, the spatial variation of volcanic hazard is computed as well as uncertainty associated with these estimates.

  6. The hazard analysis and critical control point system in food safety.

    Science.gov (United States)

    Herrera, Anavella Gaitan

    2004-01-01

    The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series.

  7. Evaluation of hazardous chemicals in edible insects and insect-based food intended for human consumption.

    Science.gov (United States)

    Poma, Giulia; Cuykx, Matthias; Amato, Elvio; Calaprice, Chiara; Focant, Jean Francois; Covaci, Adrian

    2017-02-01

    Due to the rapid increase in world population, the waste of food and resources, and non-sustainable food production practices, the use of alternative food sources is currently strongly promoted. In this perspective, insects may represent a valuable alternative to main animal food sources due to their nutritional value and sustainable production. However, edible insects may be perceived as an unappealing food source and are indeed rarely consumed in developed countries. The food safety of edible insects can thus contribute to the process of acceptance of insects as an alternative food source, changing the perception of developed countries regarding entomophagy. In the present study, the levels of organic contaminants (i.e. flame retardants, PCBs, DDT, dioxin compounds, pesticides) and metals (As, Cd, Co, Cr, Cu, Ni, Pb, Sn, Zn) were investigated in composite samples of several species of edible insects (greater wax moth, migratory locust, mealworm beetle, buffalo worm) and four insect-based food items currently commercialized in Belgium. The organic chemical mass fractions were relatively low (PCBs: 27-2065 pg/g ww; OCPs: 46-368 pg/g ww; BFRs: up to 36 pg/g ww; PFRs 783-23800 pg/g ww; dioxin compounds: up to 0.25 pg WHO-TEQ/g ww) and were generally lower than those measured in common animal products. The untargeted screening analysis revealed the presence of vinyltoluene, tributylphosphate (present in 75% of the samples), and pirimiphos-methyl (identified in 50% of the samples). The levels of Cu and Zn in insects were similar to those measured in meat and fish in other studies, whereas As, Co, Cr, Pb, Sn levels were relatively low in all samples (insect species with no additional hazards in comparison to the more commonly consumed animal products. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Criticality analysis for hazardous materials transportation; Classificacao da criticidade das rotas do transporte rodoviario de produtos perigosos da BRASKEM

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Katia; Brady, Mariana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Diniz, Americo [BRASKEM S.A., Sao Paulo, SP (Brazil)

    2008-07-01

    The bad conditions of Brazilians roads drive the companies to be more exigent with the transportation of hazardous materials to avoid accidents or materials releases with actions to contain the releases to community and water sources. To minimize this situation, DNV and BRASKEM developed a methodology for risk analysis called Criticality Analysis for Hazardous Materials Transportation. The objective of this methodology is identifying the most critical points of routes to make actions to avoid accidents. (author)

  9. Balzac and human gait analysis.

    Science.gov (United States)

    Collado-Vázquez, S; Carrillo, J M

    2015-05-01

    People have been interested in movement analysis in general, and gait analysis in particular, since ancient times. Aristotle, Hippocrates, Galen, Leonardo da Vinci and Honoré de Balzac all used observation to analyse the gait of human beings. The purpose of this study is to compare Honoré de Balzac's writings with a scientific analysis of human gait. Honoré de Balzac's Theory of walking and other works by that author referring to gait. Honoré de Balzac had an interest in gait analysis, as demonstrated by his descriptions of characters which often include references to their way of walking. He also wrote a treatise entitled Theory of walking (Théorie de la demarche) in which he employed his keen observation skills to define gait using a literary style. He stated that the walking process is divided into phases and listed the factors that influence gait, such as personality, mood, height, weight, profession and social class, and also provided a description of the correct way of walking. Balzac considered gait analysis to be very important and this is reflected in both his character descriptions and Theory of walking, his analytical observation of gait. In our own technology-dominated times, this serves as a reminder of the importance of observation. Copyright © 2011 Sociedad Española de Neurología. Published by Elsevier España, S.L.U. All rights reserved.

  10. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steve [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ma, Zhegang [Idaho National Lab. (INL), Idaho Falls, ID (United States); Spears, Bob [Idaho National Lab. (INL), Idaho Falls, ID (United States); Szilard, Ronaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kosbab, Ben [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-07-26

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA models for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.

  11. QMRA (quantitative microbial risk assessment) and HACCP (hazard analysis and critical control points) for management of pathogens in wastewater and sewage sludge treatment and reuse.

    Science.gov (United States)

    Westrell, T; Schönning, C; Stenström, T A; Ashbolt, N J

    2004-01-01

    Hazard Analysis and Critical Control Points (HACCP) was applied for identifying and controlling exposure to pathogenic microorganisms encountered during normal sludge and wastewater handling at a 12,500 m3/d treatment plant utilising tertiary wastewater treatment and mesophilic sludge digestion. The hazardous scenarios considered were human exposure during treatment, handling, soil application and crop consumption, and exposure via water at the wetland-area and recreational swimming. A quantitative microbial risk assessment (QMRA), including rotavirus, adenovirus, haemorrhagic E. coli, Salmonella, Giardia and Cryptosporidium, was performed in order to prioritise pathogen hazards for control purposes. Human exposures were treated as individual risks but also related to the endemic situation in the general population. The highest individual health risk from a single exposure was via aerosols for workers at the belt press for sludge dewatering (virus infection risk = 1). The largest impact on the community would arise if children ingested sludge at the unprotected storage site, although in the worst-case situation the largest number of infections would arise through vegetables fertilised with sludge and eaten raw (not allowed in Sweden). Acceptable risk for various hazardous scenarios, treatment and/or reuse strategies could be tested in the model.

  12. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  13. Using the Auditory Hazard Assessment Algorithm for Humans (AHAAH) Software, Beta Release W93e

    Science.gov (United States)

    2009-09-01

    Dearborn, MI. Price, G. R. (1997). “Noise hazard issues in the design of airbags.” Invited seminar presented to GM- NAO R&D Center, Warren, MI. Price...Invited presentation to seminar at GM- NAO R&D Center, Warren, MI. Price, G. R. (1994). “Hazard from impulse noise: Problems and prospects,” J. Acoust. Soc...from intense impulses from a mathematical model of the ear.” Paper in proceedings of Inter-Noise 87, meeting in Beijing, China , Sept 1987. 1986

  14. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    Science.gov (United States)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  15. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

    Science.gov (United States)

    Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.

    2004-01-01

    -motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.

  16. Analysis of Flood Hazards for the Materials and Fuels Complex at the Idaho National Laboratory Site

    Energy Technology Data Exchange (ETDEWEB)

    Skaggs, Richard; Breithaupt, Stephen A.; Waichler, Scott R.; Kim, Taeyun; Ward, Duane L.

    2010-11-01

    Researchers at Pacific Northwest National Laboratory conducted a flood hazard analysis for the Materials and Fuels Complex (MFC) site located at the Idaho National Laboratory (INL) site in southeastern Idaho. The general approach for the analysis was to determine the maximum water elevation levels associated with the design-basis flood (DBFL) and compare them to the floor elevations at critical building locations. Two DBFLs for the MFC site were developed using different precipitation inputs: probable maximum precipitation (PMP) and 10,000 year recurrence interval precipitation. Both precipitation inputs were used to drive a watershed runoff model for the surrounding upland basins and the MFC site. Outflows modeled with the Hydrologic Engineering Centers Hydrologic Modeling System were input to the Hydrologic Engineering Centers River Analysis System hydrodynamic flood routing model.

  17. Uncertainty Analysis of the Potential Hazard of MCCI during Severe Accidents for the CANDU6 Plant

    Directory of Open Access Journals (Sweden)

    Sooyong Park

    2015-01-01

    Full Text Available This paper illustrates the application of a severe accident analysis computer program to the uncertainty analysis of molten corium-concrete interaction (MCCI phenomena in cases of severe accidents in CANDU6 type plant. The potential hazard of MCCI is a failure of the reactor building owing to the possibility of a calandria vault floor melt-through even though the containment filtered vent system is operated. Meanwhile, the MCCI still has large uncertainties in several phenomena such as a melt spreading area and the extent of water ingression into a continuous debris layer. The purpose of this study is to evaluate the MCCI in the calandria vault floor via an uncertainty analysis using the ISAAC program for the CANDU6.

  18. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    Science.gov (United States)

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  19. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    Science.gov (United States)

    Grasso, S.; Maugeri, M.

    After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either

  20. Chapter 12: Human microbiome analysis.

    Directory of Open Access Journals (Sweden)

    Xochitl C Morgan

    Full Text Available Humans are essentially sterile during gestation, but during and after birth, every body surface, including the skin, mouth, and gut, becomes host to an enormous variety of microbes, bacterial, archaeal, fungal, and viral. Under normal circumstances, these microbes help us to digest our food and to maintain our immune systems, but dysfunction of the human microbiota has been linked to conditions ranging from inflammatory bowel disease to antibiotic-resistant infections. Modern high-throughput sequencing and bioinformatic tools provide a powerful means of understanding the contribution of the human microbiome to health and its potential as a target for therapeutic interventions. This chapter will first discuss the historical origins of microbiome studies and methods for determining the ecological diversity of a microbial community. Next, it will introduce shotgun sequencing technologies such as metagenomics and metatranscriptomics, the computational challenges and methods associated with these data, and how they enable microbiome analysis. Finally, it will conclude with examples of the functional genomics of the human microbiome and its influences upon health and disease.

  1. Assessment of human health hazard due to metal uptake via fish ...

    African Journals Online (AJOL)

    The estimated daily intake (EDI) of heavy metals with the respective type of fish can be arranged as Fe > Cu > As > Cd > Pb in which values are higher than Provisional Tolerable Weekly Intake (PTWI) for metals. Therefore the consumption of fish samples is questionable. Target Hazard Quotient (THQ) was used in the health ...

  2. Fire Hazard Analysis for the Cold Vacuum Drying facility (CVD) Facility

    CERN Document Server

    Singh, G

    2000-01-01

    The CVDF is a nonreactor nuclear facility that will process the Spent Nuclear Fuels (SNF) presently stored in the 105-KE and 105-KW SNF storage basins. Multi-canister overpacks (MCOs) will be loaded (filled) with K Basin fuel transported to the CVDF. The MCOs will be processed at the CVDF to remove free water from the fuel cells (packages). Following processing at the CVDF, the MCOs will be transported to the CSB for interim storage until a long-term storage solution can be implemented. This operation is expected to start in November 2000. A Fire Hazard Analysis (FHA) is required for all new facilities and all nonreactor nuclear facilities, in accordance with U.S. Department of Energy (DOE) Order 5480.7A, Fire Protection. This FHA has been prepared in accordance with DOE 5480.7A and HNF-PRO-350, Fire Hazard Analysis Requirements. Additionally, requirements or criteria contained in DOE, Richland Operations Office (RL) RL Implementing Directive (RLID) 5480.7, Fire Protection, or other DOE documentation are cite...

  3. Towards a probabilistic tsunami hazard analysis for the Gulf of Cadiz

    Science.gov (United States)

    Løvholt, Finn; Urgeles, Roger

    2017-04-01

    Landslides and volcanic flank collapses constitute a significant portion of all known tsunami sources, and they are less constrained geographically than earthquakes as they are not tied to large fault zones. While landslides have mostly produced local tsunamis historically, prehistoric evidence show that landslides can also produce ocean wide tsunamis. Because the landslide induced tsunami probability is more difficult to quantify than the one induced by earthquakes, our understanding of the landslide tsunami hazard is less understood. To improve our understanding and methodologies to deal with this hazard, we here present results and methods for a preliminary landslide probabilistic tsunami hazard assessment (LPTHA) for the Gulf of Cadiz for submerged landslides. The present literature on LPTHA is sparse, and studies have so far been separated into two groups, the first based on observed magnitude frequency distributions (MFD's), the second based on simplified geotechnical slope stability analysis. We argue that the MFD based approach is best suited when a sufficient amount of data covering a wide range of volumes is available, although uncertainties in the dating of the landslides often represent a potential large source of bias. To this end, the relatively rich availability of landslide data in the Gulf of Cadiz makes this area suitable for developing and testing LPTHA models. In the presentation, we will first explore the landslide data and statistics, including different spatial factors such as slope versus volume relationships, faults etc. Examples of how random realizations can be used to distribute tsunami source over the study area will be demonstrated. Furthermore, computational strategies for simulating both the landslide and the tsunami generation in a simplified way will be described. To this end, we use depth averaged viscoplastic landslide model coupled to the numerical tsunami model to represent a set of idealized tsunami sources, which are in turn

  4. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    Science.gov (United States)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  5. A prototype web-GIS application for risk analysis of natural hazards in Switzerland

    Science.gov (United States)

    Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien

    2016-04-01

    Following changes in the system of Swiss subsidy in January 2008, the Swiss cantons and the Federal Office for the Environment (FOEN) were forced to prioritize different natural hazard protection projects based on their cost-effectiveness, as a response to limited financial resources (Bründl et al., 2009). For this purpose, applications such as EconoMe (OFEV, 2016) and Valdorisk (DGE, 2016) were developed for risk evaluation and prioritization of mitigation projects. These tools serve as a useful decision-making instrument to the community of practitioners and responsible authorities for natural hazard risk management in Switzerland. However, there are several aspects which could be improved, in particular, the integration and visualization of spatial information interactively through a web-GIS interface for better risk planning and evaluation. Therefore, in this study, we aim to develop an interactive web-GIS application based on the risk concepts applied in Switzerland. The purpose of this tool is to provide a rapid evaluation of risk before and after protection measures, and to test the efficiency of measures by using a simplified cost-benefit analysis within the context of different protection projects. This application allows to integrate different layers which are necessary to calculate risk, in particular, hazard intensity (vector) maps for different scenarios (such as 30, 100 and 300 years of return periods based on Swiss guidelines), exposed objects (such as buildings) and vulnerability information of these objects. Based on provided information and additional parameters, risk is calculated automatically and results are visualized within the web-GIS interface of the application. The users can modify these input information and parameters to create different risk scenarios. Based on the resultant risk scenarios, the users can propose and visualize (preliminary) risk reduction measures before realizing the actual design and dimensions of such protective

  6. Hazardous waste transportation risk assessment for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement -- human health endpoints

    Energy Technology Data Exchange (ETDEWEB)

    Hartmann, H.M.; Policastro, A.J.; Lazaro, M.A.

    1994-03-01

    In this presentation, a quantitative methodology for assessing the risk associated with the transportation of hazardous waste (HW) is proposed. The focus is on identifying air concentrations of HW that correspond to specific human health endpoints.

  7. Analytical Problems Associated with the Analysis of Metals in a Simulated Hazardous Waste

    Science.gov (United States)

    Dunnivant, F. M.

    2002-06-01

    Analysis of samples subject to physical and chemical interferences can greatly enhance the learning experience in instrumental analysis and environmental chemistry laboratories. This article describes a project-based experience in which students analyze simulated hazardous waste samples (carbonated beverages) for calcium by six techniques: (i) flame atomic absorption spectroscopy (FAAS) using external standard calibration, (ii) FAAS using external standard calibration with a releasing agent (Sr), (iii) FAAS using standard addition, (iv) FAAS using standard addition with a releasing agent (Sr), (v) ethylenediaminetetraacetic acid (EDTA) titration, and (vi) Ca-ion-specific electrode. Not surprisingly, students find that these different techniques yield conflicting results and their assignment is to explain their data in the format of a peer-reviewed journal article. Students report that this series of lab experiments is challenging and highly rewarding. Laboratory experiences such as this one should significantly improve the student's ability to analyze problematic samples and interpret experimental data.

  8. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    Science.gov (United States)

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  9. A Quantitative Risk Analysis Method for the High Hazard Mechanical System in Petroleum and Petrochemical Industry

    Directory of Open Access Journals (Sweden)

    Yang Tang

    2017-12-01

    Full Text Available The high hazard mechanical system (HHMS has three characteristics in the petroleum and petrochemical industry (PPI: high risk, high cost, and high technology requirements. For a HHMS, part, component, and subsystem failures will result in varying degrees and various types of risk consequences, including unexpected downtime, production losses, economic costs, safety accidents, and environmental pollution. Thus, obtaining the quantitative risk level and distribution in a HHMS to control major risk accidents and ensure safe production is of vital importance. However, the structure of the HHMS is more complex than some other systems, making the quantitative risk analysis process more difficult. Additionally, a variety of uncertain risk data hinder the realization of quantitative risk analysis. A few quantitative risk analysis techniques and studies for HHMS exist, especially in the PPI. Therefore, a study on the quantitative risk analysis method for HHMS was completed to obtain the risk level and distribution of high-risk objects. Firstly, Fuzzy Set Theory (FST was applied to address the uncertain risk data for the occurrence probability (OP and consequence severity (CS in the risk analysis process. Secondly, a fuzzy fault tree analysis (FFTA and a fuzzy event tree analysis (FETA were used to achieve quantitative risk analysis and calculation. Thirdly, a fuzzy bow-tie model (FBTM was established to obtain a quantitative risk assessment result according to the analysis results of the FFTA and FETA. Finally, the feasibility and practicability of the method were verified with a case study on the quantitative risk analysis of one reciprocating pump system (RPS. The quantitative risk analysis method for HHMS can provide more accurate and scientific data support for the development of Asset Integrity Management (AIM systems in the PPI.

  10. A Sensitivity Study for an Evaluation of Input Parameters Effect on a Preliminary Probabilistic Tsunami Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyun-Me; Kim, Min Kyu; Choi, In-Kil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Dong-Hoon [Chonnam National University, Gwangju (Korea, Republic of)

    2014-10-15

    The tsunami hazard analysis has been based on the seismic hazard analysis. The seismic hazard analysis has been performed by using the deterministic method and the probabilistic method. To consider the uncertainties in hazard analysis, the probabilistic method has been regarded as attractive approach. The various parameters and their weight are considered by using the logic tree approach in the probabilistic method. The uncertainties of parameters should be suggested by analyzing the sensitivity because the various parameters are used in the hazard analysis. To apply the probabilistic tsunami hazard analysis, the preliminary study for the Ulchin NPP site had been performed. The information on the fault sources which was published by the Atomic Energy Society of Japan (AESJ) had been used in the preliminary study. The tsunami propagation was simulated by using the TSUNAMI{sub 1}.0 which was developed by Japan Nuclear Energy Safety Organization (JNES). The wave parameters have been estimated from the result of tsunami simulation. In this study, the sensitivity analysis for the fault sources which were selected in the previous studies has been performed. To analyze the effect of the parameters, the sensitivity analysis for the E3 fault source which was published by AESJ was performed. The effect of the recurrence interval, the potential maximum magnitude, and the beta were suggested by the sensitivity analysis results. Level of annual exceedance probability has been affected by the recurrence interval.. Wave heights have been influenced by the potential maximum magnitude and the beta. In the future, the sensitivity analysis for the all fault sources in the western part of Japan which were published AESJ would be performed.

  11. Microbiological quality of food in relation to hazard analysis systems and food hygiene training in UK catering and retail premises.

    Science.gov (United States)

    Little, C L; Lock, D; Barnes, J; Mitchell, R T

    2003-09-01

    A meta-analysis of eight UK food studies was carried out to determine the microbiological quality of food and its relationship with the presence in food businesses of hazard analysis systems and food hygiene training. Of the 19,022 premises visited to collect food samples in these studies between 1997 and 2002, two thirds (66%) were catering premises and one third (34%) were retail premises. Comparison with PHLS Microbiological Guidelines revealed that significantly more ready-to-eat food samples from catering premises (20%; 2,511/12,703) were of unsatisfactory or unacceptable microbiological quality compared to samples from retail premises (12%; 1,039/8,462) (p catering premises (p catering premises (p catering) compared with premises where the manager had received food hygiene training (11% retail, 19% catering) (p catering) were from premises where there was no hazard analysis system in place compared to premises that had a documented hazard analysis system in place (10% retail, 18% catering) (p catering premises compared with those collected from retail premises may reflect differences in management food hygiene training and the presence of a hazard analysis system. The importance of adequate training for food handlers and their managers as a pre-requisite for effective hazard analysis and critical control point (HACCP) based controls is therefore emphasised.

  12. Evaluation of the Potential of NASA Multi-satellite Precipitation Analysis in Global Landslide Hazard Assessment

    Science.gov (United States)

    Hong, Yang; Adler, Robert F.; Huffman, George J.

    2007-01-01

    Landslides are one of the most widespread natural hazards on Earth, responsible for thousands of deaths and billions of dollars in property damage every year. In the U.S. alone landslides occur in every state, causing an estimated $2 billion in damage and 25- 50 deaths each year. Annual average loss of life from landslide hazards in Japan is 170. The situation is much worse in developing countries and remote mountainous regions due to lack of financial resources and inadequate disaster management ability. Recently, a landslide buried an entire village on the Philippines Island of Leyte on Feb 17,2006, with at least 1800 reported deaths and only 3 houses left standing of the original 300. Intense storms with high-intensity , long-duration rainfall have great potential to trigger rapidly moving landslides, resulting in casualties and property damage across the world. In recent years, through the availability of remotely sensed datasets, it has become possible to conduct global-scale landslide hazard assessment. This paper evaluates the potential of the real-time NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA) system to advance our understanding of and predictive ability for rainfall-triggered landslides. Early results show that the landslide occurrences are closely associated with the spatial patterns and temporal distribution of rainfall characteristics. Particularly, the number of landslide occurrences and the relative importance of rainfall in triggering landslides rely on the influence of rainfall attributes [e.g. rainfall climatology, antecedent rainfall accumulation, and intensity-duration of rainstorms). TMPA precipitation data are available in both real-time and post-real-time versions, which are useful to assess the location and timing of rainfall-triggered landslide hazards by monitoring landslide-prone areas while receiving heavy rainfall. For the purpose of identifying rainfall-triggered landslides, an empirical global rainfall intensity

  13. Implementation of hazard analysis and critical control points in the drinking water supply system

    Directory of Open Access Journals (Sweden)

    Asghar Tavasolifar

    2012-01-01

    Full Text Available Aims: This study was aimed to design comprehensive risk management based on hazard analysis and critical control points (HACCP in the Isfahan drinking water system. Materials and Methods: Data obtained from field inspections and through related organizations of Isfahan, Iran. The most important risks and risky events of water quality in all sources of raw water in the study area including the Zayanderoud river, the water treatment plant, and the distribution system were identified and analyzed. Practical measures for the protection, control, and limitation of the risks in different phases, from water supply to consumption point, were presented in the form of seven principles of the HACCP system. Results: It was found that there was a potential of hazards during the treatment process of water because of seasonal changes and discharge of various pollutants. Water contamination could occur in eight identified critical control points (CCP. River water could be contaminated by rural communities on the banks of the river, by natural and sudden accidents, by subversive accidents, by incomplete operation, by lack of proportionate of the current treatment process, and by the high extent of antiquity of the Isfahan water distribution system. Conclusions: In order to provide safe drinking water, it is necessary to implement a modern risk management system such as the HACCP approach. The increasing trend of the Zayandehroud river pollution needs urgent attention. Therefore, the role of the government in developing and mandating the HACCP system in water industries is essential.

  14. Seismic hazard analysis of Tianjin area based on strong ground motion prediction

    Science.gov (United States)

    Zhao, Boming

    2010-08-01

    Taking Tianjin as an example, this paper proposed a methodology and process for evaluating near-fault strong ground motions from future earthquakes to mitigate earthquake damage for the metropolitan area and important engineering structures. The result of strong ground motion was predicted for Tianjin main faults by the hybrid method which mainly consists of 3D finite difference method and stochastic Green’s function. Simulation is performed for 3D structures of Tianjin region and characterized asperity models. The characterized asperity model describing source heterogeneity is introduced following the fault information from the project of Tianjin Active Faults and Seismic Hazard Assessment. We simulated the worst case that two earthquakes separately occur. The results indicate that the fault position, rupture process and the sedimentary deposits of the basin significantly affect amplification of the simulated ground motion. Our results also demonstrate the possibility of practical simulating wave propagation including basin induced surface waves in broad frequency-band, for seismic hazard analysis near the fault from future earthquakes in urbanized areas.

  15. Analysis of aerosol emission and hazard evaluation of electrical discharge machining (EDM) process.

    Science.gov (United States)

    Jose, Mathew; Sivapirakasam, S P; Surianarayanan, M

    2010-01-01

    The safety and environmental aspects of a manufacturing process are important due to increased environmental regulations and life quality. In this paper, the concentration of aerosols in the breathing zone of the operator of Electrical Discharge Machining (EDM), a commonly used non traditional manufacturing process is presented. The pattern of aerosol emissions from this process with varying process parameters such as peak current, pulse duration, dielectric flushing pressure and the level of dielectric was evaluated. Further, the HAZOP technique was employed to identify the inherent safety aspects and fire risk of the EDM process under different working conditions. The analysis of aerosol exposure showed that the concentration of aerosol was increased with increase in the peak current, pulse duration and dielectric level and was decreased with increase in the flushing pressure. It was also found that at higher values of peak current (7A) and pulse duration (520 micros), the concentration of aerosols at breathing zone of the operator was above the permissible exposure limit value for respirable particulates (5 mg/m(3)). HAZOP study of the EDM process showed that this process is vulnerable to fire and explosion hazards. A detailed discussion on preventing the fire and explosion hazard is presented in this paper. The emission and risk of fire of the EDM process can be minimized by selecting proper process parameters and employing appropriate control strategy.

  16. Human Modeling for Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  17. Human Modeling For Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Tran, Donald; Stambolian, Damon; Henderson, Gena; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over that last few years using human modeling for human factors engineering analysis for design of spacecraft and launch vehicles. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the different types of human modeling used currently and in the past at Kennedy Space Center (KSC) currently, and to explain the future plans for human modeling for future spacecraft designs.

  18. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    Science.gov (United States)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  19. The occurrence of hazardous volatile elements and nanoparticles in Bulgarian coal fly ashes and the effect on human health exposure

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Luis F.O., E-mail: lfsoliveira@univates.br [Centro Universitario Univates, Pro Reitoria de Pesquisa Estensao e Pos Graduacao, Programa de Pos Graduacao Ambiente e Desenvolvimento (Brazil); Environmental Science and Nanotechnology Department, Catarinense Institute of Environmental Research and Human Development - IPADHC, Capivari de Baixo, Santa Catarina (Brazil); DaBoit, Katia [Department of Environmental Medicine, Catarinense Institute of Environmental Research and Human Development - IPADHC, Capivari de Baixo, Santa Catarina (Brazil); Sampaio, Carlos H. [Universidade Federal do Rio Grande do Sul, Escola de Engenharia, Departamento de Metalurgia, Centro de Tecnologia, Av. Bento Goncalves, 9500, Bairro Agronomia, CEP: 91501-970, Porto Alegre - RS (Brazil); Jasper, Andre [Centro Universitario Univates, Pro Reitoria de Pesquisa Estensao e Pos Graduacao, Programa de Pos Graduacao Ambiente e Desenvolvimento (Brazil); Andrade, Maria L. [Department of Plant Biology and Soil Science, University of Vigo, 36310 Vigo (Spain); Kostova, Irena J. [Sofia University ' St. Kliment Ohridski' , Department of Geology, Paleontology and Fossil Fuels, 15, Tzar Osvoboditel Blvd., 1000 Sofia (Bulgaria); and others

    2012-02-01

    Low-rank, high-mineral matter Bulgarian coals were studied using a variety of chemical, optical, and electron beam methods. The larger fly ash carbon phases include charred carbons in contrast to coked carbons present in the fly ashes of bituminous-coal-derived fly ashes. Nanoscale carbons include multi-walled carbon nanotubes (MWCNTs) encapsulating Hg, Se, and As, among other elements. In addition to the glass which dominates the fly ash, relatively coarse 'rock fragments', consisting of an unmelted to partially melted core surrounded by a glassy rim, are present in the fly ash. Nano-scale minerals can contain hazardous elements and, along with metal-bearing multiwalled nanotubes, can be a path for the entry of hazardous particles into the lungs and other organs. Highlights: Black-Right-Pointing-Pointer We model Bulgarian power plants which have regulated minerals nanoparticles can contain hazardous elements. Black-Right-Pointing-Pointer We study changes in the level of information about nanominerals importance and the effect on human health exposure. Black-Right-Pointing-Pointer Increasing information will increase quality if power plants procedures are similar.

  20. A regional analysis of elements at risk exposed to mountain hazards in the Eastern European Alps

    Science.gov (United States)

    Fuchs, Sven; Zischg, Andreas

    2014-05-01

    We present a method to quantify the number and value of buildings exposed to torrents and snow avalanches in the Austrian Alps, as well as the number of exposed people. Based on a unique population and building register dataset, a relational SQL database was developed that allows in combination with GIS data a rule-based nation-wide automated analysis. Furthermore, possibilities and challenges are discussed with respect to the use of such data in vulnerability assessment and with respect to resilience measures. We comprehensively address the challenge of data accuracy, scale and uncertainties. From the total of approximately 2.4 million buildings with a clearly attributable geographical location, around 120,000 are exposed to torrent processes (5 %) and snow avalanches (0.4 %); exposition was defined here as located within the digitally available hazard maps of the Austrian Torrent and Avalanche Control Service. Around 5 % of the population (360,000 out of 8.5 million inhabitants), based on those people being compulsory listed in the population register, are located in these areas. The analysis according to the building category resulted in 2.05 million residential buildings in Austria (85 %), 93,000 of which (4.5 %) are exposed to these hazards. In contrast, 37,300 buildings (1.6 %) throughout the country belong to the category of accommodation facilities, 5,600 of which are exposed (15 %). Out of the 140,500 commercial buildings, 8,000 (5 %) are exposed. A considerable spatial variation was detectable within the communities and Federal States. In general, an above-average exposition of buildings to torrent process and snow avalanches was detectable in communities located in the Federal State of Salzburg, Styria and Vorarlberg (torrents), and Tyrol and Vorarlberg (snow avalanches). In the alpine part of Austria, the share of exposed accommodation buildings was two times (Salzburg) and three times (Vorarlberg) higher than the regional average of exposed buildings

  1. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    Science.gov (United States)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  2. Evaluating soil metallic pollution and consequent human health hazards in the vicinity of an industrialized zone, case study of Mubarakeh steel complex, Iran.

    Science.gov (United States)

    Ghaemi, Zohreh; Karbassi, Abdolreza; Moattar, Faramarz; Hassani, Amirhesam; Khorasani, Nematollah

    2015-01-01

    Being established in 1988 in the vicinity of Isfahan city, Mubarakeh Steel complex has imposed adverse environmental and health effects within the area. The study area is covered by lots of farms through which major crops like wheat and rice are provided. Considering the imposed pollution load of the complex, the current study has monitored the concentration of metals Fe, Al, Cd, Cr, Ni, Pb, Cu, Zn, Mn, Co, Mo, As in 14 soil samples within the study area. Furthermore, human health hazards of mentioned metals due to consumption of domestic rice and wheat have also been evaluated through different scenarios. In order to evaluate the mobility of metals in soil samples the sequential chemical analysis is performed. Regarding the accumulation of metals in loose phases the order of metals bioavailability risk level is estimated to be as follows: Co > Cd > Mo > Ba > As > Pb > Mn > Cu > V > Zn > Cr > Ni. An index approach is also considered to evaluate the severity of metal contamination. Regarding geochemical accumulation index, only cadmium is detected to be in a moderately contaminated status while other metals declare an unpolluted condition. Index of pollution pays more attention to mobility potential of metals and accordingly detects metals Co, Mn, As, Pb, Cd, Ba and Mo to be in a moderately contaminated level. On the other hand, enrichment factor declares all toxic metals except for Co, Ba and V to be enriched. Considering human health hazard assessment, except for Fe, Ba, Cu and Zn, all metals intakes in different scenarios are considered as hazardous while their CDI values are much more than the respective oral reference doses.

  3. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    Science.gov (United States)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  4. Human motion analysis and characterization

    Science.gov (United States)

    Cathcart, J. Michael; Prussing, Keith; Kocher, Brian

    2011-06-01

    Georgia Tech has investigated methods for the detection and tracking of personnel in a variety of acquisition environments. This research effort focused on a detailed phenomenological analysis of human physiology and signatures with the subsequent identification and characterization of potential observables. Both aspects are needed to support the development of personnel detection and tracking algorithms. As a fundamental part of this research effort, Georgia Tech collected motion capture data on an individual for a variety of walking speeds, carrying loads, and load distributions. These data formed the basis for deriving fundamental properties of the individual's motion and the derivation of motionbased observables, and changes in these fundamental properties arising from load variations. Analyses were conducted to characterize the motion properties of various body components such as leg swing, arm swing, head motion, and full body motion. This paper will describe the data acquisition process, extraction of motion characteristics, and analysis of these data. Video sequences illustrating the motion data and analysis results will also be presented.

  5. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  6. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  7. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Paces, James B. [U.S. Geological Survey

    2014-08-31

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  8. Multi-hazard response analysis of a 5MW offshore wind turbine

    DEFF Research Database (Denmark)

    Katsanos, Evangelos; Sanz, A. Arrospide; Georgakis, Christos T.

    2017-01-01

    Wind energy has already dominant role on the scene of the clean energy production. Well-promising markets, like China, India, Korea and Latin America are the fields of expansion for new wind turbines mainly installed in offshore environment, where wind, wave and earthquake loads threat the struct......Wind energy has already dominant role on the scene of the clean energy production. Well-promising markets, like China, India, Korea and Latin America are the fields of expansion for new wind turbines mainly installed in offshore environment, where wind, wave and earthquake loads threat...... the structural integrity and reliability of these energy infrastructures. Along these lines, a multi-hazard environment was considered herein and the structural performance of a 5 MW offshore wind turbine was assessed through time domain analysis. A fully integrated model of the offshore structure consisting...

  9. Analysis and evaluation of "noise" of occupational hazards in pumped storage power station

    Science.gov (United States)

    Zhao, Xin; Yang, Hongjian; Zhang, Huafei; Chen, Tao

    2017-05-01

    Aiming at the influence of "noise" of occupational hazards on the physical health of workers, the noise intensity of a working area of a hydropower station in China was evaluated comprehensively. Under the condition of power generation, noise detection is conducted on the main patrol area of the operator, and the noise samples in different regions are analyzed and processed by the single factor analysis of variance. The results show that the noise intensity of different working areas is significantly different, in which the overall noise level of the turbine layer is the highest and beyond the national standard, the protection measures need to be strengthened and the noise intensity of the rest area is normal

  10. Pathogen Reduction and Hazard Analysis and Critical Control Point (HACCP) systems for meat and poultry. USDA.

    Science.gov (United States)

    Hogue, A T; White, P L; Heminover, J A

    1998-03-01

    The United States Department of Agriculture (USDA) Food Safety Inspection Service (FSIS) adopted Hazard Analysis and Critical Control Point Systems and established finished product standards for Salmonella in slaughter plants to improve food safety for meat and poultry. In order to make significant improvements in food safety, measures must be taken at all points in the farm-to-table chain including production, transportation, slaughter, processing, storage, retail, and food preparation. Since pathogens can be introduced or multiplied anywhere along the continuum, success depends on consideration and comparison of intervention measures throughout the continuum. Food animal and public health veterinarians can create the necessary preventative environment that mitigates risks for food borne pathogen contamination.

  11. Balkan Endemic Nephropathy - Still continuing enigma, risk assessment and underestimated hazard of joint mycotoxin exposure of animals or humans.

    Science.gov (United States)

    Stoev, Stoycho D

    2017-01-05

    The spreading of mycotoxic nephropathy in animals/humans was studied. The possible etiological causes provoking this nephropathy were carefully reviewed and analyzed. The natural content of the most frequent nephrotoxic mycotoxins in target feedstuffs/foods were investigated, in addition to their significance for development of renal damages in endemic areas. An estimation of the level of exposure of humans to the nephrotoxic mycotoxin, ochratoxin A (OTA), is made. The possible synergism or additive effects between some target mycotoxins in the development of nephropathy is also covered. The significance of joint mycotoxin interaction and masked mycotoxins, in addition to some newly isolated fungal toxic agents in the complicated etiology of mycotoxic nephropathy ranged in Balkan countries is discussed. The importance of some target fungal species which can induce kidney damages was evaluated. The morphological/ultrastructural, functional and toxicological similarities between human and animal nephropathy are studied. The possible hazard of low content of combinations of some target mycotoxins in food or feedstuff ingested by pigs, chickens or humans under natural conditions is evaluated and a risk assessment was made. Some different but more effective manners of prophylaxis and/or prevention against OTA contamination of feedstuffs/foods are suggested. A survey was made in regard to the best possible ways of veterinary hygiene control of OTA-exposed animals at slaughter time for preventing the entrance of OTA in commercial feedstuffs/food channels with a view to reduce the possible health hazard for humans. The economic efficacy and applicability of such preventive measures is additionally discussed and some practical suggestions are made. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Marine natural hazards in coastal zone: observations, analysis and modelling (Plinius Medal Lecture)

    Science.gov (United States)

    Didenkulova, Ira

    2010-05-01

    Giant surface waves approaching the coast frequently cause extensive coastal flooding, destruction of coastal constructions and loss of lives. Such waves can be generated by various phenomena: strong storms and cyclones, underwater earthquakes, high-speed ferries, aerial and submarine landslides. The most famous examples of such events are the catastrophic tsunami in the Indian Ocean, which occurred on 26 December 2004 and hurricane Katrina (28 August 2005) in the Atlantic Ocean. The huge storm in the Baltic Sea on 9 January 2005, which produced unexpectedly long waves in many areas of the Baltic Sea and the influence of unusually high surge created by long waves from high-speed ferries, should also be mentioned as examples of regional marine natural hazards connected with extensive runup of certain types of waves. The processes of wave shoaling and runup for all these different marine natural hazards (tsunami, coastal freak waves, ship waves) are studied based on rigorous solutions of nonlinear shallow-water theory. The key and novel results presented here are: i) parameterization of basic formulas for extreme runup characteristics for bell-shape waves, showing that they weakly depend on the initial wave shape, which is usually unknown in real sea conditions; ii) runup analysis of periodic asymmetric waves with a steep front, as such waves are penetrating inland over large distances and with larger velocities than symmetric waves; iii) statistical analysis of irregular wave runup demonstrating that wave nonlinearity nearshore does not influence on the probability distribution of the velocity of the moving shoreline and its moments, and influences on the vertical displacement of the moving shoreline (runup). Wave runup on convex beaches and in narrow bays, which allow abnormal wave amplification is also discussed. Described analytical results are used for explanation of observed extreme runup of tsunami, freak (sneaker) waves and ship waves on different coasts

  13. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    Science.gov (United States)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  14. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  15. Analysis of human collagen sequences.

    Science.gov (United States)

    Nassa, Manisha; Anand, Pracheta; Jain, Aditi; Chhabra, Aastha; Jaiswal, Astha; Malhotra, Umang; Rani, Vibha

    2012-01-01

    The extracellular matrix is fast emerging as important component mediating cell-cell interactions, along with its established role as a scaffold for cell support. Collagen, being the principal component of extracellular matrix, has been implicated in a number of pathological conditions. However, collagens are complex protein structures belonging to a large family consisting of 28 members in humans; hence, there exists a lack of in depth information about their structural features. Annotating and appreciating the functions of these proteins is possible with the help of the numerous biocomputational tools that are currently available. This study reports a comparative analysis and characterization of the alpha-1 chain of human collagen sequences. Physico-chemical, secondary structural, functional and phylogenetic classification was carried out, based on which, collagens 12, 14 and 20, which belong to the FACIT collagen family, have been identified as potential players in diseased conditions, owing to certain atypical properties such as very high aliphatic index, low percentage of glycine and proline residues and their proximity in evolutionary history. These collagen molecules might be important candidates to be investigated further for their role in skeletal disorders.

  16. Analysis of human vergence dynamics.

    Science.gov (United States)

    Tyler, Christopher W; Elsaid, Anas M; Likova, Lora T; Gill, Navdeep; Nicholas, Spero C

    2012-10-25

    Disparity vergence is commonly viewed as being controlled by at least two mechanisms, an open-loop vergence-specific burst mechanism analogous to the ballistic drive of saccades, and a closed-loop feedback mechanism controlled by the disparity error. We show that human vergence dynamics for disparity jumps of a large textured field have a typical time course consistent with predominant control by the open-loop vergence-specific burst mechanism, although various subgroups of the population show radically different vergence behaviors. Some individuals show markedly slow divergence responses, others slow convergence responses, others slow responses in both vergence directions, implying that the two vergence directions have separate control mechanisms. The faster time courses usually had time-symmetric velocity waveforms implying open-loop burst control, while the slow response waveforms were usually time-asymmetric implying closed-loop feedback control. A further type of behavior seen in a distinct subpopulation was a compound anomalous divergence response consisting of an initial convergence movement followed by a large corrective divergence movement with time courses implying closed-loop feedback control. This analysis of the variety of human vergence responses thus contributes substantially to the understanding of the oculomotor control mechanisms underlying the generation of vergence movements [corrected].

  17. Multifaceted processes controlling the distribution of hazardous compounds in the spontaneous combustion of coal and the effect of these compounds on human health.

    Science.gov (United States)

    Oliveira, Marcos L S; da Boit, Kátia; Pacheco, Fernanda; Teixeira, Elba C; Schneider, Ismael L; Crissien, Tito J; Pinto, Diana C; Oyaga, Rafael M; Silva, Luis F O

    2018-01-01

    Pollution generated by hazardous elements and persistent organic compounds that affect coal fire is a major environmental concern because of its toxic nature, persistence, and potential risk to human health. The coal mining activities are growing in the state of Santa Catarina in Brazil, thus the collateral impacts on the health and economy are yet to be analyzed. In addition, the environment is also enduring the collateral damage as the waste materials directly influence the coal by-products applied in civil constructions. This study was aimed to establish the relationships between the composition, morphology, and structural characteristics of ultrafine particles emitted by coal mine fires. In Brazil, the self-combustions produced by Al-Ca-Fe-Mg-Si coal spheres are rich in chalcophile elements (As, Cd, Cu, Hg, Pb, Sb, Se, Sn, and Zn), lithophile elements (Ce, Hf, In, La, Th, and U), and siderophile elements (Co, Cr, Mo, Fe, Ni, and V). The relationship between nanomineralogy and the production of hazardous elements as analyzed by advanced methods for the geochemical analysis of different materials were also delineated. The information obtained by the mineral substance analysis may provide a better idea for the understanding of coal-fire development and assessing the response of particular coal in different combustion processes. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Use of the hazard analysis and critical control points (HACCP) risk assessment on a medical device for parenteral application.

    Science.gov (United States)

    Jahnke, Michael; Kühn, Klaus-Dieter

    2003-01-01

    In order to guarantee the consistently high quality of medical products for human use, it is absolutely necessary that flawless hygiene conditions are maintained by the strict observance of hygiene rules. With the growing understanding of the impact of process conditions on the quality of the resulting product, process controls (surveillance) have gained increasing importance to complete the quality profile traditionally defined by post-process product testing. Today, process controls have become an important GMP requirement for the pharmaceutical industry. However, before quality process controls can be introduced, the manufacturing process has to be analyzed, with the focus on its critical quality-influencing steps. The HACCP (Hazard Analysis and Critical Control Points) method is well recognized as a useful tool in the pharmaceutical industry. This risk analysis, following the guidelines of the HACCP method and the monitoring of critical steps during the manufacturing process was applied to the manufacture of methyl methacrylate solution used for bone cement and led to the establishment of a preventative monitoring system and constitutes an effective concept for quality assurance of hygiene and all other parameters influencing the quality of the product.

  19. Study Of The Risks Arising From Natural Disasters And Hazards On Urban And Intercity Motorways By Using Failure Mode Effect Analysis (FMEA) Methods

    Science.gov (United States)

    DELİCE, Yavuz

    2015-04-01

    Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing

  20. Geomorphological hazard analysis along the Egyptian Red Sea coast between Safaga and Quseir

    Directory of Open Access Journals (Sweden)

    A. M. Youssef

    2009-05-01

    Full Text Available Geomophological hazard assessment is an important component of natural hazard risk assessment. This paper presents GIS-based geomorphological hazard mapping in the Red Sea area between Safaga and Quseir, Egypt. This includes the integration of published geological, geomorphological, and other data into GIS, and generation of new map products, combining governmental concerns and legal restrictions. Detailed geomorphological hazard maps for flooding zones and earth movement potential, especially along the roads and railways, have been prepared. Further the paper illustrates the application of vulnerability maps dealing with the effect of hazard on urban areas, tourist villages, industrial facilities, quarries, and road networks. These maps can help to initiate appropriate measures to mitigate the probable hazards in the area.

  1. Site Specific Probabilistic Seismic Hazard and Risk Analysis for Surrounding Communities of The Geysers Geothermal Development Area

    Science.gov (United States)

    Miah, M.; Hutchings, L. J.; Savy, J. B.

    2014-12-01

    We conduct a probabilistic seismic hazard and risk analysis from induced and tectonic earthquakes for a 50 km radius area centered on The Geysers, California and for the next ten years. We calculate hazard with both a conventional and physics-based approach. We estimate site specific hazard. We convert hazard to risk of nuisance and damage to structures per year and map the risk. For the conventional PSHA we assume the past ten years is indicative of hazard for the next ten years from Msurprising since they were calculated by completely independent means. The conventional approach used the actual catalog of the past ten years of earthquakes to estimate the hazard for the next ten year. While the physics-based approach used geotechnical modeling to calculate the catalog for the next ten years. Similarly, for the conventional PSHA, we utilized attenuation relations from past earthquakes recorded at the Geysers to translate the ground motion from the source to the site. While for the physics-based approach we calculated ground motion from simulation of actual earthquake rupture. Finally, the source of the earthquakes was the actual source for the conventional PSHA. While, we assumed random fractures for the physics-based approach. From all this, we consider the calculation of the conventional approach, based on actual data, to validate the physics-based approach used.

  2. Vulnerability analysis of Landslide hazard area: Case study of South Korea

    Science.gov (United States)

    Oh, Chaeyeon; Jun, Kyewon; Kim, Younghwan

    2017-04-01

    Recently such as Landslide and debris flow are occurring over the due to climate changes, frequent sedimentation disaster in mountains area. A scientific analysis of landslide risk areas along with the collection and analysis of a variety of spatial information would be critical for minimizing damage in the event of mountainous disasters such as landslide and debris flow. We carried out a case study of the selected areas at Inje, Gangwon province which suffered from serious landslides due to flash floods by Typhoon Ewiniar in 2006. Landslide and debris flow locations were identified in the study area from interpretation of airborne image and field surveys. We used GIS to construct a spatial information database integrating the data required for a comprehensive analysis of landslide risk areas including geography, hydrology, pedology, and forestry. Furthermore, this study evaluates slope stability of the affected areas using SINMAP(Stability Index Mapping), analyzes spatial data that have high correlation with selected landslide areas using Likelihood ratio. And by applying the Weight of evidence techniques weight values (W+ and W-) which were calculated for each element. We then analyzed the spatial data which were significantly correlated with the landslide occurrence and predicted the mountainous areas with elevated risks of landslide which are vulnerable to disasters, and the hazard map was generated using GIS. Acknowledgments This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(No.NRF-2014R1A1A3050495).

  3. Human semen assays for workplace monitoring. [Monitoring of hazardous materials by determining effects on semen of personnel

    Energy Technology Data Exchange (ETDEWEB)

    Wyrobek, A.J.; Gledhill, B.L.

    1978-11-07

    Decades of human semen studies have yielded compelling evidence that sperm can be used to access reproductive potential and diagnose pathology. With these studies as background, the small number of detailed semen studies of men exposed to physical and chemical agents point with optimism to the application of human semen assays as efficient, effective means to monitor for reproductive hazards in the workplace. Sperm are the most accessible of human gonadal tissue and provide a means of monitoring exposure induced changes in the human testes, changes which may result in infertility and increased frequencies of genetically abnormal gametes. The focus on semen has precipitated the development of new sperm bioassays which use older conventional andrological methods (i.e., sperm counts, motility, and morphology) as well as recently developed high speed flow and scanning methods for automated cytological analyses. The status of these sperm assays for workplace surveillance is reviewed, procedures are suggested with examples of use, and their effectiveness is evaluated. The available mouse models of induced semen changes are briefly described and the importance of these models for evaluating the genetic implications of findings in human semen is discussed.

  4. Human papillomavirus sperm infection and assisted reproduction: a dangerous hazard with a possible safe solution.

    Science.gov (United States)

    Garolla, Andrea; Lenzi, Andrea; Palù, Giorgio; Pizzol, Damiano; Bertoldo, Alessandro; De Toni, Luca; Foresta, Carlo

    2012-04-01

    Human papillomavirus (HPV) infection has been demonstrated in the sperm of a large percentage of sexually active males and is associated with an impairment of sperm parameters, with a particular negative impact on sperm motility, suggesting a possible role in male infertility. Conventional sperm selection techniques have a low efficiency in removing HPV. Evaluation of sperm parameters, terminal deoxynucleotidyltransferase-mediated dUTP nick-end labeling test to evaluate DNA fragmentation and fluorescence in situ hybridization or immunohistochemistry for HPV were performed on semen samples from infected patients (n= 22), control subjects (n= 13) and on pooled control sperm samples incubated with HPV16-L1 (HPV capsid), before and after direct swim-up and modified swim-up (with added Heparinase-III). Moreover, cytofluorimetry for HPV detection was performed in pooled sperm pre- and post-incubation with HPV 16-L1 before and after direct and modified swim-up. Statistical analysis was performed with a two-tailed Student's t-test. Direct swim-up reduces the number of HPV-infected sperm by ~24% (PHPV DNA both from naturally and artificially infected sperm. Enzymatic treatment with Heparinase-III tended to decrease sperm motility, viability and DNA integrity but the effects were not significant. This study shows that Heparinase-III treatment seems not to affect spermatozoa in vitro and suggests that this treatment should be investigated further as a means of preparing sperm from patients who are infected with HPV in order to reduce the risk of HPV infection when using assisted reproduction techniques.

  5. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran.

    Science.gov (United States)

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe; Mosstafakhani, Parasto; Taheri, Kamal; Shahoie, Saber; Khodamoradpour, Mehran

    2009-10-01

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  6. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    Science.gov (United States)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  7. Flood Hazard Assessment of the coastal lowland in the Kujukuri Plain of Chiba Prefecture, Japan, using GIS and multicriteria decision analysis

    Science.gov (United States)

    CHEN, Huali; Tokunaga, Tomochika; Ito, Yuka; Sawamukai, Marie

    2014-05-01

    Floods, the most common natural disaster in the world, cause serious loss of life and economic damage. Flood is one of the disasters in the coastal lowland along the Kujukuri Plain, Chiba Prefecture, Japan. Many natural and human activities have changed the surface environment of the Plain. These include agricultural development, urban and industrial development, change of the drainage patterns of the land surface, deposition and/or erosion of the river valleys, and so on. In addition, wide spread occurrence of land subsidence has been caused by the abstraction of natural gas dissolved in groundwater. The locations of the groundwater extraction include nearby the coast, and it may increase the flood risk. Hence, it is very important to evaluate flood hazard by taking into account the temporal change of land elevation caused by land subsidence, and to develop hazard maps for protecting surface environment and land-use planning. Multicriteria decision analysis (MCDA) provides methodology and techniques for analyzing complex decision problems, which often involve incommensurable data or criteria. Also, Geographical Information System (GIS) is the powerful tool since it manages large amount of spatial data involved in MCDA. The purpose of this study is to present a flood hazard model using MCDA techniques with GIS support in a region where primary data are scare. The model incorporates six parameters: river system, topography, land-use, flood control project, passing flood from coast, and precipitation. Main data sources used are 10 meter resolution topography data, airborne laser scanning data, leveling data, Landsat-TM data, two 1:30,000 scale river watershed map, and precipitation data from precipitation observation stations around the study area. River system map was created by merging the river order, the line density, and the river sink point density layers. Land-use data were derived from Landsat-TM images. A final hazard map for 2004, as an example, was

  8. Living with Familiar Hazards: Flood Experiences and Human Vulnerability in Accra, Ghana

    Directory of Open Access Journals (Sweden)

    Dacosta Aboagye

    2012-10-01

    Full Text Available The paper explores demographic characteristics, migration history, and impact of flooding on households and communities. The main objective is to explore the different ways in which floods impact households and communities in Accra. Specifically, the paper analyzes how floods alter the set of resources available to households and communities. The results indicate that urbanization and governmental policies have rendered more people, especially the poor and recent migrants, homeless. These homeless people have become more vulnerable to flooding than the average Accra resident. The results also show that the homeless community contrast with the fixed community in terms of socio-economic characteristics, degree of social cohesion, and physical location. The paper concludes that the unchanging pattern of vulnerability shows the inability of a society to cope and adjust to familiar hazards.

  9. Risk analysis procedure for post-wildfire natural hazards in British Columbia

    Science.gov (United States)

    Jordan, Peter

    2010-05-01

    Following a severe wildfire season in 2003, and several subsequent damaging debris flow and flood events, the British Columbia Forest Service developed a procedure for analysing risks to public safety and infrastructure from such events. At the same time, the Forest Service undertook a research program to determine the extent of post-wildfire hazards, and examine the hydrologic and geomorphic processes contributing to the hazards. The risk analysis procedure follows the Canadian Standards Association decision-making framework for risk management (which in turn is based on international standards). This has several steps: identification of risk, risk analysis and estimation, evaluation of risk tolerability, developing control or mitigation strategies, and acting on these strategies. The Forest Service procedure deals only with the first two steps. The results are passed on to authorities such as the Provincial Emergency Program and local government, who are responsible for evaluating risks, warning residents, and applying mitigation strategies if appropriate. The objective of the procedure is to identify and analyse risks to public safety and infrastructure. The procedure is loosely based on the BAER (burned area emergency response) program in the USA, with some important differences. Our procedure focuses on identifying risks and warning affected parties, not on mitigation activities such as broadcast erosion control measures. Partly this is due to limited staff and financial resources. Also, our procedure is not multi-agency, but is limited to wildfires on provincial forest land; in British Columbia about 95% of forest land is in the publicly-owned provincial forest. Each fire season, wildfires are screened by size and proximity to values at risk such as populated areas. For selected fires, when the fire is largely contained, the procedure begins with an aerial reconnaissance of the fire, and photography with a hand-held camera, which can be used to make a

  10. Characterising Seismic Hazard Input for Analysis Risk to Multi-System Infrastructures: Application to Scenario Event-Based Models and extension to Probabilistic Risk

    Science.gov (United States)

    Weatherill, G. A.; Silva, V.

    2011-12-01

    The potential human and economic cost of earthquakes to complex urban infrastructures has been demonstrated in the most emphatic manner by recent large earthquakes such as that of Haiti (February 2010), Christchurch (September 2010 and February 2011) and Tohoku (March 2011). Consideration of seismic risk for a homogenous portfolio, such as a single building typology or infrastructure, or independent analyses of separate typologies or infrastructures, are insufficient to fully characterise the potential impacts that arise from inter-connected system failure. Individual elements of each infrastructure may be adversely affected by different facets of the ground motion (e.g. short-period acceleration, long-period displacement, cumulative energy input etc.). The accuracy and efficiency of the risk analysis is dependent on the ability to characterise these multiple features of the ground motion over a spatially distributed portfolio of elements. The modelling challenges raised by this extension to multi-system analysis of risk have been a key focus of the European Project "Systemic Seismic Vulnerability and Risk Analysis for Buildings, Lifeline Networks and Infrastructures Safety Gain (SYNER-G)", and are expected to be developed further within the Global Earthquake Model (GEM). Seismic performance of a spatially distributed infrastructure during an earthquake may be assessed by means of Monte Carlo simulation, in order to incorporate the aleatory variability of the ground motion into the network analysis. Methodologies for co-simulating large numbers of spatially cross-correlated ground motion fields are appraised, and their potential impacts on a spatially distributed portfolio of mixed building typologies assessed using idealised case study scenarios from California and Europe. Potential developments to incorporate correlation and uncertainty in site amplification and geotechnical hazard are also explored. Whilst the initial application of the seismic risk analysis is

  11. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    Science.gov (United States)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready

  12. Thermodynamical analysis of human thermal comfort

    OpenAIRE

    Prek, Matjaž

    2015-01-01

    Traditional methods of human thermal comfort analysis are based on the first law of thermodynamics. These methods use an energy balance of the human body to determine heat transfer between the body and its environment. By contrast, the second law of thermodynamics introduces the useful concept of exergy. It enables the determination of the exergy consumption within the human body dependent on human and environmental factors. Human body exergy consumption varies with the combination of environ...

  13. Antimicrobial-Resistant Enterococci in Animals and Meat: A Human Health Hazard?

    DEFF Research Database (Denmark)

    Hammerum, A.M.; Lester, C.H.; Heuer, Ole Eske

    2010-01-01

    of avoparcin, gentamicin, and virginiamycin for growth promotion and therapy in food animals has lead to the emergence of vancomycin-and gentamicin-resistant enterococci and quinupristin/dalfopristin-resistant E. faecium in animals and meat. This implies a potential risk for transfer of resistance genes...... or resistant bacteria from food animals to humans. The genes encoding resistance to vancomycin, gentamicin, and quinupristin/dalfopristin have been found in E. faecium of human and animal origin; meanwhile, certain clones of E. faecium are found more frequently in samples from human patients, while other...

  14. Safety and Hazard Analysis for the Coherent/Acculite Laser Based Sandia Remote Sensing System (Trailer B70).

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2005-09-01

    A laser safety and hazard analysis is presented, for the Coherent(r) driven Acculite(r) laser central to the Sandia Remote Sensing System (SRSS). The analysis is based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The trailer (B70) based SRSS laser system is a mobile platform which is used to perform laser interaction experiments and tests at various national test sites. The trailer based SRSS laser system is generally operated on the United State Air Force Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances. In order to protect personnel who work inside the Nominal Hazard Zone (NHZ) from hazardous laser exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (ODmin) necessary for the laser safety eyewear used by authorized personnel. Also, the Nominal Ocular Hazard Distance (NOHD) and The Extended Ocular Hazard Distance (EOHD) are calculated in order to protect unauthorized personnel who may have violated the boundaries of the control area and might enter into the laser's NHZ for testing outside the trailer. 4Page intentionally left blank

  15. Hazard, Vulnerability and Capacity Mapping for Landslides Risk Analysis using Geographic Information System (GIS)

    Science.gov (United States)

    Sari, D. A. P.; Innaqa, S.; Safrilah

    2017-06-01

    This research analyzed the levels of disaster risk in the Citeureup sub-District, Bogor Regency, West Java, based on its potential hazard, vulnerability and capacity, using map to represent the results, then Miles and Huberman analytical techniques was used to analyze the qualitative interviews. The analysis conducted in this study is based on the concept of disaster risk by Wisner. The result shows that the Citeureup sub-District has medium-low risk of landslides. Of the 14 villages, three villages have a moderate risk level, namely Hambalang, Tajur, and Tangkil, or 49.58% of the total land area. Eleven villages have a low level of risk, namely Pasir Mukti, Sanja, Tarikolot, Gunung Sari, Puspasari, East Karang Asem, Citeureup, Leuwinutug, Sukahati, West Karang Asem West and Puspanegara, or 48.68% of the total land area, for high-risk areas only around 1.74%, which is part of Hambalang village. The analysis using Geographic Information System (GIS) prove that areas with a high risk potential does not necessarily have a high level of risk. The capacity of the community plays an important role to minimize the risk of a region. Disaster risk reduction strategy is done by creating a safe condition, which intensified the movement of disaster risk reduction.

  16. Site specific seismic hazard analysis and determination of response spectra of Kolkata for maximum considered earthquake

    Science.gov (United States)

    Shiuly, Amit; Sahu, R. B.; Mandal, Saroj

    2017-06-01

    This paper presents site specific seismic hazard analysis of Kolkata city, former capital of India and present capital of state West Bengal, situated on the world’s largest delta island, Bengal basin. For this purpose, peak ground acceleration (PGA) for a maximum considered earthquake (MCE) at bedrock level has been estimated using an artificial neural network (ANN) based attenuation relationship developed on the basis of synthetic ground motion data for the region. Using the PGA corresponding to the MCE, a spectrum compatible acceleration time history at bedrock level has been generated by using a wavelet based computer program, WAVEGEN. This spectrum compatible time history at bedrock level has been converted to the same at surface level using SHAKE2000 for 144 borehole locations in the study region. Using the predicted values of PGA and PGV at the surface, corresponding contours for the region have been drawn. For the MCE, the PGA at bedrock level of Kolkata city has been obtained as 0.184 g, while that at the surface level varies from 0.22 g to 0.37 g. Finally, Kolkata has been subdivided into eight seismic subzones, and for each subzone a response spectrum equation has been derived using polynomial regression analysis. This will be very helpful for structural and geotechnical engineers to design safe and economical earthquake resistant structures.

  17. A Human Body Analysis System

    Directory of Open Access Journals (Sweden)

    Girondel Vincent

    2006-01-01

    Full Text Available This paper describes a system for human body analysis (segmentation, tracking, face/hands localisation, posture recognition from a single view that is fast and completely automatic. The system first extracts low-level data and uses part of the data for high-level interpretation. It can detect and track several persons even if they merge or are completely occluded by another person from the camera's point of view. For the high-level interpretation step, static posture recognition is performed using a belief theory-based classifier. The belief theory is considered here as a new approach for performing posture recognition and classification using imprecise and/or conflicting data. Four different static postures are considered: standing, sitting, squatting, and lying. The aim of this paper is to give a global view and an evaluation of the performances of the entire system and to describe in detail each of its processing steps, whereas our previous publications focused on a single part of the system. The efficiency and the limits of the system have been highlighted on a database of more than fifty video sequences where a dozen different individuals appear. This system allows real-time processing and aims at monitoring elderly people in video surveillance applications or at the mixing of real and virtual worlds in ambient intelligence systems.

  18. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xueqin [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); School of Social Development and Public Policy, Beijing Normal University, Beijing 100875 (China); Li, Ning [State Key Laboratory of Earth Surface Processes and Resource Ecology, Beijing Normal University, Beijing 100875 (China); Yuan, Shuai, E-mail: syuan@nmemc.org.cn [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China); Xu, Ning; Shi, Wenqin; Chen, Weibin [National Marine Environmental Monitoring Center, State Oceanic Administration, Dalian 116023 (China)

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54 years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10 years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. - Highlights: • A method to estimate the multidimensional joint return periods is presented. • 2D function allows better fitting results at the lower tail of hazard factors. • Three-dimensional simulation has obvious advantages in extreme value fitting. • Joint return periods are closer to the reality

  19. An Introduction to the Analysis of Paired Hazard Rates in Studies of the Family.

    Science.gov (United States)

    Smith, Ken R.; McClean, Sally I.

    1998-01-01

    Hazard rate models are described, and selected techniques are used to analyze paired hazard rates when event times are right censored. The techniques are illustrated by looking at mortality patterns in husbands and wives. Recently developed measures and models are introduced. The advantages and disadvantages of the measures are discussed.…

  20. Volcanic ash layers illuminate the resilience of Neanderthals and early modern humans to natural hazards

    Science.gov (United States)

    Lowe, John; Barton, Nick; Blockley, Simon; Ramsey, Christopher Bronk; Cullen, Victoria L.; Davies, William; Gamble, Clive; Grant, Katharine; Hardiman, Mark; Housley, Rupert; Lane, Christine S.; Lee, Sharen; Lewis, Mark; MacLeod, Alison; Menzies, Martin; Müller, Wolfgang; Pollard, Mark; Price, Catherine; Roberts, Andrew P.; Rohling, Eelco J.; Satow, Chris; Smith, Victoria C.; Stringer, Chris B.; Tomlinson, Emma L.; White, Dustin; Albert, Paul; Arienzo, Ilenia; Barker, Graeme; Borić, Dušan; Carandente, Antonio; Civetta, Lucia; Ferrier, Catherine; Guadelli, Jean-Luc; Karkanas, Panagiotis; Koumouzelis, Margarita; Müller, Ulrich C.; Orsi, Giovanni; Pross, Jörg; Rosi, Mauro; Shalamanov-Korobar, Ljiljiana; Sirakov, Nikolay; Tzedakis, Polychronis C.

    2012-01-01

    Marked changes in human dispersal and development during the Middle to Upper Paleolithic transition have been attributed to massive volcanic eruption and/or severe climatic deterioration. We test this concept using records of volcanic ash layers of the Campanian Ignimbrite eruption dated to ca. 40,000 y ago (40 ka B.P.). The distribution of the Campanian Ignimbrite has been enhanced by the discovery of cryptotephra deposits (volcanic ash layers that are not visible to the naked eye) in archaeological cave sequences. They enable us to synchronize archaeological and paleoclimatic records through the period of transition from Neanderthal to the earliest anatomically modern human populations in Europe. Our results confirm that the combined effects of a major volcanic eruption and severe climatic cooling failed to have lasting impacts on Neanderthals or early modern humans in Europe. We infer that modern humans proved a greater competitive threat to indigenous populations than natural disasters. PMID:22826222

  1. Volcanic ash layers illuminate the resilience of Neanderthals and early modern humans to natural hazards.

    Science.gov (United States)

    Lowe, John; Barton, Nick; Blockley, Simon; Ramsey, Christopher Bronk; Cullen, Victoria L; Davies, William; Gamble, Clive; Grant, Katharine; Hardiman, Mark; Housley, Rupert; Lane, Christine S; Lee, Sharen; Lewis, Mark; MacLeod, Alison; Menzies, Martin; Müller, Wolfgang; Pollard, Mark; Price, Catherine; Roberts, Andrew P; Rohling, Eelco J; Satow, Chris; Smith, Victoria C; Stringer, Chris B; Tomlinson, Emma L; White, Dustin; Albert, Paul; Arienzo, Ilenia; Barker, Graeme; Boric, Dusan; Carandente, Antonio; Civetta, Lucia; Ferrier, Catherine; Guadelli, Jean-Luc; Karkanas, Panagiotis; Koumouzelis, Margarita; Müller, Ulrich C; Orsi, Giovanni; Pross, Jörg; Rosi, Mauro; Shalamanov-Korobar, Ljiljiana; Sirakov, Nikolay; Tzedakis, Polychronis C

    2012-08-21

    Marked changes in human dispersal and development during the Middle to Upper Paleolithic transition have been attributed to massive volcanic eruption and/or severe climatic deterioration. We test this concept using records of volcanic ash layers of the Campanian Ignimbrite eruption dated to ca. 40,000 y ago (40 ka B.P.). The distribution of the Campanian Ignimbrite has been enhanced by the discovery of cryptotephra deposits (volcanic ash layers that are not visible to the naked eye) in archaeological cave sequences. They enable us to synchronize archaeological and paleoclimatic records through the period of transition from Neanderthal to the earliest anatomically modern human populations in Europe. Our results confirm that the combined effects of a major volcanic eruption and severe climatic cooling failed to have lasting impacts on Neanderthals or early modern humans in Europe. We infer that modern humans proved a greater competitive threat to indigenous populations than natural disasters.

  2. [Ecotoxicology, human ecology, laser biotechnology in primary prevention of environmental health hazards].

    Science.gov (United States)

    Dobrowolski, J W

    2001-01-01

    Interdisciplinary studies in ecotoxicology (including the influence of complex physical, chemical and biological factors on ecosystems and human food chain), human ecology (related to estimation of individuals exposition to different pollutants both in the natural environment and in-door environment, as well as by food) and environmental biotechnology (based on application of high sensitive biological, especially embryological criteria in biotests for water quality, including image computisation and biosensors)--are scientific base for primary prevention of disturbances of the ecological balance as well as environmental risk factors for human health. Large scale applications of laser biostimulation in environmental engineering is a new chance for system approach in primary prevention by more efficient nutritional prevention, protection of proper water quality (including biotests) and protection of the human environment in working and living places.

  3. Hazardous Waste

    Science.gov (United States)

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  4. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  5. Direct analysis in real time mass spectrometry for the rapid identification of four highly hazardous pesticides in agrochemicals.

    Science.gov (United States)

    Wang, Lei; Zhao, Pengyue; Zhang, Fengzu; Li, Yanjie; Pan, Canping

    2012-08-30

    Direct analysis in real time (DART) is a new ion source technique, which is conducted in the open air under ambient conditions, applied to the rapid and direct analysis of any material (gases, liquids, and solids) with minimal or no sample preparation. In order to take advantage of the capacity of DART mass spectrometry for the real-time analysis of hazardous ingredients in commercial agrochemicals, a pilot study of rapid qualitative determination of hazardous pesticides was performed. Highly hazardous pesticides were identified by DART ionization coupled to a single-quadrupole mass spectrometer (DART-MS). Acetonitrile was chosen for dissolving samples prior to the analysis. Samples were analyzed by this technique in as little as 5 s. Phorate, carbofuran, ethoprophos and fipronil were be detected directly from commercial agrochemicals. The ionization-related parameters (DART temperature, grid voltage and MS fragment) of these compounds were optimized to obtain highly response. Isotope patterns were taken into consideration for qualitative identification. Relative standard deviations (RSDs, n = 5) of 2.3-15.0% were obtained by measuring the relative abundance of selected isotopes. This study showed that DART-MS technology was able to qualitatively determine the existence of highly hazardous pesticides in commercial pesticide formulations. It is suggested that this technology should be applied for routine monitoring in the market. Copyright © 2012 John Wiley & Sons, Ltd.

  6. 78 FR 64425 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Science.gov (United States)

    2013-10-29

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Food for Animals; Public... risk-based preventive controls for animal food. This proposed rule is one of several proposed rules... system. Among other things, FSMA requires FDA to issue regulations requiring preventive controls for...

  7. In vivo laser scanning microscopic investigation of the decontamination of hazardous substances from the human skin

    Science.gov (United States)

    Lademann, J.; Patzelt, A.; Schanzer, S.; Richter, H.; Gross, I.; Menting, K. H.; Frazier, L.; Sterry, W.; Antoniou, C.

    2010-12-01

    The stimulation of the penetration of topically applied substances into the skin is a topic of intensive dermatological and pharmacological research. In this context, it was found that in addition to the intercellular penetration, the follicular penetration also represents an efficient penetration pathway. The hair follicles act as a long-term reservoir for topically applied substances. They are surrounded by all important target structures, such as blood capillaries, stem and dendritic cells. Therefore, the hair follicles, as well as the skin, need to be protected from hazardous substances. The traditional method of decontamination after respective accidental contacts consists of an intensive washing of the skin. However, during this mechanical procedure, the substances can be pushed even deeper into the hair follicles. In the present study, absorbent materials were applied to remove a fluorescent model substance from the skin without inducing mechanical stress. The results were compared to the decontamination effects obtained by intensive washing. Investigations were performed by means of in vivo laser scanning microscopy (LSM). The comparison revealed that decontamination with absorbent materials is more effective than decontamination with washing processes.

  8. ANALYSIS OF HUMAN RESOURCES MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Anis Cecilia - Nicoleta

    2010-07-01

    Full Text Available Along with other material, financial resources, human resource is an indispensable element of each work process. The concept of human resource derives exactly from the fact that it has a limited nature and it is consumed by usage in the workplace. Any work process cannot be developed without the labour factor. Work is essentially a conscious activity specific to humans through which they release certain labour objects and transforms them according to his needs.

  9. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    Directory of Open Access Journals (Sweden)

    Yu-Ting Hung

    2015-09-01

    Full Text Available To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management.

  10. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  11. Occurrence of human pathogenic Clostridium botulinum among healthy dairy animals: an emerging public health hazard.

    Science.gov (United States)

    Abdel-Moein, Khaled A; Hamza, Dalia A

    2016-01-01

    The current study was conducted to investigate the occurrence of human pathogenic Clostridium botulinum in the feces of dairy animals. Fecal samples were collected from 203 apparently healthy dairy animals (50 cattle, 50 buffaloes, 52 sheep, 51 goats). Samples were cultured to recover C. botulinum while human pathogenic C. botulinum strains were identified after screening of all C. botulinum isolates for the presence of genes that encode toxins type A, B, E, F. The overall prevalence of C. botulinum was 18.7% whereas human pathogenic C. botulinum strains (only type A) were isolated from six animals at the rates of 2, 2, 5.8, and 2% for cattle, buffaloes, sheep, and goats, respectively. High fecal carriage rates of C. botulinum among apparently healthy dairy animals especially type A alarm both veterinary and public health communities for a potential role which may be played by dairy animals in the epidemiology of such pathogen.

  12. Multifractal Analysis in Mining Microseismicity and its Application to Seismic Hazard Analysis in Mines

    Science.gov (United States)

    Pasten, D.; Comte, D.; Vallejos, J.

    2013-05-01

    During the last decades several authors showing that the spatial distribution of earthquakes follows multifractal laws and the most interesting behavior is the decreasing of the fratal dimensions before the ocurrence of a large earthquake, and also before its main aftershocks. A multifractal analysis to over 55920 microseismicity events recorded from January 2006 to January 2009 at Creighton mine, Canada was applied. In order to work with a complete catalogue in magnitude, it was taken the data associated with the linear part of the Gutenber-Richter law, with magnitudes greater than -1.5. A multifractal analysis was performed using microseismic data, considering that significant earthquakes are those with magnitude MW ≥ 1.0. A moving window was used, containing a constant number of events in order to guarantee the precise estimations of the fractal dimensions. After different trials, we choose 200 events for the number of the data points in each windows. Two consecutive windows were shifted by 20 events. The complete data set was separated in six sections and this multifractal analysis was applied for each section of 9320 data. The multifractal analysis of each section shows that there is a systematic decreasing of the fractal dimension (Dq) with time before the occurrence of rockburst or natural event with magnitude greater than MW ≥ 1.0, as it is observed in the seismic sequence of large earthquakes. This metodology was repeated for minimum magnitudes MW ≥ 1.5 and MW ≥ 2.0, obtaining same results. The best result was obtained using MW >= 2.0, a right answer rate vary between fifty and eighty percent. The result shows the possibility to use systematically the determination of the Dq parameter in order to detect the next rockburst or natural event in the studied mine. This project has been financially suppoerted by FONDECyT No 3120237 Grant (D.P).

  13. Human health hazard from antimicrobial-resistant enterococci in animals and food

    DEFF Research Database (Denmark)

    Heuer, Ole Eske; Hammerum, Anette Marie; Collignon, P.

    2006-01-01

    The use of antimicrobial agents in the modern farm industry has created a reservoir of resistant bacteria in food animals. Foods of animal origin are often contaminated with enterococci that are likely to contribute resistance genes, virulence factors, or other properties to enterococci IN humans...... to change the current view that antimicrobial-resistant enterococci from animals pose a threat to human health. On the contrary, antimicrobial resistance genes appear to spread freely between enterococci from different reservoirs, irrespective of their apparent host association....

  14. Human error analysis of commercial aviation accidents using the human factors analysis and classification system (HFACS)

    Science.gov (United States)

    2001-02-01

    The Human Factors Analysis and Classification System (HFACS) is a general human error framework : originally developed and tested within the U.S. military as a tool for investigating and analyzing the human : causes of aviation accidents. Based upon ...

  15. Accidental hazardous material releases with human impacts in the United States: exploration of geographical distribution and temporal trends.

    Science.gov (United States)

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Chermak, Christina

    2010-09-01

    To investigate the circumstances and geographic and temporal distributions of hazardous material releases and resulting human impacts in the United States. Releases with fatalities, injuries, and evacuations were identified from reports to the National Response Center between 1990 and 2008, correcting for data quality issues identified in previous studies. From more than 550,000 reports, 861 deaths, 16,348 injuries and 741,427 evacuations were identified. Injuries from releases of chemicals at fixed facilities and natural gas from pipelines have decreased whereas evacuations from petroleum releases at fixed facilities have increased. Results confirm recent advances in chemical and pipeline safety and suggest directions for further improvement including targeted training and inspections and adoption of inherently safer design principles.

  16. Job safety analysis and hazard identification for work accident prevention in para rubber wood sawmills in southern Thailand.

    Science.gov (United States)

    Thepaksorn, Phayong; Thongjerm, Supawan; Incharoen, Salee; Siriwong, Wattasit; Harada, Kouji; Koizumi, Akio

    2017-11-25

    We utilized job safety analysis (JSA) and hazard identification for work accident prevention in Para rubber wood sawmills, which aimed to investigate occupational health risk exposures and assess the health hazards at sawmills in the Trang Province, located in southern Thailand. We conducted a cross-sectional study which included a walk-through survey, JSA, occupational risk assessment, and environmental samplings from March through September 2015 at four Para rubber wood sawmills. We identified potential occupational safety and health hazards associated with six main processes, including: 1) logging and cutting, 2) sawing the lumber into sheets, 3) planing and re-arranging, 4) vacuuming and wood preservation, 5) drying and planks re-arranging, and 6) grading, packing, and storing. Working in sawmills was associated with high risk of wood dust and noise exposure, occupational accidents injuring hands and feet, chemicals and fungicide exposure, and injury due to poor ergonomics or repetitive work. Several high-risk areas were identified from JSA and hazard identification of the working processes, especially high wood dust and noise exposure when sawing lumber into sheets and risk of occupational accidents of the hands and feet when struck by lumber. All workers were strongly recommended to use personal protective equipment in any working processes. Exposures should be controlled using local ventilation systems and reducing noise transmission. We recommend that the results from the risk assessment performed in this study be used to create an action plan for reducing occupational health hazards in Para rubber sawmills.

  17. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity.

    Science.gov (United States)

    Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.

  18. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Harper, Bryan [Oregon State University (United States); Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan [Pacific Northwest National Laboratory (United States); Tang, Kaizhi [Intelligent Automation, Inc. (United States); Heredia-Langner, Alejandro [Pacific Northwest National Laboratory (United States); Lins, Roberto [CPqAM, Oswaldo Cruz Foundation, FIOCRUZ-PE (Brazil); Harper, Stacey, E-mail: stacey.harper@oregonstate.edu [Oregon State University (United States)

    2015-06-15

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure–toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure–activity relationships.

  19. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    Energy Technology Data Exchange (ETDEWEB)

    (NOEMAIL), R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  20. Correlation analysis of heat flux and fire behaviour and hazards of polycrystalline silicon photovoltaic panels

    Science.gov (United States)

    Ju, Xiaoyu; Zhou, Xiaodong; Peng, Fei; Wu, Zhibo; Lai, Dimeng; Hu, Yue; Yang, Lizhong

    2017-05-01

    This work aims to gain a better understanding of fire behaviour and hazards of PV panels under different radiation heat fluxes. The cone calorimeter tests were applied to simulate the situations when the front and back surfaces are exposed to heat flux in a fire, respectively. Through comparison of ignition time, mass loss rate and heat release rate, it is found that the back-up condition is more hazardous than face-up condition. Meanwhile, three key parameters: flashover propensity, total heat release and FED, were introduced to quantitatively illustrate fire hazards of a PV panel.

  1. Testing to fulfill HACCP (Hazard Analysis Critical Control Points) requirements: principles and examples.

    Science.gov (United States)

    Gardner, I A

    1997-12-01

    On-farm HACCP (hazard analysis critical control points) monitoring requires cost-effective, yet accurate and reproducible tests that can determine the status of cows, milk, and the dairy environment. Tests need to be field-validated, and their limitations need to be established so that appropriate screening strategies can be initiated and test results can be rationally interpreted. For infections and residues of low prevalence, tests or testing strategies that are highly specific help to minimize false-positive results and excessive costs to the dairy industry. The determination of the numbers of samples to be tested in HACCP monitoring programs depends on the specific purpose of the test and the likely prevalence of the agent or residue at the critical control point. The absence of positive samples from a herd test should not be interpreted as freedom from a particular agent or residue unless the entire herd has been tested with a test that is 100% sensitive. The current lack of field-validated tests for most of the chemical and infectious agents of concern makes it difficult to ensure that the stated goals of HACCP programs are consistently achieved.

  2. HACCP (Hazard Analysis Critical Control Points): is it coming to the dairy?

    Science.gov (United States)

    Cullor, J S

    1997-12-01

    The risks and consequences of foodborne and waterborne pathogens are coming to the forefront of public health concerns, and strong pressure is being applied on agriculture for immediate implementation of on-farm controls. The FDA is considering HACCP (Hazard Analysis Critical Control Points) as the new foundation for revision of the US Food Safety Assurance Program because HACCP is considered to be a science-based, systematic approach to the prevention of food safety problems. In addition, the implementation of HACCP principles permits more government oversight through requirements for standard operating procedures and additional systems for keeping records, places primary responsibility for ensuring food safety on the food manufacturer or distributor, and may assist US food companies in competing more effectively in the world market. With the HACCP-based program in place, a government investigator should be able to determine and evaluate both current and past conditions that are critical to ensuring the safety of the food produced by the facility. When this policy is brought to the production unit, the impact for producers and veterinarians will be substantial.

  3. Landslide hazard analysis for pipelines: The case of the Simonette river crossing

    Energy Technology Data Exchange (ETDEWEB)

    Grivas, D.A.; Schultz, B.C. [Arista International, Inc., Niskayuna, NY (United States); O`Neil, G.; Rizkalla, M. [NOVA Gas Transmission Ltd., Calgary, Alberta (Canada); McGuffey, V.C.

    1995-12-31

    The overall objective of this study is to develop a probabilistic methodology to analyze landslide hazards and their effects on the safety of buried pipelines. The methodology incorporates a range of models that can accommodate differences in the ground movement modes and the amount and type of information available at various site locations. Two movement modes are considered, namely (a) instantaneous (catastrophic) slides, and (b) gradual ground movement which may result in cumulative displacements over the pipeline design life (30--40 years) that are in excess of allowable values. Probabilistic analysis is applied in each case to address the uncertainties associated with important factors that control slope stability. Availability of information ranges from relatively well studied, instrumented installations to cases where data is limited to what can be derived from topographic and geologic maps. The methodology distinguishes between procedures applied where there is little information and those that can be used when relatively extensive data is available. important aspects of the methodology are illustrated in a case study involving a pipeline located in Northern Alberta, Canada, in the Simonette river valley.

  4. Enclosure fire hazard analysis using relative energy release criteria. [burning rate and combustion control

    Science.gov (United States)

    Coulbert, C. D.

    1978-01-01

    A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.

  5. Hazard analysis and possibilities for preventing botulism originating from meat products

    Directory of Open Access Journals (Sweden)

    Vasilev Dragan

    2008-01-01

    Full Text Available The paper presents the more important data on the bacteria Clostridium botulinum, the appearance of botulism, hazard analysis and the possibilities for preventing botulism. Proteolytic strains of C.botulinum Group I, whose spores are resistant to heat, create toxins predominantly in cans containing slightly sour food items, in the event that the spores are not inactivated in the course of sterilization. Non-proteolytic strains of Group II are more sensitive to high temperatures, but they have the ability to grow and create toxins at low temperatures. Type E most often creates a toxin in vacuum-packed smoked fish, and the non-proteolytic strain type B in dried hams and certain pasteurized meat products. The following plays an important role in the prevention of botulism: reducing to a minimum meat contamination with spores of clostridia, implementing good hygiene measures and production practice during the slaughter of animals, the inactivation of spores of C. botulinum during sterilization (F>3, and, in dried hams and pasteurized products, the prevention of bacterial growth and toxin forming by maintaining low temperatures in the course of production and storage, as well as the correct use of substances that inhibit the multiplication of bacteria and the production of toxins (nitrites, table salt, etc..

  6. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    Science.gov (United States)

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  7. Hazard analysis and critical control point systems in the United States Department of Agriculture regulatory policy.

    Science.gov (United States)

    Billy, T J; Wachsmuth, I K

    1997-08-01

    Recent outbreaks of foodborne illness and studies by expert groups have established the need for fundamental change in the United States meat and poultry inspection programme to reduce the risk of foodborne illness. The Food Safety and Inspection Service (FSIS) of the United States Department of Agriculture (USDA) has embarked on a broad effort to bring about such change, with particular emphasis on the reduction of pathogenic micro-organisms in raw meat and poultry products. The publication on 25 July 1996 of the Final Rule on pathogen reduction and hazard analysis and critical control point (HACCP) systems was a major milestone in the FSIS strategy for change. The Final Rule provides a framework for change and clarifies the respective roles of industry and government in ensuring the safety of meat and poultry products. With the implementation of this Final Rule underway, the FSIS has been exploring ways in which slaughter inspection carried out under an HACCP-based system can be changed so that food safety risks are addressed more adequately and the allocation of inspection resources is improved further. In addition, the FSIS is broadening the focus of food safety activities to extend beyond slaughter and processing plants by working with industry, academia and other government agencies. Such co-operation should lead to the development of measures to improve food safety before animals reach the slaughter plant and after products leave the inspected establishment for distribution to the retail level. For the future, the FSIS believes that quantitative risk assessments will be at the core of food safety activities. Risk assessments provide the most effective means of identifying how specific pathogens and other hazards may be encountered throughout the farm-to-table chain and of measuring the potential impact of various interventions. In addition, these assessments will be used in the development and evaluation of HACCP systems. The FSIS is currently conducting a

  8. Analysis of XXI Century Disasters in the National Geophysical Data Center Historical Natural Hazard Event Databases

    Science.gov (United States)

    Dunbar, P. K.; McCullough, H. L.

    2011-12-01

    The National Geophysical Data Center (NGDC) maintains a global historical event database of tsunamis, significant earthquakes, and significant volcanic eruptions. The database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Event date, time, location, magnitude of the phenomenon, and socio-economic information are included in the database. Analysis of the NGDC event database reveals that the 21st century began with earthquakes in Gujarat, India (magnitude 7.7, 2001) and Bam, Iran (magnitude 6.6, 2003) that killed over 20,000 and 31,000 people, respectively. These numbers were dwarfed by the numbers of earthquake deaths in Pakistan (magnitude 7.6, 2005-86,000 deaths), Wenchuan, China (magnitude 7.9, 2008-87,652 deaths), and Haiti (magnitude 7.0, 2010-222,000 deaths). The Haiti event also ranks among the top ten most fatal earthquakes. The 21st century has observed the most fatal tsunami in recorded history-the 2004 magnitude 9.1 Sumatra earthquake and tsunami that caused over 227,000 deaths and 10 billion damage in 14 countries. Six years later, the 2011 Tohoku, Japan earthquake and tsunami, although not the most fatal (15,000 deaths and 5,000 missing), could cost Japan's government in excess of 300 billion-the most expensive tsunami in history. Volcanic eruptions can cause disruptions and economic impact to the airline industry, but due to their remote locations, fatalities and direct economic effects are uncommon. Despite this fact, the second most expensive eruption in recorded history occurred in the 21st century-the 2010 Merapi, Indonesia volcanic eruption that resulted in 324 deaths, 427 injuries, and $600 million in damage. NGDC integrates all natural hazard event datasets into one search interface. Users can find fatal tsunamis generated by earthquakes or volcanic eruptions. The user can then link to information about the related runup

  9. Human Capital Development: Comparative Analysis of BRICs

    Science.gov (United States)

    Ardichvili, Alexandre; Zavyalova, Elena; Minina, Vera

    2012-01-01

    Purpose: The goal of this article is to conduct macro-level analysis of human capital (HC) development strategies, pursued by four countries commonly referred to as BRICs (Brazil, Russia, India, and China). Design/methodology/approach: This analysis is based on comparisons of macro indices of human capital and innovativeness of the economy and a…

  10. Hazard analysis in active landslide areas in the State of Veracruz, Mexico

    Science.gov (United States)

    Wilde, Martina; Morales Barrera, Wendy V.; Rodriguez Elizarrarás, Sergio R.; Solleiro Rebolledo, Elizabeth; Sedov, Sergey; Terhorst, Birgit

    2016-04-01

    mass movements are analyzed in order to reconstruct complex interrelations of the causes and effects of landslide events. One of the major objectives of this research is to evaluate the potential hazard of active landslide areas. Detailed field analyzes were performed to investigate the situations and dynamics of the slope movements. Therefore, geomorphological mapping, sediment characterization as well as geophysical methods are applied. On the one hand, a detailed sediment characterization aims to identify the type of material (e.g. geotechnical attributes), on the other sediments can provide information on different activity phases, respectively movement processes in slide masses. Furthermore, the focus is placed on the determination of landslide relevant parameters and thresholds. Digital elevation models, which were generated before the onset of slope movements, are integrated in the geomorphological analysis. The poster presents the specific study sites in Veracruz and the situation of endangered slopes before and after the landslide events. It is planned to use this knowledge to model susceptibility maps for the region in the future. Moreover, field data will be used as basic information for further monitoring plans. Resulting susceptibility maps will be provided to the responsible authorities in order to support sustainable planning of settlements and infrastructure in hazardous regions.

  11. Parameter estimation in Probabilistic Seismic Hazard Analysis: current problems and some solutions

    Science.gov (United States)

    Vermeulen, Petrus

    2017-04-01

    A typical Probabilistic Seismic Hazard Analysis (PSHA) comprises identification of seismic source zones, determination of hazard parameters for these zones, selection of an appropriate ground motion prediction equation (GMPE), and integration over probabilities according the Cornell-McGuire procedure. Determination of hazard parameters often does not receive the attention it deserves, and, therefore, problems therein are often overlooked. Here, many of these problems are identified, and some of them addressed. The parameters that need to be identified are those associated with the frequency-magnitude law, those associated with earthquake recurrence law in time, and the parameters controlling the GMPE. This study is concerned with the frequency-magnitude law and temporal distribution of earthquakes, and not with GMPEs. TheGutenberg-Richter frequency-magnitude law is usually adopted for the frequency-magnitude law, and a Poisson process for earthquake recurrence in time. Accordingly, the parameters that need to be determined are the slope parameter of the Gutenberg-Richter frequency-magnitude law, i.e. the b-value, the maximum value at which the Gutenberg-Richter law applies mmax, and the mean recurrence frequency,λ, of earthquakes. If, instead of the Cornell-McGuire, the "Parametric-Historic procedure" is used, these parameters do not have to be known before the PSHA computations, they are estimated directly during the PSHA computation. The resulting relation for the frequency of ground motion vibration parameters has an analogous functional form to the frequency-magnitude law, which is described by parameters γ (analogous to the b¬-value of the Gutenberg-Richter law) and the maximum possible ground motion amax (analogous to mmax). Originally, the approach was possible to apply only to the simple GMPE, however, recently a method was extended to incorporate more complex forms of GMPE's. With regards to the parameter mmax, there are numerous methods of estimation

  12. Metagenomic Analysis of the Human Gut Microbiome

    DEFF Research Database (Denmark)

    dos Santos, Marcelo Bertalan Quintanilha

    of our results changes the way we link the gut microbiome with diseases. Our results indicate that inflammatory diseases will affect the ecological system of the human gut microbiome, reducing its diversity. Classification analysis of healthy and unhealthy individuals demonstrates that unhealthy......Understanding the link between the human gut microbiome and human health is one of the biggest scientific challenges in our decade. Because 90% of our cells are bacteria, and the microbial genome contains 200 times more genes than the human genome, the study of the human microbiome has...... the potential to impact many areas of our health. This PhD thesis is the first study to generate a large amount of experimental data on the DNA and RNA of the human gut microbiome. This was made possible by our development of a human gut microbiome array capable of profiling any human gut microbiome. Analysis...

  13. SCEC Community Modeling Environment (SCEC/CME) - Seismic Hazard Analysis Applications and Infrastructure

    Science.gov (United States)

    Maechling, P. J.; Jordan, T. H.; Kesselman, C.; Moore, R.; Minster, B.; SCEC ITR Collaboration

    2003-12-01

    The Southern California Earthquake Center (SCEC) has formed a Geoscience/IT partnership to develop an advanced information infrastructure for system-level earthquake science in Southern California. This SCEC/ITR partnership comprises SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. This collaboration recently completed the second year in a five-year National Science Foundation (NSF) funded ITR project called the SCEC Community Modeling Environment (SCEC/CME). The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed by project collaborators include a Probabilistic Seismic Hazard Analysis system called OpenSHA [Field et al., this meeting]. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERF's). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. A Rupture Dynamic Model (RDM) has also been developed that couples a rupture dynamics simulation into an anelastic wave model. The collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of SHA programs. To support computationally expensive simulations, we have constructed a grid-based system utilizing Globus software [Kesselman et al., this meeting]. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC, NPACI and Teragrid High Performance Computing Centers. We have

  14. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

    Science.gov (United States)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

    2005-12-01

    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy

  15. Advanced Rapid Imaging & Analysis for Monitoring Hazards (ARIA-MH) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a service-oriented hazard/disaster monitoring data system enabling both science and decision-support communities to monitor ground motion in areas of...

  16. Readiness to implement Hazard Analysis and Critical Control Point (HACCP) systems in Iowa schools.

    Science.gov (United States)

    Henroid, Daniel; Sneed, Jeannie

    2004-02-01

    To evaluate current food-handling practices, food safety prerequisite programs, and employee knowledge and food safety attitudes and provide baseline data for implementing Hazard Analysis and Critical Control Point (HACCP) systems in school foodservice. One member of the research team visited each school to observe food-handling practices and assess prerequisite programs using a structured observation form. A questionnaire was used to determine employees' attitudes, knowledge, and demographic information. A convenience sample of 40 Iowa schools was recruited with input from the Iowa Department of Education. Descriptive statistics were used to summarize data. One-way analysis of variance was used to assess differences in attitudes and food safety knowledge among managers, cooks, and other foodservice employees. Multiple linear regression assessed the relationship between manager and school district demographics and the food safety practice score. Proper food-handling practices were not being followed in many schools and prerequisite food safety programs for HACCP were found to be inadequate for many school foodservice operations. School foodservice employees were found to have a significant amount of food safety knowledge (15.9+/-2.4 out of 20 possible points). School districts with managers (P=.019) and employees (P=.030) who had a food handler certificate were found to have higher food safety practice scores. Emphasis on implementing prerequisite programs in preparation for HACCP is needed in school foodservice. Training programs, both basic food safety such as ServSafe and HACCP, will support improvement of food-handling practices and implementation of prerequisite programs and HACCP.

  17. Flood hazard zoning in Yasooj region, Iran, using GIS and multi-criteria decision analysis

    OpenAIRE

    Omid Rahmati; Hossein Zeinivand; Mosa Besharat

    2016-01-01

    Flood is considered to be the most common natural disaster worldwide during the last decades. Flood hazard potential mapping is required for management and mitigation of flood. The present research was aimed to assess the efficiency of analytical hierarchical process (AHP) to identify potential flood hazard zones by comparing with the results of a hydraulic model. Initially, four parameters via distance to river, land use, elevation and land slope were used in some part of the Yasooj River, I...

  18. Workplace health hazards: analysis of hotline calls over a six-year period.

    Science.gov (United States)

    Quint, J; Handley, M; Cummings, K

    1990-01-01

    Between 1981-1986 a state-based occupational health telephone hotline received more than 8,000 inquiries on over 3,000 hazardous agents. Major caller groups were employees (37%), employers (20%), health care providers, primarily physicians (19%), government agencies (12%), and labor unions (6%). Employees were the fastest growing caller group. Callers inquired about general health hazards of chemicals (65%), the relation of symptoms to work (22%), and risks to pregnancy (13%). PMID:2297067

  19. New insight into bacterial zoonotic pathogens posing health hazards to humans

    Directory of Open Access Journals (Sweden)

    Marcin Ciszewski

    2014-12-01

    Full Text Available This article presents the problem of evolutionary changes of zoonotic pathogens responsible for human diseases. Everyone is exposed to the risk of zoonotic infection, particularly employees having direct contact with animals, i.e. veterinarians, breeders, butchers and workers of animal products’ processing industry. The article focuses on pathogens monitored by the European Centre for Disease Prevention and Control (ECDC, which has been collecting statistical data on zoonoses from all European Union countries for 19 years and publishing collected data in annual epidemiological reports. Currently, the most important 11 pathogens responsible for causing human zoonotic diseases are being monitored, of which seven are bacteria: Salmonella spp., Campylobacter spp., Listeria monocytogenes, Mycobacterium bovis, Brucella spp., Coxiella burnetti and Verotoxin- producing E. coli (VTEC / Shiga-like toxin producing E. coli (STEC. As particularly important are considered foodborne pathogens. The article also includes new emerging zoonotic bacteria, which are not currently monitored by ECDC but might pose a serious epidemiological problem in a foreseeable future: Streptococcus iniae, S. suis, S. dysgalactiae and staphylococci: Staphylococcus intermedius, S. pseudintermedius. Those species have just crossed the animal-human interspecies barrier. The exact mechanism of this phenomenon remains unknown, it is connected, however, with genetic variability, capability to survive in changing environment. These abilities derive from DNA rearrangement and horizontal gene transfer between bacterial cells. Substantial increase in the number of scientific publications on this subject, observed over the last few years, illustrates the importance of the problem. Med Pr 2014;65(6:819–829

  20. [An analysis of occupational hazard in manufacturing industry in Guangzhou, China, in 2013].

    Science.gov (United States)

    Zhang, Haihong; Li, Yongqin; Zhou, Hailin; Rong, Xing; Zhu, Shaofang; He, Yinan; Zhai, Ran; Liu, Yiming

    2015-08-01

    To provide data for the occupational health supervision by analyzing the occupational health status in manufacturing industry in Guangzhou, China. The occupational health investigation was performed in 280 enterprises randomly selected from 8 industries based on industry stratification. According to the occupational health standards, 198 out of the 280 enterprises were supervised and monitored. Sample testing was performed in 3~5 workplaces where workers were exposed to the highest concentration/intensity of occupational hazard for the longest time. Comparative analyses of the overproof rates of hazard were performed among enterprises, workplaces, and testing items from different industries. The concentrations of occupational hazard in 42.93% (85/198) of enterprises and 22.96% (200/871) of workplaces were above the limit concentration. The most severe hazards were the noises in shipbuilding and wooden furniture industries and the welding fumes in shipbuilding industry. Less than 30% of enterprises were able to provide occupational health examination and periodic test reports of occupational hazard in workplaces. The rate of the workers with abnormal occupational health examination results and the need for reexamination reached 6.63% (832/12 549), and they were mostly from shipbuilding, wooden furniture, and chemical industries. The occupational health supervision should be strengthened in enterprises, and hazard from noises and dusts should be selectively controlled or reduced. The publication of relevant data and information of occupational health in enterprises should be promoted to enhance social supervision.

  1. Flood hazard zoning in Yasooj region, Iran, using GIS and multi-criteria decision analysis

    Directory of Open Access Journals (Sweden)

    Omid Rahmati

    2016-05-01

    Full Text Available Flood is considered to be the most common natural disaster worldwide during the last decades. Flood hazard potential mapping is required for management and mitigation of flood. The present research was aimed to assess the efficiency of analytical hierarchical process (AHP to identify potential flood hazard zones by comparing with the results of a hydraulic model. Initially, four parameters via distance to river, land use, elevation and land slope were used in some part of the Yasooj River, Iran. In order to determine the weight of each effective factor, questionnaires of comparison ratings on the Saaty's scale were prepared and distributed to eight experts. The normalized weights of criteria/parameters were determined based on Saaty's nine-point scale and its importance in specifying flood hazard potential zones using the AHP and eigenvector methods. The set of criteria were integrated by weighted linear combination method using ArcGIS 10.2 software to generate flood hazard prediction map. The inundation simulation (extent and depth of flood was conducted using hydrodynamic program HEC-RAS for 50- and 100-year interval floods. The validation of the flood hazard prediction map was conducted based on flood extent and depth maps. The results showed that the AHP technique is promising of making accurate and reliable prediction for flood extent. Therefore, the AHP and geographic information system (GIS techniques are suggested for assessment of the flood hazard potential, specifically in no-data regions.

  2. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, Northwest Italy

    Science.gov (United States)

    Turconi, L.; Tropeano, D.; Savio, G.; De, S. K.; Mason, P. J.

    2015-09-01

    The study area (600 km2), consisting of Orco and Soana valleys in the Western Italian Alps, experienced different types of natural hazards, typical of the whole Alpine environment. Some of the authors have been requested to draw a civil protection plan for such mountainous regions. This offered the special opportunity (1) to draw a lot of unpublished historical data, dating back several centuries mostly concerning natural hazard processes and related damages, (2) to develop original detailed geo-morphological studies in a region still poorly known, (3) to prepare detailed thematic maps illustrating landscape components related to natural conditions and hazards, (4) to thoroughly check present-day situations in the area compared to the effects of past events and (5) to find adequate natural hazard scenarios for all sites exposed to risk. The method of work has been essentially to compare archival findings with field evidence in order to assess natural hazard processes, their occurrence and magnitude, and to arrange all such elements in a database for GIS-supported thematic maps. Several types of natural hazards, such as landslides, rockfalls, debris flows, stream floods and snow avalanches cause huge damage to lives and properties (housings, roads, tourist sites). We aim to obtain newly acquired knowledge in this large, still poorly understood area as well as develop easy-to-interpret products such as natural risk maps.

  3. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  4. Human Performance Modeling for Dynamic Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  5. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    Science.gov (United States)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  6. Analysis of human emotion in human-robot interaction

    Science.gov (United States)

    Blar, Noraidah; Jafar, Fairul Azni; Abdullah, Nurhidayu; Muhammad, Mohd Nazrin; Kassim, Anuar Muhamed

    2015-05-01

    There is vast application of robots in human's works such as in industry, hospital, etc. Therefore, it is believed that human and robot can have a good collaboration to achieve an optimum result of work. The objectives of this project is to analyze human-robot collaboration and to understand humans feeling (kansei factors) when dealing with robot that robot should adapt to understand the humans' feeling. Researches currently are exploring in the area of human-robot interaction with the intention to reduce problems that subsist in today's civilization. Study had found that to make a good interaction between human and robot, first it is need to understand the abilities of each. Kansei Engineering in robotic was used to undergo the project. The project experiments were held by distributing questionnaire to students and technician. After that, the questionnaire results were analyzed by using SPSS analysis. Results from the analysis shown that there are five feelings which significant to the human in the human-robot interaction; anxious, fatigue, relaxed, peaceful, and impressed.

  7. Critical evaluation of key evidence on the human health hazards of exposure to bisphenol A

    Science.gov (United States)

    Hengstler, JG; Foth, H; Gebel, T; Kramer, P-J; Lilienblum, W; Schweinfurth, H; Völkel, W; Wollin, K-M; Gundert-Remy, U

    2011-01-01

    Despite the fact that more than 5000 safety-related studies have been published on bisphenol A (BPA), there seems to be no resolution of the apparently deadlocked controversy as to whether exposure of the general population to BPA causes adverse effects due to its estrogenicity. Therefore, the Advisory Committee of the German Society of Toxicology reviewed the background and cutting-edge topics of this BPA controversy. The current tolerable daily intake value (TDI) of 0.05 mg/kg body weight [bw]/day, derived by the European Food Safety Authority (EFSA), is mainly based on body weight changes in two- and three-generation studies in mice and rats. Recently, these studies and the derivation of the TDI have been criticized. After having carefully considered all arguments, the Committee had to conclude that the criticism was scientifically not justified; moreover, recently published additional data further support the reliability of the two-and three-generation studies demonstrating a lack of estrogen-dependent effects at and below doses on which the current TDI is based. A frequently discussed topic is whether doses below 5 mg/ kg bw/day may cause adverse health effects in laboratory animals. Meanwhile, it has become clear that positive results from some explorative studies have not been confirmed in subsequent studies with higher numbers of animals or a priori defined hypotheses. Particularly relevant are some recent studies with negative outcomes that addressed effects of BPA on the brain, behavior, and the prostate in rodents for extrapolation to the human situation. The Committee came to the conclusion that rodent data can well be used as a basis for human risk evaluation. Currently published conjectures that rats are insensitive to estrogens compared to humans can be refuted. Data from toxicokinetics studies show that the half-life of BPA in adult human subjects is less than 2 hours and BPA is completely recovered in urine as BPA-conjugates. Tissue deconjugation

  8. Human Health Hazards from Antimicrobial-Resistant Escherichia coli of Animal Origin

    DEFF Research Database (Denmark)

    Hammerum, A. M.; Heuer, Ole Eske

    2009-01-01

    Because of the intensive use of antimicrobial agents in food animal production, meat is frequently contaminated with antimicrobial-resistant Escherichia coli. Humans can be colonized with E. coli of animal origin, and because of resistance to commonly used antimicrobial agents, these bacteria may...... cause infections for which limited therapeutic options are available. This may lead to treatment failure and can have serious consequences for the patient. Furthermore, E. coli of animal origin may act as a donor of antimicrobial resistance genes for other pathogenic E. coli. Thus, the intensive use...

  9. Radon Gas-Hazardous Element for Human Life Really Found in the Environment

    OpenAIRE

    Rizo Maestre, Carlos; Chinchón Yepes, Servando

    2015-01-01

    Radon is a gas that is considered as an extremely harmful element to people’s health by the World Health Organization (WHO). Radon is a type of radioactive gaseous element that is present in almost all materials with which buildings are constructed, as well as in the areas in which they are raised. For this reason, one should take into consideration in what proportion is radon harmful and in what proportion it surrounds the human environment. In Spain, the CTE (Technical Building Code) does n...

  10. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    Science.gov (United States)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  11. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    MULKEY, C.H.

    1999-07-06

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.

  12. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  13. The RiskScape System - a tool for quantitative multi-risk analysis for natural hazards.

    Science.gov (United States)

    Schmidt, J.; Reese, S.; Matcham, I.; King, A.; Bell, R.

    2009-04-01

    This paper introduces a generic framework for multi-risk modelling developed in the project ‘Regional RiskScape' at the Research Organization GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand. Our goal was to develop a generic technology for modelling risks from multiple natural hazards and for multiple risk elements. The framework is independent on the specific nature of the individual hazard and individual risk element. A software prototype has been developed which is capable of ‘plugging in' various natural hazards and risk elements without reconfiguring / adapting the generic software framework. To achieve that goal we developed a set of standards for treating the fundamental components of a risk model: hazards, assets (risk elements), and vulnerability models (or fragility functions). Thus, the developed prototype system is able to understand any hazard, asset, or fragility model which is provided to the system according to that standard. We tested the software prototype for modelling earthquake, volcanic, flood, wind, and tsunami risks for urban centres in New Zealand.

  14. Quantitative analysis of human behavior.

    Science.gov (United States)

    Iacovitti, G

    2010-01-01

    Many aspects of individual as well as social behaviours of human beings can be analyzed in a quantitative way using typical scientific methods, based on empirical measurements and mathematical inference. Measurements are made possible today by the large variety of sensing devices, while formal models are synthesized using modern system and information theories.

  15. Industrial hazard and safety handbook

    CERN Document Server

    King, Ralph W

    1979-01-01

    Industrial Hazard and Safety Handbook (Revised Impression) describes and exposes the main hazards found in industry, with emphasis on how these hazards arise, are ignored, are identified, are eliminated, or are controlled. These hazard conditions can be due to human stresses (for example, insomnia), unsatisfactory working environments, as well as secret industrial processes. The book reviews the cost of accidents, human factors, inspections, insurance, legal aspects, planning for major emergencies, organization, and safety measures. The text discusses regulations, codes of practice, site layou

  16. Spatial Temporal Analysis of Urban Heat Hazard on Education Area (University of Indonesia

    Directory of Open Access Journals (Sweden)

    Adi Wibowo

    2017-07-01

    Full Text Available As education area, campus or university is full with various activities which have an impact on the existence of land-use or land-cover. The variation of activities dynamically change the shape of land-use or land-cover within the campus area, thus also create variations in Land Surface Temperature (LST. The LST are impacting the coziness of human activity especially when reaches more than 30 oC. This study used the term Urban Heat Signature (UHS to explain LST in different land-use or land-cover types. The objective of this study is to examine UHS as an Urban Heat Hazard (UHH based on Universal Temperature Climate Index (UTCI and Effective Temperature Index (ETI in University of Indonesia. Thermal bands of Landsat 8 images (the acquisition year 2013-2015 were used to create LST model. A ground data known as Air Surface Temperature (AST were used to validate the model. The result showed an increased level of maximum temperature during September-October since 2013 until 2014. The maximum temperature was reduced in October 2014, however it increased again in August 2015. The UTCI showed “moderate” and “strong heat stress”, while EFI showed “uncomfortable” and “very uncomfortable” categories during that period. This research concluded that build up area in UI Campus highest temperature on UI campus based on UHS. Range UHS in Campus UI on 2013 (21.8-31.1oC, 2014 (25.0-36.2oC and 2015 (24.9-38.2oC. This maximum UHS on September (2014 and 2015 put on levelling UTCI included range temperature 32-35oC, with an explanation of sensation temperature is warm and sensation of comfort is Uncomfortable, Psychology with  Increasing Stress Case by Sweating and Blood Flow and Health category is Cardiovascular Embarrassment. This UHS occurs in September will give impact on psychology and health, that’s become the UHH of the living on education area.

  17. LIFE CYCLE ANALYSIS OF HAZARDOUS WASTE AND RECYCLABLE ORIGIN OF HOUSEHOLD

    Directory of Open Access Journals (Sweden)

    Patrícia Raquel da Silva Sottoriva

    2011-09-01

    Full Text Available As the sustainable development that the society aims is based on economic, social and environmental factors, it can be said that the environmental crisis has as the component factors: natural resources, population and pollution. To reduce the pressure that human activities have on the environment, it is necessary to know the production process, inputs and outputs, to reduce potential problems such as waste and facilitate opportunities for system optimization. In this context it was investigated the life cycle of waste and household hazardous recyclable items to identify possibilities for reducing impact on supply chains. As a result it was found that the raw material most used by the paper industry is pine and eucalyptus plantations and some industries also use sugar cane. From the growing process until the paper is industrialized, there is a large demand of time. The cutting of eucalyptus should be done between 5 and 7 years, since the pine requires 10 to 12 years. After used, the papers can and should be recycled. When recycling 1 ton of paper 29.2 m3 of water can be saved, 3.51 MWh of electricity 76 and 22 trees when compared to traditional production processes. The cultivation of trees also contributes to carbon capture and sequestration. The eucalyptus ages 2, 4, 6, 8 years fixing concentrations of 11.12, 18.55, 80.91 and 97.86 t / ha, respectively. The paper can also be designed to compost due to biodegradability. The metal, glass and plastics are not biodegradable and inorganic nature needing to be recycled or reused. Recycling 1 ton of plastic is no economy of 5.3 MWh and 500 kg of oil. Even with the gains of environmental, social and economic impacts of recycling compared to traditional processes, in Brazil, the percentage of recycling paper and glass and PET bottles are less than 60%. The recycling of aluminum cans and steel exceeds 90%. Lamps and batteries are materials that are inadequately provide for contamination to the

  18. ANALYSIS AND MITIGATION OF X-RAY HAZARD GENERATED FROM HIGH INTENSITY LASER-TARGET INTERACTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, R.; Liu, J.C.; Prinz, A.A.; Rokni, S.H.; Woods, M.; Xia, Z.; /SLAC

    2011-03-21

    Interaction of a high intensity laser with matter may generate an ionizing radiation hazard. Very limited studies have been made, however, on the laser-induced radiation protection issue. This work reviews available literature on the physics and characteristics of laser-induced X-ray hazards. Important aspects include the laser-to-electron energy conversion efficiency, electron angular distribution, electron energy spectrum and effective temperature, and bremsstrahlung production of X-rays in the target. The possible X-ray dose rates for several femtosecond Ti:sapphire laser systems used at SLAC, including the short pulse laser system for the Matter in Extreme Conditions Instrument (peak power 4 TW and peak intensity 2.4 x 10{sup 18} W/cm{sup 2}) were analysed. A graded approach to mitigate the laser-induced X-ray hazard with a combination of engineered and administrative controls is also proposed.

  19. From observational to analytical morphology of the stratum corneum: progress avoiding hazardous animal and human testings.

    Science.gov (United States)

    Piérard, Gérald E; Courtois, Justine; Ritacco, Caroline; Humbert, Philippe; Fanian, Ferial; Piérard-Franchimont, Claudine

    2015-01-01

    In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD) and cyanoacrylate skin surface strippings (CSSSs). Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping.

  20. Development and Analysis of a Hurricane Hazard Model for Disaster Risk Assessment in Central America

    Science.gov (United States)

    Pita, G. L.; Gunasekera, R.; Ishizawa, O. A.

    2014-12-01

    Hurricane and tropical storm activity in Central America has consistently caused over the past decades thousands of casualties, significant population displacement, and substantial property and infrastructure losses. As a component to estimate future potential losses, we present a new regional probabilistic hurricane hazard model for Central America. Currently, there are very few openly available hurricane hazard models for Central America. This resultant hazard model would be used in conjunction with exposure and vulnerability components as part of a World Bank project to create country disaster risk profiles that will assist to improve risk estimation and provide decision makers with better tools to quantify disaster risk. This paper describes the hazard model methodology which involves the development of a wind field model that simulates the gust speeds at terrain height at a fine resolution. The HURDAT dataset has been used in this study to create synthetic events that assess average hurricane landfall angles and their variability at each location. The hazard model also then estimates the average track angle at multiple geographical locations in order to provide a realistic range of possible hurricane paths that will be used for risk analyses in all the Central-American countries. This probabilistic hurricane hazard model is then also useful for relating synthetic wind estimates to loss and damage data to develop and calibrate existing empirical building vulnerability curves. To assess the accuracy and applicability, modeled results are evaluated against historical events, their tracks and wind fields. Deeper analyses of results are also presented with a special reference to Guatemala. The findings, interpretations, and conclusions expressed in this paper are entirely those of the authors. They do not necessarily represent the views of the International Bank for Reconstruction and Development/World Bank and its affiliated organizations, or those of the

  1. A hazard rate analysis of fertility using duration data from Malaysia.

    Science.gov (United States)

    Chang, C

    1988-01-01

    Data from the Malaysia Fertility and Family Planning Survey (MFLS) of 1974 were used to investigate the effects of biological and socioeconomic variables on fertility based on the hazard rate model. Another study objective was to investigate the robustness of the findings of Trussell et al. (1985) by comparing the findings of this study with theirs. The hazard rate of conception for the jth fecundable spell of the ith woman, hij, is determined by duration dependence, tij, measured by the waiting time to conception; unmeasured heterogeneity (HETi; the time-invariant variables, Yi (race, cohort, education, age at marriage); and time-varying variables, Xij (age, parity, opportunity cost, income, child mortality, child sex composition). In this study, all the time-varying variables were constant over a spell. An asymptotic X2 test for the equality of constant hazard rates across birth orders, allowing time-invariant variables and heterogeneity, showed the importance of time-varying variables and duration dependence. Under the assumption of fixed effects heterogeneity and the Weibull distribution for the duration of waiting time to conception, the empirical results revealed a negative parity effect, a negative impact from male children, and a positive effect from child mortality on the hazard rate of conception. The estimates of step functions for the hazard rate of conception showed parity-dependent fertility control, evidence of heterogeneity, and the possibility of nonmonotonic duration dependence. In a hazard rate model with piecewise-linear-segment duration dependence, the socioeconomic variables such as cohort, child mortality, income, and race had significant effects, after controlling for the length of the preceding birth. The duration dependence was consistant with the common finding, i.e., first increasing and then decreasing at a slow rate. The effects of education and opportunity cost on fertility were insignificant.

  2. Earthquake-induced crustal deformation and consequences for fault displacement hazard analysis of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Gürpinar, Aybars, E-mail: aybarsgurpinar2007@yahoo.com [Nuclear & Risk Consultancy, Anisgasse 4, 1221 Vienna (Austria); Serva, Leonello, E-mail: lserva@alice.it [Independent Consultant, Via dei Dauni 1, 00185 Rome (Italy); Livio, Franz, E-mail: franz.livio@uninsubria.it [Dipartimento di Scienza ed Alta Tecnologia, Università degli Studi dell’Insubria, Via Velleggio, 11, 22100 Como (Italy); Rizzo, Paul C., E-mail: paul.rizzo@rizzoasoc.com [RIZZO Associates, 500 Penn Center Blvd., Suite 100, Pittsburgh, PA 15235 (United States)

    2017-01-15

    Highlights: • A three-step procedure to incorporate coseismic deformation into PFDHA. • Increased scrutiny for faults in the area permanently deformed by future strong earthquakes. • These faults share with the primary structure the same time window for fault capability. • VGM variation may occur due to tectonism that has caused co-seismic deformation. - Abstract: Readily available interferometric data (InSAR) of the coseismic deformation field caused by recent seismic events clearly show that major earthquakes produce crustal deformation over wide areas, possibly resulting in significant stress loading/unloading of the crust. Such stress must be considered in the evaluation of seismic hazards of nuclear power plants (NPP) and, in particular, for the potential of surface slip (i.e., probabilistic fault displacement hazard analysis - PFDHA) on both primary and distributed faults. In this study, based on the assumption that slip on pre-existing structures can represent the elastic response of compliant fault zones to the permanent co-seismic stress changes induced by other major seismogenic structures, we propose a three-step procedure to address fault displacement issues and consider possible influence of surface faulting/deformation on vibratory ground motion (VGM). This approach includes: (a) data on the presence and characteristics of capable faults, (b) data on recognized and/or modeled co-seismic deformation fields and, where possible, (c) static stress transfer between source and receiving faults of unknown capability. The initial step involves the recognition of the major seismogenic structures nearest to the site and their characterization in terms of maximum expected earthquake and the time frame to be considered for determining their “capability” (as defined in the International Atomic Energy Agency - IAEA Specific Safety Guide SSG-9). Then a GIS-based buffer approach is applied to identify all the faults near the NPP, possibly influenced by

  3. Use of fragile geologic structures as indicators of unexceeded ground motions and direct constraints on probabilistic seismic hazard analysis

    Science.gov (United States)

    Baker, J.W.; Whitney, John W.; Hanks, Thomas C.; Abramson, Norman A.; Board, Mark P.

    2013-01-01

    We present a quantitative procedure for constraining probabilistic seismic hazard analysis results at a given site, based on the existence of fragile geologic structures at that site. We illustrate this procedure by analyzing precarious rocks and undamaged lithophysae at Yucca Mountain, Nevada. The key metric is the probability that the feature would have survived to the present day, assuming that the hazard results are correct. If the fragile geologic structure has an extremely low probability of having survived (which would be inconsistent with the observed survival of the structure), then the calculations illustrate how much the hazard would have to be reduced to result in a nonnegligible survival probability. The calculations are able to consider structures the predicted failure probabilities of which are a function of one or more ground‐motion parameters, as well as structures that either rapidly or slowly evolved to their current state over time. These calculations are the only way to validate seismic hazard curves over long periods of time.

  4. Use of remote sensing and seismotectonic parameters for seismic hazard analysis of Bangalore

    Directory of Open Access Journals (Sweden)

    T. G. Sitharam

    2006-01-01

    Full Text Available Deterministic Seismic Hazard Analysis (DSHA for the Bangalore, India has been carried out by considering the past earthquakes, assumed subsurface fault rupture lengths and point source synthetic ground motion model. The sources have been identified using satellite remote sensing images and seismotectonic atlas map of India and relevant field studies. Maximum Credible Earthquake (MCE has been determined by considering the regional seismotectonic activity in about 350 km radius around Bangalore. The seismotectonic map has been prepared by considering the faults, lineaments, shear zones in the area and past moderate earthquakes of more than 470 events having the moment magnitude of 3.5 and above. In addition, 1300 number of earthquake tremors having moment magnitude of less than 3.5 has been considered for the study. Shortest distance from the Bangalore to the different sources is measured and then Peak Horizontal Acceleration (PHA is calculated for the different sources and moment magnitude of events using regional attenuation relation for peninsular India. Based on Wells and Coppersmith (1994 relationship, subsurface fault rupture length of about 3.8% of total length of the fault shown to be matching with past earthquake events in the area. To simulate synthetic ground motions, Boore (1983, 2003 SMSIM programs have been used and the PHA for the different locations is evaluated. From the above approaches, the PHA of 0.15 g was established. This value was obtained for a maximum credible earthquake having a moment magnitude of 5.1 for a source Mandya-Channapatna-Bangalore lineament. This particular source has been identified as a vulnerable source for Bangalore. From this study, it is very clear that Bangalore area can be described as seismically moderately active region. It is also recommended that southern part of Karnataka in particular Bangalore, Mandya and Kolar, need to be upgraded from current Indian Seismic Zone II to Seismic Zone III

  5. Flood hazards analysis based on changes of hydrodynamic processes in fluvial systems of Sao Paulo, Brazil.

    Science.gov (United States)

    Simas, Iury; Rodrigues, Cleide

    2016-04-01

    The metropolis of Sao Paulo, with its 7940 Km² and over 20 million inhabitants, is increasingly being consolidated with disregard for the dynamics of its fluvial systems and natural limitations imposed by fluvial terraces, floodplains and slopes. Events such as floods and flash floods became particularly persistent mainly in socially and environmentally vulnerable areas. The Aricanduva River basin was selected as the ideal area for the development of the flood hazard analysis since it presents the main geological and geomorphological features found in the urban site. According to studies carried out by Anthropic Geomorphology approach in São Paulo, to study this phenomenon is necessary to take into account the original hydromorphological systems and its functional conditions, as well as in which dimensions the Anthropic factor changes the balance between the main variables of surface processes. Considering those principles, an alternative model of geographical data was proposed and enabled to identify the role of different driving forces in terms of spatial conditioning of certain flood events. Spatial relationships between different variables, such as anthropogenic and original morphology, were analyzed for that purpose in addition to climate data. The surface hydrodynamic tendency spatial model conceived for this study takes as key variables: 1- The land use present at the observed date combined with the predominant lithological group, represented by a value ranging 0-100, based on indexes of the National Soil Conservation Service (NSCS-USA) and the Hydraulic Technology Center Foundation (FCTH-Brazil) to determine the resulting balance of runoff/infiltration. 2- The original slope, applying thresholds from which it's possible to determine greater tendency for runoff (in percents). 3- The minimal features of relief, combining the curvature of surface in plant and profile. Those three key variables were combined in a Geographic Information System in a series of

  6. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

    Science.gov (United States)

    Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.

  7. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    Science.gov (United States)

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  8. International collaboration towards a global analysis of volcanic hazards and risk

    Science.gov (United States)

    Loughlin, Susan; Duncan, Melanie; Volcano Model Network, Global

    2017-04-01

    Approximately 800 million people live within 100km of an active volcano and such environments are often subject to multiple natural hazards. Volcanic eruptions and related volcanic hazards are less frequent than many other natural hazards but when they occur they can have immediate and long-lived impacts so it is important that they are not overlooked in a multi-risk assessment. Based on experiences to date, it's clear that natural hazards communities need to address a series of challenges in order to move to a multi-hazard approach to risk assessment. Firstly, the need to further develop synergies and coordination within our own communities at local to global scales. Secondly, we must collaborate and identify opportunities for harmonisation across natural hazards communities: for instance, by ensuring our databases are accessible and meet certain standards, a variety of users will be then able to contribute and access data. Thirdly, identifying the scale and breadth of multi-risk assessments needs to be co-defined with decision-makers, which will constrain the relevant potential cascading/compounding hazards to consider. Fourthly, and related to all previous points, multi-risk assessments require multi-risk knowledge, requiring interdisciplinary perspectives, as well as discipline specific expertise. The Global Volcano Model network (GVM) is a growing international network of (public and private) institutions and organisations, which have the collective aim of identifying and reducing volcanic risks. GVM's values embody collaboration, scientific excellence, open-access (wherever possible) and, above all, public good. GVM highlights and builds on the best research available within the volcanological community, drawing on the work of IAVCEI Commissions and other research initiatives. It also builds on the local knowledge of volcano observatories and collaborating scientists, ensuring that global efforts are underpinned by local evidence. Some of GVM's most

  9. Fire hazard analysis of alcohol aqueous solution and Chinese liquor based on flash point

    Science.gov (United States)

    Chen, Qinpei; Kang, Guoting; Zhou, Tiannian; Wang, Jian

    2017-10-01

    In this paper, a series of experiments were conducted to study the flash point of alcohol aqueous solution and Chinese liquor. The fire hazard of the experimental results was analysed based on the standard GB50160-2008 of China. The result shows open-cup method doesn’t suit to alcohol aqueous solution. On the other hand, the closed-cup method shows good applicability. There is a non-linear relationship between closed-cup flash point and alcohol volume concentration. And the prediction equation established in this paper shows good fitting to the flash point and fire hazard classification of Chinese liquor.

  10. Water-molten uranium hazard analysis. Final report. LATA report No. 92

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, P.S.; Rigdon, L.D.; Donham, B.J.

    1979-08-21

    The hazard potential of cooling water leakage into the crucible of molten uranium in the MARS laser isotope separation experiment was investigated. A vapor-phase explosion is highly unlikely in any of the scenarios defined for MARS. For the operating basis accident, the gas pressure transient experienced by the vessel wall is 544 psia peak with a duration of 200 ..mu..s, and the peak hoop stress is about 20,000 psi in a 0.5-in. wall. Design and procedural recommendations are given for reducing the hazard. (DLC)

  11. From observational to analytical morphology of the stratum corneum: progress avoiding hazardous animal and human testings

    Directory of Open Access Journals (Sweden)

    Piérard GE

    2015-03-01

    Full Text Available Gérald E Piérard,1,2 Justine Courtois,1 Caroline Ritacco,1 Philippe Humbert,2,3 Ferial Fanian,3 Claudine Piérard-Franchimont1,4,5 1Laboratory of Skin Bioengineering and Imaging (LABIC, Department of Clinical Sciences, Liège University, Liège, Belgium; 2University of Franche-Comté, Besançon, France; 3Department of Dermatology, University Hospital Saint-Jacques, Besançon, France; 4Department of Dermatopathology, Unilab Lg, University Hospital of Liège, Liège, Belgium; 5Department of Dermatology, Regional Hospital of Huy, Huy, Belgium Background: In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD and cyanoacrylate skin surface strippings (CSSSs. Methods: Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results: With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion: A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. Keywords: irritation, morphometry, quantitative morphology, stripping

  12. Human Reliability Analysis for Digital Human-Machine Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2014-06-01

    This paper addresses the fact that existing human reliability analysis (HRA) methods do not provide guidance on digital human-machine interfaces (HMIs). Digital HMIs are becoming ubiquitous in nuclear power operations, whether through control room modernization or new-build control rooms. Legacy analog technologies like instrumentation and control (I&C) systems are costly to support, and vendors no longer develop or support analog technology, which is considered technologically obsolete. Yet, despite the inevitability of digital HMI, no current HRA method provides guidance on how to treat human reliability considerations for digital technologies.

  13. Ground landslide hazard potency using geoelectrical resistivity analysis and VS30, case study at geophysical station, Lembang, Bandung

    Science.gov (United States)

    Rohadi, Supriyanto; Sakya, Andi Eka; Masturyono, Murjaya, Jaya; Sunardi, Bambang; Rasmid, Ngadmanto, Drajat; Susilanto, Pupung; Nugraha, Jimmi; Pakpahan, Suliyanti

    2017-07-01

    We have conducted geoelectric resistivity and shear wave velocity (Vs30) study to identify the landslide potential hazard, around Geophysics Station Lembang, Bandung (107,617° E and 6,825° S). The the geoelectric analysis using Dipole-Dipole resitivity configuration, while shear wave velocity analysis performed using the Multichannel Analysis of Surface Wave (MASW). The study results indicate that the assumed soil or clay depth from the electrical resistivity observation was in accordance with the confirmed soil or clay depth by the MASW investigation. Based on these conditions, indicate the high potential of landsliding in this area, landslide potential supported by high slope angle in this area.

  14. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  15. Space Mission Human Reliability Analysis (HRA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this project is to extend current ground-based Human Reliability Analysis (HRA) techniques to a long-duration, space-based tool to more effectively...

  16. ARSENIC SPECIATION ANALYSIS IN HUMAN SALIVA

    Science.gov (United States)

    Background: Determination of arsenic species in human saliva is potentially useful for biomonitoring of human exposure to arsenic and for studying arsenic metabolism. However, there is no report on the speciation analysis of arsenic in saliva. Methods: Arsenic species in saliva ...

  17. Incorporating the effects of topographic amplification in the analysis of earthquake-induced landslide hazards using logistic regression

    Science.gov (United States)

    Lee, S. T.; Yu, T. T.; Peng, W. F.; Wang, C. L.

    2010-12-01

    Seismic-induced landslide hazards are studied using seismic shaking intensity based on the topographic amplification effect. The estimation of the topographic effect includes the theoretical topographic amplification factors and the corresponding amplified ground motion. Digital elevation models (DEM) with a 5-m grid space are used. The logistic regression model and the geographic information system (GIS) are used to perform the seismic landslide hazard analysis. The 99 Peaks area, located 3 km away from the ruptured fault of the Chi-Chi earthquake, is used to test the proposed hypothesis. An inventory map of earthquake-triggered landslides is used to produce a dependent variable that takes a value of 0 (no landslides) or 1 (landslides). A set of independent parameters, including lithology, elevation, slope gradient, slope aspect, terrain roughness, land use, and Arias intensity (Ia) with the topographic effect. Subsequently, logistic regression is used to find the best fitting function to describe the relationship between the occurrence and absence of landslides within an individual grid cell. The results of seismic landslide hazard analysis that includes the topographic effect (AUROC = 0.890) are better than those of the analysis without it (AUROC = 0.874).

  18. Human reliability analysis of control room operators

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)

  19. Integrative analysis of 111 reference human epigenomes

    Science.gov (United States)

    Kundaje, Anshul; Meuleman, Wouter; Ernst, Jason; Bilenky, Misha; Yen, Angela; Kheradpour, Pouya; Zhang, Zhizhuo; Heravi-Moussavi, Alireza; Liu, Yaping; Amin, Viren; Ziller, Michael J; Whitaker, John W; Schultz, Matthew D; Sandstrom, Richard S; Eaton, Matthew L; Wu, Yi-Chieh; Wang, Jianrong; Ward, Lucas D; Sarkar, Abhishek; Quon, Gerald; Pfenning, Andreas; Wang, Xinchen; Claussnitzer, Melina; Coarfa, Cristian; Harris, R Alan; Shoresh, Noam; Epstein, Charles B; Gjoneska, Elizabeta; Leung, Danny; Xie, Wei; Hawkins, R David; Lister, Ryan; Hong, Chibo; Gascard, Philippe; Mungall, Andrew J; Moore, Richard; Chuah, Eric; Tam, Angela; Canfield, Theresa K; Hansen, R Scott; Kaul, Rajinder; Sabo, Peter J; Bansal, Mukul S; Carles, Annaick; Dixon, Jesse R; Farh, Kai-How; Feizi, Soheil; Karlic, Rosa; Kim, Ah-Ram; Kulkarni, Ashwinikumar; Li, Daofeng; Lowdon, Rebecca; Mercer, Tim R; Neph, Shane J; Onuchic, Vitor; Polak, Paz; Rajagopal, Nisha; Ray, Pradipta; Sallari, Richard C; Siebenthall, Kyle T; Sinnott-Armstrong, Nicholas; Stevens, Michael; Thurman, Robert E; Wu, Jie; Zhang, Bo; Zhou, Xin; Beaudet, Arthur E; Boyer, Laurie A; De Jager, Philip; Farnham, Peggy J; Fisher, Susan J; Haussler, David; Jones, Steven; Li, Wei; Marra, Marco; McManus, Michael T; Sunyaev, Shamil; Thomson, James A; Tlsty, Thea D; Tsai, Li-Huei; Wang, Wei; Waterland, Robert A; Zhang, Michael; Chadwick, Lisa H; Bernstein, Bradley E; Costello, Joseph F; Ecker, Joseph R; Hirst, Martin; Meissner, Alexander; Milosavljevic, Aleksandar; Ren, Bing; Stamatoyannopoulos, John A; Wang, Ting; Kellis, Manolis

    2015-01-01

    The reference human genome sequence set the stage for studies of genetic variation and its association with human disease, but a similar reference has lacked for epigenomic studies. To address this need, the NIH Roadmap Epigenomics Consortium generated the largest collection to-date of human epigenomes for primary cells and tissues. Here, we describe the integrative analysis of 111 reference human epigenomes generated as part of the program, profiled for histone modification patterns, DNA accessibility, DNA methylation, and RNA expression. We establish global maps of regulatory elements, define regulatory modules of coordinated activity, and their likely activators and repressors. We show that disease and trait-associated genetic variants are enriched in tissue-specific epigenomic marks, revealing biologically-relevant cell types for diverse human traits, and providing a resource for interpreting the molecular basis of human disease. Our results demonstrate the central role of epigenomic information for understanding gene regulation, cellular differentiation, and human disease. PMID:25693563

  20. A cross-hazard analysis of terse message retransmission on Twitter

    Science.gov (United States)

    Sutton, Jeannette; Gibson, C. Ben; Phillips, Nolan Edward; Spiro, Emma S.; League, Cedar; Johnson, Britta; Fitzhugh, Sean M.; Butts, Carter T.

    2015-01-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes—local network properties, message content, and message style—that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates. PMID:26627233

  1. ON-SITE MERCURY ANALYSIS OF SOIL AT HAZARDOUS WASTE SITES BY IMMUNOASSAY AND ASV

    Science.gov (United States)

    Two field methods for Hg, immunoassay and anodic stripping voltammetry (ASV), that can provide onsite results for quick decisions at hazardous waste sites were evaluated. Each method was applied to samples from two Superfund sites that contain high levels of Hg; Sulphur Bank Me...

  2. Flow-type failures in fine-grained soils : An important aspect in landslide hazard analysis

    NARCIS (Netherlands)

    Van Asch, T.W.J.; Malet, J.P.

    2009-01-01

    Forecasting the possibility of flow-type failures within a slow-moving landslide mass is rarely taken into account in quantitative hazard assessments. Therefore, this paper focuses on the potential transition of sliding blocks (slumps) into flow-like processes due to the generation of excess pore

  3. A cross-hazard analysis of terse message retransmission on Twitter.

    Science.gov (United States)

    Sutton, Jeannette; Gibson, C Ben; Phillips, Nolan Edward; Spiro, Emma S; League, Cedar; Johnson, Britta; Fitzhugh, Sean M; Butts, Carter T

    2015-12-01

    For decades, public warning messages have been relayed via broadcast information channels, including radio and television; more recently, risk communication channels have expanded to include social media sites, where messages can be easily amplified by user retransmission. This research examines the factors that predict the extent of retransmission for official hazard communications disseminated via Twitter. Using data from events involving five different hazards, we identity three types of attributes--local network properties, message content, and message style--that jointly amplify and/or attenuate the retransmission of official communications under imminent threat. We find that the use of an agreed-upon hashtag and the number of users following an official account positively influence message retransmission, as does message content describing hazard impacts or emphasizing cohesion among users. By contrast, messages directed at individuals, expressing gratitude, or including a URL were less widely disseminated than similar messages without these features. Our findings suggest that some measures commonly taken to convey additional information to the public (e.g., URL inclusion) may come at a cost in terms of message amplification; on the other hand, some types of content not traditionally emphasized in guidance on hazard communication may enhance retransmission rates.

  4. Hazard analysis and critical control point evaluation of school food programs in Bahrain.

    Science.gov (United States)

    Ali, A A; Spencer, N J

    1996-03-01

    Hazard analyses were conducted in six food preparation sites and 16 school canteens in the State of Bahrain. Sandwiches made with cheese, meat, eggs, liver, and beef burgers were prepared in small shops or a bakery outside schools. Foods were cooked between 4 and 5 A.M. Time-temperature exposure during cooking was adequate to kill vegetative microbes and their spores, but potential for recontamination existed from the hands of food workers, utensils, and cloths and sponges used for wiping. All foods were left at room temperature before they were transported in vans to schools where they were also kept at room temperature between 17 degrees C and 41 degrees C. Air temperature inside the canteens during this investigation was between 18.5 and 28 degrees C with a relative humidity of 65 to 70%. Hazard analyses, which included observation of operations inside school canteens and sites of food preparation, measuring temperatures, and interviewing workers and consumers (teachers, students) were carried out. Hazards were primarily associated with preparation of foods long before they were consumed, physical touching of products, and holding foods at room temperature after preparation. Holding foods at room temperature would have allowed germination of bacterial spores and multiplication of microbes. Reheating of foods was not practiced. Health promoters must be aware of these hazards and need to educate food workers, administrators, and the public on the methods of prevention.

  5. Southwestern Oregon's Biscuit Fire: An Analysis of Forest Resources, Fire Severity, and Fire Hazard

    Science.gov (United States)

    David L. Azuma; Glenn A. Christensen

    2005-01-01

    This study compares pre-fire field inventory data (collected from 1993 to 1997) in relation to post-fire mapped fire severity classes and the Fire and Fuels Extension of the Forest Vegetation Simulator growth and yield model measures of fire hazard for the portion of the Siskiyou National Forest in the 2002 Biscuit fire perimeter of southwestern Oregon. Post-fire...

  6. Limitations of Cox Proportional Hazards Analysis in Mortality Prediction of Patients with Acute Coronary Syndrome

    Directory of Open Access Journals (Sweden)

    Babińska Magdalena

    2015-12-01

    Full Text Available The aim of this study was to evaluate the possibility of incorrect assessment of mortality risk factors in a group of patients affected by acute coronary syndrome, due to the lack of hazard proportionality in the Cox regression model. One hundred and fifty consecutive patients with acute coronary syndrome (ACS and no age limit were enrolled. Univariable and multivariable Cox proportional hazard analyses were performed. The proportional hazard assumptions were verified using Schoenfeld residuals, χ2 test and rank correlation coefficient t between residuals and time. In the total group of 150 patients, 33 (22.0% deaths from any cause were registered in the follow-up time period of 64 months. The non-survivors were significantly older and had increased prevalence of diabetes and erythrocyturia, longer history of coronary artery disease, higher concentrations of serum creatinine, cystatin C, uric acid, glucose, C-reactive protein (CRP, homocysteine and B-type natriuretic peptide (NT-proBNP, and lower concentrations of serum sodium. No significant differences in echocardiography parameters were observed between groups. The following factors were risk of death factors and fulfilled the proportional hazard assumption in the univariable model: smoking, occurrence of diabetes and anaemia, duration of coronary artery disease, and abnormal serum concentrations of uric acid, sodium, homocysteine, cystatin C and NT-proBNP, while in the multivariable model, the risk of death factors were: smoking and elevated concentrations of homocysteine and NT-proBNP. The study has demonstrated that violation of the proportional hazard assumption in the Cox regression model may lead to creating a false model that does not include only time-independent predictive factors.

  7. Sensitivity analysis of seismic hazard for the northwestern portion of the state of Gujarat, India

    Science.gov (United States)

    Petersen, M.D.; Rastogi, B.K.; Schweig, E.S.; Harmsen, S.C.; Gomberg, J.S.

    2004-01-01

    We test the sensitivity of seismic hazard to three fault source models for the northwestern portion of Gujarat, India. The models incorporate different characteristic earthquake magnitudes on three faults with individual recurrence intervals of either 800 or 1600 years. These recurrence intervals imply that large earthquakes occur on one of these faults every 266-533 years, similar to the rate of historic large earthquakes in this region during the past two centuries and for earthquakes in intraplate environments like the New Madrid region in the central United States. If one assumes a recurrence interval of 800 years for large earthquakes on each of three local faults, the peak ground accelerations (PGA; horizontal) and 1-Hz spectral acceleration ground motions (5% damping) are greater than 1 g over a broad region for a 2% probability of exceedance in 50 years' hazard level. These probabilistic PGAs at this hazard level are similar to median deterministic ground motions. The PGAs for 10% in 50 years' hazard level are considerably lower, generally ranging between 0.2 g and 0.7 g across northwestern Gujarat. Ground motions calculated from our models that consider fault interevent times of 800 years are considerably higher than other published models even though they imply similar recurrence intervals. These higher ground motions are mainly caused by the application of intraplate attenuation relations, which account for less severe attenuation of seismic waves when compared to the crustal interplate relations used in these previous studies. For sites in Bhuj and Ahmedabad, magnitude (M) 7 3/4 earthquakes contribute most to the PGA and the 0.2- and 1-s spectral acceleration ground motion maps at the two considered hazard levels. ?? 2004 Elsevier B.V. All rights reserved.

  8. Risk prediction of Critical Infrastructures against extreme natural hazards: local and regional scale analysis

    Science.gov (United States)

    Rosato, Vittorio; Hounjet, Micheline; Burzel, Andreas; Di Pietro, Antonio; Tofani, Alberto; Pollino, Maurizio; Giovinazzi, Sonia

    2016-04-01

    Natural hazard events can induce severe impacts on the built environment; they can hit wide and densely populated areas, where there is a large number of (inter)dependent technological systems whose damages could cause the failure or malfunctioning of further different services, spreading the impacts on wider geographical areas. The EU project CIPRNet (Critical Infrastructures Preparedness and Resilience Research Network) is realizing an unprecedented Decision Support System (DSS) which enables to operationally perform risk prediction on Critical Infrastructures (CI) by predicting the occurrence of natural events (from long term weather to short nowcast predictions, correlating intrinsic vulnerabilities of CI elements with the different events' manifestation strengths, and analysing the resulting Damage Scenario. The Damage Scenario is then transformed into an Impact Scenario, where punctual CI element damages are transformed into micro (local area) or meso (regional) scale Services Outages. At the smaller scale, the DSS simulates detailed city models (where CI dependencies are explicitly accounted for) that are of important input for crisis management organizations whereas, at the regional scale by using approximate System-of-Systems model describing systemic interactions, the focus is on raising awareness. The DSS has allowed to develop a novel simulation framework for predicting earthquakes shake maps originating from a given seismic event, considering the shock wave propagation in inhomogeneous media and the subsequent produced damages by estimating building vulnerabilities on the basis of a phenomenological model [1, 2]. Moreover, in presence of areas containing river basins, when abundant precipitations are expected, the DSS solves the hydrodynamic 1D/2D models of the river basins for predicting the flux runoff and the corresponding flood dynamics. This calculation allows the estimation of the Damage Scenario and triggers the evaluation of the Impact Scenario

  9. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Torrential processes like flooding, heavy bedload transport or debris flows in steep mountain channels emerge during intense, highly localized rainfall events. They pose a serious risk on the densely populated Alpine region. Hydrogeomorphic hazards are profoundly nonlinear, threshold mediated phenomena frequently causing costly damage to infrastructure and people. Thus, in the context of climate change, there is an ever rising interest in whether sediment cascades of small alpine catchments react to changing precipitation patterns and how the climate signal is propagated through the fluvial system. We intend to answer the following research questions: (i) What are critical meteorological characteristics triggering torrential events in the Eastern Alps of Austria? (ii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which factors control the internal susceptibility? (iii) Do torrential processes show an increase in magnitude and frequency or a shift in seasonality in the recent past? (iv) Which future changes can be expected under different climate scenarios? Quantifications of bedload transport in small alpine catchments are rare and often associated with high uncertainties. Detailed knowledge though exists for the Schöttlbach catchment, a 71 km2 study area in Styria in the Eastern Alps. The torrent is monitored since a heavy precipitation event resulted in a disastrous flood in July 2011. Sediment mobilisation from slopes as well as within-channel storage and fluxes are regularly measured by photogrammetric methods and sediment impact sensors (SIS). The associated hydro-meteorological conditions are known from a dense station network. Changing states of connectivity can thus be related to precipitation and internal dynamics (sediment availability, cut-and-fill cycles). The site-specific insights are then conceptualized for application to a broader scale. Therefore, a Styria wide database of torrential

  10. The impact of overlapping processes on rockfall hazard analysis - the Bolonia Bay study (southern Spain)

    Science.gov (United States)

    Fernandez-Steeger, T.; Grützner, C.; Reicherter, K.; Braun, A.; Höbig, N.

    2009-04-01

    from the described investigation show that on a screening and planning level the results of the empirical methods are quite good. Especially for numerical simulation, where back analysis is common to parameterize the models, the identification of "ideal" rockfalls is essential for a good simulation performance and subsequently for an appropriate planning of protection measures. References Corominas, J. 1996. The angle of reach as a mobility index for small and large landslides. Canadian Geotechnical Journal, 33, 260 - 271. Dorren, L.K. 2003. A review of rockfall mechanics and modeling approaches. Progress in Physical Geography, 27 (1), 69 - 87. Evans, S. & Hungr, O. 1993. The assessment of rockfall hazard at the base of talus slopes. Canadian Geotechnical Journal, 30, 620 - 636. Heim, A. 1932. Bergsturz und Menschenleben. Vjschr. d. Naturforsch Ges. Zürich, 216 pp. Silva P.G., Reicherter K., Grützner C., Bardají T., Lario J., Goy J.L., Zazo C., & Becker-Heidmann P. 2009. Surface and subsurface paleoseismic records at the ancient Roman city of Baelo Claudia and the Bolonia Bay area, Cádiz (South Spain). Geol Soc of London Spec. Vol.: Paleoseismology: Historical and prehistorical records of earthquake ground effects for seismic hazard assessment. In press. Spang, R. M. & Sonser, Th. 1995. Optimized rockfall protection by "ROCKFALL". Proc 8th Int Congress Rock Mechanics, 3, 1233-1242.

  11. A dynamic approach for the impact of a toxic gas dispersion hazard considering human behaviour and dispersion modelling.

    Science.gov (United States)

    Lovreglio, Ruggiero; Ronchi, Enrico; Maragkos, Georgios; Beji, Tarek; Merci, Bart

    2016-11-15

    The release of toxic gases due to natural/industrial accidents or terrorist attacks in populated areas can have tragic consequences. To prevent and evaluate the effects of these disasters different approaches and modelling tools have been introduced in the literature. These instruments are valuable tools for risk managers doing risk assessment of threatened areas. Despite the significant improvements in hazard assessment in case of toxic gas dispersion, these analyses do not generally include the impact of human behaviour and people movement during emergencies. This work aims at providing an approach which considers both modelling of gas dispersion and evacuation movement in order to improve the accuracy of risk assessment for disasters involving toxic gases. The approach is applied to a hypothetical scenario including a ship releasing Nitrogen dioxide (NO2) on a crowd attending a music festival. The difference between the results obtained with existing static methods (people do not move) and a dynamic approach (people move away from the danger) which considers people movement with different degrees of sophistication (either a simple linear path or more complex behavioural modelling) is discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Tsunami hazard for the city of Catania, eastern Sicily, Italy, assessed by means of Worst-case Credible Tsunami Scenario Analysis (WCTSA)

    Science.gov (United States)

    Tonini, R.; Armigliato, A.; Pagnoni, G.; Zaniboni, F.; Tinti, S.

    2011-05-01

    Eastern Sicily is one of the coastal areas most exposed to earthquakes and tsunamis in Italy. The city of Catania that developed between the eastern base of Etna volcano and the Ionian Sea is, together with the neighbour coastal belt, under the strong menace of tsunamis. This paper addresses the estimation of the tsunami hazard for the city of Catania by using the technique of the Worst-case Credible Tsunami Scenario Analysis (WCTSA) and is focused on a target area including the Catania harbour and the beach called La Plaia where many human activities develop and many important structures are present. The aim of the work is to provide a detailed tsunami hazard analysis, firstly by building scenarios that are proposed on the basis of tectonic considerations and of the largest historical events that hit the city in the past, and then by combining all the information deriving from single scenarios into a unique aggregated scenario that can be viewed as the worst virtual scenario. Scenarios have been calculated by means of numerical simulations on computational grids of different resolutions, passing from 3 km on a regional scale to 40 m in the target area. La Plaia beach results to be the area most exposed to tsunami inundation, with inland penetration up to hundreds of meters. The harbour turns out to be more exposed to tsunami waves with low frequencies: in particular, it is found that the major contribution to the hazard in the harbour is due to a tsunami from a remote source, which propagates with much longer periods than tsunamis from local sources. This work has been performed in the framework of the EU-funded project SCHEMA.

  13. Culture Representation in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    David Gertman; Julie Marble; Steven Novack

    2006-12-01

    Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991) cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.

  14. Potential Hazard to Human Health from Exposure to Fragments of Lead Bullets and Shot in the Tissues of Game Animals

    Science.gov (United States)

    Pain, Deborah J.; Cromie, Ruth L.; Newth, Julia; Brown, Martin J.; Crutcher, Eric; Hardman, Pippa; Hurst, Louise; Mateo, Rafael; Meharg, Andrew A.; Moran, Annette C.; Raab, Andrea; Taggart, Mark A.; Green, Rhys E.

    2010-01-01

    Background Lead is highly toxic to animals. Humans eating game killed using lead ammunition generally avoid swallowing shot or bullets and dietary lead exposure from this source has been considered low. Recent evidence illustrates that lead bullets fragment on impact, leaving small lead particles widely distributed in game tissues. Our paper asks whether lead gunshot pellets also fragment upon impact, and whether lead derived from spent gunshot and bullets in the tissues of game animals could pose a threat to human health. Methodology/Principal Findings Wild-shot gamebirds (6 species) obtained in the UK were X-rayed to determine the number of shot and shot fragments present, and cooked using typical methods. Shot were then removed to simulate realistic practice before consumption, and lead concentrations determined. Data from the Veterinary Medicines Directorate Statutory Surveillance Programme documenting lead levels in raw tissues of wild gamebirds and deer, without shot being removed, are also presented. Gamebirds containing ≥5 shot had high tissue lead concentrations, but some with fewer or no shot also had high lead concentrations, confirming X-ray results indicating that small lead fragments remain in the flesh of birds even when the shot exits the body. A high proportion of samples from both surveys had lead concentrations exceeding the European Union Maximum Level of 100 ppb w.w. (0.1 mg kg−1 w.w.) for meat from bovine animals, sheep, pigs and poultry (no level is set for game meat), some by several orders of magnitude. High, but feasible, levels of consumption of some species could result in the current FAO/WHO Provisional Weekly Tolerable Intake of lead being exceeded. Conclusions/Significance The potential health hazard from lead ingested in the meat of game animals may be larger than previous risk assessments indicated, especially for vulnerable groups, such as children, and those consuming large amounts of game. PMID:20436670

  15. Hazard function theory for nonstationary natural hazards

    Science.gov (United States)

    Read, Laura K.; Vogel, Richard M.

    2016-04-01

    Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.

  16. Hazard Identification and Risk Assessment of Health and Safety Approach JSA (Job Safety Analysis) in Plantation Company

    Science.gov (United States)

    Sugarindra, Muchamad; Ragil Suryoputro, Muhammad; Tiya Novitasari, Adi

    2017-06-01

    Plantation company needed to identify hazard and perform risk assessment as an Identification of Hazard and Risk Assessment Crime and Safety which was approached by using JSA (Job Safety Analysis). The identification was aimed to identify the potential hazards that might be the risk of workplace accidents so that preventive action could be taken to minimize the accidents. The data was collected by direct observation to the workers concerned and the results were recorded on a Job Safety Analysis form. The data were as forklift operator, macerator worker, worker’s creeper, shredder worker, workers’ workshop, mechanical line worker, trolley cleaning workers and workers’ crepe decline. The result showed that shredder worker value was 30 and had the working level with extreme risk with the risk value range was above 20. So to minimize the accidents could provide Personal Protective Equipment (PPE) which were appropriate, information about health and safety, the company should have watched the activities of workers, and rewards for the workers who obey the rules that applied in the plantation.

  17. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant.

    Science.gov (United States)

    Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen

    2015-09-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.

  18. Discrimination and numerical analysis of human pathogenic ...

    African Journals Online (AJOL)

    Discrimination and numerical analysis of human pathogenic Candida albicans strains based on SDSPAGE protein profiles. ... obtaining a correct identification, both the commercial yeast kit system and the numerical analysis of whole-cell protein patterns can be useful for the more reliable identification of C. albicans strains.

  19. Analysis of root causes of major hazard precursors (hydrocarbon leaks) in the Norwegian offshore petroleum industry

    Energy Technology Data Exchange (ETDEWEB)

    Vinnem, Jan Erik, E-mail: jev@preventor.n [Preventor AS/University of Stavanger, Rennebergstien 30, 4021 Stavanger (Norway); Hestad, Jon Andreas [Safetec Nordic AS, Bergen (Norway); Kvaloy, Jan Terje [Department of Mathematics and Natural Sciences, University of Stavanger (Norway); Skogdalen, Jon Espen [Department of Industrial Economics, Risk Management and Planning, University of Stavanger (Norway)

    2010-11-15

    The offshore petroleum industry in Norway reports major hazard precursors to the authorities, and data are available for the period 1996 through 2009. Barrier data have been reported since 2002, as have data from an extensive questionnaire survey covering working environment, organizational culture and perceived risk among all employees on offshore installations. Several attempts have been made to analyse different data sources in order to discover relations that may cast some light on possible root causes of major hazard precursors. These previous attempts were inconclusive. The study presented in this paper is the most extensive study performed so far. The data were analysed using linear regression. The conclusion is that there are significant correlations between number of leaks and safety climate indicators. The discussion points to possible root causes of major accidents.

  20. Seismic hazard analysis. Volume 5. Review panel, Ground Motion Panel, and feedback results

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D. L.

    1981-08-01

    The Site Specific Spectra Project (SSSP) was a multi-year study funded by the US Nuclear Regulatory Commission to provide estimates of the seismic hazards at a number of nuclear power plant sites in the Eastern US. A key element of our approach was the Peer Review Panel, which we formed in order to ensure that our use of expert opinion was reasonable. We discuss the Peer Review Panel results and provide the complete text of each member's report. In order to improve the ground motion model, an Eastern US Ground Motion Model Panel was formed. In Section 4 we tabulate the responses from the panel members to our feedback questionnaire and discuss the implications of changes introduced by them. We conclude that the net difference in seismic hazard values from those presented in Volume 4 is small and does not warrant a reanalysis. 22 figs.

  1. Seismic hazard analysis of nuclear installations in France. Current practice and research

    Energy Technology Data Exchange (ETDEWEB)

    Mohammadioun, B. [CEA Centre d`Etudes de Fontenay-aux-Roses, 92 (France). Inst. de Protection et de Surete Nucleaire

    1997-03-01

    The methodology put into practice in France for the evaluation of seismic hazard on the sites of nuclear facilities is founded on data assembled country-wide over the past 15 years, in geology, geophysics and seismology. It is appropriate to the regional seismotectonic context (interplate), characterized notably by diffuse seismicity. Extensive use is made of information drawn from historical seismicity. The regulatory practice described in the RFS I.2.c is reexamined periodically and is subject to up-dating so as to take advantage of new earthquake data and of the results gained from research work. Acquisition of the basic data, such as the identification of active faults and the quantification of site effect, which will be needed to achieve improved preparedness versus severe earthquake hazard in the 21st century, will necessarily be the fruit of close international cooperation and collaboration, which should accordingly be actively promoted. (J.P.N.)

  2. Probabilistic aftershock hazard analysis, two case studies in West and Northwest Iran

    Science.gov (United States)

    Ommi, S.; Zafarani, H.

    2017-09-01

    Aftershock hazard maps contain the essential information for search and rescue process, and re-occupation after a main-shock. Accordingly, the main purposes of this article are to study the aftershock decay parameters and to estimate the expected high-frequency ground motions (i.e., Peak Ground Acceleration (PGA)) for recent large earthquakes in the Iranian plateau. For this aim, the Ahar-Varzaghan doublet earthquake (August 11, 2012; M N =6.5, M N =6.3), and the Ilam (Murmuri) earthquake (August 18, 2014 ; M N =6.2) have been selected. The earthquake catalogue has been collected based on the Gardner and Knopoff (Bull Seismol Soc Am 64(5), 1363-1367, 1974) temporal and spatial windowing technique. The magnitude of completeness and the seismicity parameters (a, b) and the modified Omori law parameters (P, K, C) have been determined for these two earthquakes in the 14, 30, and 60 days after the mainshocks. Also, the temporal changes of parameters (a, b, P, K, C) have been studied. The aftershock hazard maps for the probability of exceedance (33%) have been computed in the time periods of 14, 30, and 60 days after the Ahar-Varzaghan and Ilam (Murmuri) earthquakes. For calculating the expected PGA of aftershocks, the regional and global ground motion prediction equations have been utilized. Amplification factor based on the site classes has also been implied in the calculation of PGA. These aftershock hazard maps show an agreement between the PGAs of large aftershocks and the forecasted PGAs. Also, the significant role of b parameter in the Ilam (Murmuri) probabilistic aftershock hazard maps has been investigated.

  3. Hazardous waste characterization among various thermal processes in South Korea: a comparative analysis.

    Science.gov (United States)

    Shin, Sun Kyoung; Kim, Woo-Il; Jeon, Tae-Wan; Kang, Young-Yeul; Jeong, Seong-Kyeong; Yeon, Jin-Mo; Somasundaram, Swarnalatha

    2013-09-15

    Ministry of Environment, Republic of Korea (South Korea) is in progress of converting its current hazardous waste classification system to harmonize it with the international standard and to set-up the regulatory standards for toxic substances present in the hazardous waste. In the present work, the concentrations along with the trend of 13 heavy metals, F(-), CN(-) and 19 PAH present in the hazardous waste generated among various thermal processes (11 processes) in South Korea were analyzed along with their leaching characteristics. In all thermal processes, the median concentrations of Cu (3.58-209,000 mg/kg), Ni (BDL-1560 mg/kg), Pb (7.22-5132.25mg/kg) and Zn (83.02-31419 mg/kg) were comparatively higher than the other heavy metals. Iron & Steel thermal process showed the highest median value of the heavy metals Cd (14.76 mg/kg), Cr (166.15 mg/kg) and Hg (2.38 mg/kg). Low molecular weight PAH (BDL-37.59 mg/kg) was predominant in sludge & filter cake samples present in most of the thermal processes. Comparatively flue gas dust present in most of the thermal processing units resulted in the higher leaching of the heavy metals. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. [The analysis of adverse health effects of occupational hazards factors in one solid waste landfill].

    Science.gov (United States)

    Shi, Ting-Ming; Weng, Shao-Fan; Liu, Yue-Wei; Tao, Hua; Wang, Xin; Guo, Yan-Fei; Wang, He-Ping; Wang, Hai-Jiao; Wang, Ke-Hong; Yu, Dan; Chen, Wei-Hong

    2011-07-01

    To determine occupational hazards in work sites of a large solid waste landfill and analyze their adverse health effects. The national standardized detection methods were used to determine dust concentration, harmful gas and physical factors in worksites. Routine physical examination, pulmonary function, hearing tests and nervous system test were performed in workers for 2 consecutive years. Urine lead, cadmium and mercury contents were detected. The comet assay was use to measure DNA damage in peripheral blood lymphocytes among workers. The main occupational hazard factors in this solid landfill are dust, harmful gas, high temperature and noise. The oxides, carbon monoxide, and noise and high temperatures in summer at some work sites exceeded the national occupational exposure limits. The prevalence of respiratory inflammation and rate of pulmonary function decrease among front-line workers and on-site technical managers are 21.2% and 11.5%, which are significantly higher than those among administrative staff (7.1% and 0) (P occupational hazards were observed among workers in this solid waste landfill.

  5. Analysis of factors related to man-induced hazard for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Soon; Jung, Jea Hee; Lee, Keun O; Son, Ki Sang; Wang, Sang Chul; Lee, Chang Jin; Ku, Min Ho; Park, Nam Young [Seoul National Univ. of Technology, Seoul (Korea, Republic of)

    2003-03-15

    This study is to show a guide for installing hazardous facilities adjoined atomic power plant after finding out how much these facilities could impact to the atomic plant. Nuclear power plant is an important facility which is closely connected with public life, industrial activity, and the conduct of public business, so it should not be damaged. Therefore, if there are hazardous and harmful facilities near the plant, then they must be evaluated by the size, the type, and the shape. First of all, any factors that could cause man induced accident must be investigated. And they must be exactly evaluated from how much it will damage the plant facilities. The purpose of this study is to set a technical standard for the installation of these facilities by evaluating the man induced accident. Also, it is to make out the evaluation methods by investigating the hazardous facilities which are placed near the plant. Our country is now using CFR standard : reg. guide and IAEA safety series. However, not only the standard of technology which is related to man induced accident but also the evaluation methods for facilities are not yet layed down. As It was mentioned above, we should evaluate these facilities adequately, and these methods must be made out.

  6. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  7. Food-Safety Hazards in the Pork Chain in Nagaland, North East India: Implications for Human Health

    Directory of Open Access Journals (Sweden)

    Anna Sophie Fahrion

    2013-12-01

    Full Text Available Pork occupies an important place in the diet of the population of Nagaland, one of the North East Indian states. We carried out a pilot study along the pork meat production chain, from live animal to end consumer. The goal was to obtain information about the presence of selected food borne hazards in pork in order to assess the risk deriving from these hazards to the health of the local consumers and make recommendations for improving food safety. A secondary objective was to evaluate the utility of risk-based approaches to food safety in an informal food system. We investigated samples from pigs and pork sourced at slaughter in urban and rural environments, and at retail, to assess a selection of food-borne hazards. In addition, consumer exposure was characterized using information about hygiene and practices related to handling and preparing pork. A qualitative hazard characterization, exposure assessment and hazard characterization for three representative hazards or hazard proxies, namely Enterobacteriaceae, T. solium cysticercosis and antibiotic residues, is presented. Several important potential food-borne pathogens are reported for the first time including Listeria spp. and Brucella suis. This descriptive pilot study is the first risk-based assessment of food safety in Nagaland. We also characterise possible interventions to be addressed by policy makers, and supply data to inform future risk assessments.

  8. Food-safety hazards in the pork chain in Nagaland, North East India: implications for human health.

    Science.gov (United States)

    Fahrion, Anna Sophie; Jamir, Lanu; Richa, Kenivole; Begum, Sonuwara; Rutsa, Vilatuo; Ao, Simon; Padmakumar, Varijaksha P; Deka, Ram Pratim; Grace, Delia

    2013-12-24

    Pork occupies an important place in the diet of the population of Nagaland, one of the North East Indian states. We carried out a pilot study along the pork meat production chain, from live animal to end consumer. The goal was to obtain information about the presence of selected food borne hazards in pork in order to assess the risk deriving from these hazards to the health of the local consumers and make recommendations for improving food safety. A secondary objective was to evaluate the utility of risk-based approaches to food safety in an informal food system. We investigated samples from pigs and pork sourced at slaughter in urban and rural environments, and at retail, to assess a selection of food-borne hazards. In addition, consumer exposure was characterized using information about hygiene and practices related to handling and preparing pork. A qualitative hazard characterization, exposure assessment and hazard characterization for three representative hazards or hazard proxies, namely Enterobacteriaceae, T. solium cysticercosis and antibiotic residues, is presented. Several important potential food-borne pathogens are reported for the first time including Listeria spp. and Brucella suis. This descriptive pilot study is the first risk-based assessment of food safety in Nagaland. We also characterise possible interventions to be addressed by policy makers, and supply data to inform future risk assessments.

  9. A framework for the assessment and analysis of multi-hazards induced risk resulting from space vehicles operations

    Science.gov (United States)

    Sala-Diakanda, Serge N.

    2007-12-01

    With the foreseeable increase in traffic frequency to and from orbit, the safe operation of current and future space vehicles at designated spaceports has become a serious concern. Due to their high explosive energy potential, operating those launch vehicles presents a real risk to: (1) the spaceport infrastructure and personnel, (2) the communities surrounding the spaceport and (3) the flying aircrafts whose routes could be relatively close to spaceport launch and reentry routes. Several computer models aimed at modeling the effects of the different hazards generated by the breakup of such vehicles (e.g., fragmentation of debris, release of toxic gases, propagation of blast waves, etc.) have been developed, and are used to assist in Go-No Go launch decisions. They can simulate a total failure scenario of the vehicle and, estimate a number of casualties to be expected as a result of such failure. However, as all of these models---which can be very elaborate and complex---consider only one specific explosion hazard in their simulations, the decision of whether or not a launch should occur is currently based on the evaluation of several estimates of an expected number of casualties. As such, current practices ignore the complex, nonlinear interactions between the different hazards as well as the interdependencies between the estimates. In this study, we developed a new framework which makes use of information fusion theory, hazards' dispersion modeling and, geographical statistical analysis and visualization capabilities of geographical information systems to assess the risk generated by the operation of space launch vehicles. A new risk metric, which effectively addresses the lack of a common risk metric with current methods, is also proposed. A case study, based on a proposed spaceport in the state of Oklahoma showed that the estimates we generate through our framework consistently outperform estimates provided by any individual hazard, or by the independent

  10. A decision analysis framework for estimating the potential hazards for drinking water resources of chemicals used in hydraulic fracturing fluids.

    Science.gov (United States)

    Yost, Erin E; Stanek, John; Burgoon, Lyle D

    2017-01-01

    Despite growing concerns over the potential for hydraulic fracturing to impact drinking water resources, there are limited data available to identify chemicals used in hydraulic fracturing fluids that may pose public health concerns. In an effort to explore these potential hazards, a multi-criteria decision analysis (MCDA) framework was employed to analyze and rank selected subsets of these chemicals by integrating data on toxicity, frequency of use, and physicochemical properties that describe transport in water. Data used in this analysis were obtained from publicly available databases compiled by the United States Environmental Protection Agency (EPA) as part of a larger study on the potential impacts of hydraulic fracturing on drinking water. Starting with nationwide hydraulic fracturing chemical usage data from EPA's analysis of the FracFocus Chemical Disclosure Registry 1.0, MCDAs were performed on chemicals that had either noncancer toxicity values (n=37) or cancer-specific toxicity values (n=10). The noncancer MCDA was then repeated for subsets of chemicals reported in three representative states (Texas, n=31; Pennsylvania, n=18; and North Dakota, n=20). Within each MCDA, chemicals received scores based on relative toxicity, relative frequency of use, and physicochemical properties (mobility in water, volatility, persistence). Results show a relative ranking of these chemicals based on hazard potential, and provide preliminary insight into chemicals that may be more likely than others to impact drinking water resources. Comparison of nationwide versus state-specific analyses indicates regional differences in the chemicals that may be of more concern to drinking water resources, although many chemicals were commonly used and received similar overall hazard rankings. Several chemicals highlighted by these MCDAs have been reported in groundwater near areas of hydraulic fracturing activity. This approach is intended as a preliminary analysis, and represents one

  11. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 18. Errors in Probabilistic Seismic Hazard Analysis.

    Science.gov (United States)

    1982-01-01

    hazard is conditional on a given t.a. process representation of seismicity, symbolized here by the random process X(t). However, X(t) is not always known...Regionalized Variables and Its Applications, Les Cahiers du Centre de Morphologie Mathematique de Fontainbleau, No. 5. McGuire, R.K. and Shedlock

  12. Human Motion Analysis for Creating Immersive Experiences

    OpenAIRE

    Abedan Kondori, Farid

    2012-01-01

    From an early age, people display the ability to quickly and effortlessly interpret the orientation and movement of human body parts, thereby allowing one to infer the intentions of others who are nearby and to comprehend an important nonverbal form of communication. The ease with which one accomplishes this task belies the difficulty of a problem that has challenged computational systems for decades, human motion analysis. Technological developments over years have resulted into many systems...

  13. Patents and human rights: a heterodox analysis.

    Science.gov (United States)

    Gold, E Richard

    2013-01-01

    Much international debate over access to medicines focuses on whether patent law accords with international human rights law. This article argues that this is the wrong question to ask. Following an analysis of both patent and human rights law, this article suggests that the better approach is to focus on national debates over the best calibration of patent law to achieve national objectives. © 2013 American Society of Law, Medicine & Ethics, Inc.

  14. Application of hazard analysis and critical control points (HACCP) to the processing of compost used in the cultivation of button mushroom

    National Research Council Canada - National Science Library

    José Emilio Pardo; Diego Cunha Zied; Manuel Alvarez-Ortí; Jesús Ángel Peñaranda; Carmen Gómez-Cantó; Arturo Pardo-Giménez

    2017-01-01

    .... Methods In this paper, the Hazard Analysis and Critical Control Points system is applied to the processing line of compost used in the cultivation of mushrooms and other edible cultivated fungi...

  15. DOWNFLOW code and LIDAR technology for lava flow analysis and hazard assessment at Mount Etna

    Directory of Open Access Journals (Sweden)

    Alessandro Fornaciai

    2011-12-01

    Full Text Available The use of a lava-flow simulation (DOWNFLOW probabilistic code and airborne light detection and ranging (LIDAR technology are combined to analyze the emplacement of compound lava flow fields at Mount Etna (Sicily, Italy. The goal was to assess the hazard posed by lava flows. The LIDAR-derived time series acquired during the 2006 Mount Etna eruption records the changing topography of an active lava-flow field. These short-time-interval, high-resolution topographic surveys provide a detailed quantitative picture of the topographic changes. The results highlight how the flow field evolves as a number of narrow (5-15 m wide disjointed flow units that are fed simultaneously by uneven lava pulses that advance within formed channels. These flow units have widely ranging advance velocities (3-90 m/h. Overflows, bifurcations and braiding are also clearly displayed. In such a complex scenario, the suitability of deterministic codes for lava-flow simulation can be hampered by the fundamental difficulty of measuring the flow parameters (e.g. the lava discharge rate, or the lava viscosity of a single flow unit. However, the DOWNFLOW probabilistic code approaches this point statistically and needs no direct knowledge of flow parameters. DOWNFLOW intrinsically accounts for complexities and perturbations of lava flows by randomly varying the pre-eruption topography. This DOWNFLOW code is systematically applied here over Mount Etna, to derive a lava-flow hazard map based on: (i the topography of the volcano; (ii the probability density function for vent opening; and (iii a law for the expected lava-flow length for all of the computational vents considered. Changes in the hazard due to the recent morphological evolution of Mount Etna have also been addressed.

  16. Numerical and probabilistic analysis of asteroid and comet impact hazard mitigation

    Energy Technology Data Exchange (ETDEWEB)

    Plesko, Catherine S [Los Alamos National Laboratory; Weaver, Robert P [Los Alamos National Laboratory; Huebner, Walter F [Los Alamos National Laboratory

    2010-09-09

    The possibility of asteroid and comet impacts on Earth has received significant recent media and scientific attention. Still, there are many outstanding questions about the correct response once a potentially hazardous object (PHO) is found. Nuclear munitions are often suggested as a deflection mechanism because they have a high internal energy per unit launch mass. However, major uncertainties remain about the use of nuclear munitions for hazard mitigation. There are large uncertainties in a PHO's physical response to a strong deflection or dispersion impulse like that delivered by nuclear munitions. Objects smaller than 100 m may be solid, and objects at all sizes may be 'rubble piles' with large porosities and little strength. Objects with these different properties would respond very differently, so the effects of object properties must be accounted for. Recent ground-based observations and missions to asteroids and comets have improved the planetary science community's understanding of these objects. Computational power and simulation capabilities have improved such that it is possible to numerically model the hazard mitigation problem from first principles. Before we know that explosive yield Y at height h or depth -h from the target surface will produce a momentum change in or dispersion of a PHO, we must quantify energy deposition into the system of particles that make up the PHO. Here we present the initial results of a parameter study in which we model the efficiency of energy deposition from a stand-off nuclear burst onto targets made of PHO constituent materials.

  17. Hazards/Failure Modes and Effects Analysis MK 1 MOD 0 LSO-HUD Console System.

    Science.gov (United States)

    1980-03-24

    CHART NATIONAL BUREAU Of STANDARDS- 1963-.1 I C2~ LEYE IAFVAL AIR EIWM MET j REPORT NAEC-9L-7958 I-LAKENU RST. N.J. 00733 I 0 HAZARDS/FAILURE MODES AND...Jersey 08733 24 March 1980 " TIC Technical Report ELECTE Contract No. N68335-78-C-2002 S ApR 29 1980 APPROVED FOR PUBLIC RELEASE;A DISTRIBUTION UNLIMITED...4 hI - 4- m SSY! x I. _ _ _ _ _ _ __ _ _ _ _ _ _ _ __ _ _ _ _ _ _ _ __ _ _ _ _ --m z , 09 za4 .32 .4 oo a 4 p., tic 00 a2 9 -1 215A-99 NAEC-91-7958

  18. Combustion diagnosis for analysis of solid propellant rocket abort hazards: Role of spectroscopy

    Science.gov (United States)

    Gill, W.; Cruz-Cabrera, A. A.; Donaldson, A. B.; Lim, J.; Sivathanu, Y.; Bystrom, E.; Haug, A.; Sharp, L.; Surmick, D. M.

    2014-11-01

    Solid rocket propellant plume temperatures have been measured using spectroscopic methods as part of an ongoing effort to specify the thermal-chemical-physical environment in and around a burning fragment of an exploded solid rocket at atmospheric pressures. Such specification is needed for launch safety studies where hazardous payloads become involved with large fragments of burning propellant. The propellant burns in an off-design condition producing a hot gas flame loaded with burning metal droplets. Each component of the flame (soot, droplets and gas) has a characteristic temperature, and it is only through the use of spectroscopy that their temperature can be independently identified.

  19. Analysis of potential hazards associated with 241Am loaded resins from nitrate media

    Energy Technology Data Exchange (ETDEWEB)

    Schulte, Louis D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rubin, Jim [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fife, Keith William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ricketts, Thomas Edgar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tappan, Bryce C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chavez, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-19

    LANL has been contacted to provide possible assistance in safe disposition of a number of 241Am-bearing materials associated with local industrial operations. Among the materials are ion exchange resins which have been in contact with 241Am and nitric acid, and which might have potential for exothermic reaction. The purpose of this paper is to analyze and define the resin forms and quantities to the extent possible from available data to allow better bounding of the potential reactivity hazard of the resin materials. An additional purpose is to recommend handling procedures to minimize the probability of an uncontrolled exothermic reaction.

  20. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  1. Vibrational microspectroscopy analysis of human lenses

    Science.gov (United States)

    Paluszkiewicz, C.; Piergies, N.; Sozańska, A.; Chaniecki, P.; Rękas, M.; Miszczyk, J.; Gajda, M.; Kwiatek, W. M.

    2018-01-01

    In this study we present vibrational analysis of healthy (non-affected by cataract) and cataractous human lenses by means of Raman and FTIR spectroscopy methods. The performed analysis provides complex information about the secondary structure of the proteins and conformational changes of the amino acid residues due to the formation of opacification of human lens. Briefly, the changes in the conformation of the Tyr and Trp residues and the protein secondary structure between the healthy and cataractous samples, were recognized. Moreover, the observed spectral pattern suggests that the process of cataract development does not occur uniformly over the entire volume of the lens.

  2. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  3. Human milk metagenome: a functional capacity analysis

    Science.gov (United States)

    2013-01-01

    Background Human milk contains a diverse population of bacteria that likely influences colonization of the infant gastrointestinal tract. Recent studies, however, have been limited to characterization of this microbial community by 16S rRNA analysis. In the present study, a metagenomic approach using Illumina sequencing of a pooled milk sample (ten donors) was employed to determine the genera of bacteria and the types of bacterial open reading frames in human milk that may influence bacterial establishment and stability in this primal food matrix. The human milk metagenome was also compared to that of breast-fed and formula-fed infants’ feces (n = 5, each) and mothers’ feces (n = 3) at the phylum level and at a functional level using open reading frame abundance. Additionally, immune-modulatory bacterial-DNA motifs were also searched for within human milk. Results The bacterial community in human milk contained over 360 prokaryotic genera, with sequences aligning predominantly to the phyla of Proteobacteria (65%) and Firmicutes (34%), and the genera of Pseudomonas (61.1%), Staphylococcus (33.4%) and Streptococcus (0.5%). From assembled human milk-derived contigs, 30,128 open reading frames were annotated and assigned to functional categories. When compared to the metagenome of infants’ and mothers’ feces, the human milk metagenome was less diverse at the phylum level, and contained more open reading frames associated with nitrogen metabolism, membrane transport and stress response (P milk metagenome also contained a similar occurrence of immune-modulatory DNA motifs to that of infants’ and mothers’ fecal metagenomes. Conclusions Our results further expand the complexity of the human milk metagenome and enforce the benefits of human milk ingestion on the microbial colonization of the infant gut and immunity. Discovery of immune-modulatory motifs in the metagenome of human milk indicates more exhaustive analyses of the functionality of the human

  4. Washing and chilling as critical control points in pork slaughter hazard analysis and critical control point (HACCP) systems.

    Science.gov (United States)

    Bolton, D J; Pearce, R A; Sheridan, J J; Blair, I S; McDowell, D A; Harrington, D

    2002-01-01

    The aim of this research was to examine the effects of preslaughter washing, pre-evisceration washing, final carcass washing and chilling on final carcass quality and to evaluate these operations as possible critical control points (CCPs) within a pork slaughter hazard analysis and critical control point (HACCP) system. This study estimated bacterial numbers (total viable counts) and the incidence of Salmonella at three surface locations (ham, belly and neck) on 60 animals/carcasses processed through a small commercial pork abattoir (80 pigs d(-1)). Significant reductions (P HACCP in pork slaughter plants. This research will provide a sound scientific basis on which to develop and implement effective HACCP in pork abattoirs.

  5. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    Science.gov (United States)

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  6. Updated laser safety & hazard analysis for the ARES laser system based on the 2007 ANSI Z136.1 standard.

    Energy Technology Data Exchange (ETDEWEB)

    Augustoni, Arnold L.

    2007-08-01

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2007 version of the American National Standards Institutes (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2005 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  7. Scale orientated analysis of river width changes due to extreme flood hazards

    Directory of Open Access Journals (Sweden)

    G. Krapesch

    2011-08-01

    Full Text Available This paper analyses the morphological effects of extreme floods (recurrence interval >100 years and examines which parameters best describe the width changes due to erosion based on 5 affected alpine gravel bed rivers in Austria. The research was based on vertical aerial photos of the rivers before and after extreme floods, hydrodynamic numerical models and cross sectional measurements supported by LiDAR data of the rivers. Average width ratios (width after/before the flood were calculated and correlated with different hydraulic parameters (specific stream power, shear stress, flow area, specific discharge. Depending on the geomorphological boundary conditions of the different rivers, a mean width ratio between 1.12 (Lech River and 3.45 (Trisanna River was determined on the reach scale. The specific stream power (SSP best predicted the mean width ratios of the rivers especially on the reach scale and sub reach scale. On the local scale more parameters have to be considered to define the "minimum morphological spatial demand of rivers", which is a crucial parameter for addressing and managing flood hazards and should be used in hazard zone plans and spatial planning.

  8. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-05-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  9. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Project

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-10-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  10. Preliminary Hazard Analysis for the Remote-Handled Low-Level Waste Disposal Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lisa Harvego; Mike Lehto

    2010-02-01

    The need for remote handled low level waste (LLW) disposal capability has been identified. A new onsite, remote-handled LLW disposal facility has been identified as the highest ranked alternative for providing continued, uninterrupted remote-handled LLW disposal capability for remote-handled LLW that is generated as part of the nuclear mission of the Idaho National Laboratory and from spent nuclear fuel processing activities at the Naval Reactors Facility. Historically, this type of waste has been disposed of at the Radioactive Waste Management Complex. Disposal of remote-handled LLW in concrete disposal vaults at the Radioactive Waste Management Complex will continue until the facility is full or until it must be closed in preparation for final remediation of the Subsurface Disposal Area (approximately at the end of Fiscal Year 2017). This document supports the conceptual design for the proposed remote-handled LLW disposal facility by providing an initial nuclear facility hazard categorization and by identifying potential hazards for processes associated with onsite handling and disposal of remote-handled LLW.

  11. Elemental analysis and radiation hazards parameters of bauxite located in Saudi Arabia

    Science.gov (United States)

    Alashrah, S.; E Taher, A.

    2017-04-01

    Since Bauxite has been widely used in industry and in scientific investigations for producing Aluminum, it is important to measure the radionuclides concentrations to determine the health effect. The Bauxite mine is located in Az Zabirah city in Saudi Arabia. The concentrations of the radionuclides in the bauxite samples were measured using γ-ray spectrometer NaI (Tl). The average and range values of the concentrations of 226Ra, 232Th and 40K were 102.2 (141.1-62.7), 156.3 (202.8-102.8) and 116.8 (191.7- 48.9) Bq/kg respectively. These results were compared with the reported ranges in the literature from other locations around the world. The radiation hazard parameters; radium equivalent activity, annual dose, external hazard were also calculated and compared with the recommended levels by International Commission on Radiological Protection (ICRP-60) and united nations scientific committee on the effects of atomic radiation UNSCEAR reports. There are no studies for the natural radioactivity in the bauxite mine in Az Zabirah city, so these results are a start to establishing a database in this location.

  12. Assessment of the 1988 Saguenay earthquake: Implications on attenuation functions for seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Toro, G.R.; McGuire, R.K. (Risk Engineering, Inc., Golden, CO (United States))

    1991-09-01

    This study investigates the earthquake records from the 1988 Saguenay earthquake and examines the implications of these records with respect to ground-motion models used in seismic-hazard studies in eastern North America (ENA), specifically, to what extent the ground motions from this earthquake support or reject the various attenuation functions used in the EPRI and LLNL seismic-hazard calculations. Section 2 provides a brief description of the EPRI and LLNL attenuation functions for peak acceleration and for spectral velocities. Section 2 compares these attenuation functions the ground motions from the Saguenay earthquake and from other relevant earthquakes. Section 4 reviews available seismological studies about the Saguenay earthquake, in order to understand its seismological characteristics and why some observations may differ from predictions. Section 5 examines the assumptions and methodology used in the development of the attenuation functions selected by LLNL ground-motion expert 5. Finally, Section 6 draws conclusions about the validity of the various sets of attenuation functions, in light of the Saguenay data and of other evidence presented here. 50 refs., 37 figs., 7 tabs.

  13. Analysis and GIS Mapping of Flooding Hazards on 10 May 2016, Guangzhou, China

    Directory of Open Access Journals (Sweden)

    Hai-Min Lyu

    2016-10-01

    Full Text Available On 10 May 2016, Guangdong Province, China, suffered a heavy rainstorm. This rainstorm flooded the whole city of Guangzhou. More than 100,000 people were affected by the flooding, in which eight people lost their lives. Subway stations, cars, and buses were submerged. In order to analyse the influential factors of this flooding, topographical characteristics were mapped using Digital Elevation Model (DEM by the Geographical Information System (GIS and meteorological conditions were statistically summarised at both the whole city level and the district level. To analyse the relationship between flood risk and urbanization, GIS was also adopted to map the effect of the subway system using the Multiple Buffer operator over the flooding distribution area. Based on the analyses, one of the significant influential factors of flooding was identified as the urbanization degree, e.g., construction of a subway system, which forms along flood-prone areas. The total economic loss due to flooding in city centers with high urbanization has become very serious. Based on the analyses, the traditional standard of severity of flooding hazards (rainfall intensity grade was modified. Rainfall intensity for severity flooding was decreased from 50 mm to 30 mm in urbanized city centers. In order to protect cities from flooding, a “Sponge City” planning approach is recommended to increase the temporary water storage capacity during heavy rainstorms. In addition, for future city management, the combined use of GIS and Building Information Modelling (BIM is recommended to evaluate flooding hazards.

  14. Directed proteomic analysis of the human nucleolus

    DEFF Research Database (Denmark)

    Andersen, Jens S; Lyon, Carol E; Fox, Archa H

    2002-01-01

    of their structure and function remain uncharacterized. RESULTS: We report a proteomic analysis of human nucleoli. Using a combination of mass spectrometry (MS) and sequence database searches, including online analysis of the draft human genome sequence, 271 proteins were identified. Over 30% of the nucleolar...... proteins were encoded by novel or uncharacterized genes, while the known proteins included several unexpected factors with no previously known nucleolar functions. MS analysis of nucleoli isolated from HeLa cells in which transcription had been inhibited showed that a subset of proteins was enriched....... These data highlight the dynamic nature of the nucleolar proteome and show that proteins can either associate with nucleoli transiently or accumulate only under specific metabolic conditions. CONCLUSIONS: This extensive proteomic analysis shows that nucleoli have a surprisingly large protein complexity...

  15. Causal Analysis of the Inadvertent Contact with an Uncontrolled Electrical Hazardous Energy Source (120 Volts AC)

    Energy Technology Data Exchange (ETDEWEB)

    David E. James; Dennis E. Raunig; Sean S. Cunningham

    2014-10-01

    On September 25, 2013, a Health Physics Technician (HPT) was performing preparations to support a pneumatic transfer from the HFEF Decon Cell to the Room 130 Glovebox in HFEF, per HFEF OI 3165 section 3.5, Field Preparations. This activity involves an HPT setting up and climbing a portable ladder to remove the 14-C meter probe from above ball valve HBV-7. The HPT source checks the meter and probe and then replaces the probe above HBV-7, which is located above Hood ID# 130 HP. At approximately 13:20, while reaching past the HBV-7 valve position indicator switches in an attempt to place the 14-C meter probe in the desired location, the HPT’s left forearm came in contact with one of the three sets of exposed terminals on the valve position indication switches for HBV 7. This resulted in the HPT receiving an electrical shock from a 120 Volt AC source. Upon moving the arm, following the electrical shock, the HPT noticed two exposed electrical connections on a switch. The HPT then notified the HFEF HPT Supervisor, who in turn notified the MFC Radiological Controls Manager and HFEF Operations Manager of the situation. Work was stopped in the area and the hazard was roped off and posted to prevent access to the hazard. The HPT was escorted by the HPT Supervisor to the MFC Dispensary and then preceded to CFA medical for further evaluation. The individual was evaluated and released without any medical restrictions. Causal Factor (Root Cause) A3B3C01/A5B2C08: - Knowledge based error/Attention was given to wrong issues - Written Communication content LTA, Incomplete/situation not covered The Causal Factor (root cause) was attention being given to the wrong issues during the creation, reviews, verifications, and actual performance of HFEF OI-3165, which covers the need to perform the weekly source check and ensure placement of the probe prior to performing a “rabbit” transfer. This resulted in the hazard not being identified and mitigated in the procedure. Work activities

  16. Volcanic hazard at Vesuvius: An analysis for the revision of the current emergency plan

    Science.gov (United States)

    Rolandi, G.

    2010-01-01

    Mt Somma-Vesuvius is a composite volcano on the southern margin of the Campanian Plain which has been active since 39 ka BP and which poses a hazard and risk for the people living around its base. The volcano last erupted in 1944, and since this date has been in repose. As the level of volcanic risk perception is very high in the scientific community, in 1995 a hazard and risk evaluation, and evacuation plan, was published by the Italian Department of Civil Protection ( Dipartimento della Protezione Civile) . The plan considered the response to a worst-case scenario, taken to be a subplinian eruption on the scale of the 1631 AD eruption, and based on a volcanological reconstruction of this eruption, assumes that a future eruption will be preceded by about two weeks of ground uplift at the volcano's summit, and about one week of locally perceptible seismic activity. Moreover, by analogy with the 1631 events, the plan assumes that ash fall and pyroclastic flow should be recognized as the primary volcanic hazard. To design the response to this subplinian eruption, the emergency plan divided the Somma-Vesuvius region into three hazard zones affected by pyroclastic flows (Red Zone), tephra fall (Yellow and Green Zone), and floods (Blue Zone). The plan at present is the subject of much controversy, and, in our opinion, several assumptions need to be modified according to the following arguments: a) For the precursory unrest problem, recent scientific studies show that at present neither forecast capability is realistic, so that the assumption that a future eruption will be preceded by about two weeks of forecasts need to be modified; b) Regarding the exposure of the Vesuvius region to flow phenomena, the Red Zone presents much inconsistency near the outer border as it has been defined by the administrative limits of the eighteen municipality area lying on the volcano. As this outer limit shows no uniformity, a pressing need exists to define appropriately the flow hazard

  17. Analisis Risk Assessment Menggunakan Process Hazard Analysis (PHA dan Safety Objective Analysis (SOA pada Central Gathering Station (CGS di Onshore Facilities

    Directory of Open Access Journals (Sweden)

    Dimas Jouhari

    2014-03-01

    Full Text Available Keselamatan proses merupakan faktor utama yang sering dibahas oleh industri-industri kimia beberapa tahun terakhir ini. Salah satu metode semi-kuantitatif yang dapat digunakan untuk mengidentifikasi, menganalisis, dan menetapkan tingkat risiko bahaya yaitu dengan Process Hazard Analysis (PHA dan Safety Objective Analysis (SOA. Hazard and Operability Studies (HAZOP dan What-If Analysis merupakan metode identifikasi bahaya kualitatif yang sering diterapkan secara simultan untuk PHA-SOA. Process Hazard Analysis (PHA ialah rangkaian aktivitas mengidentifikasi hazard, mengestimasi konsekuensi, mengestimasi likelihood suatu skenario proses disertai dengan safeguard, dan mendapatkan risk ranking yang dapat dilihat pada matrik PHA 6x6. Sedangkan Safety Objective Analysis (SOA merupakan rangkaian aktivitas yang bergantung pada penyebab skenario, dan konsekuensi dari PHA, menghasilkan kebutuhan IPL (Independent Protective Layer menggunakan matrik SOA 6x6. Risk ranking 6 pada penilaian PHA diketegorikan aman jika safeguard yang ada selalu siap mengurangi risiko yang timbul dari skenario tersebut. Namun tidak semua safeguard dapat selalu siap mengurangi risiko tersebut. Oleh karena itu, perlu adanya analisis tambahan untuk memastikan risiko dari skenario dapat diperkecil. Analisis safety suatu skenario dengan SOA menghasilkan kebutuhan IPL yang dapat ditutup dengan mengkonfirmasi safeguard yang sesuai menjadi IPL. Hasil penilaian PHA-SOA CGS 1, CGS 3, CGS 4, dan CGS 5 menunjukkan bahwa ada penilaian severity dan PHA-SOA likelihood yang berbeda di tiap CGS padahal proses pada CGS tersebut identik, maka perlu adanya analisis konsistensi. Hasil analisis konsistensi ini dapat dijadikan pedoman untuk melakukan safety review pada risk assessment workshop kedepannya, yang biasanya diadakan setiap tiga hingga lima tahun sekali oleh industri.

  18. Flood Hazard Area

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  19. Flood Hazard Boundaries

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  20. Advancing Usability Evaluation through Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2005-07-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.

  1. Techniques for the Analysis of Human Movement.

    Science.gov (United States)

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  2. Big Data Analysis of Human Genome Variations

    KAUST Repository

    Gojobori, Takashi

    2016-01-25

    Since the human genome draft sequence was in public for the first time in 2000, genomic analyses have been intensively extended to the population level. The following three international projects are good examples for large-scale studies of human genome variations: 1) HapMap Data (1,417 individuals) (http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/2010-08_phaseII+III/forward/), 2) HGDP (Human Genome Diversity Project) Data (940 individuals) (http://www.hagsc.org/hgdp/files.html), 3) 1000 genomes Data (2,504 individuals) http://ftp.1000genomes.ebi.ac.uk/vol1/ftp/release/20130502/ If we can integrate all three data into a single volume of data, we should be able to conduct a more detailed analysis of human genome variations for a total number of 4,861 individuals (= 1,417+940+2,504 individuals). In fact, we successfully integrated these three data sets by use of information on the reference human genome sequence, and we conducted the big data analysis. In particular, we constructed a phylogenetic tree of about 5,000 human individuals at the genome level. As a result, we were able to identify clusters of ethnic groups, with detectable admixture, that were not possible by an analysis of each of the three data sets. Here, we report the outcome of this kind of big data analyses and discuss evolutionary significance of human genomic variations. Note that the present study was conducted in collaboration with Katsuhiko Mineta and Kosuke Goto at KAUST.

  3. Space Mission Human Reliability Analysis (HRA) Project

    Science.gov (United States)

    Boyer, Roger

    2014-01-01

    The purpose of the Space Mission Human Reliability Analysis (HRA) Project is to extend current ground-based HRA risk prediction techniques to a long-duration, space-based tool. Ground-based HRA methodology has been shown to be a reasonable tool for short-duration space missions, such as Space Shuttle and lunar fly-bys. However, longer-duration deep-space missions, such as asteroid and Mars missions, will require the crew to be in space for as long as 400 to 900 day missions with periods of extended autonomy and self-sufficiency. Current indications show higher risk due to fatigue, physiological effects due to extended low gravity environments, and others, may impact HRA predictions. For this project, Safety & Mission Assurance (S&MA) will work with Human Health & Performance (HH&P) to establish what is currently used to assess human reliabiilty for human space programs, identify human performance factors that may be sensitive to long duration space flight, collect available historical data, and update current tools to account for performance shaping factors believed to be important to such missions. This effort will also contribute data to the Human Performance Data Repository and influence the Space Human Factors Engineering research risks and gaps (part of the HRP Program). An accurate risk predictor mitigates Loss of Crew (LOC) and Loss of Mission (LOM).The end result will be an updated HRA model that can effectively predict risk on long-duration missions.

  4. An Analysis of U.S. Army Health Hazard Assessments During the Acquisition of Military Materiel

    Science.gov (United States)

    2010-06-03

    Asthma - Allergic rhinitis - Pneumonia - Influenza, acute respiratory infections, tuberculosis - Acute or chronic toxicity - Physiological...OSHA, the Environmental Protection Agency (EPA), the Department of Health and Human Services (HHS), and the Food and Drug Administration (FDA) (Code...appropriate in many circumstances such as for food service equipment used at fixed dining facilities on established military installations. However, human

  5. Analysis of Operational Hazards and Safety Requirements for Traffic Aware Strategic Aircrew Requests (TASAR)

    Science.gov (United States)

    Koczo, Stefan, Jr.

    2013-01-01

    Safety analyses of the Traffic Aware Strategic Aircrew Requests (TASAR) Electronic Flight Bag (EFB) application are provided to establish its Failure Effects Classification which affects certification and operational approval requirements. TASAR was developed by NASA Langley Research Center to offer flight path improvement opportunities to the pilot during flight for operational benefits (e.g., reduced fuel, flight time). TASAR, using own-ship and network-enabled information concerning the flight and its environment, including weather and Air Traffic Control (ATC) system constraints, provides recommended improvements to the flight trajectory that the pilot can choose to request via Change Requests to ATC for revised clearance. This study reviews the Change Request process of requesting updates to the current clearance, examines the intended function of TASAR, and utilizes two safety assessment methods to establish the Failure Effects Classification of TASAR. Considerable attention has been given in this report to the identification of operational hazards potentially associated with TASAR.

  6. VEGETATION COVER ANALYSIS OF HAZARDOUS WASTE SITES IN UTAH AND ARIZONA USING HYPERSPECTRAL REMOTE SENSING

    Energy Technology Data Exchange (ETDEWEB)

    Serrato, M.; Jungho, I.; Jensen, J.; Jensen, R.; Gladden, J.; Waugh, J.

    2012-01-17

    Remote sensing technology can provide a cost-effective tool for monitoring hazardous waste sites. This study investigated the usability of HyMap airborne hyperspectral remote sensing data (126 bands at 2.3 x 2.3 m spatial resolution) to characterize the vegetation at U.S. Department of Energy uranium processing sites near Monticello, Utah and Monument Valley, Arizona. Grass and shrub species were mixed on an engineered disposal cell cover at the Monticello site while shrub species were dominant in the phytoremediation plantings at the Monument Valley site. The specific objectives of this study were to: (1) estimate leaf-area-index (LAI) of the vegetation using three different methods (i.e., vegetation indices, red-edge positioning (REP), and machine learning regression trees), and (2) map the vegetation cover using machine learning decision trees based on either the scaled reflectance data or mixture tuned matched filtering (MTMF)-derived metrics and vegetation indices. Regression trees resulted in the best calibration performance of LAI estimation (R{sup 2} > 0.80). The use of REPs failed to accurately predict LAI (R{sup 2} < 0.2). The use of the MTMF-derived metrics (matched filter scores and infeasibility) and a range of vegetation indices in decision trees improved the vegetation mapping when compared to the decision tree classification using just the scaled reflectance. Results suggest that hyperspectral imagery are useful for characterizing biophysical characteristics (LAI) and vegetation cover on capped hazardous waste sites. However, it is believed that the vegetation mapping would benefit from the use of 1 higher spatial resolution hyperspectral data due to the small size of many of the vegetation patches (< 1m) found on the sites.

  7. Local models for rainstorm-induced hazard analysis on Mediterranean river-torrential geomorphological systems

    Directory of Open Access Journals (Sweden)

    N. Diodato

    2004-01-01

    Full Text Available Damaging hydrogeomorphological events are defined as one or more simultaneous phenomena (e.g. accelerated erosions, landslides, flash floods and river floods, occurring in a spatially and temporal random way and triggered by rainfall with different intensity and extent. The storm rainfall values are highly dependent on weather condition and relief. However, the impact of rainstorms in Mediterranean mountain environments depend mainly on climatic fluctuations in the short and long term, especially in rainfall quantity. An algorithm for the characterisation of this impact, called Rainfall Hazard Index (RHI, is developed with a less expensive methodology. In RHI modelling, we assume that the river-torrential system has adapted to the natural hydrological regime, and a sudden fluctuation in this regime, especially those exceeding thresholds for an acceptable range of flexibility, may have disastrous consequences for the mountain environment. RHI integrate two rainfall variables based upon storm depth current and historical data, both of a fixed duration, and a one-dimensionless parameter representative of the degree ecosystem flexibility. The approach was applied to a test site in the Benevento river-torrential landscape, Campania (Southern Italy. So, a database including data from 27 events which have occurred during an 77-year period (1926-2002 was compared with Benevento-station RHI(24h, for a qualitative validation. Trends in RHIx for annual maximum storms of duration 1, 3 and 24h were also examined. Little change is observed at the 3- and 24-h duration of a storm, but a significant increase results in hazard of a short and intense storm (RHIx(1h, in agreement with a reduction in return period for extreme rainfall events.

  8. On the use of faults and background seismicity in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    Science.gov (United States)

    Selva, Jacopo; Lorito, Stefano; Basili, Roberto; Tonini, Roberto; Tiberti, Mara Monica; Romano, Fabrizio; Perfetti, Paolo; Volpe, Manuela

    2017-04-01

    Most of the SPTHA studies and applications rely on several working assumptions: i) the - mostly offshore - tsunamigenic faults are sufficiently well known; ii) the subduction zone earthquakes dominate the hazard; iii) and their location and geometry is sufficiently well constrained. Hence, a probabilistic model is constructed as regards the magnitude-frequency distribution and sometimes the slip distribution of earthquakes occurring on assumed known faults. Then, tsunami scenarios are usually constructed for all earthquakes location, sizes, and slip distributions included in the probabilistic model, through deterministic numerical modelling of tsunami generation, propagation and impact on realistic bathymetries. Here, we adopt a different approach (Selva et al., GJI, 2016) that releases some of the above assumptions, considering that i) also non-subduction earthquakes may contribute significantly to SPTHA, depending on the local tectonic context; ii) that not all the offshore faults are known or sufficiently well constrained; iii) and that the faulting mechanism of future earthquakes cannot be considered strictly predictable. This approach uses as much as possible information from known faults which, depending on the amount of available information and on the local tectonic complexity, among other things, are either modelled as Predominant Seismicity (PS) or as Background Seismicity (BS). PS is used when it is possible to assume sufficiently known geometry and mechanism (e.g. for the main subduction zones). Conversely, within the BS approach information on faults is merged with that on past seismicity, dominant stress regime, and tectonic characterisation, to determine a probability density function for the faulting mechanism. To illustrate the methodology and its impact on the hazard estimates, we present an application in the NEAM region (Northeast Atlantic, Mediterranean and connected seas), initially designed during the ASTARTE project and now applied for the

  9. Analysis of two-phase sampling data with semiparametric additive hazards models.

    Science.gov (United States)

    Sun, Yanqing; Qian, Xiyuan; Shou, Qiong; Gilbert, Peter B

    2017-07-01

    Under the case-cohort design introduced by Prentice (Biometrica 73:1-11, 1986), the covariate histories are ascertained only for the subjects who experience the event of interest (i.e., the cases) during the follow-up period and for a relatively small random sample from the original cohort (i.e., the subcohort). The case-cohort design has been widely used in clinical and epidemiological studies to assess the effects of covariates on failure times. Most statistical methods developed for the case-cohort design use the proportional hazards model, and few methods allow for time-varying regression coefficients. In addition, most methods disregard data from subjects outside of the subcohort, which can result in inefficient inference. Addressing these issues, this paper proposes an estimation procedure for the semiparametric additive hazards model with case-cohort/two-phase sampling data, allowing the covariates of interest to be missing for cases as well as for non-cases. A more flexible form of the additive model is considered that allows the effects of some covariates to be time varying while specifying the effects of others to be constant. An augmented inverse probability weighted estimation procedure is proposed. The proposed method allows utilizing the auxiliary information that correlates with the phase-two covariates to improve efficiency. The asymptotic properties of the proposed estimators are established. An extensive simulation study shows that the augmented inverse probability weighted estimation is more efficient than the widely adopted inverse probability weighted complete-case estimation method. The method is applied to analyze data from a preventive HIV vaccine efficacy trial.

  10. Some Open Issues on Rockfall Hazard Analysis in Fractured Rock Mass: Problems and Prospects

    Science.gov (United States)

    Ferrero, Anna Maria; Migliazza, Maria Rita; Pirulli, Marina; Umili, Gessica

    2016-09-01

    Risk is part of every sector of engineering design. It is a consequence of the uncertainties connected with the cognitive boundaries and with the natural variability of the relevant variables. In soil and rock engineering, in particular, uncertainties are linked to geometrical and mechanical aspects and the model used for the problem schematization. While the uncertainties due to the cognitive gaps could be filled by improving the quality of numerical codes and measuring instruments, nothing can be done to remove the randomness of natural variables, except defining their variability with stochastic approaches. Probabilistic analyses represent a useful tool to run parametric analyses and to identify the more significant aspects of a given phenomenon: They can be used for a rational quantification and mitigation of risk. The connection between the cognitive level and the probability of failure is at the base of the determination of hazard, which is often quantified through the assignment of safety factors. But these factors suffer from conceptual limits, which can be only overcome by adopting mathematical techniques with sound bases, not so used up to now (Einstein et al. in rock mechanics in civil and environmental engineering, CRC Press, London, 3-13, 2010; Brown in J Rock Mech Geotech Eng 4(3):193-204, 2012). The present paper describes the problems and the more reliable techniques used to quantify the uncertainties that characterize the large number of parameters that are involved in rock slope hazard assessment through a real case specifically related to rockfall. Limits of the existing approaches and future developments of the research are also provided.

  11. In silico analysis sheds light on the structural basis underlying the ribotoxicity of trichothecenes-A tool for supporting the hazard identification process.

    Science.gov (United States)

    Dellafiora, Luca; Galaverna, Gianni; Dall'Asta, Chiara

    2017-03-15

    Deoxynivalenol is a food borne mycotoxin belonging to the trichothecenes family that may cause severe injuries in human and animals. The inhibition of protein synthesis via the interaction with the ribosome has been identified as a crucial mechanism underlying toxic action. However, it is not still fully understood how and to what extent compounds belonging to trichothecenes family affect human and animal health. In turn, this scenario causes delay in managing the related health risk. Aimed at supporting the hazard identification process, the in silico analysis may be a straightforward tool to investigate the structure-activity relationship of trichothecenes, finding out molecules of possible concern to carry forth in the risk assessment process. In this framework, this work investigated through a molecular modeling approach the structural basis underlying the interaction with the ribosome under a structure-activity relationship perspective. To identify further forms possibly involved in the total trichothecenes-dependent ribotoxic load, the model was challenged with a set of 16 trichothecene modified forms found in plants, fungi and animals, including also compounds never tested before for the capability to bind and inhibit the ribosome. Among them, only the regiospecific glycosylation in the position 3 of the sesquiterpenoid scaffold (i.e. T-2 toxin-3-glucuronide, α and β isomers of T-2 toxin-3-glucoside and deoxynivalenol-3-glucuronide) was found impairing the interaction with the ribosome, while the other compounds tested (i.e. neosolaniol, nivalenol, fusarenon-X, diacetoxyscirpenol, NT-1 toxin, HT-2 toxin, 19- and 20-hydroxy-T-2 toxin, T-2 toxin triol and tetraol, and 15-deacetyl-T-2 toxin), were found potentially able to inhibit the ribosome. Accordingly, they should be included with high priority in further risk assessment studies in order to better characterize the trichothecenes-related hazard. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Resampling methods for evaluating the uncertainty of the nonparametric magnitude distribution estimation in the Probabilistic Seismic Hazard Analysis

    Science.gov (United States)

    Orlecka-Sikora, Beata

    2008-08-01

    The cumulative distribution function (CDF) of magnitude of seismic events is one of the most important probabilistic characteristics in Probabilistic Seismic Hazard Analysis (PSHA). The magnitude distribution of mining induced seismicity is complex. Therefore, it is estimated using kernel nonparametric estimators. Because of its model-free character the nonparametric approach cannot, however, provide confidence interval estimates for CDF using the classical methods of mathematical statistics. To assess errors in the seismic events magnitude estimation, and thereby in the seismic hazard parameters evaluation in the nonparametric approach, we propose the use of the resampling methods. Resampling techniques applied to a one dataset provide many replicas of this sample, which preserve its probabilistic properties. In order to estimate the confidence intervals for the CDF of magnitude, we have developed an algorithm based on the bias corrected and accelerated method (BC a method). This procedure uses the smoothed bootstrap and second-order bootstrap samples. We refer to this algorithm as the iterated BC a method. The algorithm performance is illustrated through the analysis of Monte Carlo simulated seismic event catalogues and actual data from an underground copper mine in the Legnica-Głogów Copper District in Poland. The studies show that the iterated BC a technique provides satisfactory results regardless of the sample size and actual shape of the magnitude distribution.

  13. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  14. An assessment of hazards caused by electromagnetic interaction on humans present near short-wave physiotherapeutic devices of various types including hazards for users of electronic active implantable medical devices (AIMD).

    Science.gov (United States)

    Karpowicz, Jolanta; Gryz, Krzysztof

    2013-01-01

    Leakage of electromagnetic fields (EMF) from short-wave radiofrequency physiotherapeutic diathermies (SWDs) may cause health and safety hazards affecting unintentionally exposed workers (W) or general public (GP) members (assisting patient exposed during treatment or presenting there for other reasons). Increasing use of electronic active implantable medical devices (AIMDs), by patients, attendants, and workers, needs attention because dysfunctions of these devices may be caused by electromagnetic interactions. EMF emitted by 12 SWDs (with capacitive or inductive applicators) were assessed following international guidelines on protection against EMF exposure (International Commission on Nonionizing Radiation Protection for GP and W, new European directive 2013/35/EU for W, European Recommendation for GP, and European Standard EN 50527-1 for AIMD users). Direct EMF hazards for humans near inductive applicators were identified at a distance not exceeding 45 cm for W or 62 cm for GP, but for AIMD users up to 90 cm (twice longer than that for W and 50% longer than that for GP because EMF is pulsed modulated). Near capacitive applicators emitting continuous wave, the corresponding distances were: 120 cm for W or 150 cm for both-GP or AIMD users. This assessment does not cover patients who undergo SWD treatment (but it is usually recommended for AIMD users to be careful with EMF treatment).

  15. An Assessment of Hazards Caused by Electromagnetic Interaction on Humans Present near Short-Wave Physiotherapeutic Devices of Various Types Including Hazards for Users of Electronic Active Implantable Medical Devices (AIMD

    Directory of Open Access Journals (Sweden)

    Jolanta Karpowicz

    2013-01-01

    Full Text Available Leakage of electromagnetic fields (EMF from short-wave radiofrequency physiotherapeutic diathermies (SWDs may cause health and safety hazards affecting unintentionally exposed workers (W or general public (GP members (assisting patient exposed during treatment or presenting there for other reasons. Increasing use of electronic active implantable medical devices (AIMDs, by patients, attendants, and workers, needs attention because dysfunctions of these devices may be caused by electromagnetic interactions. EMF emitted by 12 SWDs (with capacitive or inductive applicators were assessed following international guidelines on protection against EMF exposure (International Commission on Nonionizing Radiation Protection for GP and W, new European directive 2013/35/EU for W, European Recommendation for GP, and European Standard EN 50527-1 for AIMD users. Direct EMF hazards for humans near inductive applicators were identified at a distance not exceeding 45 cm for W or 62 cm for GP, but for AIMD users up to 90 cm (twice longer than that for W and 50% longer than that for GP because EMF is pulsed modulated. Near capacitive applicators emitting continuous wave, the corresponding distances were: 120 cm for W or 150 cm for both—GP or AIMD users. This assessment does not cover patients who undergo SWD treatment (but it is usually recommended for AIMD users to be careful with EMF treatment.

  16. Food-Safety Hazards in the Pork Chain in Nagaland, North East India: Implications for Human Health

    OpenAIRE

    Anna Sophie Fahrion; Lanu Jamir; Kenivole Richa; Sonuwara Begum; Vilatuo Rutsa; Simon Ao; Padmakumar, Varijaksha P.; Ram Pratim Deka; Delia Grace

    2013-01-01

    Pork occupies an important place in the diet of the population of Nagaland, one of the North East Indian states. We carried out a pilot study along the pork meat production chain, from live animal to end consumer. The goal was to obtain information about the presence of selected food borne hazards in pork in order to assess the risk deriving from these hazards to the health of the local consumers and make recommendations for improving food safety. A secondary objective was to evaluate the ut...

  17. An Evaluation of the Effectiveness of Efforts over the last Two Decades to Reduce Human Losses due to Natural Hazards, and A Proposal for Future Efforts (Invited)

    Science.gov (United States)

    Tucker, B. E.; Chakos, A.

    2009-12-01

    While there is evidence that efforts over the last 20 years to reduce human and fiscal losses due to natural hazards have been effective, there is also evidence that, despite these efforts, we can expect large and, perhaps, even increasing losses in the future. If this conclusion is correct—and unacceptable—then what should be done differently to reduce these losses? One piece of the answer can be found through analyzing why the efforts to date have not been more effective. Another piece can be found through examining the characteristics of successful social movements. For a social movement is what we are talking about when we advocate changing human behavior in order to reduce risk from natural hazards. We cannot attribute the disappointingly modest success of past risk reduction efforts to inadequate science or engineering: the reduction of natural disaster losses in both the U.S. and Japan over the last century indicates that humans possess the required scientific and engineering expertise to reduce the risk of natural hazards, and reduce it significantly. If the problem is that this expertise is not being applied outside of Japan and the U.S., where the risk is concentrated, then we need to understand why. There are numerous examples, after all, of widespread, rapid adoption of modern technologies (such as the internet), once these technologies were perceived to be beneficial. Yet not only have earthquake engineering advances failed to be adopted where they are needed, even existing building codes are often not followed. To understand this behavioral paradox better, we turn to human psychology. In the last several years, Nicholas Kristof of the New York Times has invoked the work of psychologists, in order to explore how our brains may not have yet evolved to respond properly to certain types of modern risks. Kristof refers, for example, to Professor Daniel Gilbert, who argues that threats that will catch our attention will either be personalized, imminent

  18. A proxy analysis of urban air quality hazards in Bergen, Norway under a changing climate.

    Science.gov (United States)

    Wolf, Tobias; Esau, Igor; Reuder, Joachim

    2014-05-01

    The urban air quality in Bergen, Norway is characterized by clean air throughout most of the year interrupted by short episodes of hazardous pollution levels especially in close proximity to major road-emission sources. These pollution episodes are linked to winter time anti-cyclonic weather conditions with persistent stable temperature stratification (inversions) in the Atmospheric Boundary Layer. Although the pollution episodes are local events, the high pollution episodes are linked to large-scale persistent blockings in the atmospheric circulation. Here we present an atmospheric circulation proxy for the pollution episodes based on the ECMWF ERA-Interim reanalysis. The proxy is based on local 3-hourly instantaneous wind-speeds and directions at the 1000 hPa pressure level, and 1-day running mean temperature deviations at 2 m above ground from the 1-day running mean temperatures averaged over the full ERA-Interim record length. We tuned the thresholds for each quantity to the occurrence of events with an hourly mean NO2 concentration > 150 μg/m3 at a high pollution reference station. A condition on cloud cover had only little effect, sea-level pressure was not applicable. High pollution episodes predicted during typical low traffic days (Sundays, Christmas, New Year) were removed. The final proxy had a detection rate of 82 %, a false alarm rate of 77 % and a correct null prediction rate of 96 %. The high false alarm rate was expected because of the relaxed thresholds chosen in order to include a large fraction of possible states of atmospheric circulation that lead to hazardous air quality. Additionally, the false alarm rate was high because no constraint on the persistence of adverse meteorological conditions was set and because of the high variability of traffic, not always leading to hazardous pollution levels, even if the atmospheric circulation would allow for it. The Scandinavian index, an often used proxy for the occurrence of atmospheric circulation

  19. Evaluation of the ToxRTool's ability to rate the reliability of toxicological data for human health hazard assessments

    Science.gov (United States)

    Regulatory agencies often utilize results from peer reviewed publications for hazard assessments.A problem in doing so is the lack of well-accepted tools to objectively, efficiently and systematically assess the quality of published toxicological studies. Herein, we evaluated the...

  20. Heavy metal ions in wines: meta-analysis of target hazard quotients reveal health risks

    Directory of Open Access Journals (Sweden)

    Petróczi Andrea

    2008-10-01

    Full Text Available Abstract Background Metal ions such as iron and copper are among the key nutrients that must be provided by dietary sources. Numerous foodstuffs have been evaluated for their contributions to the recommended daily allowance both to guide for satisfactory intake and also to prevent over exposure. In the case of heavy metal ions, the focus is often on exposure to potentially toxic levels of ions such as lead and mercury. The aim of this study is to determine target hazard quotients (THQ from literature reports giving empirical levels of metal ions in table wines using the reference upper safe limit value. Contributions to the THQ value were calculated for seven metal ions along with total values for each wine. Results The THQ values were determined as ranges from previously reported ranges of metal ion concentrations and were frequently concerningly high. Apart from the wines selected from Italy, Brazil and Argentina, all other wines exhibited THQ values significantly greater than one indicating levels of risk. The levels of vanadium, copper and manganese had the highest impact on THQ measures. Typical potential maximum THQ values ranged from 50 to 200 with Hungarian and Slovakian wines reaching 300. THQ values for a sample of red and white wines were high for both having values ranging from 30 to 80 for females based on a 250 mL glass per day. Conclusion The THQ values calculated are concerning in that they are mainly above the safe level of THQ

  1. Physical Volcanology and Hazard Analysis of a Young Volcanic Field: Black Rock Desert, Utah, USA

    Science.gov (United States)

    Hintz, A. R.

    2009-05-01

    The Black Rock Desert volcanic field, located in west-central Utah, consists of ~30 small-volume monogenetic volcanoes with compositions ranging from small rhyolite domes to large basaltic lava flow fields. The field has exhibited bimodal volcanism for > 9 Ma with the most recent eruption of Ice Springs volcano ˜ 600 yrs ago. Together this eruptive history along with ongoing geothermal activity attests to the usefulness of a hazard assessment. The likelihood of a future eruption in this area has been calculated to be ˜ 8% over