WorldWideScience

Sample records for models lahar assessment

  1. A Framework for Probabilistic Multi-Hazard Assessment of Rain-Triggered Lahars Using Bayesian Belief Networks

    Directory of Open Access Journals (Sweden)

    Pablo Tierz

    2017-09-01

    Full Text Available Volcanic water-sediment flows, commonly known as lahars, can often pose a higher threat to population and infrastructure than primary volcanic hazardous processes such as tephra fallout and Pyroclastic Density Currents (PDCs. Lahars are volcaniclastic flows of water, volcanic debris and entrained sediments that can travel long distances from their source, causing severe damage by impact and burial. Lahars are frequently triggered by intense or prolonged rainfall occurring after explosive eruptions, and their occurrence depends on numerous factors including the spatio-temporal rainfall characteristics, the spatial distribution and hydraulic properties of the tephra deposit, and the pre- and post-eruption topography. Modeling (and forecasting such a complex system requires the quantification of aleatory variability in the lahar triggering and propagation. To fulfill this goal, we develop a novel framework for probabilistic hazard assessment of lahars within a multi-hazard environment, based on coupling a versatile probabilistic model for lahar triggering (a Bayesian Belief Network: Multihaz with a dynamic physical model for lahar propagation (LaharFlow. Multihaz allows us to estimate the probability of lahars of different volumes occurring by merging varied information about regional rainfall, scientific knowledge on lahar triggering mechanisms and, crucially, probabilistic assessment of available pyroclastic material from tephra fallout and PDCs. LaharFlow propagates the aleatory variability modeled by Multihaz into hazard footprints of lahars. We apply our framework to Somma-Vesuvius (Italy because: (1 the volcano is strongly lahar-prone based on its previous activity, (2 there are many possible source areas for lahars, and (3 there is high density of population nearby. Our results indicate that the size of the eruption preceding the lahar occurrence and the spatial distribution of tephra accumulation have a paramount role in the lahar

  2. Modeling the October 2005 lahars at Panabaj (Guatemala)

    Science.gov (United States)

    Charbonnier, S. J.; Connor, C. B.; Connor, L. J.; Sheridan, M. F.; Oliva Hernández, J. P.; Richardson, J. A.

    2018-01-01

    An extreme rainfall event in October of 2005 triggered two deadly lahars on the flanks of Tolimán volcano (Guatemala) that caused many fatalities in the village of Panabaj. We mapped the deposits of these lahars, then developed computer simulations of the lahars using the geologic data and compared simulated area inundated by the flows to mapped area inundated. Computer simulation of the two lahars was dramatically improved after calibration with geological data. Specifically, detailed field measurements of flow inundation area, flow thickness, flow direction, and velocity estimates, collected after lahar emplacement, were used to calibrate the rheological input parameters for the models, including deposit volume, yield strength, sediment and water concentrations, and Manning roughness coefficients. Simulations of the two lahars, with volumes of 240,200 ± 55,400 and 126,000 ± 29,000 m3, using the FLO-2D computer program produced models of lahar runout within 3% of measured runouts and produced reasonable estimates of flow thickness and velocity along the lengths of the simulated flows. We compare areas inundated using the Jaccard fit, model sensitivity, and model precision metrics, all related to Bayes' theorem. These metrics show that false negatives (areas inundated by the observed lahar where not simulated) and false positives (areas not inundated by the observed lahar where inundation was simulated) are reduced using a model calibrated by rheology. The metrics offer a procedure for tuning model performance that will enhance model accuracy and make numerical models a more robust tool for natural hazard reduction.

  3. Probabilistic Volcanic Multi-Hazard Assessment at Somma-Vesuvius (Italy): coupling Bayesian Belief Networks with a physical model for lahar propagation

    Science.gov (United States)

    Tierz, Pablo; Woodhouse, Mark; Phillips, Jeremy; Sandri, Laura; Selva, Jacopo; Marzocchi, Warner; Odbert, Henry

    2017-04-01

    Volcanoes are extremely complex physico-chemical systems where magma formed at depth breaks into the planet's surface resulting in major hazards from local to global scales. Volcano physics are dominated by non-linearities, and complicated spatio-temporal interrelationships which make volcanic hazards stochastic (i.e. not deterministic) by nature. In this context, probabilistic assessments are required to quantify the large uncertainties related to volcanic hazards. Moreover, volcanoes are typically multi-hazard environments where different hazardous processes can occur whether simultaneously or in succession. In particular, explosive volcanoes are able to accumulate, through tephra fallout and Pyroclastic Density Currents (PDCs), large amounts of pyroclastic material into the drainage basins surrounding the volcano. This addition of fresh particulate material alters the local/regional hydrogeological equilibrium and increases the frequency and magnitude of sediment-rich aqueous flows, commonly known as lahars. The initiation and volume of rain-triggered lahars may depend on: rainfall intensity and duration; antecedent rainfall; terrain slope; thickness, permeability and hydraulic diffusivity of the tephra deposit; etc. Quantifying these complex interrelationships (and their uncertainties), in a tractable manner, requires a structured but flexible probabilistic approach. A Bayesian Belief Network (BBN) is a directed acyclic graph that allows the representation of the joint probability distribution for a set of uncertain variables in a compact and efficient way, by exploiting unconditional and conditional independences between these variables. Once constructed and parametrized, the BBN uses Bayesian inference to perform causal (e.g. forecast) and/or evidential reasoning (e.g. explanation) about query variables, given some evidence. In this work, we illustrate how BBNs can be used to model the influence of several variables on the generation of rain-triggered lahars

  4. Water, ice and mud: Lahars and lahar hazards at ice- and snow-clad volcanoes

    Science.gov (United States)

    Waythomas, Christopher F.

    2014-01-01

    Large-volume lahars are significant hazards at ice and snow covered volcanoes. Hot eruptive products produced during explosive eruptions can generate a substantial volume of melt water that quickly evolves into highly mobile flows of ice, sediment and water. At present it is difficult to predict the size of lahars that can form at ice and snow covered volcanoes due to their complex flow character and behaviour. However, advances in experiments and numerical approaches are producing new conceptual models and new methods for hazard assessment. Eruption triggered lahars that are ice-dominated leave behind thin, almost unrecognizable sedimentary deposits, making them likely to be under-represented in the geological record.

  5. Modeling lahar behavior and hazards

    Science.gov (United States)

    Manville, Vernon; Major, Jon J.; Fagents, Sarah A.

    2013-01-01

    Lahars are highly mobile mixtures of water and sediment of volcanic origin that are capable of traveling tens to > 100 km at speeds exceeding tens of km hr-1. Such flows are among the most serious ground-based hazards at many volcanoes because of their sudden onset, rapid advance rates, long runout distances, high energy, ability to transport large volumes of material, and tendency to flow along existing river channels where populations and infrastructure are commonly concentrated. They can grow in volume and peak discharge through erosion and incorporation of external sediment and/or water, inundate broad areas, and leave deposits many meters thick. Furthermore, lahars can recur for many years to decades after an initial volcanic eruption, as fresh pyroclastic material is eroded and redeposited during rainfall events, resulting in a spatially and temporally evolving hazard. Improving understanding of the behavior of these complex, gravitationally driven, multi-phase flows is key to mitigating the threat to communities at lahar-prone volcanoes. However, their complexity and evolving nature pose significant challenges to developing the models of flow behavior required for delineating their hazards and hazard zones.

  6. Assessing lahars from ice-capped volcanoes using ASTER satellite data, the SRTM DTM and two different flow models: case study on Iztaccíhuatl (Central Mexico

    Directory of Open Access Journals (Sweden)

    D. Schneider

    2008-06-01

    Full Text Available Lahars frequently affect the slopes of ice-capped volcanoes. They can be triggered by volcano-ice interactions during eruptions but also by processes such as intense precipitation or by outbursts of glacial water bodies not directly related to eruptive activity. We use remote sensing, GIS and lahar models in combination with ground observations for an initial lahar hazard assessment on Iztaccíhuatl volcano (5230 m a.s.l., considering also possible future developments of the glaciers on the volcano. Observations of the glacial extent are important for estimations of future hazard scenarios, especially in a rapidly changing tropical glacial environment. In this study, analysis of the glaciers on Iztaccíhuatl shows a dramatic retreat during the last 150 years: the glaciated area in 2007 corresponds to only 4% of the one in 1850 AD and the glaciers are expected to survive no later than the year 2020. Most of the glacial retreat is considered to be related to climate change but in-situ observations suggest also that geo- and hydrothermal heat flow at the summit-crater area can not be ruled out, as emphasized by fumarolic activity documented in a former study. However, development of crater lakes and englacial water reservoirs are supposed to be a more realistic scenario for lahar generation than sudden ice melting by rigorous volcano-ice interaction. Model calculations show that possible outburst floods have to be larger than ~5×105 m3 or to achieve an H/L ratio (Height/runout Length of 0.2 and lower in order to reach the populated lower flanks. This threshold volume equals 2.4% melted ice of Iztaccíhuatl's total ice volume in 2007, assuming 40% water and 60% volumetric debris content of a potential lahar. The model sensitivity analysis reveals important effects of the generic type of the Digital Terrain Model (DTM used on the results. As a consequence, the predicted affected areas can vary significantly. For such

  7. Real-time prediction of rain-triggered lahars: incorporating seasonality and catchment recovery

    Science.gov (United States)

    Jones, Robbie; Manville, Vern; Peakall, Jeff; Froude, Melanie J.; Odbert, Henry M.

    2017-12-01

    Rain-triggered lahars are a significant secondary hydrological and geomorphic hazard at volcanoes where unconsolidated pyroclastic material produced by explosive eruptions is exposed to intense rainfall, often occurring for years to decades after the initial eruptive activity. Previous studies have shown that secondary lahar initiation is a function of rainfall parameters, source material characteristics and time since eruptive activity. In this study, probabilistic rain-triggered lahar forecasting models are developed using the lahar occurrence and rainfall record of the Belham River valley at the Soufrière Hills volcano (SHV), Montserrat, collected between April 2010 and April 2012. In addition to the use of peak rainfall intensity (PRI) as a base forecasting parameter, considerations for the effects of rainfall seasonality and catchment evolution upon the initiation of rain-triggered lahars and the predictability of lahar generation are also incorporated into these models. Lahar probability increases with peak 1 h rainfall intensity throughout the 2-year dataset and is higher under given rainfall conditions in year 1 than year 2. The probability of lahars is also enhanced during the wet season, when large-scale synoptic weather systems (including tropical cyclones) are more common and antecedent rainfall and thus levels of deposit saturation are typically increased. The incorporation of antecedent conditions and catchment evolution into logistic-regression-based rain-triggered lahar probability estimation models is shown to enhance model performance and displays the potential for successful real-time prediction of lahars, even in areas featuring strongly seasonal climates and temporal catchment recovery.

  8. Real-time prediction of rain-triggered lahars: incorporating seasonality and catchment recovery

    Directory of Open Access Journals (Sweden)

    R. Jones

    2017-12-01

    Full Text Available Rain-triggered lahars are a significant secondary hydrological and geomorphic hazard at volcanoes where unconsolidated pyroclastic material produced by explosive eruptions is exposed to intense rainfall, often occurring for years to decades after the initial eruptive activity. Previous studies have shown that secondary lahar initiation is a function of rainfall parameters, source material characteristics and time since eruptive activity. In this study, probabilistic rain-triggered lahar forecasting models are developed using the lahar occurrence and rainfall record of the Belham River valley at the Soufrière Hills volcano (SHV, Montserrat, collected between April 2010 and April 2012. In addition to the use of peak rainfall intensity (PRI as a base forecasting parameter, considerations for the effects of rainfall seasonality and catchment evolution upon the initiation of rain-triggered lahars and the predictability of lahar generation are also incorporated into these models. Lahar probability increases with peak 1 h rainfall intensity throughout the 2-year dataset and is higher under given rainfall conditions in year 1 than year 2. The probability of lahars is also enhanced during the wet season, when large-scale synoptic weather systems (including tropical cyclones are more common and antecedent rainfall and thus levels of deposit saturation are typically increased. The incorporation of antecedent conditions and catchment evolution into logistic-regression-based rain-triggered lahar probability estimation models is shown to enhance model performance and displays the potential for successful real-time prediction of lahars, even in areas featuring strongly seasonal climates and temporal catchment recovery.

  9. Community Exposure to Lahar Hazards from Mount Rainier, Washington

    Science.gov (United States)

    Wood, Nathan J.; Soulard, Christopher E.

    2009-01-01

    Geologic evidence of past events and inundation modeling of potential events suggest that lahars associated with Mount Rainier, Washington, are significant threats to downstream development. To mitigate potential impacts of future lahars and educate at-risk populations, officials need to understand how communities are vulnerable to these fast-moving debris flows and which individuals and communities may need assistance in preparing for and responding to an event. To support local risk-reduction planning for future Mount Rainier lahars, this study documents the variations among communities in King, Lewis, Pierce, and Thurston Counties in the amount and types of developed land, human populations, economic assets, and critical facilities in a lahar-hazard zone. The lahar-hazard zone in this study is based on the behavior of the Electron Mudflow, a lahar that traveled along the Puyallup River approximately 500 years ago and was due to a slope failure on the west flank of Mount Rainier. This lahar-hazard zone contains 78,049 residents, of which 11 percent are more than 65 years in age, 21 percent do not live in cities or unincorporated towns, and 39 percent of the households are renter occupied. The lahar-hazard zone contains 59,678 employees (4 percent of the four-county labor force) at 3,890 businesses that generate $16 billion in annual sales (4 and 7 percent, respectively, of totals in the four-county area) and tax parcels with a combined total value of $8.8 billion (2 percent of the study-area total). Employees in the lahar-hazard zone are primarily in businesses related to manufacturing, retail trade, transportation and warehousing, wholesale trade, and construction. Key road and rail corridors for the region are in the lahar-hazard zone, which could result in significant indirect economic losses for businesses that rely on these networks, such as the Port of Tacoma. Although occupancy values are not known for each site, the lahar-hazard zone contains numerous

  10. Reducing risk from lahar hazards: concepts, case studies, and roles for scientists

    Science.gov (United States)

    Pierson, Thomas C.; Wood, Nathan J.; Driedger, Carolyn L.

    2014-01-01

    Lahars are rapid flows of mud-rock slurries that can occur without warning and catastrophically impact areas more than 100 km downstream of source volcanoes. Strategies to mitigate the potential for damage or loss from lahars fall into four basic categories: (1) avoidance of lahar hazards through land-use planning; (2) modification of lahar hazards through engineered protection structures; (3) lahar warning systems to enable evacuations; and (4) effective response to and recovery from lahars when they do occur. Successful application of any of these strategies requires an accurate understanding and assessment of the hazard, an understanding of the applicability and limitations of the strategy, and thorough planning. The human and institutional components leading to successful application can be even more important: engagement of all stakeholders in hazard education and risk-reduction planning; good communication of hazard and risk information among scientists, emergency managers, elected officials, and the at-risk public during crisis and non-crisis periods; sustained response training; and adequate funding for risk-reduction efforts. This paper reviews a number of methods for lahar-hazard risk reduction, examines the limitations and tradeoffs, and provides real-world examples of their application in the U.S. Pacific Northwest and in other volcanic regions of the world. An overriding theme is that lahar-hazard risk reduction cannot be effectively accomplished without the active, impartial involvement of volcano scientists, who are willing to assume educational, interpretive, and advisory roles to work in partnership with elected officials, emergency managers, and vulnerable communities.

  11. Hydrological control of large hurricane-induced lahars: evidence from rainfall-runoff modeling, seismic and video monitoring

    Science.gov (United States)

    Capra, Lucia; Coviello, Velio; Borselli, Lorenzo; Márquez-Ramírez, Víctor-Hugo; Arámbula-Mendoza, Raul

    2018-03-01

    The Volcán de Colima, one of the most active volcanoes in Mexico, is commonly affected by tropical rains related to hurricanes that form over the Pacific Ocean. In 2011, 2013 and 2015 hurricanes Jova, Manuel and Patricia, respectively, triggered tropical storms that deposited up to 400 mm of rain in 36 h, with maximum intensities of 50 mm h -1. The effects were devastating, with the formation of multiple lahars along La Lumbre and Montegrande ravines, which are the most active channels in sediment delivery on the south-southwest flank of the volcano. Deep erosion along the river channels and several marginal landslides were observed, and the arrival of block-rich flow fronts resulted in damages to bridges and paved roads in the distal reaches of the ravines. The temporal sequence of these flow events is reconstructed and analyzed using monitoring data (including video images, seismic records and rainfall data) with respect to the rainfall characteristics and the hydrologic response of the watersheds based on rainfall-runoff numerical simulation. For the studied events, lahars occurred 5-6 h after the onset of rainfall, lasted several hours and were characterized by several pulses with block-rich fronts and a maximum flow discharge of 900 m3 s -1. Rainfall-runoff simulations were performer using the SCS-curve number and the Green-Ampt infiltration models, providing a similar result in the detection of simulated maximum watershed peaks discharge. Results show different behavior for the arrival times of the first lahar pulses that correlate with the simulated catchment's peak discharge for La Lumbre ravine and with the peaks in rainfall intensity for Montegrande ravine. This different behavior is related to the area and shape of the two watersheds. Nevertheless, in all analyzed cases, the largest lahar pulse always corresponds with the last one and correlates with the simulated maximum peak discharge of these catchments. Data presented here show that flow pulses

  12. Modeled inundation limits of potential lahars from Mount Adams in the White Salmon River Valley, Washington

    Science.gov (United States)

    Griswold, Julia P.; Pierson, Thomas C.; Bard, Joseph A.

    2018-05-09

    ,000 years ago, primarily through the episodic effusion of lava flows; it has not had a history of major explosive eruptions like Mount St. Helens, its neighbor to the west. Timing of the most recent eruptive activity (recorded by four thin tephra layers) is on the order of 1,000 years ago; the tephras are bracketed by 2,500-year-old and 500-year-old ash layers from Mount St. Helens (Hildreth and Fierstein, 1995, 1997). Mount Adams currently shows no signs of renewed unrest.Eruptive history does not tell us everything we need to know about hazards at Mount Adams, however, which are fully addressed in the volcano hazard assessment for Mount Adams (W.E. Scott and others, 1995). This volcano has had a long-active hydrothermal system that circulated acidic hydrothermal fluids, formed by the solution of volcanic gases in heated groundwater, through fractures and permeable zones into upper parts of the volcanic cone. Acid sulfate leaching of rocks in the summit area may still be occurring, but chemical and thermal evidence suggests that the main hydrothermal system is no longer active at Mount Adams (Nathenson and Mariner, 2013). However, these rock-weakening chemical reactions have operated long enough to change about 0.4 cubic miles (mi3) (1.7 cubic kilometers [km3]) of the hard lava rock in the volcano’s upper cone to a much weaker clay-rich rock, thus significantly reducing rock strength and thereby slope stability in parts of the cone (Finn and others, 2007). The two largest previous lahars from Mount Adams were triggered by landslides of hydrothermally altered rock from the upper southwestern flank of the cone, and any future large lahars are likely to be triggered by the same mechanism. Mount Rainier also has had extensive hydrothermal alteration of rock in its upper edifice, and it also has a history of large landslides that transform into lahars (K.M. Scott and others, 1995; Vallance and Scott, 1997; Reid and others, 2001).The spatial depiction of modeled lahar

  13. Variations in community exposure to lahar hazards from multiple volcanoes in Washington State (USA)

    Science.gov (United States)

    Diefenbach, Angela K.; Wood, Nathan J.; Ewert, John W.

    2015-01-01

    Understanding how communities are vulnerable to lahar hazards provides critical input for effective design and implementation of volcano hazard preparedness and mitigation strategies. Past vulnerability assessments have focused largely on hazards posed by a single volcano, even though communities and officials in many parts of the world must plan for and contend with hazards associated with multiple volcanoes. To better understand community vulnerability in regions with multiple volcanic threats, we characterize and compare variations in community exposure to lahar hazards associated with five active volcanoes in Washington State, USA—Mount Baker, Glacier Peak, Mount Rainier, Mount Adams and Mount St. Helens—each having the potential to generate catastrophic lahars that could strike communities tens of kilometers downstream. We use geospatial datasets that represent various population indicators (e.g., land cover, residents, employees, tourists) along with mapped lahar-hazard boundaries at each volcano to determine the distributions of populations within communities that occupy lahar-prone areas. We estimate that Washington lahar-hazard zones collectively contain 191,555 residents, 108,719 employees, 433 public venues that attract visitors, and 354 dependent-care facilities that house individuals that will need assistance to evacuate. We find that population exposure varies considerably across the State both in type (e.g., residential, tourist, employee) and distribution of people (e.g., urban to rural). We develop composite lahar-exposure indices to identify communities most at-risk and communities throughout the State who share common issues of vulnerability to lahar-hazards. We find that although lahars are a regional hazard that will impact communities in different ways there are commonalities in community exposure across multiple volcanoes. Results will aid emergency managers, local officials, and the public in educating at-risk populations and developing

  14. The drag forces exerted by lahar flows on a cylindrical pier: case study of post Mount Merapi eruptions

    Science.gov (United States)

    Faizien Haza, Zainul

    2018-03-01

    Debris flows of lahar flows occurred in post mount eruption is a phenomenon in which large quantities of water, mud, and gravel flow down a stream at a high velocity. It is a second stage of danger after the first danger of lava flows, pyroclastic, and toxic gases. The debris flow of lahar flows has a high density and also high velocity; therefore it has potential detrimental consequences against homes, bridges, and infrastructures, as well as loss of life along its pathway. The collision event between lahar flows and pier of a bridge is observed. The condition is numerically simulated using commercial software of computational fluid dynamic (CFD). The work is also conducted in order to investigate drag force generated during collision. Rheological data of lahar is observed through laboratory test of lahar model as density and viscosity. These data were used as the input data of the CFD simulation. The numerical model is involving two types of fluid: mud and water, therefore multiphase model is adopted in the current CFD simulation. The problem formulation is referring to the constitutive equations of mass and momentum conservation for incompressible and viscous fluid, which in perspective of two dimension (2D). The simulation models describe the situation of the collision event between lahar flows and pier of a bridge. It provides sequential view images of lahar flow impaction and the propagation trend line of the drag force coefficient values. Lahar flow analysis used non-dimensional parameter of Reynolds number. According to the results of numerical simulations, the drag force coefficients are in range 1.23 to 1.48 those are generated by value of flow velocity in range 11.11 m/s to 16.67 m/s.

  15. Catastrophic precipitation-triggered lahar at Casita volcano, Nicaragua: Occurrence, bulking and transformation

    Science.gov (United States)

    Scott, K.M.; Vallance, J.W.; Kerle, N.; Macias, J.L.; Strauch, W.; Devoli, G.

    2005-01-01

    A catastrophic lahar began on 30 October 1998, as hurricane precipitation triggered a small flank collapse of Casita volcano, a complex and probably dormant stratovolcano. The initial rockslide-debris avalanche evolved on the flank to yield a watery debris flood with a sediment concentration less than 60 per cent by volume at the base of the volcano. Within 2-5 km, however, the watery flow entrained (bulked) enough sediment to transform entirely to a debris flow. The debris flow, 6 km downstream and 1??2 km wide and 3 to 6 m deep, killed 2500 people, nearly the entire populations of the communities of El Porvenir and Rolando Rodriguez. These 'new towns' were developed in a prehistoric lahar pathway: at least three flows of similar size since 8330 14C years BP are documented by stratigraphy in the same 30-degree sector. Travel time between perception of the flow and destruction of the towns was only 2??5-3??0 minutes. The evolution of the flow wave occurred with hydraulic continuity and without pause or any extraordinary addition of water. The precipitation trigger of the Casita lahar emphasizes the nee d, in volcano hazard assessments, for including the potential for non-eruption-related collapse lahars with the more predictable potential of their syneruption analogues. The flow behaviour emphasizes that volcano collapses can yield not only volcanic debris avalanches with restricted runouts, but also mobile lahars that enlarge by bulking as they flow. Volumes and hence inundation areas of collapse-runout lahars can increase greatly beyond their sources: the volume of the Casita lahar bulked to at least 2??6 times the contributing volume of the flank collapse and 4??2 times that of the debris flood. At least 78 per cent of the debris flow matrix (sediment < -1??0??; 2 mm) was entrained during flow. Copyright c 2004 John Wiley & Sons, Ltd.

  16. Rain-triggered lahars following the 2010 eruption of Merapi volcano, Indonesia: A major risk

    Science.gov (United States)

    de Bélizal, Edouard; Lavigne, Franck; Hadmoko, Danang Sri; Degeai, Jean-Philippe; Dipayana, Gilang Aria; Mutaqin, Bachtiar Wahyu; Marfai, Muh Aris; Coquet, Marie; Mauff, Baptiste Le; Robin, Anne-Kyria; Vidal, Céline; Cholik, Noer; Aisyah, Nurnaning

    2013-07-01

    The 2010 VEI 4 eruption of Merapi volcano deposited roughly ten times the volume of pyroclastic materials of the 1994 and 2006 eruptions, and is recognized as one of the most intense eruption since 1872. However, as the eruptive phase is now over, another threat endangers local communities: rain-triggered lahars. Previous papers on lahars at Merapi presented lahar-related risk following small-scale dome-collapse PDCs. Thus the aim of this study is to provide new insights on lahar-related risk following a large scale VEI 4 eruption. The paper highlights the high number of events (240) during the 2010-2011 rainy season (October 2010-May 2011). The frequency of the 2010-2011 lahars is also the most important ever recorded at Merapi. Lahars occurred in almost all drainages located under the active cone, with runout distances exceeding 15 km. The geomorphic impacts of lahars on the distal slope of the volcano are then explained as they directly threaten houses and infrastructures: creation of large corridors, avulsions, riverbank erosion and riverbed downcutting are detailed through local scale examples. Related damage is also studied: 860 houses damaged, 14 sabo-dams and 21 bridges destroyed. Sedimentological characteristics of volcaniclastic sediments in lahar corridors are presented, with emphasis on the resource in building material that they represent for local communities. Risk studies should not forget that thousands of people are exposing themselves to lahar hazard when they quarry volcaniclastic sediment on lahar corridors. Finally, the efficient community-based crisis management is explained, and shows how local people organize themselves to manage the risk: 3 fatalities were reported, although lahars reached densely populated areas. To summarize, this study provides an update of lahar risk issues at Merapi, with emphasis on the distal slope of the volcano where lahars had not occurred for 40 years, and where lahar corridors were rapidly formed.

  17. Groundwater drainage from fissures as a source for lahars

    Science.gov (United States)

    Johnson, P. J.; Valentine, G. A.; Stauffer, P. H.; Lowry, C. S.; Sonder, I.; Pulgarín, B. A.; Santacoloma, C. C.; Agudelo, A.

    2018-04-01

    One mechanism for generating lahars at volcanoes experiencing unrest is the disruption of internal aquifers. These disruptions can trigger releases of large quantities of groundwater. An example of such aquifer disruption occurred at Nevado del Huila Volcano, Colombia, during February and April 2007 when large fractures formed across the summit area of the volcano and lahars were emitted from them. Previous work interpreted that lahar volumes could not be accounted for by melted glacial snow or precipitation, and by elimination suggested that the primary water source was groundwater. Conceptual models have been developed for perched, confined aquifers that have been heated and pressurized by magma intrusions, followed by sudden pressure release and water emission during fracture formation. We consider an alternative end member wherein water release from large fissures at volcanoes is driven by simple gravity drainage. We apply numerical modeling to quantify water discharge from the porous medium surrounding a fissure with a low-elevation free exit. If a long fracture with high vertical extent (on the order of hundreds of meters) intersects a highly connected saturated porous medium, large volumes (on order 103 m3/m of crack length) of water may be released within tens of minutes. The drainage rates from the model may be adequate to account for the Nevado del Huila events if the medium surrounding the crack contains a large volume of water and has high horizontal permeability. This simple but poorly understood mechanism can present a hazard on its own or compound other processes releasing water from volcanoes.

  18. Rapid mapping using low-cost structure-from-motion photogrammetry expedites the lahar modeling process

    Science.gov (United States)

    Ratner, Jacqueline; Pyle, David; Mather, Tamsin

    2014-05-01

    Structure-from-motion (SfM) is a branch of photogrammetry that triangulates points in digital photos to produce a 3D model. When applied to topographical modeling, SfM presents a powerful tool for rapid terrain mapping. At little to no cost and on a timescale of hours, a metric-resolution digital terrain model (DTM) can be produced; the resultant DTM can be used for many types of hazard scenario modeling and is here applied to lahars and floods. This study demonstrates the robustness of the SfM method through two case studies. First, an SfM DTM of Boscastle, UK, is compared against LiDAR and SRTM DTMs in a flood simulation model. Resolution is found to be more robust than for satellite based DTMs, and though less precise than the most detailed LiDAR survey, still perfectly adequate for the purposes of modeling flows. Next, the same method is applied to a region of Ecuador lacking the regionally comprehensive LiDAR survey available in the UK. Compared against the only other topographical data available, (SRTM, ASTER, 1956 topographical map), the SfM DTM is shown to have a higher resolution and is a preferable alternative for modeling lahars. The advantages of this study for emergency management are to provide a cheap and rapid metric-resolution alternative to low resolution or costly topography data sets. In regions such as Ecuador where scientific resources are scarce, SfM assists in providing a thorough, but otherwise unattainable, understanding of potential disaster scenarios that is accessible to local authorities to be used in the disaster prevention and mitigation processes.

  19. Volcanic-glacial interactions: GIS applications to the assessment of lahar hazards (case study of Kamchatka

    Directory of Open Access Journals (Sweden)

    Ya. D. Muraviev

    2014-01-01

    Full Text Available On the Kamchatka peninsula, lahars or volcanogenic mudflows arise as a result of intensive snow melting caused by incandescent material ejected by volcanoes onto the surface. Such flows carrying volcanic ash and cinders together with lava fragments and blocks move with a speed up to 70 km/h that can result in significant destructions and even human victims. Formation of such water flows is possible during the whole year.Large-scale GIS «Hazards of lahars (volcanogenic mudflows» has been developed for some volcano group as well as for individual volcanoes on the peninsula in framework of the GIS «Volcanic hazard of the Kuril-Kamchatka island arc». Main components of this database are the following: physic-geographical information on region of active volcanism and adjacent areas, on human settlements; data on the mudflow activity; data on distribution of the snow and ice reserves. This database is aimed at mapping of surrounding territories and estimating a hazard of lahars.For illustration the paper presents a map of the lahar hazards, results of calculations of the distances of ejects and maximal area of ejected material spreading in dependence on a character and power of an eruption. In future we plan to perform operational calculations of maximal possible volumes of such flows and areas of their spreading. The calculations will be made on the basis of the GIS «Volcanic hazard of the Kuril-Kamchatka island arc».A volume of hard material carried by lahars onto slopes and down to foot of the Kluchevskaya volcanic massif is estimated on the basis of data on the snow and ice reserves on volcano slopes. On the average for many years, the snow accumulation in zones of the mudflow formations their volume often reaches 15–17 millions of cubic meters. Depending on the snowfall activity in different years this value may vary within 50% relative to the norm. Further on, calculations of maximal possible volume of such flows will be performed in a

  20. Insights into lahar deposition processes in the Curah Lengkong (Semeru Volcano, Indonesia) using photogrammetry-based geospatial analysis, near-surface geophysics and CFD modelling

    Science.gov (United States)

    Gomez, C.; Lavigne, F.; Sri Hadmoko, D.; Wassmer, P.

    2018-03-01

    Semeru Volcano is an active stratovolcano located in East Java (Indonesia), where historic lava flows, occasional pyroclastic flows and vulcanian explosions (on average every 5 min to 15 min) generate a stock of material that is remobilized by lahars, mostly occurring during the rainy season between October and March. Every year, several lahars flow down the Curah Lengkong Valley on the South-east flank of the volcano, where numerous lahar studies have been conducted. In the present contribution, the objective was to study the spatial distribution of boulder-size clasts and try to understand how this distribution relates to the valley morphology and to the dynamic and deposition dynamic of lahars. To achieve this objective, the method relies on a combination of (1) aerial photogrammetry-derived geospatial data on boulders' distribution, (2) ground penetrating radar data collected along a 2 km series of transects and (3) a CFD model of flow to analyse the results from the deposits. Results show that <1 m diameter boulders are evenly distributed along the channel, but that lava flow deposits visible at the surface of the river bed and SABO dams increase the concentration of clasts upstream of their position. Lateral input of boulders from collapsing lava-flow deposits can bring outsized clasts in the system that tend to become trapped at one location. Finally, the comparison between the CFD simulation and previous research using video imagery of lahars put the emphasis the fact that there is no direct link between the sedimentary units observed in the field and the flow that deposited them. Both grain size, flow orientation, matrix characteristics can be very different in a deposit for one single flow, even in confined channels like the Curah Lengkong.

  1. Lahars at Cotopaxi and Tungurahua Volcanoes, Ecuador: Highlights from stratigraphy and observational records and related downstream hazards: Chapter 6

    Science.gov (United States)

    Mothes, Patricia A; Vallance, James W.

    2015-01-01

    Lahars are volcanic debris flows that are dubbed primary when triggered by eruptive activity or secondary when triggered by other factors such as heavy rainfall after eruptive activity has waned. Variation in time and space of the proportion of sediment to water within a lahar dictates lahar flow phase and the resultant sedimentary character of deposits. Characteristics of source material and of debris eroded and incorporated during flow downstream may strongly affect the grain-size composition of flowing lahars and their deposits. Lahars borne on the flanks of two steep-sided stratocones in Ecuador exemplify two important lahar types. Glacier-clad Cotopaxi volcano has been a producer of primary lahars that flow great distances downstream. Such primary lahars include those of both clast-rich and matrix-rich composition—some of which have flowed as far as 325 km to the Pacific Ocean. Cotopaxi's last important eruption in 1877 generated formidable syneruptive lahars comparable in size to those that buried Armero, Colombia, following the 1985 eruption of Nevado del Ruiz volcano. In contrast, ash-producing eruptive activity during the past 15 years at Tungurahua volcano has generated a continual supply of fresh volcaniclastic debris that is regularly remobilized by precipitation. Between 2000 and 2011, 886 rain-generated lahars were registered at Tungurahua. These two volcanoes pose dramatically different hazards to nearby populations. At Tungurahua, the frequency and small sizes of lahars have resulted in effective mitigation measures. At Cotopaxi 137 years have passed since the last important lahar-producing eruption, and there is now a high-risk situation for more than 100,000 people living in downstream valleys.

  2. Characteristics of the summit lakes of Ambae volcano and their potential for generating lahars

    Directory of Open Access Journals (Sweden)

    P. Bani

    2009-08-01

    Full Text Available Volcanic eruptions through crater lakes often generate lahars, causing loss of life and property. On Ambae volcano, recent eruptive activities have rather tended to reduce the water volume in the crater lake (Lake Voui, in turn, reducing the chances for outburst floods. Lake Voui occupies a central position in the summit caldera and is well enclosed by the caldera relief. Eruptions with significantly higher magnitude than that of 1995 and 2005 are required for an outburst. A more probable scenario for lahar events is the overflow from Lake Manaro Lakua bounded on the eastern side by the caldera wall. Morphology and bathymetry analysis have been used to identify the weakest point of the caldera rim from which water from Lake Manaro Lakua may overflow to initiate lahars. The 1916 disaster described on south-east Ambae was possibly triggered by such an outburst from Lake Manaro Lakua. Taking into account the current level of Lake Manaro Lakua well below a critical overflow point, and the apparently low potential of Lake Voui eruptions to trigger lahars, the Ambae summit lakes may not be directly responsible for numerous lahar deposits identified around the Island.

  3. Geomorphological evolution of a fluvial channel after primary lahar deposition: Huiloac Gorge, Popocatépetl volcano (Mexico)

    Science.gov (United States)

    Tanarro, L. M.; Andrés, N.; Zamorano, J. J.; Palacios, D.; Renschler, C. S.

    2010-10-01

    Popocatépetl volcano (19°02' N, 98°62' W, 5424 m) began its most recent period of volcanic activity in December 1994. The interaction of volcanic and glacier activity triggered the formation of lahars through the Huiloac Gorge, located on the northern flank of the volcano, causing significant morphological changes in the channel. The most powerful lahars occurred in April 1995, July 1997 and January 2001, and were followed by secondary lahars that formed during the post-eruptive period. This study interprets the geomorphological evolution of the Huiloac Gorge after the January 2001 lahar. Variations in channel morphology at a 520 m-long research site located mid-way down the gorge were recorded over a 4 year period from February 2002 to March 2005, and depicted in five geomorphological maps (scale 1:200) for 14 February and 15 October 2002, 27 September 2003, 9 February 2004, and 16 March 2006. A GIS was used to calculate the surface area for the landforms identified for each map and detected changes and erosion-deposition processes of the landforms using the overlay function for different dates. Findings reveal that secondary lahars and others types of flows, like sediment-laden or muddy streamflows caused by precipitation, rapidly modified the gorge channel following the January 2001 non-eruptive lahar, a period associated with volcanic inactivity and the disappearance of the glacier once located at the headwall of the gorge. Field observations also confirmed that secondary flows altered the dynamics and geomorphological development of the channel. These flows incised and destroyed the formations generated by the primary lahars (1997 and 2001), causing a widening of the channel that continues today. After February 2004, a rain-triggered lahar and other flows infilled the channel with materials transported by these flows. The deposits on the lateral edges of the channel form terraces. A recent lull in lahar activity contrasts with the increasing instability of

  4. Seismic signals of snow-slurry lahars in motion: 25 September 2007, Mt Ruapehu, New Zealand

    Science.gov (United States)

    Cole, S. E.; Cronin, S. J.; Sherburn, S.; Manville, V.

    2009-05-01

    Detection of ground shaking forms the basis of many lahar-warning systems. Seismic records of two lahar types at Ruapehu, New Zealand, in 2007 are used to examine their nature and internal dynamics. Upstream detection of a flow depends upon flow type and coupling with the ground. 3-D characteristics of seismic signals can be used to distinguish the dominant rheology and gross physical composition. Water-rich hyperconcentrated flows are turbulent; common inter-particle and particle-substrate collisions engender higher energy in cross-channel vibrations relative to channel-parallel. Plug-like snow-slurry lahars show greater energy in channel-parallel signals, due to lateral deposition insulating channel margins, and low turbulence. Direct comparison of flow size must account for flow rheology; a water-rich lahar will generate signals of greater amplitude than a similar-sized snow-slurry flow.

  5. Frozen Martian lahars? Evaluation of morphology, degradation and geologic development in the Utopia-Elysium transition zone

    Science.gov (United States)

    Pedersen, G. B. M.

    2013-09-01

    Regional coverage of high-resolution data from the CTX camera has permitted new, detailed morphologic analysis of the enigmatic Utopia-Elysium flows which dominate the transition zone between Elysium volcanic province and Utopia Planitia. Based on topographic and morphologic analysis of the Galaxias region, this study supports the lahar hypothesis put forth by previous works and suggests that the center and the margins of the outflow deposits have very diverse morphologies that can be explained by varying degrees of water drainage and freezing. Regular channel and flood plain deposits are found in the central part of the outflow deposits, whereas the marginal deposits are interpreted to contain significant amount of ice because of their distinct morphological properties (smooth, lobate flow-fronts with upward convex snouts, unusual crater morphologies, raised rim fractures and localized flow fronts indicating rheomorphism). Thus, this study suggest that, unlike terrestrial lahars, lahar emplacement under Martian conditions only drain in the central parts, whereas the water in the margins of the outflow deposit (∼75% of the total outflow deposit in the Galaxias region) freezes up resulting in a double-layered deposit consisting of ice-rich core with an ice-poor surface layer. It is here furthermore suggested that continued intrusive volcanic activity was highly affected by the presence of the ice-rich lahar deposits, generating ground-ice-volcano interactions resulting in a secondary suite of morphologies. These morphologies include seventeen ridges that are interpreted to be möberg ridges (due to their NW-SE orientation, distinct ridge-crests and association with fractures and linear ridges) and depressions with nested faults interpreted to be similar to terrestrial ice-cauldrons, which form by enhanced subglacial geothermal activity including subglacial volcanic eruptions. These sub-lahar intrusions caused significant volatile loss in the ice-rich core of the

  6. Alteration, slope-classified alteration, and potential lahar inundation maps of volcanoes for the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Volcano Archive

    Science.gov (United States)

    Mars, John C.; Hubbard, Bernard E.; Pieri, David; Linick, Justin

    2015-01-01

    This study identifies areas prone to lahars from hydrothermally altered volcanic edifices on a global scale, using visible and near infrared (VNIR) and short wavelength infrared (SWIR) reflectance data from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and digital elevation data from the ASTER Global Digital Elevation Model (GDEM) dataset. This is the first study to create a global database of hydrothermally altered volcanoes showing quantitatively compiled alteration maps and potentially affected drainages, as well as drainage-specific maps illustrating modeled lahars and their potential inundation zones. We (1) identified and prioritized 720 volcanoes based on population density surrounding the volcanoes using the Smithsonian Institution Global Volcanism Program database (GVP) and LandScan™ digital population dataset; (2) validated ASTER hydrothermal alteration mapping techniques using Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) and ASTER data for Mount Shasta, California, and Pico de Orizaba (Citlaltépetl), Mexico; (3) mapped and slope-classified hydrothermal alteration using ASTER VNIR-SWIR reflectance data on 100 of the most densely populated volcanoes; (4) delineated drainages using ASTER GDEM data that show potential flow paths of possible lahars for the 100 mapped volcanoes; (5) produced potential alteration-related lahar inundation maps using the LAHARZ GIS code for Iztaccíhuatl, Mexico, and Mount Hood and Mount Shasta in the United States that illustrate areas likely to be affected based on DEM-derived volume estimates of hydrothermally altered rocks and the ~2x uncertainty factor inherent within a statistically-based lahar model; and (6) saved all image and vector data for 3D and 2D display in Google Earth™, ArcGIS® and other graphics display programs. In addition, these data are available from the ASTER Volcano Archive (AVA) for distribution (available at http://ava.jpl.nasa.gov/recent_alteration_zones.php).

  7. Lahar flow simulation using Laharz_py program: Application for the Mt. Halla volcano, Jeju, Korea

    Science.gov (United States)

    Chang, C.; Yun, S. H.; Yi, W.

    2017-12-01

    Lahar, one of catastrophic events, has the potential to cause the loss of life and damage to infrastructure over inhabited areas. This study using Laharz_py, was performed schematic prediction on the impact area of lahar hazards at the Mt. Halla volcano, Jeju island. In order to comprehensively address the impact of lahar for the Mt. Halla, two distinct parameters, H/L ratio and lahar volume, were selected to influence variable for Laharz_py simulation. It was carried out on the basis of numerical simulation by estimating a possible lahar volumes of 30,000, 50,000, 70,000, 100,000, 300,000, 500,000 m3 according to H/L ratios (0.20, 0.22 and 0.25) was applied. Based on the numerical simulations, the area of the proximal hazard zone boundary is gradually decreased with increasing H/L ratio. The number of streams which affected by lahar, tended to decrease with increasing H/L ratio. In the case of H/L ratio 0.20, three streams (Gwangryeong stream, Dogeun stream, Han stream) in the Jeju-si area and six streams (Gungsan stream, Hogeun stream, Seohong stream, Donghong stream, Bomok stream, Yeong stream-Hyodon stream) in the Seogwipo-si area are affected. In the case of H/L ratio 0.22, two streams (Gwangryeong stream and Han stream) in the Jeju-si area and five streams (Gungsan stream, Seohong stream, Donghong stream, Bomok stream, Yeong stream-Hyodon stream) in the Seogwipo-si area are affected. And in the case of H/L ratio 0.25, two streams (Gwangryeong stream and Han stream) in the Jeju-si area and one stream (Yeong stream-Hyodon stream) in the Seogwipo-si area are affected. The results of this study will be used as basic data to create a risk map for the direct damage that can be caused due to volcanic hazards arising from Mt. Halla. This research was supported by a grant [MPSS-NH-2015-81] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government.

  8. Snow and ice perturbation during historical volcanic eruptions and the formation of lahars and floods

    Science.gov (United States)

    Major, Jon J.; Newhall, Christopher G.

    1989-10-01

    Historical eruptions have produced lahars and floods by perturbing snow and ice at more than 40 volcanoes worldwide. Most of these volcanoes are located at latitudes higher than 35°; those at lower latitudes reach altitudes generally above 4000 m. Volcanic events can perturb mantles of snow and ice in at least five ways: (1) scouring and melting by flowing pyroclastic debris or blasts of hot gases and pyroclastic debris, (2) surficial melting by lava flows, (3) basal melting of glacial ice or snow by subglacial eruptions or geothermal activity, (4) ejection of water by eruptions through a crater lake, and (5) deposition of tephra fall. Historical records of volcanic eruptions at snow-clad volcanoes show the following: (1) Flowing pyroclastic debris (pyroclastic flows and surges) and blasts of hot gases and pyroclastic debris are the most common volcanic events that generate lahars and floods; (2) Surficial lava flows generally cannot melt snow and ice rapidly enough to form large lahars or floods; (3) Heating the base of a glacier or snowpack by subglacial eruptions or by geothermal activity can induce basal melting that may result in ponding of water and lead to sudden outpourings of water or sediment-rich debris flows; (4) Tephra falls usually alter ablation rates of snow and ice but generally produce little meltwater that results in the formation of lahars and floods; (5) Lahars and floods generated by flowing pyroclastic debris, blasts of hot gases and pyroclastic debris, or basal melting of snow and ice commonly have volumes that exceed 105 m3. The glowing lava (pyroclastic flow) which flowed with force over ravines and ridges...gathered in the basin quickly and then forced downwards. As a result, tremendously wide and deep pathways in the ice and snow were made and produced great streams of water (Wolf 1878).

  9. Rheological behavior of water-ash mixtures from Sakurajima and Ontake volcanoes: implications for lahar flow dynamics

    Science.gov (United States)

    Kurokawa, Aika K.; Ishibashi, Hidemi; Miwa, Takahiro; Nanayama, Futoshi

    2018-06-01

    Lahars represent one of the most serious volcanic hazards, potentially causing severe damage to the surrounding environment, not only immediately after eruption but also later due to rainfall or snowfall. The flow of a lahar is governed by volcanic topography and its rheological behavior, which is controlled by its volume, microscale properties, and the concentration of particles. However, the effects of particle properties on the rheology of lahars are poorly understood. In this study, viscosity measurements were performed on water-ash mixtures from Sakurajima and Ontake volcanoes. Samples from Sakurajima show strong and simple shear thinning, whereas those from Ontake show viscosity fluctuations and a transition between shear thinning and shear thickening. Particle analysis of the volcanic ash together with a theoretical analysis suggests that the rheological difference between the two types of suspension can be explained by variations in particle size distribution and shape. In particular, to induce the complex rheology of the Ontake samples, coexistence of two particle size groups may be required since two independent behaviors, one of which follows the streamline (Stokes number St << 1, inertial number I < 0.001) and the other shows a complicated motion ( St 1, I 0.001), compete against each other. The variations in the spatial distribution of polydisperse particles, and the time dependence of this feature which generates apparent rheological changes, indicate that processes related to microscale particle heterogeneities are important in understanding the flow dynamics of lahars and natural polydisperse granular-fluid mixtures in general.

  10. Perturbation and melting of snow and ice by the 13 November 1985 eruption of Nevado del Ruiz, Colombia, and consequent mobilization, flow and deposition of lahars

    Science.gov (United States)

    Pierson, T.C.; Janda, R.J.; Thouret, J.-C.; Borrero, C.A.

    1990-01-01

    A complex sequence of pyroclastic flows and surges erupted by Nevado del Ruiz volcano on 13 November 1985 interacted with snow and ice on the summit ice cap to trigger catastrophic lahars (volcanic debris flows), which killed more than 23,000 people living at or beyond the base of the volcano. The rapid transfer of heat from the hot eruptive products to about 10 km2 of the snowpack, combined with seismic shaking, produced large volumes of meltwater that flowed downslope, liquefied some of the new volcanic deposits, and generated avalanches of saturated snow, ice and rock debris within minutes of the 21:08 (local time) eruption. About 2 ?? 107 m3 of water was discharged into the upper reaches of the Molinos, Nereidas, Guali, Azufrado and Lagunillas valleys, where rapid entrainment of valley-fill sediment transformed the dilute flows and avalanches to debris flows. Computed mean velocities of the lahars at peak flow ranged up to 17 m s-1. Flows were rapid in the steep, narrow upper canyons and slowed with distance away from the volcano as flow depth and channel slope diminished. Computed peak discharges ranged up to 48,000 m3 s-1 and were greatest in reaches 10 to 20 km downstream from the summit. A total of about 9 ?? 107 m3 of lahar slurry was transported to depositional areas up to 104 km from the source area. Initial volumes of individual lahars increased up to 4 times with distance away from the summit. The sedimentology and stratigraphy of the lahar deposits provide compelling evidence that: (1) multiple initial meltwater pulses tended to coalesce into single flood waves; (2) lahars remained fully developed debris flows until they reached confluences with major rivers; and (3) debris-flow slurry composition and rheology varied to produce gradationally density-stratified flows. Key lessons and reminders from the 1985 Nevado del Ruiz volcanic eruption are: (1) catastrophic lahars can be generated on ice- and snow-capped volcanoes by relatively small eruptions; (2

  11. Combining criteria for delineating lahar- and flash-flood-prone hazard and risk zones for the city of Arequipa, Peru

    OpenAIRE

    Thouret , Jean-Claude; Enjolras , G.; Martelli , K.; Santoni , O.; Luque , A.; Nagata , M.; Arguedas , A.; Macedo , L.

    2013-01-01

    Arequipa, the second largest city in Peru, is exposed to many natural hazards, most notably earthquakes, volcanic eruptions, landslides, lahars (volcanic debris flows), and flash floods. Of these, lahars and flash floods, triggered by occasional torrential rainfall, pose the most frequently occurring hazards that can affect the city and its environs, in particular the areas containing low-income neighbourhoods. This paper presents and discusses criteria for delineating areas prone to flash fl...

  12. Lahar hazards at Agua volcano, Guatemala

    Science.gov (United States)

    Schilling, S.P.; Vallance, J.W.; Matías, O.; Howell, M.M.

    2001-01-01

    At 3760 m, Agua volcano towers more than 3500 m above the Pacific coastal plain to the south and 2000 m above the Guatemalan highlands to the north. The volcano is within 5 to 10 kilometers (km) of Antigua, Guatemala and several other large towns situated on its northern apron. These towns have a combined population of nearly 100,000. It is within about 20 km of Escuintla (population, ca. 100,000) to the south. Though the volcano has not been active in historical time, or about the last 500 years, it has the potential to produce debris flows (watery flows of mud, rock, and debris—also known as lahars when they occur on a volcano) that could inundate these nearby populated areas.

  13. Rheological Variations in Lahars Expected to Flow Along the Sides of Sakurajima and Ontake Volcanoes, Japan

    Science.gov (United States)

    Kurokawa, A. K.; Ishibashi, H.

    2016-12-01

    Volcanic ash is known to accumulate on the ground surface around volcano after eruptions. Once the ash gains weight and mixes with water to a critical point, the mixture of volcanic ash and water runs down a side of volcano causing severe damage to the ambient environment. The flow is referred to as lahar that is widely observed all over the world and it occasionally generates seismic signals [Walsh et al., 2016; Ogiso and Yomogida, 2015]. Sometimes it happens just after an eruption [Nakayama and Kuroda, 2003] whereas a large debris flow, which occurred about 30 years after the latest eruption due to heavy rainfall is also reported [Ogiso and Yomogida, 2015]. Thus when the lahar starts flowing is a key. In order to understand flow characteristics of lahar, it is important to focus on the rheology. However, little is known about the rheological property although the experimental condition can be controlled at atmospheric pressure and ambient temperature. This is an advantage when compared with magma and rock, which need to reach high-pressure and/or high-temperature conditions to be measured. Based on the background, we have performed basic rheological measurements using mixtures of water and volcanic ashes collected at Sakurajima and Ontake volcanoes in Japan. The first important point of our findings is that the two types of mixtures show non-linear characteristics differently. For instance, the viscosity variation strongly depends on the water content in the case of Sakurajima sample while the viscosity fluctuates within a certain definite range of shear rate using Ontake sample. Since these non-linear characteristics are related to structural changes in the flow, our results indicate that the flow of lahar is time-variable and complicated. In this presentation, we report the non-linear rheology in detail and go into the relation to temporal changes in the flow.

  14. Does exposure to lahars risk affect people's risk-preferences and other attitudes? Field data from incentivized experiments and surveys in Arequipa - Peru

    Science.gov (United States)

    Heitz, C.; Bchir, M. A.; Willinger, M.

    2012-04-01

    Many individuals are exposed to risks which are either difficult to insure or hard to mitigate, such as tsunamis, floods, volcanic eruption,... Little is known about how exposure to such risks shapes individuals' risk-preferences. Are they more (less) risk-averse than people who are unexposed to such hazard risk? We provide empirical evidence about this question for the case of individuals exposed to lahars risk. Lahars are sediments laden flows from volcanic origin. We compare the risk-attitude of people exposed - versus non-exposed ones - to lahars risk. The originality of our approach is that we combine standard survey data to behavioural data collected by means of incentivized experiments. We collected data in various locations of the city of Arequipa (Peru), a densely populated area down the volcano El Misti. Participants in our experiment were identified as (non-)exposed to lahars risk based on risk zoning. Our survey questionnaire allows us to compare assessed exposure and the perceived exposure. We elicit risk-preference, time-preference, and trusting behaviour (a measure of social capital) for each respondent in addition to standard survey data. Our field experiment involved a total of 209 respondents from exposed and non-exposed areas. While respondents endow legitimacy in risk reduction (more than 74%) to a national authority (Defensa Civil) in charge of the management of risk in the city, more than 64% of them consider that they are not sufficiently informed about the behaviours to adopt in case of a disaster. Respondents are therefore poorly motivated to adopt initiatives of self-protection (23%) and express instead high expectations with respect to authorities' actions for decreasing their vulnerability (73%). The experimental data show that participants who live in exposed areas are not significantly more risk-averse than those living in non-exposed ones. Furthermore, there is no significant difference in time-preference between exposed and non

  15. Lahar hazards at Mombacho Volcano, Nicaragua

    Science.gov (United States)

    Vallance, J.W.; Schilling, S.P.; Devoli, G.

    2001-01-01

    Mombacho volcano, at 1,350 meters, is situated on the shores of Lake Nicaragua and about 12 kilometers south of Granada, a city of about 90,000 inhabitants. Many more people live a few kilometers southeast of Granada in 'las Isletas de Granada and the nearby 'Peninsula de Aseses. These areas are formed of deposits of a large debris avalanche (a fast moving avalanche of rock and debris) from Mombacho. Several smaller towns with population, in the range of 5,000 to 12,000 inhabitants are to the northwest and the southwest of Mombacho volcano. Though the volcano has apparently not been active in historical time, or about the last 500 years, it has the potential to produce landslides and debris flows (watery flows of mud, rock, and debris -- also known as lahars when they occur on a volcano) that could inundate these nearby populated areas. -- Vallance, et.al., 2001

  16. Preparing for Volcanic Hazards: An Examination of Lahar Knowledge, Risk Perception, and Preparedness around Mount Baker and Glacier Peak, WA

    Science.gov (United States)

    Corwin, K.; Brand, B. D.

    2015-12-01

    As the number of people living at risk from volcanic hazards in the U.S. Pacific Northwest continues to rise, so does the need for improved hazard science, mitigation, and response planning. The effectiveness of these efforts relies not only on scientists and policymakers, but on individuals and their risk perception and preparedness levels. This study examines the individual knowledge, perception, and preparedness of over 500 survey respondents living or working within the lahar zones of Mount Baker and Glacier Peak volcanoes. We (1) explore the common disconnect between accurate risk perception and adequate preparedness; (2) determine how participation in hazard response planning influences knowledge, risk perception, and preparedness; and (3) assess the effectiveness of current lahar hazard maps for public risk communication. Results indicate that a disconnect exists between perception and preparedness for the majority of respondents. While 82% of respondents accurately anticipate that future volcanic hazards will impact the Skagit Valley, this knowledge fails to motivate increased preparedness. A majority of respondents also feel "very responsible" for their own protection and provision of resources during a hazardous event (83%) and believe they have the knowledge and skills necessary to respond effectively to such an event (56%); however, many of these individuals still do not adequately prepare. When asked what barriers prevent them from preparing, respondents primarily cite a lack of knowledge about relevant local hazards. Results show that participation in response-related activities—a commonly recommended solution to this disconnect—minimally influences preparedness. Additionally, although local hazard maps successfully communicate the primary hazard—97% of respondents recognize the lahar hazard—many individuals incorrectly interpret other important facets of the maps. Those who participate in response-related activities fail to understand these

  17. Mount Baker lahars and debris flows, ancient, modern, and future

    Science.gov (United States)

    Tucker, David S; Scott, Kevin M.; Grossman, Eric E.; Linneman, Scott

    2014-01-01

    The Middle Fork Nooksack River drains the southwestern slopes of the active Mount Baker stratovolcano in northwest Washington State. The river enters Bellingham Bay at a growing delta 98 km to the west. Various types of debris flows have descended the river, generated by volcano collapse or eruption (lahars), glacial outburst floods, and moraine landslides. Initial deposition of sediment during debris flows occurs on the order of minutes to a few hours. Long-lasting, down-valley transport of sediment, all the way to the delta, occurs over a period of decades, and affects fish habitat, flood risk, gravel mining, and drinking water.

  18. ANALYSIS OF EFFECTIVE RAINFALL INTENSITY AND WORKING RAINFALL FOR BASIC WARNING CRITERIA DEVELOPMENT ON LAHAR FLOW EVENT

    Directory of Open Access Journals (Sweden)

    Fitriyadi Fitriyadi

    2015-05-01

    The research results showed that the number of reviewed serial rain with total value ≥ 80 mm is 9.28% of the whole serial rain, and 12.5% of them caused lahar flow in Gendol River. Debris flow occurrence probability on total rainfall amount of ≥ 80 mm that may occur on Gendol River amounted to 1.89%. This value represents less possibility of debris flow in Gendol River, this is due to the rain conditions in the Gendol Watershed different from the situation in Japan as well as the limitations of the available data. It is recommended for further research on the limitation of total rainfall in accordance with the conditions in Gendol Watershed by considering other parameters becoming the lahar flow controller factor. Further, it is necessary to perform the analysis using rain catchment method by averaging rainfall values on each of serial rain.

  19. EVALUATION OF DISASTER MITIGATION SYSTEM AGAINST LAHAR FLOW OF PUTIH RIVER, MT. MERAPI AREA

    Directory of Open Access Journals (Sweden)

    T. Maksal Saputra

    2013-05-01

    Result of the evaluation shows that the existing early warning system does not produce sufficient time for the sand miners to save themselves. The proposed solution is to divide sand mine area in Putih River into 3 zones, each zone has different procedure of the early warning and evacuation. This is arranged to avoid casualties to the sand miners. Keywords: Lahar flood, sand miners, early warning.

  20. Interrelations among pyroclastic surge, pyroclastic flow, and lahars in Smith Creek valley during first minutes of 18 May 1980 eruption of Mount St. Helens, USA

    Science.gov (United States)

    Brantley, S.R.; Waitt, R.B.

    1988-01-01

    A devastating pyroclastic surge and resultant lahars at Mount St. Helens on 18 May 1980 produced several catastrophic flowages into tributaries on the northeast volcano flank. The tributaries channeled the flows to Smith Creek valley, which lies within the area devastated by the surge but was unaffected by the great debris avalanche on the north flank. Stratigraphy shows that the pyroclastic surge preceded the lahars; there is no notable "wet" character to the surge deposits. Therefore the lahars must have originated as snowmelt, not as ejected water-saturated debris that segregated from the pyroclastic surge as has been inferred for other flanks of the volcano. In stratigraphic order the Smith Creek valley-floor materials comprise (1) a complex valley-bottom facies of the pyroclastic surge and a related pyroclastic flow, (2) an unusual hummocky diamict caused by complex mixing of lahars with the dry pyroclastic debris, and (3) deposits of secondary pyroclastic flows. These units are capped by silt containing accretionary lapilli, which began falling from a rapidly expanding mushroom-shaped cloud 20 minutes after the eruption's onset. The Smith Creek valley-bottom pyroclastic facies consists of (a) a weakly graded basal bed of fines-poor granular sand, the deposit of a low-concentration lithic pyroclastic surge, and (b) a bed of very poorly sorted pebble to cobble gravel inversely graded near its base, the deposit of a high-concentration lithic pyroclastic flow. The surge apparently segregated while crossing the steep headwater tributaries of Smith Creek; large fragments that settled from the turbulent surge formed a dense pyroclastic flow along the valley floor that lagged behind the front of the overland surge. The unusual hummocky diamict as thick as 15 m contains large lithic clasts supported by a tough, brown muddy sand matrix like that of lahar deposits upvalley. This unit contains irregular friable lenses and pods meters in diameter, blocks incorporated from

  1. Comparative lahar hazard mapping at Volcan Citlaltépetl, Mexico using SRTM, ASTER and DTED-1 digital topographic data

    Science.gov (United States)

    Hubbard, Bernard E.; Sheridan, Michael F.; Carrasco-Nunez, Gerardo; Diaz-Castellon, Rodolfo; Rodriguez, Sergio R.

    2007-01-01

    In this study, we evaluated and compared the utility of spaceborne SRTM and ASTER DEMs with baseline DTED-1 “bald-earth” topography for mapping lahar inundation hazards from volcan Citlaltépetl, Mexico, a volcano which has had a history of producing debris flows of various extents. In particular, we tested the utility of these topographic datasets for resolving ancient valley-filling deposits exposed around the flanks of the volcano, for determining their magnitude using paleohydrologic methods and for forecasting their inundation limits in the future. We also use the three datasets as inputs to a GIS stream inundation flow model, LAHARZ, and compare the results.

  2. Técnicas de información geográfica aplicadas al estudio del origen de los lahares y su experimentación en estratovolcanes tropicales

    OpenAIRE

    Andrés de Pablo, Nuria

    2011-01-01

    El objetivo de esta investigación se centra en el seguimiento de los factores que controlan el inicio de los lahares, mediante el empleo de técnicas de información geográfica asequibles, y en la aplicación práctica de los métodos elegidos a casos concretos. Los objetivos específicos se proponen en función de las características de cada uno de los procesos que intervienen en la generación de los lahares, que, a su vez, están condicionados por las áreas de experimentación elegidas. Entre la var...

  3. Combining criteria for delineating lahar- and flash-flood-prone hazard and risk zones for the city of Arequipa, Peru

    Science.gov (United States)

    Thouret, J.-C.; Enjolras, G.; Martelli, K.; Santoni, O.; Luque, J. A.; Nagata, M.; Arguedas, A.; Macedo, L.

    2013-02-01

    Arequipa, the second largest city in Peru, is exposed to many natural hazards, most notably earthquakes, volcanic eruptions, landslides, lahars (volcanic debris flows), and flash floods. Of these, lahars and flash floods, triggered by occasional torrential rainfall, pose the most frequently occurring hazards that can affect the city and its environs, in particular the areas containing low-income neighbourhoods. This paper presents and discusses criteria for delineating areas prone to flash flood and lahar hazards, which are localized along the usually dry (except for the rainy season) ravines and channels of the Río Chili and its tributaries that dissect the city. Our risk-evaluation study is based mostly on field surveys and mapping, but we also took into account quality and structural integrity of buildings, available socio-economic data, and information gained from interviews with risk-managers officials. In our evaluation of the vulnerability of various parts of the city, in addition to geological and physical parameters, we also took into account selected socio-economic parameters, such as the educational and poverty level of the population, unemployment figures, and population density. In addition, we utilized a criterion of the "isolation factor", based on distances to access emergency resources (hospitals, shelters or safety areas, and water) in each city block. By combining the hazard, vulnerability and exposure criteria, we produced detailed risk-zone maps at the city-block scale, covering the whole city of Arequipa and adjacent suburbs. Not surprisingly, these maps show that the areas at high risk coincide with blocks or districts with populations at low socio-economic levels. Inhabitants at greatest risk are the poor recent immigrants from rural areas who live in unauthorized settlements in the outskirts of the city in the upper parts of the valleys. Such settlements are highly exposed to natural hazards and have little access to vital resources. Our

  4. Simulating Lahars Using A Rotating Drum

    Science.gov (United States)

    Neather, Adam; Lube, Gert; Jones, Jim; Cronin, Shane

    2014-05-01

    A large (0.5 m in diameter, 0.15 m wide) rotating drum is used to investigate the erosion and deposition mechanics of lahars. To systematically simulate the conditions occurring in natural mass flows our experimental setup differs from the common rotating drum employed in industrial/engineering studies. Natural materials with their typical friction properties are used, as opposed to the frequently employed spherical glass beads; the drum is completely water-proof, so solid/air and solid/liquid mixtures can be investigated; the drum velocity and acceleration can be precisely controlled using a software interface to a micro-controller, allowing for the study of steady, unsteady and intermediate flow regimes. The drum has a toughened glass door, allowing high-resolution, high-speed video recording of the material inside. Vector maps of the velocities involved in the flows are obtained using particle image velocimetry (PIV). The changes in velocity direction and/or magnitude are used to locate the primary internal boundaries between layers of opposite flow direction, as well as secondary interfaces between shear layers. A range of variables can be measured: thickness and number of layers; the curvature of the free surface; frequency of avalanching; position of the centre of mass of the material; and the velocity profiles of the flowing material. Experiments to date have focussed on dry materials, and have had a fill factor of approximately 0.3. Combining these measured variables allows us to derive additional data of interest, such as mass and momentum flux. It is these fluxes that we propose will allow insight into the erosion/deposition mechanics of a lahar. A number of conclusions can be drawn to date. A primary interface separates flowing and passive region (this interface has been identified in previous studies). As well as the primary interface, the flowing layer separates into individual shear layers, with individual erosion/deposition and flow histories. This

  5. Lahar inundated, modified, and preserved 1.88 Ma early hominin (OH24 and OH56) Olduvai DK site.

    Science.gov (United States)

    Stanistreet, I G; Stollhofen, H; Njau, J K; Farrugia, P; Pante, M C; Masao, F T; Albert, R M; Bamford, M K

    2018-03-01

    Archaeological excavations at the DK site in the eastern Olduvai Basin, Tanzania, age-bracketed between ∼1.88 Ma (Bed I Basalt) and ∼1.85 Ma (Tuff IB), record the oldest lahar inundation, modification, and preservation of a hominin "occupation" site yet identified. Our landscape approach reconstructs environments and processes at high resolution to explain the distribution and final preservation of archaeological materials at the DK site, where an early hominin (likely Homo habilis) assemblage of stone tools and bones, found close to hominin specimens OH24 and OH56, developed on an uneven heterogeneous surface that was rapidly inundated by a lahar and buried to a depth of 0.4-1.2 m (originally ∼1.0-2.4 m pre-compaction). The incoming intermediate to high viscosity mudflow selectively modified the original accumulation of "occupation debris," so that it is no longer confined to the original surface. A dispersive debris "halo" was identified within the lahar deposit: debris is densest immediately above the site, but tails off until not present >150 m laterally. Voorhies indices and metrics derived from limb bones are used to define this dispersive halo spatially and might indicate a possible second assemblage to the east that is now eroded away. Based upon our new data and prior descriptions, two possibilities for the OH24 skull are suggested: it was either entrained by the mudflow from the DK surface and floated due to lower density toward its top, or it was deposited upon the solid top surface after its consolidation. Matrix adhering to material found in association with the parietals indicates that OH56 at least was relocated by the mudflow. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Combining criteria for delineating lahar- and flash-flood-prone hazard and risk zones for the city of Arequipa, Peru

    Directory of Open Access Journals (Sweden)

    J.-C. Thouret

    2013-02-01

    Full Text Available Arequipa, the second largest city in Peru, is exposed to many natural hazards, most notably earthquakes, volcanic eruptions, landslides, lahars (volcanic debris flows, and flash floods. Of these, lahars and flash floods, triggered by occasional torrential rainfall, pose the most frequently occurring hazards that can affect the city and its environs, in particular the areas containing low-income neighbourhoods. This paper presents and discusses criteria for delineating areas prone to flash flood and lahar hazards, which are localized along the usually dry (except for the rainy season ravines and channels of the Río Chili and its tributaries that dissect the city. Our risk-evaluation study is based mostly on field surveys and mapping, but we also took into account quality and structural integrity of buildings, available socio-economic data, and information gained from interviews with risk-managers officials.

    In our evaluation of the vulnerability of various parts of the city, in addition to geological and physical parameters, we also took into account selected socio-economic parameters, such as the educational and poverty level of the population, unemployment figures, and population density. In addition, we utilized a criterion of the "isolation factor", based on distances to access emergency resources (hospitals, shelters or safety areas, and water in each city block. By combining the hazard, vulnerability and exposure criteria, we produced detailed risk-zone maps at the city-block scale, covering the whole city of Arequipa and adjacent suburbs. Not surprisingly, these maps show that the areas at high risk coincide with blocks or districts with populations at low socio-economic levels. Inhabitants at greatest risk are the poor recent immigrants from rural areas who live in unauthorized settlements in the outskirts of the city in the upper parts of the valleys. Such settlements are highly exposed to natural hazards and have little access

  7. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    Science.gov (United States)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  8. Volcanic Risk Perception and Preparedness in Communities within the Mount Baker and Glacier Peak Lahar Hazard Zones

    Science.gov (United States)

    Corwin, K.; Brand, B. D.

    2014-12-01

    A community's ability to effectively respond to and recover from natural hazards depends on both the physical characteristics of the hazard and the community's inherent resilience. Resilience is shaped by a number of factors including the residents' perception of and preparedness for a natural hazard as well as the level of institutional preparedness. This study examines perception of and preparedness for lahar hazards from Mount Baker and Glacier Peak in Washington's Skagit Valley. Through an online survey, this study isolates the influence of specific variables (e.g., knowledge, past experience, scientific background, trust in various information sources, occupation, self-efficacy, sense of community) on risk perception and explores reasons behind the frequent disconnect between perception and preparedness. We anticipate that individuals with more extensive education in the sciences, especially geology or earth science, foster greater trust in scientists and a more accurate knowledge, understanding, and perception of the volcanic hazards in their community. Additionally, little research exists examining the extent to which first responders and leaders in response-related institutions prepare on a personal level. Since these individuals work toward community preparedness professionally, we hypothesize that they will be more prepared at home than members of the general public. Finally, the Skagit Valley has a significant history of flooding. We expect that the need to respond to and recover from frequent flooding creates a community with an inherently higher level of preparedness for other hazards such as lahars. The results of this study will contribute to the understanding of what controls risk perception and the interplay between perception and preparedness. At a broader level, this study provides local and state-level emergency managers information to evaluate and improve response capabilities and communication with the public and key institutions in order to

  9. Stratigraphic And Lithofacies Study Of Distal Rain-Triggered Lahars: The Case Of West Coast Of Ecuador

    Science.gov (United States)

    Mulas, M.; Chunga, K.; Peña Carpio, E.; Falquez Torres, D. A.; Alcivar, R., Sr.; Lopez Coronel, M. C.

    2015-12-01

    The central zone of the coast of Ecuador at the north of Manabí Province, on the area comprised between Salango and Jama communities, is characterized by the presence of whitish to grey, centimeters to meters thick, consolidated to loose distal ash deposits. Recent archeological studies on Valdivia (3500 BC) and Manteña (800-1500 AC - Harris et al. 2004) civilizations remains link this deposits with the intense eruptive phases that afflicted Ecuador 700-900 years ago (Usselman, 2006). Stratigraphic evidences and bibliographic datations of paleosols (Estrada, 1962; Mothes and Hall, 2008), allowed to estimate that these deposits are linked with the 800 BP eruption of Quilotoa and the following eruptions of Cotopaxi. According to the Smith and Lowe classification (1991), the deposits outcropping on the coast (located at a distance greater than 160 km from the volcanic vents), varied from whitish to grey, loose to weakly consolidated, massive to weakly stratified, centimeters to meters thick, coarse to fine ash matrix layers (diluite streamflow facies) to massive, large angular to sub-rounded siltitic blocks-rich and coarse to medium ash matrix deposits (debris flow facies). These types of lithofacies are associated to a rain-triggered lahar (De Belizal et al., 2013). The presence in some stratigraphic sections of sharp contacts, laminated layers of very fine ash, and also cm-thick sand and silt layers between the ash beds of the same deposits permit to understand that the different pulses were generated in short periods and after a long period. Structures like water pipes imply that the lahar went into the sea (Schneider, 2004), and allow the reconstruction of the paleotopographic condition during the emplacement of these deposits. This study focuses on the characterization of these types of deposits, permit to understand the kind of risk that may affect the towns located on the coast of Ecuador after VEI 4 to 6 eruptions on short time and within years.

  10. The enormous Chillos Valley Lahar: An ash-flow-generated debris flow from Cotopaxi Volcano, Ecuador

    Science.gov (United States)

    Mothes, P.A.; Hall, M.L.; Janda, R.J.

    1998-01-01

    The Chillos Valley Lahar (CVL), the largest Holocene debris flow in area and volume as yet recognized in the northern Andes, formed on Cotopaxi volcano's north and northeast slopes and descended river systems that took it 326 km north-northwest to the Pacific Ocean and 130+ km east into the Amazon basin. In the Chillos Valley, 40 km downstream from the volcano, depths of 80-160 m and valley cross sections up to 337000m2 are observed, implying peak flow discharges of 2.6-6.0 million m3/s. The overall volume of the CVL is estimated to be ???3.8 km3. The CVL was generated approximately 4500 years BP by a rhyolitic ash flow that followed a small sector collapse on the north and northeast sides of Cotopaxi, which melted part of the volcano's icecap and transformed rapidly into the debris flow. The ash flow and resulting CVL have identical components, except for foreign fragments picked up along the flow path. Juvenile materials, including vitric ash, crystals, and pumice, comprise 80-90% of the lahar's deposit, whereas rhyolitic, dacitic, and andesitic lithics make up the remainder. The sand-size fraction and the 2- to 10-mm fraction together dominate the deposit, constituting ???63 and ???15 wt.% of the matrix, respectively, whereas the silt-size fraction averages less than ???10 wt.% and the clay-size fraction less than 0.5 wt.%. Along the 326-km runout, these particle-size fractions vary little, as does the sorting coefficient (average = 2.6). There is no tendency toward grading or improved sorting. Limited bulking is recognized. The CVL was an enormous non-cohesive debris flow, notable for its ash-flow origin and immense volume and peak discharge which gave it characteristics and a behavior akin to large cohesive mudflows. Significantly, then, ash-flow-generated debris flows can also achieve large volumes and cover great areas; thus, they can conceivably affect large populated regions far from their source. Especially dangerous, therefore, are snowclad volcanoes

  11. Lahar—River of volcanic mud and debris

    Science.gov (United States)

    Major, Jon J.; Pierson, Thomas C.; Vallance, James W.

    2018-05-09

    Lahar, an Indonesian word for volcanic mudflow, is a mixture of water, mud, and volcanic rock flowing swiftly along a channel draining a volcano. Lahars can form during or after eruptions, or even during periods of inactivity. They are among the greatest threats volcanoes pose to people and property. Lahars can occur with little to no warning, and may travel great distances at high speeds, destroying or burying everything in their paths.Lahars form in many ways. They commonly occur when eruptions melt snow and ice on snow-clad volcanoes; when rains fall on steep slopes covered with fresh volcanic ash; when crater lakes, volcano glaciers or lakes dammed by volcanic debris suddenly release water; and when volcanic landslides evolve into flowing debris. Lahars are especially likely to occur at erupting or recently active volcanoes.Because lahars are so hazardous, U.S. Geological Survey scientists pay them close attention. They study lahar deposits and limits of inundation, model flow behavior, develop lahar-hazard maps, and work with community leaders and governmental authorities to help them understand and minimize the risks of devastating lahars.

  12. Local to global: a collaborative approach to volcanic risk assessment

    Science.gov (United States)

    Calder, Eliza; Loughlin, Sue; Barsotti, Sara; Bonadonna, Costanza; Jenkins, Susanna

    2017-04-01

    Volcanic risk assessments at all scales present challenges related to the multitude of volcanic hazards, data gaps (hazards and vulnerability in particular), model representation and resources. Volcanic hazards include lahars, pyroclastic density currents, lava flows, tephra fall, ballistics, gas dispersal and also earthquakes, debris avalanches, tsunamis and more ... they can occur in different combinations and interact in different ways throughout the unrest, eruption and post-eruption period. Volcanoes and volcanic hazards also interact with other natural hazards (e.g. intense rainfall). Currently many hazards assessments consider the hazards from a single volcano but at national to regional scales the potential impacts of multiple volcanoes over time become important. The hazards that have the greatest tendency to affect large areas up to global scale are those transported in the atmosphere: volcanic particles and gases. Volcanic ash dispersal has the greatest potential to directly or indirectly affect the largest number of people worldwide, it is currently the only volcanic hazard for which a global assessment exists. The quantitative framework used (primarily at a regional scale) considers the hazard at a given location from any volcano. Flow hazards such as lahars and floods can have devastating impacts tens of kilometres from a source volcano and lahars can be devastating decades after an eruption has ended. Quantitative assessment of impacts is increasingly undertaken after eruptions to identify thresholds for damage and reduced functionality. Some hazards such as lava flows could be considered binary (totally destructive) but others (e.g. ash fall) have varying degrees of impact. Such assessments are needed to enhance available impact and vulnerability data. Currently, most studies focus on physical vulnerability but there is a growing emphasis on social vulnerability showing that it is highly variable and dynamic with pre-eruption socio

  13. Utilizing NASA Earth Observations to Model Volcanic Hazard Risk Levels in Areas Surrounding the Copahue Volcano in the Andes Mountains

    Science.gov (United States)

    Keith, A. M.; Weigel, A. M.; Rivas, J.

    2014-12-01

    Copahue is a stratovolcano located along the rim of the Caviahue Caldera near the Chile-Argentina border in the Andes Mountain Range. There are several small towns located in proximity of the volcano with the two largest being Banos Copahue and Caviahue. During its eruptive history, it has produced numerous lava flows, pyroclastic flows, ash deposits, and lahars. This isolated region has steep topography and little vegetation, rendering it poorly monitored. The need to model volcanic hazard risk has been reinforced by recent volcanic activity that intermittently released several ash plumes from December 2012 through May 2013. Exposure to volcanic ash is currently the main threat for the surrounding populations as the volcano becomes more active. The goal of this project was to study Copahue and determine areas that have the highest potential of being affected in the event of an eruption. Remote sensing techniques were used to examine and identify volcanic activity and areas vulnerable to experiencing volcanic hazards including volcanic ash, SO2 gas, lava flow, pyroclastic density currents and lahars. Landsat 7 Enhanced Thematic Mapper Plus (ETM+), Landsat 8 Operational Land Imager (OLI), EO-1 Advanced Land Imager (ALI), Terra Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Shuttle Radar Topography Mission (SRTM), ISS ISERV Pathfinder, and Aura Ozone Monitoring Instrument (OMI) products were used to analyze volcanic hazards. These datasets were used to create a historic lava flow map of the Copahue volcano by identifying historic lava flows, tephra, and lahars both visually and spectrally. Additionally, a volcanic risk and hazard map for the surrounding area was created by modeling the possible extent of ash fallout, lahars, lava flow, and pyroclastic density currents (PDC) for future eruptions. These model results were then used to identify areas that should be prioritized for disaster relief and evacuation orders.

  14. Towards a Proactive Risk Mitigation Strategy at La Fossa Volcano, Vulcano Island

    Science.gov (United States)

    Biass, S.; Gregg, C. E.; Frischknecht, C.; Falcone, J. L.; Lestuzzi, P.; di Traglia, F.; Rosi, M.; Bonadonna, C.

    2014-12-01

    A comprehensive risk assessment framework was built to develop proactive risk reduction measures for Vulcano Island, Italy. This framework includes identification of eruption scenarios; probabilistic hazard assessment, quantification of hazard impacts on the built environment, accessibility assessment on the island and risk perception study. Vulcano, a 21 km2 island with two primary communities host to 900 permanent residents and up to 10,000 visitors during summer, shows a strong dependency on the mainland for basic needs (water, energy) and relies on a ~2 month tourism season for its economy. The recent stratigraphy reveals a dominance of vulcanian and subplinian eruptions, producing a range of hazards acting at different time scales. We developed new methods to probabilistically quantify the hazard related to ballistics, lahars and tephra for all eruption styles. We also elaborated field- and GIS- based methods to assess the physical vulnerability of the built environment and created dynamic models of accessibility. Results outline the difference of hazard between short and long-lasting eruptions. A subplinian eruption has a 50% probability of impacting ~30% of the buildings within days after the eruption, but the year-long damage resulting from a long-lasting vulcanian eruption is similar if tephra is not removed from rooftops. Similarly, a subplinian eruption results in a volume of 7x105 m3 of material potentially remobilized into lahars soon after the eruption. Similar volumes are expected for a vulcanian activity over years, increasing the hazard of small lahars. Preferential lahar paths affect critical infrastructures lacking redundancy, such as the road network, communications systems, the island's only gas station, and access to the island's two evacuation ports. Such results from hazard, physical and systemic vulnerability help establish proactive volcanic risk mitigation strategies and may be applicable in other island settings.

  15. The Osceola Mudflow from Mount Rainier: Sedimentology and hazard implications of a huge clay-rich debris flow

    Science.gov (United States)

    Vallance, J.W.; Scott, K.M.

    1997-01-01

    altered rock in the preavalanche mass determines whether a debris avalanche will transform into a cohesive debris flow or remain a largely unsaturated debris avalanche. The distinction among cohesive lahar, noncohesive lahar, and debris avalanche is important in hazard assessment because cohesive lahars spread much more widely than noncohesive lahars that travel similar distances, and travel farther and spread more widely than debris avalanches of similar volume. The Osceola Mudflow is documented here as an example of a cohesive debris flow of huge size that can be used as a model for hazard analysis of similar flows.

  16. Mapping Pyroclastic Flow Inundation Using Radar and Optical Satellite Images and Lahar Modeling

    Directory of Open Access Journals (Sweden)

    Chang-Wook Lee

    2018-01-01

    Full Text Available Sinabung volcano, located above the Sumatra subduction of the Indo-Australian plate under the Eurasian plate, became active in 2010 after about 400 years of quiescence. We use ALOS/PALSAR interferometric synthetic aperture radar (InSAR images to measure surface deformation from February 2007 to January 2011. We model the observed preeruption inflation and coeruption deflation using Mogi and prolate spheroid sources to infer volume changes of the magma chamber. We interpret that the inflation was due to magma accumulation in a shallow reservoir beneath Mount Sinabung and attribute the deflation due to magma withdrawal from the shallow reservoir during the eruption as well as thermoelastic compaction of erupted material. The pyroclastic flow extent during the eruption is then derived from the LAHARZ model based on the coeruption volume from InSAR modeling and compared to that derived from the Landsat 7 Enhanced Thematic Mapper Plus (ETM+ image. The pyroclastic flow inundation extents between the two different methods agree at about 86%, suggesting the capability of mapping pyroclastic flow inundation by combing radar and optical imagery as well as flow modeling.

  17. Considerations on comprehensive risk assessment and mitigation planning of volcanic ash-fall

    International Nuclear Information System (INIS)

    Toshida, Kiyoshi

    2010-01-01

    Volcanic ash-fall is inevitable hazard throughout Japan, and causes wide range of effects due to its physical and chemical properties. Nuclear power plants in Japan face the necessity to assess the risk from volcanic ash-fall. Risk assessment of the volcanic ash-fall should include engineering solution and mitigation planning as well as the ash-fall hazard. This report points out the characteristics for reducing the various effects of volcanic ash-fall as follows. Large-scale eruptions produce prominent volcanic ash-falls that can approach power plants at a great distance. Aftermath hazards of ash-fall events, such as remobilization of fine ash particles and generation of lahars, require further assessments. The kind and extent of damages becomes greater whenever ash is wet. Wet ash requires separate assessments in contrast to dry ash. The mitigation and recovery measures at power plants involve quick cleanup operations of volcanic ash. Those operations should be prepared through comprehensive risk assessment, and by cooperation with authorities, during pre-eruption repose period. The comprehensive assessment for volcanic ash-fall hazards, however, has yet to be conducted. Development of risk communication method may result in increased implementation mitigation planning. Numerical analysis of the ash-fall hazards provides quantitative data on particle motions that can be used in the risk assessment. In order to implement the quantitative assessment method, the verification on the effect of ambient air condition to the altitude of volcanic ash cloud is necessary. We need to develop a three-dimensional model of volcanic ash cloud, and calculate motions of ash clouds under multiple conditions of ambient air. (author)

  18. Sediment transport in headwaters of a volcanic catchment—Kamchatka Peninsula case study

    Science.gov (United States)

    Chalov, Sergey R.; Tsyplenkov, Anatolii S.; Pietron, Jan; Chalova, Aleksandra S.; Shkolnyi, Danila I.; Jarsjö, Jerker; Maerker, Michael

    2017-09-01

    Due to specific environmental conditions, headwater catchments located on volcanic slopes and valleys are characterized by distinctive hydrology and sediment transport patterns. However, lack of sufficient monitoring causes that the governing processes and patterns in these areas are rarely well understood. In this study, spatiotemporal water discharge and sediment transport from upstream sources was investigated in one of the numerous headwater catchments located in the lahar valleys of the Kamchatka Peninsula Sukhaya Elizovskaya River near Avachinskii and Koryakskii volcanoes. Three different subcatchments and corresponding channel types (wandering rivers within lahar valleys, mountain rivers within volcanic slopes and rivers within submountain terrains) were identified in the studied area. Our measurements from different periods of observations between years 2012-2014 showed that the studied catchment was characterized by extreme diurnal fluctuation of water discharges and sediment loads that were influenced by snowmelt patterns and high infiltration rates of the easily erodible lahar deposits. The highest recorded sediment loads were up to 9•104 mg/L which was related to an increase of two orders of magnitude within a one day of observations. Additionally, to get a quantitative estimate of the spatial distribution of the eroded material in the volcanic substrates we applied an empirical soil erosion and sediment yield model-modified universal soil loss equation (MUSLE). The modeling results showed that even if the applications of the universal erosion model to different non-agricultural areas (e.g., volcanic catchments) can lead to irrelevant results, the MUSLE model delivered might be acceptable for non-lahar areas of the studied volcanic catchment. Overall the results of our study increase our understanding of the hydrology and associated sediment transport for prediction of risk management within headwater volcanic catchments.

  19. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  20. Advances in volcano monitoring and risk reduction in Latin America

    Science.gov (United States)

    McCausland, W. A.; White, R. A.; Lockhart, A. B.; Marso, J. N.; Assitance Program, V. D.; Volcano Observatories, L. A.

    2014-12-01

    We describe results of cooperative work that advanced volcanic monitoring and risk reduction. The USGS-USAID Volcano Disaster Assistance Program (VDAP) was initiated in 1986 after disastrous lahars during the 1985 eruption of Nevado del Ruiz dramatizedthe need to advance international capabilities in volcanic monitoring, eruption forecasting and hazard communication. For the past 28 years, VDAP has worked with our partners to improve observatories, strengthen monitoring networks, and train observatory personnel. We highlight a few of the many accomplishments by Latin American volcano observatories. Advances in monitoring, assessment and communication, and lessons learned from the lahars of the 1985 Nevado del Ruiz eruption and the 1994 Paez earthquake enabled the Servicio Geológico Colombiano to issue timely, life-saving warnings for 3 large syn-eruptive lahars at Nevado del Huila in 2007 and 2008. In Chile, the 2008 eruption of Chaitén prompted SERNAGEOMIN to complete a national volcanic vulnerability assessment that led to a major increase in volcano monitoring. Throughout Latin America improved seismic networks now telemeter data to observatories where the decades-long background rates and types of seismicity have been characterized at over 50 volcanoes. Standardization of the Earthworm data acquisition system has enabled data sharing across international boundaries, of paramount importance during both regional tectonic earthquakes and during volcanic crises when vulnerabilities cross international borders. Sharing of seismic forecasting methods led to the formation of the international organization of Latin American Volcano Seismologists (LAVAS). LAVAS courses and other VDAP training sessions have led to international sharing of methods to forecast eruptions through recognition of precursors and to reduce vulnerabilities from all volcano hazards (flows, falls, surges, gas) through hazard assessment, mapping and modeling. Satellite remote sensing data

  1. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  2. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  3. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  4. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  5. Hazard assessment of long-range tephra dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico). Inplications on civil aviation

    Science.gov (United States)

    Bonasia, R.; Scaini, C.; Capra, L.; Nathenson, M.; Siebe, C.; Arana-Salinas, L.; Folch, A.

    2013-12-01

    Popocatépetl is one of the most active volcanoes in Mexico threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene-Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude. The current volcanic hazards map, reconstructed after the crisis occurred in 1994, considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra dispersal hazard, especially related to atmospheric dispersal, has been performed. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is strongly required. In this work we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels. Tephra dispersal modelling is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the 'Ochre Pumice' Plinian eruption (4965 14C yrBP). FALL3D model input eruptive parameters are constrained through an inversion method carried out with the semi-analytical HAZMAP model and are varied sampling them on the base of a Probability Density Function. We analyze the influence of seasonal variations on ash dispersal and estimate the average persistence of critical ash concentrations at relevant locations and airports. This study assesses the impact that a Plinian eruption similar to the Ochre Pumice eruption would have on the main airports of Mexico and adjacent areas. The hazard maps presented here

  6. Conceptual Development of a National Volcanic Hazard Model for New Zealand

    Science.gov (United States)

    Stirling, Mark; Bebbington, Mark; Brenna, Marco; Cronin, Shane; Christophersen, Annemarie; Deligne, Natalia; Hurst, Tony; Jolly, Art; Jolly, Gill; Kennedy, Ben; Kereszturi, Gabor; Lindsay, Jan; Neall, Vince; Procter, Jonathan; Rhoades, David; Scott, Brad; Shane, Phil; Smith, Ian; Smith, Richard; Wang, Ting; White, James D. L.; Wilson, Colin J. N.; Wilson, Tom

    2017-06-01

    We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g. short-term forecasting). The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.

  7. Conceptual Development of a National Volcanic Hazard Model for New Zealand

    Directory of Open Access Journals (Sweden)

    Mark Stirling

    2017-06-01

    Full Text Available We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration, and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g., short-term forecasting. The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.

  8. Ecosystem Model Skill Assessment. Yes We Can!

    Science.gov (United States)

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S

    2016-01-01

    Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable

  9. Bioavailability in the boris assessment model

    International Nuclear Information System (INIS)

    Norden, M.; Avila, R.; Gonze, M.A.; Tamponnet, C.

    2004-01-01

    The fifth framework EU project BORIS (Bioavailability Of Radionuclides In Soils: role of biological components and resulting improvement of prediction models) has three scientific objectives. The first is to improve understanding of the mechanisms governing the transfer of radionuclides to plants. The second is to improve existing predictive models of radionuclide interaction with soils by incorporating the knowledge acquired from the experimental results. The last and third objective is to extract from the experimental results some scientific basis for the development of bioremediation methods of radionuclides contaminated soils and to apprehend the role of additional non-radioactive pollutants on radionuclide bio-availability. This paper is focused on the second objective. The purpose of the BORIS assessment model is to describe the behaviour of radionuclides in the soil-plant system with the aim of making predictions of the time dynamics of the bioavailability of radionuclides in soil and the radionuclides concentrations in plants. To be useful the assessment model should be rather simple and use only a few parameters, which are commonly available or possible to measure for different sites. The model shall take into account, as much as possible, the results of the experimental studies and the mechanistic models developed in the BORIS project. One possible approach is to introduce in the assessment model a quantitative relationship between bioavailability of the radionuclides in soil and the soil properties. To do this an operational definition of bioavailability is needed. Here operational means experimentally measurable, directly or indirectly, and that the bioavailability can be translated into a mathematical expression. This paper describes the reasoning behind the chosen definition of bioavailability for the assessment model, how to derive operational expressions for the bioavailability and how to use them in the assessment model. (author)

  10. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  11. Models for Pesticide Risk Assessment

    Science.gov (United States)

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  12. Assessment of the Rescorla-Wagner model.

    Science.gov (United States)

    Miller, R R; Barnet, R C; Grahame, N J

    1995-05-01

    The Rescorla-Wagner model has been the most influential theory of associative learning to emerge from the study of animal behavior over the last 25 years. Recently, equivalence to this model has become a benchmark in assessing connectionist models, with such equivalence often achieved by incorporating the Widrow-Hoff delta rule. This article presents the Rescorla-Wagner model's basic assumptions, reviews some of the model's predictive successes and failures, relates the failures to the model's assumptions, and discusses the model's heuristic value. It is concluded that the model has had a positive influence on the study of simple associative learning by stimulating research and contributing to new model development. However, this benefit should neither lead to the model being regarded as inherently "correct" nor imply that its predictions can be profitably used to assess other models.

  13. Understanding National Models for Climate Assessments

    Science.gov (United States)

    Dave, A.; Weingartner, K.

    2017-12-01

    National-level climate assessments have been produced or are underway in a number of countries. These efforts showcase a variety of approaches to mapping climate impacts onto human and natural systems, and involve a variety of development processes, organizational structures, and intended purposes. This presentation will provide a comparative overview of national `models' for climate assessments worldwide, drawing from a geographically diverse group of nations with varying capacities to conduct such assessments. Using an illustrative sampling of assessment models, the presentation will highlight the range of assessment mandates and requirements that drive this work, methodologies employed, focal areas, and the degree to which international dimensions are included for each nation's assessment. This not only allows the U.S. National Climate Assessment to be better understood within an international context, but provides the user with an entry point into other national climate assessments around the world, enabling a better understanding of the risks and vulnerabilities societies face.

  14. Predictions of models for environmental radiological assessment

    International Nuclear Information System (INIS)

    Peres, Sueli da Silva; Lauria, Dejanira da Costa; Mahler, Claudio Fernando

    2011-01-01

    In the field of environmental impact assessment, models are used for estimating source term, environmental dispersion and transfer of radionuclides, exposure pathway, radiation dose and the risk for human beings Although it is recognized that the specific information of local data are important to improve the quality of the dose assessment results, in fact obtaining it can be very difficult and expensive. Sources of uncertainties are numerous, among which we can cite: the subjectivity of modelers, exposure scenarios and pathways, used codes and general parameters. The various models available utilize different mathematical approaches with different complexities that can result in different predictions. Thus, for the same inputs different models can produce very different outputs. This paper presents briefly the main advances in the field of environmental radiological assessment that aim to improve the reliability of the models used in the assessment of environmental radiological impact. The intercomparison exercise of model supplied incompatible results for 137 Cs and 60 Co, enhancing the need for developing reference methodologies for environmental radiological assessment that allow to confront dose estimations in a common comparison base. The results of the intercomparison exercise are present briefly. (author)

  15. STAMINA - Model description. Standard Model Instrumentation for Noise Assessments

    NARCIS (Netherlands)

    Schreurs EM; Jabben J; Verheijen ENG; CMM; mev

    2010-01-01

    Deze rapportage beschrijft het STAMINA-model, dat staat voor Standard Model Instrumentation for Noise Assessments en door het RIVM is ontwikkeld. Het instituut gebruikt dit standaardmodel om omgevingsgeluid in Nederland in kaart te brengen. Het model is gebaseerd op de Standaard Karteringsmethode

  16. PARALLEL MODELS OF ASSESSMENT: INFANT MENTAL HEALTH AND THERAPEUTIC ASSESSMENT MODELS INTERSECT THROUGH EARLY CHILDHOOD CASE STUDIES.

    Science.gov (United States)

    Gart, Natalie; Zamora, Irina; Williams, Marian E

    2016-07-01

    Therapeutic Assessment (TA; S.E. Finn & M.E. Tonsager, 1997; J.D. Smith, 2010) is a collaborative, semistructured model that encourages self-discovery and meaning-making through the use of assessment as an intervention approach. This model shares core strategies with infant mental health assessment, including close collaboration with parents and caregivers, active participation of the family, a focus on developing new family stories and increasing parents' understanding of their child, and reducing isolation and increasing hope through the assessment process. The intersection of these two theoretical approaches is explored, using case studies of three infants/young children and their families to illustrate the application of TA to infant mental health. The case of an 18-month-old girl whose parents fear that she has bipolar disorder illustrates the core principles of the TA model, highlighting the use of assessment intervention sessions and the clinical approach to preparing assessment feedback. The second case follows an infant with a rare genetic syndrome from ages 2 to 24 months, focusing on the assessor-parent relationship and the importance of a developmental perspective. Finally, assessment of a 3-year-old boy illustrates the development and use of a fable as a tool to provide feedback to a young child about assessment findings and recommendations. © 2016 Michigan Association for Infant Mental Health.

  17. A multi-model assessment of terrestrial biosphere model data needs

    Science.gov (United States)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial

  18. Road Assessment Model and Pilot Application in China

    Directory of Open Access Journals (Sweden)

    Tiejun Zhang

    2014-01-01

    Full Text Available Risk assessment of roads is an effective approach for road agencies to determine safety improvement investments. It can increases the cost-effective returns in crash and injury reductions. To get a powerful Chinese risk assessment model, Research Institute of Highway (RIOH is developing China Road Assessment Programme (ChinaRAP model to show the traffic crashes in China in partnership with International Road Assessment Programme (iRAP. The ChinaRAP model is based upon RIOH’s achievements and iRAP models. This paper documents part of ChinaRAP’s research work, mainly including the RIOH model and its pilot application in a province in China.

  19. Utility of Social Modeling for Proliferation Assessment - Enhancing a Facility-Level Model for Proliferation Resistance Assessment of a Nuclear Enegry System

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Brothers, Alan J.; Gastelum, Zoe N.; Olson, Jarrod; Thompson, Sandra E.

    2009-10-26

    The Utility of Social Modeling for Proliferation Assessment project (PL09-UtilSocial) investigates the use of social and cultural information to improve nuclear proliferation assessments, including nonproliferation assessments, Proliferation Resistance (PR) assessments, safeguards assessments, and other related studies. These assessments often use and create technical information about a host State and its posture towards proliferation, the vulnerability of a nuclear energy system (NES) to an undesired event, and the effectiveness of safeguards. This objective of this project is to find and integrate social and technical information by explicitly considering the role of cultural, social, and behavioral factors relevant to proliferation; and to describe and demonstrate if and how social science modeling has utility in proliferation assessment. This report describes a modeling approach and how it might be used to support a location-specific assessment of the PR assessment of a particular NES. The report demonstrates the use of social modeling to enhance an existing assessment process that relies on primarily technical factors. This effort builds on a literature review and preliminary assessment performed as the first stage of the project and compiled in PNNL-18438. [ T his report describes an effort to answer questions about whether it is possible to incorporate social modeling into a PR assessment in such a way that we can determine the effects of social factors on a primarily technical assessment. This report provides: 1. background information about relevant social factors literature; 2. background information about a particular PR assessment approach relevant to this particular demonstration; 3. a discussion of social modeling undertaken to find and characterize social factors that are relevant to the PR assessment of a nuclear facility in a specific location; 4. description of an enhancement concept that integrates social factors into an existing, technically

  20. The Model for Assessment of Telemedicine (MAST)

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Clemensen, Jane; Caffery, Liam J

    2017-01-01

    The evaluation of telemedicine can be achieved using different evaluation models or theoretical frameworks. This paper presents a scoping review of published studies which have applied the Model for Assessment of Telemedicine (MAST). MAST includes pre-implementation assessment (e.g. by use...

  1. Modelling the pre-assessment learning effects of assessment: evidence in the validity chain.

    Science.gov (United States)

    Cilliers, Francois J; Schuwirth, Lambert W T; van der Vleuten, Cees P M

    2012-11-01

    We previously developed a model of the pre-assessment learning effects of consequential assessment and started to validate it. The model comprises assessment factors, mechanism factors and learning effects. The purpose of this study was to continue the validation process. For stringency, we focused on a subset of assessment factor-learning effect associations that featured least commonly in a baseline qualitative study. Our aims were to determine whether these uncommon associations were operational in a broader but similar population to that in which the model was initially derived. A cross-sectional survey of 361 senior medical students at one medical school was undertaken using a purpose-made questionnaire based on a grounded theory and comprising pairs of written situational tests. In each pair, the manifestation of an assessment factor was varied. The frequencies at which learning effects were selected were compared for each item pair, using an adjusted alpha to assign significance. The frequencies at which mechanism factors were selected were calculated. There were significant differences in the learning effect selected between the two scenarios of an item pair for 13 of this subset of 21 uncommon associations, even when a p-value of value. For a subset of uncommon associations in the model, the role of most assessment factor-learning effect associations and the mechanism factors involved were supported in a broader but similar population to that in which the model was derived. Although model validation is an ongoing process, these results move the model one step closer to the stage of usefully informing interventions. Results illustrate how factors not typically included in studies of the learning effects of assessment could confound the results of interventions aimed at using assessment to influence learning. © Blackwell Publishing Ltd 2012.

  2. Integrated Assessment Model Evaluation

    Science.gov (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  3. Assessment of the assessment: Evaluation of the model quality estimates in CASP10

    KAUST Repository

    Kryshtafovych, Andriy

    2013-08-31

    The article presents an assessment of the ability of the thirty-seven model quality assessment (MQA) methods participating in CASP10 to provide an a priori estimation of the quality of structural models, and of the 67 tertiary structure prediction groups to provide confidence estimates for their predicted coordinates. The assessment of MQA predictors is based on the methods used in previous CASPs, such as correlation between the predicted and observed quality of the models (both at the global and local levels), accuracy of methods in distinguishing between good and bad models as well as good and bad regions within them, and ability to identify the best models in the decoy sets. Several numerical evaluations were used in our analysis for the first time, such as comparison of global and local quality predictors with reference (baseline) predictors and a ROC analysis of the predictors\\' ability to differentiate between the well and poorly modeled regions. For the evaluation of the reliability of self-assessment of the coordinate errors, we used the correlation between the predicted and observed deviations of the coordinates and a ROC analysis of correctly identified errors in the models. A modified two-stage procedure for testing MQA methods in CASP10 whereby a small number of models spanning the whole range of model accuracy was released first followed by the release of a larger number of models of more uniform quality, allowed a more thorough analysis of abilities and inabilities of different types of methods. Clustering methods were shown to have an advantage over the single- and quasi-single- model methods on the larger datasets. At the same time, the evaluation revealed that the size of the dataset has smaller influence on the global quality assessment scores (for both clustering and nonclustering methods), than its diversity. Narrowing the quality range of the assessed models caused significant decrease in accuracy of ranking for global quality predictors but

  4. Attention modeling for video quality assessment

    DEFF Research Database (Denmark)

    You, Junyong; Korhonen, Jari; Perkis, Andrew

    2010-01-01

    averaged spatiotemporal pooling. The local quality is derived from visual attention modeling and quality variations over frames. Saliency, motion, and contrast information are taken into account in modeling visual attention, which is then integrated into IQMs to calculate the local quality of a video frame...... average between the global quality and the local quality. Experimental results demonstrate that the combination of the global quality and local quality outperforms both sole global quality and local quality, as well as other quality models, in video quality assessment. In addition, the proposed video...... quality modeling algorithm can improve the performance of image quality metrics on video quality assessment compared to the normal averaged spatiotemporal pooling scheme....

  5. Personalized pseudophakic model for refractive assessment.

    Science.gov (United States)

    Ribeiro, Filomena J; Castanheira-Dinis, António; Dias, João M

    2012-01-01

    To test a pseudophakic eye model that allows for intraocular lens power (IOL) calculation, both in normal eyes and in extreme conditions, such as post-LASIK. The model's efficacy was tested in 54 participants (104 eyes) who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan®) and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®). To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF) weighted according to contrast sensitivity function (CSF), were applied. Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05). Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05). Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  6. Fluid flow in disrupted porous media: Volcanological and radiological waste applications

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Peter Jacob [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-04

    This presentation provides Technical defense supporting documents, with the following slides: (1) Introduction (2) Long boundary drainage as a source for lahars (submitted for review) (3) Development and implementation of a porosity-dependent capillary suction function in FEHM (4) Modeling in support of field testing high-level radioactive waste in salt (5) Conclusions.

  7. The use of geographical information systems for disaster risk reduction strategies: a case study of Volcan de Colima, Mexico

    Science.gov (United States)

    Landeg, O.

    Contemporary disaster risk management requires the analysis of vulnerability and hazard exposure, which is imperative at Volcan de Colima (VdC), Mexico, due to the predicted, large-magnitude eruption forecast to occur before 2025. The methods used to gauge social vulnerability included the development and application of proxies to census records, the undertaking of a building vulnerability survey and the spatial mapping of civil and emergency infrastructure. Hazard exposure was assessed using primary modelling of laharic events and the digitalisation of secondary data sources detailing the modelled extent of pyroclastic flows and tephra deposition associated with a large-magnitude (VEI 5) eruption at VdC. The undertaking and analysis of a risk perception survey of the population enabled an understanding of the cognitive behaviour of residents towards the volcanic risk. In comparison to the published hazard map, the GIS analysis highlighted an underestimation of lahar hazard on the western flank of VdC and the regional tephra hazard. Vulnerability analysis identified three communities where social deprivation is relatively high, and those with significant elderly and transient populations near the volcano. Furthermore, recognition of the possibility of an eruption in the near future was found to be low across the study region. These results also contributed to the analysis of emergency management procedures and the preparedness of the regional authorities. This multidisciplinary research programme demonstrates the success of applying a GIS platform to varied integrative spatial and temporal analysis. Furthermore, ascertaining the impact of future activity at VdC upon its surrounding populations permits the evaluation of emergency preparedness and disaster risk reduction strategies.

  8. Review of early assessment models of innovative medical technologies.

    Science.gov (United States)

    Fasterholdt, Iben; Krahn, Murray; Kidholm, Kristian; Yderstræde, Knud Bonnet; Pedersen, Kjeld Møller

    2017-08-01

    Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models for early assessment in different health organisations and discusses which models hold most promise for hospital decision makers. A scoping review of published studies between 1996 and 2015 was performed using nine databases. The following information was collected: decision context, decision problem, and a description of the early assessment model. 2362 articles were identified and 12 studies fulfilled the inclusion criteria. An additional 12 studies were identified and included in the review by searching reference lists. The majority of the 24 early assessment studies were variants of traditional cost-effectiveness analysis. Around one fourth of the studies presented an evaluation model with a broader focus than cost-effectiveness. Uncertainty was mostly handled by simple sensitivity or scenario analysis. This review shows that evaluation models using known methods assessing cost-effectiveness are most prevalent in early assessment, but seems ill-suited for early assessment in hospitals. Four models provided some usable elements for the development of a hospital-based model. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  9. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol.2

    International Nuclear Information System (INIS)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines. In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios

  10. Personalized pseudophakic model for refractive assessment.

    Directory of Open Access Journals (Sweden)

    Filomena J Ribeiro

    Full Text Available PURPOSE: To test a pseudophakic eye model that allows for intraocular lens power (IOL calculation, both in normal eyes and in extreme conditions, such as post-LASIK. METHODS: PARTICIPANTS: The model's efficacy was tested in 54 participants (104 eyes who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan® and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®. To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF weighted according to contrast sensitivity function (CSF, were applied. RESULTS: Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05. Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05. CONCLUSIONS: Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  11. Model-Based Approaches for Teaching and Practicing Personality Assessment.

    Science.gov (United States)

    Blais, Mark A; Hopwood, Christopher J

    2017-01-01

    Psychological assessment is a complex professional skill. Competence in assessment requires an extensive knowledge of personality, neuropsychology, social behavior, and psychopathology, a background in psychometrics, familiarity with a range of multimethod tools, cognitive flexibility, skepticism, and interpersonal sensitivity. This complexity makes assessment a challenge to teach and learn, particularly as the investment of resources and time in assessment has waned in psychological training programs over the last few decades. In this article, we describe 3 conceptual models that can assist teaching and learning psychological assessments. The transtheoretical model of personality provides a personality systems-based framework for understanding how multimethod assessment data relate to major personality systems and can be combined to describe and explain complex human behavior. The quantitative psychopathology-personality trait model is an empirical model based on the hierarchical organization of individual differences. Application of this model can help students understand diagnostic comorbidity and symptom heterogeneity, focus on more meaningful high-order domains, and identify the most effective assessment tools for addressing a given question. The interpersonal situation model is rooted in interpersonal theory and can help students connect test data to here-and-now interactions with patients. We conclude by demonstrating the utility of these models using a case example.

  12. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  13. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  14. Interactive Rapid Dose Assessment Model (IRDAM): user's guide

    International Nuclear Information System (INIS)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This User's Guide provides instruction in the setup and operation of the equipment necessary to run IRDAM. Instructions are also given on how to load the magnetic disks and access the interactive part of the program. Two other companion volumes to this one provide additional information on IRDAM. Reactor Accident Assessment Methods (NUREG/CR-3012, Volume 2) describes the technical bases for IRDAM including methods, models and assumptions used in calculations. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios

  15. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  16. Hydrological sensitivity of volcanically disturbed watersheds—a lesson reinforced at Pinatubo

    Science.gov (United States)

    Major, J. J.; Janda, R. J.

    2016-12-01

    The climactic June 1991 eruption of Mount Pinatubo devastated many surrounding catchments with thick pyroclastic fall and flow deposits, and subsequent hydrogeomorphic responses were dramatic and persisted for years. But in the 24 hours preceding the climactic eruption there was less devastating eruptive activity that had more subtle, yet significant, impact on catchment hydrology. Stratigraphic relations show damaging lahars swept all major channels east of the volcano, starting late on June 14 and continuing through (and in some instances after) midday on June 15, before the climactic phase of the eruption began and before Typhoon Yunya struck the region. These early lahars were preceded by relatively small explosions and pyroclastic surges that emplaced fine-grained ash in the upper catchments, locally damaged or destroyed vegetation, reduced hillside infiltration capacity, and smoothed surface roughness. Thus the lahars, likely triggered by typical afternoon monsoon storms perhaps enhanced by local thermal influences of fresh volcanic deposits, did not result from extraordinary tropical rainfall or exceptional volcaniclastic deposition. Instead, direct rainfall-runoff volume increased substantially as a consequence of vegetation damage and moderate deposition of fine ash. Rapid runoff from hillsides to channels initiated hillside and bank erosion as well as channel scour, producing debris flows and hyperconcentrated flows. Timing of some lahars varied across catchments as well as downstream within catchments with respect to climactic pumice fall, demonstrating complex interplay among volcanic processes, variations in catchment disturbance, and rainfall timing and intensity. Occurrence of these early lahars supports the hypothesis that eruptions that deposit fine ash in volcanic catchments can instigate major hydrogeomorphic responses even when volcanic disturbances are modest—an effect that can be masked by later eruption impacts.

  17. The volcaniclastic sequence of Aranzazu: Record of the impact of volcanism on Neogene fluvial system in the middle part of the Central Cordillera, Colombia

    International Nuclear Information System (INIS)

    Borrero Pena, Carlos Alberto; Rosero Cespedes, Juan Sebastian; Valencia M, Julian David; Pardo Trujillo, Andres

    2008-01-01

    The volcaniclastic sequence of Aranzazu (VSA, late Pliocene - early Pleistocene?) was sourced from the northernmost sector of the Machin - Cerro Bravo volcanic complex. The volcaniclastic accumulations filled the pre-existing fault-bend depressions in the surroundings of Aranzazu town (Caldas department, Colombia). A new classification of volcaniclastic deposits is proposed, in which the lahars are defined as volcaniclastic resedimented deposits, and differentiated from the primary volcaniclastic and epiclastic deposits. The updating the sedimentology and rheology of the deposits related with the laharic events is aimed. The VSA stratigraphy is based on the lithofacies identification and the definition of the architectural elements for syn- and inter-eruptive periods. The VSA lower member corresponds to the successive aggradation of syneruptive lahars (SV and SB elements) resulted from re-sedimentation of pumice-rich pyroclastic deposits and transported as debris and hyperconcentrated stream/flood flows. The VSA middle and upper members defined by coal contents were formed during the dominion of inter-eruptive (FF element) over the syn-eruptive (SV and SB elements) periods. They were formed during the reestablishment of the fluvial condition after the syn-eruptive laharic activity. Once the fluvial deposition was strengthened, the necessary conditions for the peat formation were propitious and the coal-bearing bed sets were developed.

  18. Integrated assessment models of global climate change

    International Nuclear Information System (INIS)

    Parson, E.A.; Fisher-Vanden, K.

    1997-01-01

    The authors review recent work in the integrated assessment modeling of global climate change. This field has grown rapidly since 1990. Integrated assessment models seek to combine knowledge from multiple disciplines in formal integrated representations; inform policy-making, structure knowledge, and prioritize key uncertainties; and advance knowledge of broad system linkages and feedbacks, particularly between socio-economic and bio-physical processes. They may combine simplified representations of the socio-economic determinants of greenhouse gas emissions, the atmosphere and oceans, impacts on human activities and ecosystems, and potential policies and responses. The authors summarize current projects, grouping them according to whether they emphasize the dynamics of emissions control and optimal policy-making, uncertainty, or spatial detail. They review the few significant insights that have been claimed from work to date and identify important challenges for integrated assessment modeling in its relationships to disciplinary knowledge and to broader assessment seeking to inform policy- and decision-making. 192 refs., 2 figs

  19. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  20. Underwater noise modelling for environmental impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Farcas, Adrian [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom); Thompson, Paul M. [Lighthouse Field Station, Institute of Biological and Environmental Sciences, University of Aberdeen, Cromarty IV11 8YL (United Kingdom); Merchant, Nathan D., E-mail: nathan.merchant@cefas.co.uk [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom)

    2016-02-15

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliar with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.

  1. Underwater noise modelling for environmental impact assessment

    International Nuclear Information System (INIS)

    Farcas, Adrian; Thompson, Paul M.; Merchant, Nathan D.

    2016-01-01

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliar with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.

  2. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  3. Assessing NARCCAP climate model effects using spatial confidence regions

    Directory of Open Access Journals (Sweden)

    J. P. French

    2017-07-01

    Full Text Available We assess similarities and differences between model effects for the North American Regional Climate Change Assessment Program (NARCCAP climate models using varying classes of linear regression models. Specifically, we consider how the average temperature effect differs for the various global and regional climate model combinations, including assessment of possible interaction between the effects of global and regional climate models. We use both pointwise and simultaneous inference procedures to identify regions where global and regional climate model effects differ. We also show conclusively that results from pointwise inference are misleading, and that accounting for multiple comparisons is important for making proper inference.

  4. A Model for Situation and Threat Assessment

    Science.gov (United States)

    2006-12-01

    CUBRC , Inc.) 8151 Needwood #T103 Derwood, MD 20855 UNITED STATES steinberg@cubrc.org A model is presented for situation and threat assessment...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Subject Matter Expert (SME) Calspan-UB Research Center ( CUBRC , Inc.) 8151 Needwood #T103 Derwood, MD...1 A Model for Situation and Threat Assessment Alan Steinberg CUBRC , Inc. steinberg@cubrc.org November, 2005 2 Objectives • Advance the state-of

  5. A Simple Model of Self-Assessments

    NARCIS (Netherlands)

    S. Dominguez Martinez (Silvia); O.H. Swank (Otto)

    2006-01-01

    textabstractWe develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too

  6. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  7. Conceptual Models and Guidelines for Clinical Assessment of Financial Capacity.

    Science.gov (United States)

    Marson, Daniel

    2016-09-01

    The ability to manage financial affairs is a life skill of critical importance, and neuropsychologists are increasingly asked to assess financial capacity across a variety of settings. Sound clinical assessment of financial capacity requires knowledge and appreciation of applicable clinical conceptual models and principles. However, the literature has presented relatively little conceptual guidance for clinicians concerning financial capacity and its assessment. This article seeks to address this gap. The article presents six clinical models of financial capacity : (1) the early gerontological IADL model of Lawton, (2) the clinical skills model and (3) related cognitive psychological model developed by Marson and colleagues, (4) a financial decision-making model adapting earlier decisional capacity work of Appelbaum and Grisso, (5) a person-centered model of financial decision-making developed by Lichtenberg and colleagues, and (6) a recent model of financial capacity in the real world developed through the Institute of Medicine. Accompanying presentation of the models is discussion of conceptual and practical perspectives they represent for clinician assessment. Based on the models, the article concludes by presenting a series of conceptually oriented guidelines for clinical assessment of financial capacity. In summary, sound assessment of financial capacity requires knowledge and appreciation of clinical conceptual models and principles. Awareness of such models, principles and guidelines will strengthen and advance clinical assessment of financial capacity. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. A simple model of self-assessment

    NARCIS (Netherlands)

    Dominguez-Martinez, S.; Swank, O.H.

    2009-01-01

    We develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too positive and

  9. MODEL AUTHENTIC SELF-ASSESSMENT DALAM PENGEMBANGAN EMPLOYABILITY SKILLS MAHASISWA PENDIDIKAN TINGGI VOKASI

    Directory of Open Access Journals (Sweden)

    I Made Suarta

    2015-06-01

    ______________________________________________________________ AUTHENTIC SELF-ASSESSMENT MODEL FOR DEVELOPING EMPLOYABILITY SKILLS STUDENT IN HIGHER VOCATIONAL EDUCATION Abstract The purpose of this research is to develop assessment tools to evaluate achievement of employability skills which are integrated in the learning database applications. The assessment model developed is a combination of self-assessment and authentic assessment, proposed as models of authentic self-assessment. The steps of developing authentic self-assessment models include: identifying the standards, selecting an authentic task, identifying the criteria for the task, and creating the rubric. The results of development assessment tools include: (1 problem solving skills assessment model, (2 self-management skills assessment model, and (3 competence database applications assessment model. This model can be used to assess the cognitive, affective, and psychomotor achievement. The results indicate: achievement of problem solving and self-management ability was in good category, and competencies in designing conceptual and logical database was in high category. This model also has met the basic principles of assessment, i.e.: validity, reliability, focused on competencies, comprehen-sive, objectivity, and the principle of educating. Keywords: authentic assessment, self-assessment, problem solving skills, self-management skills, vocational education

  10. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  11. Irrigation in dose assessments models

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, Ulla; Barkefors, Catarina [Studsvik RadWaste AB, Nykoeping (Sweden)

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  12. Irrigation in dose assessments models

    International Nuclear Information System (INIS)

    Bergstroem, Ulla; Barkefors, Catarina

    2004-05-01

    SKB has carried out several safety analyses for repositories for radioactive waste, one of which was SR 97, a multi-site study concerned with a future deep bedrock repository for high-level waste. In case of future releases due to unforeseen failure of the protective multiple barrier system, radionuclides may be transported with groundwater and may reach the biosphere. Assessments of doses have to be carried out with a long-term perspective. Specific models are therefore employed to estimate consequences to man. It has been determined that the main pathway for nuclides from groundwater or surface water to soil is via irrigation. Irrigation may cause contamination of crops directly by e.g. interception or rain-splash, and indirectly via root-uptake from contaminated soil. The exposed people are in many safety assessments assumed to be self-sufficient, i.e. their food is produced locally where the concentration of radionuclides may be the highest. Irrigation therefore plays an important role when estimating consequences. The present study is therefore concerned with a more extensive analysis of the role of irrigation for possible future doses to people living in the area surrounding a repository. Current irrigation practices in Sweden are summarised, showing that vegetables and potatoes are the most common crops for irrigation. In general, however, irrigation is not so common in Sweden. The irrigation model used in the latest assessments is described. A sensitivity analysis is performed showing that, as expected, interception of irrigation water and retention on vegetation surfaces are important parameters. The parameters used to describe this are discussed. A summary is also given how irrigation is proposed to be handled in the international BIOMASS (BIOsphere Modelling and ASSessment) project and in models like TAME and BIOTRAC. Similarities and differences are pointed out. Some numerical results are presented showing that surface contamination in general gives the

  13. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  14. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  15. THE MODEL FOR RISK ASSESSMENT ERP-SYSTEMS INFORMATION SECURITY

    Directory of Open Access Journals (Sweden)

    V. S. Oladko

    2016-12-01

    Full Text Available The article deals with the problem assessment of information security risks in the ERP-system. ERP-system functions and architecture are studied. The model malicious impacts on levels of ERP-system architecture are composed. Model-based risk assessment, which is the quantitative and qualitative approach to risk assessment, built on the partial unification 3 methods for studying the risks of information security - security models with full overlapping technique CRAMM and FRAP techniques developed.

  16. Construction and Application Research of Isomap-RVM Credit Assessment Model

    Directory of Open Access Journals (Sweden)

    Guangrong Tong

    2015-01-01

    Full Text Available Credit assessment is the basis and premise of credit risk management systems. Accurate and scientific credit assessment is of great significance to the operational decisions of shareholders, corporate creditors, and management. Building a good and reliable credit assessment model is key to credit assessment. Traditional credit assessment models are constructed using the support vector machine (SVM combined with certain traditional dimensionality reduction algorithms. When constructing such a model, the dimensionality reduction algorithms are first applied to reduce the dimensions of the samples, so as to prevent the correlation of the samples’ characteristic index from being too high. Then, machine learning of the samples will be conducted using the SVM, in order to carry out classification assessment. To further improve the accuracy of credit assessment methods, this paper has introduced more cutting-edge algorithms, applied isometric feature mapping (Isomap for dimensionality reduction, and used the relevance vector machine (RVM for credit classification. It has constructed an Isomap-RVM model and used it to conduct financial analysis of China's listed companies. The empirical analysis shows that the credit assessment accuracy of the Isomap-RVM model is significantly higher than that of the Isomap-SVM model and slightly higher than that of the PCA-RVM model. It can correctly identify the credit risks of listed companies.

  17. Addressing challenges in single species assessments via a simple state-space assessment model

    DEFF Research Database (Denmark)

    Nielsen, Anders

    Single-species and age-structured fish stock assessments still remains the main tool for managing fish stocks. A simple state-space assessment model is presented as an alternative to (semi) deterministic procedures and the full parametric statistical catch at age models. It offers a solution...... to some of the key challenges of these models. Compared to the deterministic procedures it solves a list of problems originating from falsely assuming that age classified catches are known without errors and allows quantification of uncertainties of estimated quantities of interest. Compared to full...

  18. Confidence assessment. Site-descriptive modelling SDM-Site Laxemar

    International Nuclear Information System (INIS)

    2009-06-01

    The objective of this report is to assess the confidence that can be placed in the Laxemar site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Laxemar). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface-based investigations or more usefully by explorations underground made during construction of the repository. Procedures for this assessment have been progressively refined during the course of the site descriptive modelling, and applied to all previous versions of the Forsmark and Laxemar site descriptive models. They include assessment of whether all relevant data have been considered and understood, identification of the main uncertainties and their causes, possible alternative models and their handling, and consistency between disciplines. The assessment then forms the basis for an overall confidence statement. The confidence in the Laxemar site descriptive model, based on the data available at the conclusion of the surface based site investigations, has been assessed by exploring: - Confidence in the site characterization data base, - remaining issues and their handling, - handling of alternatives, - consistency between disciplines and - main reasons for confidence and lack of confidence in the model. Generally, the site investigation database is of high quality, as assured by the quality procedures applied. It is judged that the Laxemar site descriptive model has an overall high level of confidence. Because of the relatively robust geological model that describes the site, the overall confidence in the Laxemar Site Descriptive model is judged to be high, even though details of the spatial variability remain unknown. The overall reason for this confidence is the wide spatial distribution of the data and the consistency between

  19. Confidence assessment. Site-descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    2008-12-15

    The objective of this report is to assess the confidence that can be placed in the Laxemar site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Laxemar). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface-based investigations or more usefully by explorations underground made during construction of the repository. Procedures for this assessment have been progressively refined during the course of the site descriptive modelling, and applied to all previous versions of the Forsmark and Laxemar site descriptive models. They include assessment of whether all relevant data have been considered and understood, identification of the main uncertainties and their causes, possible alternative models and their handling, and consistency between disciplines. The assessment then forms the basis for an overall confidence statement. The confidence in the Laxemar site descriptive model, based on the data available at the conclusion of the surface based site investigations, has been assessed by exploring: - Confidence in the site characterization data base, - remaining issues and their handling, - handling of alternatives, - consistency between disciplines and - main reasons for confidence and lack of confidence in the model. Generally, the site investigation database is of high quality, as assured by the quality procedures applied. It is judged that the Laxemar site descriptive model has an overall high level of confidence. Because of the relatively robust geological model that describes the site, the overall confidence in the Laxemar Site Descriptive model is judged to be high, even though details of the spatial variability remain unknown. The overall reason for this confidence is the wide spatial distribution of the data and the consistency between

  20. NEW MODEL OF QUALITY ASSESSMENT IN PUBLIC ADMINISTRATION - UPGRADING THE COMMON ASSESSMENT FRAMEWORK (CAF

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2017-01-01

    Full Text Available In our study, we developed new model of quality assessment in public administration. The Common Assessment Framework (CAF is frequently used in continental Europe for this purpose. Its use has many benefits, however we believe its assessment logic is not adequate for public administration. Upgraded version of CAF is conceptually different: instead of analytical and linear CAF we get the instrument that measures organisation as a network of complex processes. Original and upgraded assessment approaches are presented in the paper and compared in the case of self-assessment of selected public administration organisation. The two approaches produced different, sometimes contradictory results. The upgraded model proved to be logically more consistent and it produced higher interpretation capacity.

  1. The Development of a Secondary School Health Assessment Model

    Science.gov (United States)

    Sriring, Srinual; Erawan, Prawit; Sriwarom, Monoon

    2015-01-01

    The objective of this research was to: 1) involved a survey of information relating to secondary school health, 2) involved the construction of a model of health assessment and a handbook for using the model in secondary school, 3) develop an assessment model for secondary school. The research included 3 phases. (1) involved a survey of…

  2. Model summary report for the safety assessment SR-Site

    International Nuclear Information System (INIS)

    Vahlund, Fredrik; Zetterstroem Evins, Lena; Lindgren, Maria

    2010-12-01

    This document is the model summary report for the safety assessment SR-Site. In the report, the quality assurance (QA) measures conducted for assessment codes are presented together with the chosen QA methodology. In the safety assessment project SR-Site, a large number of numerical models are used to analyse the system and to show compliance. In order to better understand how the different models interact and how information are transferred between the different models Assessment Model Flowcharts, AMFs, are used. From these, different modelling tasks can be identify and the computer codes used. As a large number of computer codes are used in the assessment the complexity of these differs to a large extent, some of the codes are commercial while others are developed especially for the assessment at hand. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined for all codes: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. - It must be described how data are transferred between the different computational tasks. Although the requirements are identical for all codes in the assessment, the measures used to show that the requirements are fulfilled will be different for different types of codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented together with a discussion on how the requirements are met

  3. Model summary report for the safety assessment SR-Site

    Energy Technology Data Exchange (ETDEWEB)

    Vahlund, Fredrik; Zetterstroem Evins, Lena (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Lindgren, Maria (Kemakta Konsult AB, Stockholm (Sweden))

    2010-12-15

    This document is the model summary report for the safety assessment SR-Site. In the report, the quality assurance (QA) measures conducted for assessment codes are presented together with the chosen QA methodology. In the safety assessment project SR-Site, a large number of numerical models are used to analyse the system and to show compliance. In order to better understand how the different models interact and how information are transferred between the different models Assessment Model Flowcharts, AMFs, are used. From these, different modelling tasks can be identify and the computer codes used. As a large number of computer codes are used in the assessment the complexity of these differs to a large extent, some of the codes are commercial while others are developed especially for the assessment at hand. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined for all codes: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. - It must be described how data are transferred between the different computational tasks. Although the requirements are identical for all codes in the assessment, the measures used to show that the requirements are fulfilled will be different for different types of codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented together with a discussion on how the requirements are met

  4. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    Science.gov (United States)

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  5. A systematic literature review of open source software quality assessment models.

    Science.gov (United States)

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  6. Model of MSD Risk Assessment at Workplace

    OpenAIRE

    K. Sekulová; M. Šimon

    2015-01-01

    This article focuses on upper-extremity musculoskeletal disorders risk assessment model at workplace. In this model are used risk factors that are responsible for musculoskeletal system damage. Based on statistic calculations the model is able to define what risk of MSD threatens workers who are under risk factors. The model is also able to say how MSD risk would decrease if these risk factors are eliminated.

  7. Conceptual models for cumulative risk assessment.

    Science.gov (United States)

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  8. Computational model for the assessment of oil spill damages

    Energy Technology Data Exchange (ETDEWEB)

    Seip, K L; Heiberg, A B; Brekke, K A

    1985-06-01

    A description is given of the method and the required data of a model for calculating oil spill damages. Eleven damage attributes are defined: shorelength contaminated, shore restitution time, birds dead, restitution time for three groups of birds, open sea damages-two types, damages to recreation, economy and fisheries. The model has been applied in several cases of oil pollution assessments: in an examination of alternative models for the organization of oil spill combat in Norway, in the assessment of the damages coused by a blowout at Tromsoeflaket and in assessing a possible increase in oil spill preparedness for Svalbard. 56 references.

  9. Model of environmental life cycle assessment for coal mining operations.

    Science.gov (United States)

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Assessment of Teacher Perceived Skill in Classroom Assessment Practices Using IRT Models

    Science.gov (United States)

    Koloi-Keaikitse, Setlhomo

    2017-01-01

    The purpose of this study was to assess teacher perceived skill in classroom assessment practices. Data were collected from a sample of (N = 691) teachers selected from government primary, junior secondary, and senior secondary schools in Botswana. Item response theory models were used to identify teacher response on items that measured their…

  11. Models and parameters for environmental radiological assessments

    International Nuclear Information System (INIS)

    Miller, C.W.

    1983-01-01

    This article reviews the forthcoming book Models and Parameters for Environmental Radiological Assessments, which presents a unified compilation of models and parameters for assessing the impact on man of radioactive discharges, both routine and accidental, into the environment. Models presented in this book include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Summaries are presented for each of the transport and dosimetry areas previously for each of the transport and dosimetry areas previously mentioned, and details are available in the literature cited. A chapter of example problems illustrates many of the methodologies presented throughout the text. Models and parameters presented are based on the results of extensive literature reviews and evaluations performed primarily by the staff of the Health and Safety Research Division of Oak Ridge National Laboratory

  12. Assessment of Venous Thrombosis in Animal Models.

    Science.gov (United States)

    Grover, Steven P; Evans, Colin E; Patel, Ashish S; Modarai, Bijan; Saha, Prakash; Smith, Alberto

    2016-02-01

    Deep vein thrombosis and common complications, including pulmonary embolism and post-thrombotic syndrome, represent a major source of morbidity and mortality worldwide. Experimental models of venous thrombosis have provided considerable insight into the cellular and molecular mechanisms that regulate thrombus formation and subsequent resolution. Here, we critically appraise the ex vivo and in vivo techniques used to assess venous thrombosis in these models. Particular attention is paid to imaging modalities, including magnetic resonance imaging, micro-computed tomography, and high-frequency ultrasound that facilitate longitudinal assessment of thrombus size and composition. © 2015 American Heart Association, Inc.

  13. Model of environmental life cycle assessment for coal mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Burchart-Korol, Dorota, E-mail: dburchart@gig.eu; Fugiel, Agata, E-mail: afugiel@gig.eu; Czaplicka-Kolarz, Krystyna, E-mail: kczaplicka@gig.eu; Turek, Marian, E-mail: mturek@gig.eu

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  14. Model of environmental life cycle assessment for coal mining operations

    International Nuclear Information System (INIS)

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-01-01

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500 years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. - Highlights: • A computational LCA model for assessment of coal mining operations • Identification of

  15. Using ecosystem modelling techniques in exposure assessments of radionuclides - an overview

    International Nuclear Information System (INIS)

    Kumblad, L.

    2005-01-01

    The risk to humans from potential releases from nuclear facilities is evaluated in safety assessments. Essential components of these assessments are exposure models, which estimate the transport of radionuclides in the environment, the uptake in biota, and transfer to humans. Recently, there has been a growing concern for radiological protection of the whole environment, not only humans, and a first attempt has been to employ model approaches based on stylized environments and transfer functions to biota based exclusively on bioconcentration factors (BCF). They are generally of a non-mechanistic nature and involve no knowledge of the actual processes involved, which is a severe limitation when assessing real ecosystems. in this paper, the possibility of using an ecological modelling approach as a complement or an alternative to the use of BCF-based models is discussed. The paper gives an overview of ecological and ecosystem modelling and examples of studies where ecosystem models have been used in association to ecological risk assessment studies for other pollutants than radionuclides. It also discusses the potential to use this technique in exposure assessments of radionuclides with a few examples from the safety assessment work performed by the Swedish nuclear fuel and waste management company (SKB). Finally there is a comparison of the characteristics of ecosystem models and traditionally exposure models for radionuclides used to estimate the radionuclide exposure of biota. The evaluation of ecosystem models already applied in safety assessments has shown that the ecosystem approach is possible to use to assess exposure to biota, and that it can handle many of the modelling problems identified related to BCF-models. The findings in this paper suggest that both national and international assessment frameworks for protection of the environment from ionising radiation would benefit from striving to adopt methodologies based on ecologically sound principles and

  16. Route Assessment for Unmanned Aerial Vehicle Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Xixia Sun

    2014-01-01

    Full Text Available An integrated route assessment approach based on cloud model is proposed in this paper, where various sources of uncertainties are well kept and modeled by cloud theory. Firstly, a systemic criteria framework incorporating models for scoring subcriteria is developed. Then, the cloud model is introduced to represent linguistic variables, and survivability probability histogram of each route is converted into normal clouds by cloud transformation, enabling both randomness and fuzziness in the assessment environment to be managed simultaneously. Finally, a new way to measure the similarity between two normal clouds satisfying reflexivity, symmetry, transitivity, and overlapping is proposed. Experimental results demonstrate that the proposed route assessment approach outperforms fuzzy logic based assessment approach with regard to feasibility, reliability, and consistency with human thinking.

  17. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  18. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  19. Agricultural climate impacts assessment for economic modeling and decision support

    Science.gov (United States)

    Thomson, A. M.; Izaurralde, R. C.; Beach, R.; Zhang, X.; Zhao, K.; Monier, E.

    2013-12-01

    A range of approaches can be used in the application of climate change projections to agricultural impacts assessment. Climate projections can be used directly to drive crop models, which in turn can be used to provide inputs for agricultural economic or integrated assessment models. These model applications, and the transfer of information between models, must be guided by the state of the science. But the methodology must also account for the specific needs of stakeholders and the intended use of model results beyond pure scientific inquiry, including meeting the requirements of agencies responsible for designing and assessing policies, programs, and regulations. Here we present methodology and results of two climate impacts studies that applied climate model projections from CMIP3 and from the EPA Climate Impacts and Risk Analysis (CIRA) project in a crop model (EPIC - Environmental Policy Indicator Climate) in order to generate estimates of changes in crop productivity for use in an agricultural economic model for the United States (FASOM - Forest and Agricultural Sector Optimization Model). The FASOM model is a forward-looking dynamic model of the US forest and agricultural sector used to assess market responses to changing productivity of alternative land uses. The first study, focused on climate change impacts on the UDSA crop insurance program, was designed to use available daily climate projections from the CMIP3 archive. The decision to focus on daily data for this application limited the climate model and time period selection significantly; however for the intended purpose of assessing impacts on crop insurance payments, consideration of extreme event frequency was critical for assessing periodic crop failures. In a second, coordinated impacts study designed to assess the relative difference in climate impacts under a no-mitigation policy and different future climate mitigation scenarios, the stakeholder specifically requested an assessment of a

  20. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    The prime goal of model validation is to build confidence in the model concept and that the model is fit for its intended purpose. In other words: Does the model predict transport in fractured rock adequately to be used in repository performance assessments. Are the results reasonable for the type of modelling tasks the model is designed for. Commonly, in performance assessments a large number of realisations of flow and transport is made to cover the associated uncertainties. Thus, the flow and transport including radioactive chain decay are preferably calculated in the same model framework. A rather sophisticated concept is necessary to be able to model flow and radionuclide transport in the near field and far field of a deep repository, also including radioactive chain decay. In order to avoid excessively long computational times there is a need for well-based simplifications. For this reason, the far field code FARF31 is made relatively simple, and calculates transport by using averaged entities to represent the most important processes. FARF31 has been shown to be suitable for the performance assessments within the SKB studies, e.g. SR 97. Among the advantages are that it is a fast, simple and robust code, which enables handling of many realisations with wide spread in parameters in combination with chain decay of radionuclides. Being a component in the model chain PROPER, it is easy to assign statistical distributions to the input parameters. Due to the formulation of the advection-dispersion equation in FARF31 it is possible to perform the groundwater flow calculations separately.The basis for the modelling is a stream tube, i.e. a volume of rock including fractures with flowing water, with the walls of the imaginary stream tube defined by streamlines. The transport within the stream tube is described using a dual porosity continuum approach, where it is assumed that rock can be divided into two distinct domains with different types of porosity

  1. Assessing a Theoretical Model on EFL College Students

    Science.gov (United States)

    Chang, Yu-Ping

    2011-01-01

    This study aimed to (1) integrate relevant language learning models and theories, (2) construct a theoretical model of college students' English learning performance, and (3) assess the model fit between empirically observed data and the theoretical model proposed by the researchers of this study. Subjects of this study were 1,129 Taiwanese EFL…

  2. Assessing Local Model Adequacy in Bayesian Hierarchical Models Using the Partitioned Deviance Information Criterion

    Science.gov (United States)

    Wheeler, David C.; Hickson, DeMarc A.; Waller, Lance A.

    2010-01-01

    Many diagnostic tools and goodness-of-fit measures, such as the Akaike information criterion (AIC) and the Bayesian deviance information criterion (DIC), are available to evaluate the overall adequacy of linear regression models. In addition, visually assessing adequacy in models has become an essential part of any regression analysis. In this paper, we focus on a spatial consideration of the local DIC measure for model selection and goodness-of-fit evaluation. We use a partitioning of the DIC into the local DIC, leverage, and deviance residuals to assess local model fit and influence for both individual observations and groups of observations in a Bayesian framework. We use visualization of the local DIC and differences in local DIC between models to assist in model selection and to visualize the global and local impacts of adding covariates or model parameters. We demonstrate the utility of the local DIC in assessing model adequacy using HIV prevalence data from pregnant women in the Butare province of Rwanda during 1989-1993 using a range of linear model specifications, from global effects only to spatially varying coefficient models, and a set of covariates related to sexual behavior. Results of applying the diagnostic visualization approach include more refined model selection and greater understanding of the models as applied to the data. PMID:21243121

  3. A test-bed modeling study for wave resource assessment

    Science.gov (United States)

    Yang, Z.; Neary, V. S.; Wang, T.; Gunawan, B.; Dallman, A.

    2016-02-01

    Hindcasts from phase-averaged wave models are commonly used to estimate standard statistics used in wave energy resource assessments. However, the research community and wave energy converter industry is lacking a well-documented and consistent modeling approach for conducting these resource assessments at different phases of WEC project development, and at different spatial scales, e.g., from small-scale pilot study to large-scale commercial deployment. Therefore, it is necessary to evaluate current wave model codes, as well as limitations and knowledge gaps for predicting sea states, in order to establish best wave modeling practices, and to identify future research needs to improve wave prediction for resource assessment. This paper presents the first phase of an on-going modeling study to address these concerns. The modeling study is being conducted at a test-bed site off the Central Oregon Coast using two of the most widely-used third-generation wave models - WaveWatchIII and SWAN. A nested-grid modeling approach, with domain dimension ranging from global to regional scales, was used to provide wave spectral boundary condition to a local scale model domain, which has a spatial dimension around 60km by 60km and a grid resolution of 250m - 300m. Model results simulated by WaveWatchIII and SWAN in a structured-grid framework are compared to NOAA wave buoy data for the six wave parameters, including omnidirectional wave power, significant wave height, energy period, spectral width, direction of maximum directionally resolved wave power, and directionality coefficient. Model performance and computational efficiency are evaluated, and the best practices for wave resource assessments are discussed, based on a set of standard error statistics and model run times.

  4. Uncertainty Assessment in Urban Storm Water Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...

  5. A comparison of radiological risk assessment models: Risk assessment models used by the BEIR V Committee, UNSCEAR, ICRP, and EPA (for NESHAP)

    International Nuclear Information System (INIS)

    Wahl, L.E.

    1994-03-01

    Radiological risk assessments and resulting risk estimates have been developed by numerous national and international organizations, including the National Research Council's fifth Committee on the Biological Effects of Ionizing Radiations (BEIR V), the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), and the International Commission on Radiological Protection (ICRP). A fourth organization, the Environmental Protection Agency (EPA), has also performed a risk assessment as a basis for the National Emission Standards for Hazardous Air Pollutants (NESHAP). This paper compares the EPA's model of risk assessment with the models used by the BEIR V Committee, UNSCEAR, and ICRP. Comparison is made of the values chosen by each organization for several model parameters: populations used in studies and population transfer coefficients, dose-response curves and dose-rate effects, risk projection methods, and risk estimates. This comparison suggests that the EPA has based its risk assessment on outdated information and that the organization should consider adopting the method used by the BEIR V Committee, UNSCEAR, or ICRP

  6. A model for assessment of telemedicine applications

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Ekeland, Anne Granstrøm; Jensen, Lise Kvistgaard

    2012-01-01

    the European Commission initiated the development of a framework for assessing telemedicine applications, based on the users' need for information for decision making. This article presents the Model for ASsessment of Telemedicine applications (MAST) developed in this study.......Telemedicine applications could potentially solve many of the challenges faced by the healthcare sectors in Europe. However, a framework for assessment of these technologies is need by decision makers to assist them in choosing the most efficient and cost-effective technologies. Therefore in 2009...

  7. Forecasting consequences of accidental release: how reliable are current assessment models

    International Nuclear Information System (INIS)

    Rohwer, P.S.; Hoffman, F.O.; Miller, C.W.

    1983-01-01

    This paper focuses on uncertainties in model output used to assess accidents. We begin by reviewing the historical development of assessment models and the associated interest in uncertainties as these evolutionary processes occurred in the United States. This is followed by a description of the sources of uncertainties in assessment calculations. Types of models appropriate for assessment of accidents are identified. A summary of results from our analysis of uncertainty is provided in results obtained with current methodology for assessing routine and accidental radionuclide releases to the environment. We conclude with discussion of preferred procedures and suggested future directions to improve the state-of-the-art of radiological assessments

  8. Users guide to REGIONAL-1: a regional assessment model

    International Nuclear Information System (INIS)

    Davis, W.E.; Eadie, W.J.; Powell, D.C.

    1979-09-01

    A guide was prepared to allow a user to run the PNL long-range transport model, REGIONAL 1. REGIONAL 1 is a computer model set up to run atmospheric assessments on a regional basis. The model has the capability of being run in three modes for a single time period. The three modes are: (1) no deposition, (2) dry deposition, (3) wet and dry deposition. The guide provides the physical and mathematical basis used in the model for calculating transport, diffusion, and deposition for all three modes. Also the guide includes a program listing with an explanation of the listings and an example in the form of a short-term assessment for 48 hours. The purpose of the example is to allow a person who has past experience with programming and meteorology to operate the assessment model and compare his results with the guide results. This comparison will assure the user that the program is operating in a proper fashion

  9. A Corrosion Risk Assessment Model for Underground Piping

    Science.gov (United States)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  10. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  11. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    International Nuclear Information System (INIS)

    Klos, Richard

    2008-03-01

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  12. Review and assessment of pool scrubbing models

    International Nuclear Information System (INIS)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez, J.

    1996-01-01

    Decontamination of fission products bearing bubbles as they pass through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC and BUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B. As a result of this work, model capabilities and shortcomings have been assessed and some areas susceptible of further research have been identified. (Author) 73 refs

  13. Review and assessment of pool scrubbing models

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez, J.

    1996-07-01

    Decontamination of fission products bearing bubbles as they pass through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC and BUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B. As a result of this work, model capabilities and shortcomings have been assessed and some areas susceptible of further research have been identified. (Author) 73 refs.

  14. Uncertainties in environmental radiological assessment models and their implications

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  15. Using models in Integrated Ecosystem Assessment of coastal areas

    Science.gov (United States)

    Solidoro, Cosimo; Bandelj, Vinko; Cossarini, Gianpiero; Melaku Canu, Donata; Libralato, Simone

    2014-05-01

    Numerical Models can greatly contribute to integrated ecological assessment of coastal and marine systems. Indeed, models can: i) assist in the identification of efficient sampling strategy; ii) provide space interpolation and time extrapolation of experiemtanl data which are based on the knowedge on processes dynamics and causal realtionships which is coded within the model, iii) provide estimates of hardly measurable indicators. Furthermore model can provide indication on potential effects of implementation of alternative management policies. Finally, by providing a synthetic representation of an ideal system, based on its essential dynamic, model return a picture of ideal behaviour of a system in the absence of external perturbation, alteration, noise, which might help in the identification of reference behaivuor. As an important example, model based reanalyses of biogeochemical and ecological properties are an urgent need for the estimate of the environmental status and the assessment of efficacy of conservation and environmental policies, also with reference to the enforcement of the European MSFD. However, the use of numerical models, and particularly of ecological models, in modeling and in environmental management still is far from be the rule, possibly because of a lack in realizing the benefits which a full integration of modeling and montoring systems might provide, possibly because of a lack of trust in modeling results, or because many problems still exists in the development, validation and implementation of models. For istance, assessing the validity of model results is a complex process that requires the definition of appropriate indicators, metrics, methodologies and faces with the scarcity of real-time in-situ biogeochemical data. Furthermore, biogeochemical models typically consider dozens of variables which are heavily undersampled. Here we show how the integration of mathematical model and monitoring data can support integrated ecosystem

  16. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    Directory of Open Access Journals (Sweden)

    Moiz Mumtaz

    2012-01-01

    Full Text Available Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures.

  17. The assessment of two-fluid models using critical flow data

    International Nuclear Information System (INIS)

    Shome, B.; Lahey, R.T. Jr.

    1992-01-01

    The behavior of two-phase flow is governed by the thermal-hydraulic transfers occurring across phasic interfaces. If correctly formulated, two-fluid models should yield all conceivable evolutions. Moreover, some experiments may be uniquely qualified for model assessment if they can isolate important closure models. This paper is primarily concerned with the possible assessment of the virtual mass force using air-water critical flow data, in which phase-change effects do not take place. The following conclusions can be drawn from this study: (1) The closure parameters, other than those for cirtual mass, were found to have an insignificant effect on critical flow. In contrast, the void fraction profile and the slip ratio were observed to be sensitive to the virtual mass model. (2) It appears that air-water critical flow experiments may be effectively used for the assessment of the virtual mass force used in two-fluid models. In fact, such experiments are unique in their ability to isolate the spatial gradients in a vm models. It is hoped that this study will help stimulate the conduct of further critical flow experiments for the assessment of two fluid models

  18. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  19. Assessment of volcanic hazards, vulnerability, risk and uncertainty (Invited)

    Science.gov (United States)

    Sparks, R. S.

    2009-12-01

    A volcanic hazard is any phenomenon that threatens communities . These hazards include volcanic events like pyroclastic flows, explosions, ash fall and lavas, and secondary effects such as lahars and landslides. Volcanic hazards are described by the physical characteristics of the phenomena, by the assessment of the areas that they are likely to affect and by the magnitude-dependent return period of events. Volcanic hazard maps are generated by mapping past volcanic events and by modelling the hazardous processes. Both these methods have their strengths and limitations and a robust map should use both approaches in combination. Past records, studied through stratigraphy, the distribution of deposits and age dating, are typically incomplete and may be biased. Very significant volcanic hazards, such as surge clouds and volcanic blasts, are not well-preserved in the geological record for example. Models of volcanic processes are very useful to help identify hazardous areas that do not have any geological evidence. They are, however, limited by simplifications and incomplete understanding of the physics. Many practical volcanic hazards mapping tools are also very empirical. Hazards maps are typically abstracted into hazards zones maps, which are some times called threat or risk maps. Their aim is to identify areas at high levels of threat and the boundaries between zones may take account of other factors such as roads, escape routes during evacuation, infrastructure. These boundaries may change with time due to new knowledge on the hazards or changes in volcanic activity levels. Alternatively they may remain static but implications of the zones may change as volcanic activity changes. Zone maps are used for planning purposes and for management of volcanic crises. Volcanic hazards maps are depictions of the likelihood of future volcanic phenomena affecting places and people. Volcanic phenomena are naturally variable, often complex and not fully understood. There are

  20. Proposing an Environmental Excellence Self-Assessment Model

    DEFF Research Database (Denmark)

    Meulengracht Jensen, Peter; Johansen, John; Wæhrens, Brian Vejrum

    2013-01-01

    that the EEA model can be used in global organizations to differentiate environmental efforts depending on the maturity stage of the individual sites. Furthermore, the model can be used to support the decision-making process regarding when organizations should embark on more complex environmental efforts......This paper presents an Environmental Excellence Self-Assessment (EEA) model based on the structure of the European Foundation of Quality Management Business Excellence Framework. Four theoretical scenarios for deploying the model are presented as well as managerial implications, suggesting...

  1. GEMA3D - landscape modelling for dose assessments

    International Nuclear Information System (INIS)

    Klos, Richard

    2010-08-01

    Concerns have been raised about SKB's interpretation of landscape objects in their radiological assessment models, specifically in relation to the size of the objects represented - leading to excessive volumetric dilution - and to the interpretation of local hydrology - leading to non-conservative hydrologic dilution. Developed from the Generic Ecosystem Modelling Approach, GEMA3D is an attempt to address these issues in a simple radiological assessment landscape model. In GEMA3D landscape features are model led as landscape elements (lels) based on a three compartment structure which is able to represent both terrestrial and aquatic lels. The area of the lels can be chosen to coincide with the bedrock fracture from which radionuclides are assumed to be released and the dispersion of radionuclides through out the landscape can be traced. Result indicate that released contaminants remain localised close to the release location and follow the main flow axis of the surface drainage system. This is true even for relatively weakly sorbing species. An interpretation of the size of landscape elements suitable to represent dilution in the biosphere for radiological assessment purposes is suggested, though the concept remains flexible. For reference purposes an agricultural area of one hectare is the baseline. The Quaternary deposits (QD) at the Forsmark site are only a few metres thick above the crystalline bedrock in which the planned repository for spent fuel will be constructed. The biosphere model is assumed to be the upper one metre of the QD. A further model has been implemented for advective - dispersive transport in the deeper QD. The effects of chemical zonation have been briefly investigated. The results confirm the importance of retention close to the release point from the bedrock and clearly indicate that there is a need for a better description of the hydrology of the QD on the spatial scales relevant to the lels required for radiological assessments

  2. Scale changes in air quality modelling and assessment of associated uncertainties

    International Nuclear Information System (INIS)

    Korsakissok, Irene

    2009-01-01

    After an introduction of issues related to a scale change in the field of air quality (existing scales for emissions, transport, turbulence and loss processes, hierarchy of data and models, methods of scale change), the author first presents Gaussian models which have been implemented within the Polyphemus modelling platform. These models are assessed by comparison with experimental observations and with other commonly used Gaussian models. The second part reports the coupling of the puff-based Gaussian model with the Eulerian Polair3D model for the sub-mesh processing of point sources. This coupling is assessed at the continental scale for a passive tracer, and at the regional scale for photochemistry. Different statistical methods are assessed

  3. The importance of trajectory modelling in accident consequence assessments

    International Nuclear Information System (INIS)

    Jones, J.A.; Williams, J.A.; Hill, M.D.

    1988-01-01

    Most atmospheric dispersion models used at present or probabilistic risk assessment (PRA) are linear: they take account of the wind speed but not the direction after the first hour. Therefore, the trajectory model is a more realistic description of the cloud's behaviour. However, the extra complexity means that the computing costs increase. This is an important factor for the MARIA code which is intended to be run on computers of varying power. The numbers of early effects predicted by a linear model and a trajectory model in a probabilistic risk assessment were compared to see which model should be preferred. The trajectory model predicted about 25% fewer expected early deaths and 30% more people evacuated than the linear model. However, the trajectory model took about ten times longer to calculate its results. The choice between the two models may depend on the speed of the computer available

  4. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.; Katzfuss, M.; Hu, J.; Johnson, V. E.

    2014-01-01

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  5. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.

    2014-09-16

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models\\' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  6. The PP ampersand L Nuclear Department model for conducting self-assessments

    International Nuclear Information System (INIS)

    Murthy, M.L.R.; Vernick, H.R.; Male, A.M.; Burchill, W.E.

    1995-01-01

    The nuclear department of Pennsylvania Power ampersand Light Company (PP ampersand L) has initiated an aggressive, methodical, self-assessment program. Self-assessments are conducted to prevent problems, improve performance, and monitor results. The assessment activities are conducted by, or for, an individual having responsibility for performing the work being assessed. This individual, or customer, accepts ownership of the assessment effort and commits to implementing the recommendations agreed on during the assessment. This paper discusses the main elements of the assessment model developed by PP ampersand L and the results the model has achieved to date

  7. The Effect of Computer Models as Formative Assessment on Student Understanding of the Nature of Models

    Science.gov (United States)

    Park, Mihwa; Liu, Xiufeng; Smith, Erica; Waight, Noemi

    2017-01-01

    This study reports the effect of computer models as formative assessment on high school students' understanding of the nature of models. Nine high school teachers integrated computer models and associated formative assessments into their yearlong high school chemistry course. A pre-test and post-test of students' understanding of the nature of…

  8. A Comprehensive Assessment Model for Critical Infrastructure Protection

    Directory of Open Access Journals (Sweden)

    Häyhtiö Markus

    2017-12-01

    Full Text Available International business demands seamless service and IT-infrastructure throughout the entire supply chain. However, dependencies between different parts of this vulnerable ecosystem form a fragile web. Assessment of the financial effects of any abnormalities in any part of the network is demanded in order to protect this network in a financially viable way. Contractual environment between the actors in a supply chain, different business domains and functions requires a management model, which enables a network wide protection for critical infrastructure. In this paper authors introduce such a model. It can be used to assess financial differences between centralized and decentralized protection of critical infrastructure. As an end result of this assessment business resilience to unknown threats can be improved across the entire supply chain.

  9. Models and parameters for environmental radiological assessments

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C W [ed.

    1984-01-01

    This book presents a unified compilation of models and parameters appropriate for assessing the impact of radioactive discharges to the environment. Models examined include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Chapters have been entered separately into the data base. (ACR)

  10. Models and parameters for environmental radiological assessments

    International Nuclear Information System (INIS)

    Miller, C.W.

    1984-01-01

    This book presents a unified compilation of models and parameters appropriate for assessing the impact of radioactive discharges to the environment. Models examined include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Chapters have been entered separately into the data base

  11. Improving Flood Damage Assessment Models in Italy

    Science.gov (United States)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  12. Skill and independence weighting for multi-model assessments

    International Nuclear Information System (INIS)

    Sanderson, Benjamin M.; Wehner, Michael; Knutti, Reto

    2017-01-01

    We present a weighting strategy for use with the CMIP5 multi-model archive in the fourth National Climate Assessment, which considers both skill in the climatological performance of models over North America as well as the inter-dependency of models arising from common parameterizations or tuning practices. The method exploits information relating to the climatological mean state of a number of projection-relevant variables as well as metrics representing long-term statistics of weather extremes. The weights, once computed can be used to simply compute weighted means and significance information from an ensemble containing multiple initial condition members from potentially co-dependent models of varying skill. Two parameters in the algorithm determine the degree to which model climatological skill and model uniqueness are rewarded; these parameters are explored and final values are defended for the assessment. The influence of model weighting on projected temperature and precipitation changes is found to be moderate, partly due to a compensating effect between model skill and uniqueness. However, more aggressive skill weighting and weighting by targeted metrics is found to have a more significant effect on inferred ensemble confidence in future patterns of change for a given projection.

  13. Modelling of the radiological impact of radioactive waste dumping in the Arctic Seas. Report of the Modelling and Assessment Working Group of the International Arctic Seas Assessment Project (IASAP)

    International Nuclear Information System (INIS)

    2003-01-01

    The work is summarized carried out by the Modelling and Assessment Working Group in 1994-1996. The Modelling and Assessment Working Group was established within the framework of the International Arctic Seas Assessment Project (IASAP) launched by the IAEA in 1993 with the objectives of modelling the environmental dispersal and transport of nuclides to be potentially released from the dumped objects and of assessing the associated radiological impact on man and biota. Models were developed to model the dispersal of the pollutants and for the assessment of the radiological consequences of the releases from the dumped wastes in the Arctic. The results of the model intercomparison exercise were used as a basis on which to evaluate the estimate of concentration fields when detailed source term scenarios were used and also to assess the uncertainties in ensuing dose calculations. The descriptions and modelling work was divided into three main phases: description of the area, collection of relevant and necessary information; extension to and development of predictive models including an extensive model inter-comparison and finally prediction of radiological impact, used in the evaluation of the need and options for remediation

  14. Landslides in Nicaragua - Mapping, Inventory, Hazard Assessment, Vulnerability Reduction, and Forecasting Attempts

    Science.gov (United States)

    Dévoli, G.; Strauch, W.; Álvarez, A.; Muñoz, A.; Kjekstad, O.

    2009-04-01

    access, manage, update and distribute in a short time to all sectors and users; and finally, the need of a comprehensive understanding of landslide processes. Many efforts have been made in the last 10 years to gain a more comprehensive and predictive understanding of landslide processes in Nicaragua. Since 1998, landslide inventory GIS based maps have been produced in different areas of the country, as part of international and multidisciplinary development projects. Landslide susceptibility and hazard maps are available now at INETEŔs Website for all municipalities of the country. The insights on landslide hazard have been transmitted to governmental agencies, local authorities, NGÓs, international agencies to be used in measures for risk reduction. A massive application example was the integration of hazard assessment studies in a large house building program in Nicaragua. Hazards of landslides, and other dangerous phenomena, were evaluated in more than 90 house building projects, each with 50 - 200 houses to be build, sited mainly in rural areas of the country. For more than 7000 families, this program could finally assure that their new houses were build in safe areas. Attempts have been made to develop a strategy for early warning of landslides in Nicaragua. First approaches relied on precipitation gauges with satellite based telemetry which were installed in some Nicaraguan volcanoes where lahars occur frequently. The occurrence of lahars in certain gullies could be detected by seismic stations. A software system gave acoustic alarm at INETEŔs Monitoring Centre when certain trigger levels of the accumulated precipitation were reached. The monitoring and early warning for all areas under risk would have required many rain gauges. A new concept is tested which uses near real time precipitation estimates from NOAA meteorological satellite data. A software system sends out alarm messages if strong or long lasting rains are observed over certain landslide "hot spots

  15. Predicting the ungauged basin: Model validation and realism assessment

    Directory of Open Access Journals (Sweden)

    Tim evan Emmerik

    2015-10-01

    Full Text Available The hydrological decade on Predictions in Ungauged Basins (PUB led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of model outcome has not been discussed to a great extent. With this paper we aim to contribute to the discussion on how one can determine the value and validity of a hydrological model developed for an ungauged basin. As in many cases no local, or even regional, data are available, alternative methods should be applied. Using a PUB case study in a genuinely ungauged basin in southern Cambodia, we give several examples of how one can use different types of soft data to improve model design, calibrate and validate the model, and assess the realism of the model output. A rainfall-runoff model was coupled to an irrigation reservoir, allowing the use of additional and unconventional data. The model was mainly forced with remote sensing data, and local knowledge was used to constrain the parameters. Model realism assessment was done using data from surveys. This resulted in a successful reconstruction of the reservoir dynamics, and revealed the different hydrological characteristics of the two topographical classes. This paper does not present a generic approach that can be transferred to other ungauged catchments, but it aims to show how clever model design and alternative data acquisition can result in a valuable hydrological model for an ungauged catchment.

  16. A Methodology to Assess Ionospheric Models for GNSS

    Science.gov (United States)

    Rovira-Garcia, Adria; Juan, José Miguel; Sanz, Jaume; González-Casado, Guillermo; Ibánez, Deimos

    2015-04-01

    Testing the accuracy of the ionospheric models used in the Global Navigation Satellite System (GNSS) is a long-standing issue. It is still a challenging problem due to the lack of accurate enough slant ionospheric determinations to be used as a reference. The present study proposes a methodology to assess any ionospheric model used in satellite-based applications and, in particular, GNSS ionospheric models. The methodology complements other analysis comparing the navigation based on different models to correct the code and carrier-phase observations. Specifically, the following ionospheric models are assessed: the operational models broadcast in the Global Positioning System (GPS), Galileo and the European Geostationary Navigation Overlay System (EGNOS), the post-process Global Ionospheric Maps (GIMs) from different analysis centers belonging to the International GNSS Service (IGS) and, finally, a new GIM computed by the gAGE/UPC research group. The methodology is based in the comparison between the predictions of the ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences shall be separated into the hardware delays (a receiver constant plus a satellite constant) per data interval, e.g., a day. The condition that these Differential Code Biases (DCBs) are commonly shared throughout the world-wide network of receivers and satellites provides a global character to the assessment. This approach generalizes simple tests based on double differenced Slant Total Electron Contents (STECs) between pairs of satellites and receivers on a much local scale. The present study has been conducted during the entire 2014, i.e., the last Solar Maximum. The seasonal and latitudinal structures of the results clearly reflect the different strategies used by the different models. On one hand, ionospheric model corrections based on a grid (IGS-GIMs or EGNOS) are shown to be several times better than the models

  17. Model Test Bed for Evaluating Wave Models and Best Practices for Resource Assessment and Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Neary, Vincent Sinclair [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Yang, Zhaoqing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Wang, Taiping [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Gunawan, Budi [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Dallman, Ann Renee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies

    2016-03-01

    A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending on the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.

  18. Protein single-model quality assessment by feature-based probability density functions.

    Science.gov (United States)

    Cao, Renzhi; Cheng, Jianlin

    2016-04-04

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.

  19. Model error assessment of burst capacity models for energy pipelines containing surface cracks

    International Nuclear Information System (INIS)

    Yan, Zijian; Zhang, Shenwei; Zhou, Wenxing

    2014-01-01

    This paper develops the probabilistic characteristics of the model errors associated with five well-known burst capacity models/methodologies for pipelines containing longitudinally-oriented external surface cracks, namely the Battelle and CorLAS™ models as well as the failure assessment diagram (FAD) methodologies recommended in the BS 7910 (2005), API RP579 (2007) and R6 (Rev 4, Amendment 10). A total of 112 full-scale burst test data for cracked pipes subjected internal pressure only were collected from the literature. The model error for a given burst capacity model is evaluated based on the ratios of the test to predicted burst pressures for the collected data. Analysis results suggest that the CorLAS™ model is the most accurate model among the five models considered and the Battelle, BS 7910, API RP579 and R6 models are in general conservative; furthermore, the API RP579 and R6 models are markedly more accurate than the Battelle and BS 7910 models. The results will facilitate the development of reliability-based structural integrity management of pipelines. - Highlights: • Model errors for five burst capacity models for pipelines containing surface cracks are characterized. • Basic statistics of the model errors are obtained based on test-to-predicted ratios. • Results will facilitate reliability-based design and assessment of energy pipelines

  20. Modeling human intention formation for human reliability assessment

    International Nuclear Information System (INIS)

    Woods, D.D.; Roth, E.M.; Pople, H. Jr.

    1988-01-01

    This paper describes a dynamic simulation capability for modeling how people form intentions to act in nuclear power plant emergency situations. This modeling tool, Cognitive Environment Simulation or CES, was developed based on techniques from artificial intelligence. It simulates the cognitive processes that determine situation assessment and intention formation. It can be used to investigate analytically what situations and factors lead to intention failures, what actions follow from intention failures (e.g. errors of omission, errors of commission, common mode errors), the ability to recover from errors or additional machine failures, and the effects of changes in the NPP person machine system. One application of the CES modeling environment is to enhance the measurement of the human contribution to risk in probabilistic risk assessment studies. (author)

  1. Connecting single-stock assessment models through correlated survival

    DEFF Research Database (Denmark)

    Albertsen, Christoffer Moesgaard; Nielsen, Anders; Thygesen, Uffe Høgsbro

    2017-01-01

    times. We propose a simple alternative. In three case studies each with two stocks, we improve the single-stock models, as measured by Akaike information criterion, by adding correlation in the cohort survival. To limit the number of parameters, the correlations are parameterized through...... the corresponding partial correlations. We consider six models where the partial correlation matrix between stocks follows a band structure ranging from independent assessments to complex correlation structures. Further, a simulation study illustrates the importance of handling correlated data sufficiently...... by investigating the coverage of confidence intervals for estimated fishing mortality. The results presented will allow managers to evaluate stock statuses based on a more accurate evaluation of model output uncertainty. The methods are directly implementable for stocks with an analytical assessment and do...

  2. Development of assessment model for demand-side management investment programs in Korea

    International Nuclear Information System (INIS)

    Lee, Deok Ki; Park, Sang Yong; Park, Soo Uk

    2007-01-01

    The goal of this study is the development of the assessment model for demand-side management investment programs (DSMIPs) in the areas of natural gas and district heating. Demand-side management (DSM) is the process of managing the consumption of energy to optimize available and planned generation resources and DSMIPs are the actions conducted by energy suppliers to promote investment in the DSM. In this research, the analytic hierarchy process (AHP) method was used to develop a scientific and rational assessment model for DSMIPs. To apply the AHP method, assessment indicators for the assessment have been identified by using the concept of 'plan, do, see' and the decision-making hierarchy was established. Then AHP model was developed to set up the priorities of assessment indicators and a survey of experts from government and energy suppliers was carried out. Finally, the priorities of assessment indicators were calculated based on the result of survey using the AHP method. The assessment model developed from this research will actually be used to assess the results of DSMIPs, which is being carried out by Korea gas corporation (KOGAS) and Korea district heating corporation (KDHC). The use of the assessment model developed by this research is expected to contribute to enhance efficiency in planning, execution, and assessment of DSMIPs

  3. Assessing work disability for social security benefits: international models for the direct assessment of work capacity.

    Science.gov (United States)

    Geiger, Ben Baumberg; Garthwaite, Kayleigh; Warren, Jon; Bambra, Clare

    2017-08-25

    It has been argued that social security disability assessments should directly assess claimants' work capacity, rather than relying on proxies such as on functioning. However, there is little academic discussion of how such assessments could be conducted. The article presents an account of different models of direct disability assessments based on case studies of the Netherlands, Germany, Denmark, Norway, the United States of America, Canada, Australia, and New Zealand, utilising over 150 documents and 40 expert interviews. Three models of direct work disability assessments can be observed: (i) structured assessment, which measures the functional demands of jobs across the national economy and compares these to claimants' functional capacities; (ii) demonstrated assessment, which looks at claimants' actual experiences in the labour market and infers a lack of work capacity from the failure of a concerned rehabilitation attempt; and (iii) expert assessment, based on the judgement of skilled professionals. Direct disability assessment within social security is not just theoretically desirable, but can be implemented in practice. We have shown that there are three distinct ways that this can be done, each with different strengths and weaknesses. Further research is needed to clarify the costs, validity/legitimacy, and consequences of these different models. Implications for rehabilitation It has recently been argued that social security disability assessments should directly assess work capacity rather than simply assessing functioning - but we have no understanding about how this can be done in practice. Based on case studies of nine countries, we show that direct disability assessment can be implemented, and argue that there are three different ways of doing it. These are "demonstrated assessment" (using claimants' experiences in the labour market), "structured assessment" (matching functional requirements to workplace demands), and "expert assessment" (the

  4. Stress testing hydrologic models using bottom-up climate change assessment

    Science.gov (United States)

    Stephens, C.; Johnson, F.; Marshall, L. A.

    2017-12-01

    Bottom-up climate change assessment is a promising approach for understanding the vulnerability of a system to potential future changes. The technique has been utilised successfully in risk-based assessments of future flood severity and infrastructure vulnerability. We find that it is also an ideal tool for assessing hydrologic model performance in a changing climate. In this study, we applied bottom-up climate change to compare the performance of two different hydrologic models (an event-based and a continuous model) under increasingly severe climate change scenarios. This allowed us to diagnose likely sources of future prediction error in the two models. The climate change scenarios were based on projections for southern Australia, which indicate drier average conditions with increased extreme rainfall intensities. We found that the key weakness in using the event-based model to simulate drier future scenarios was the model's inability to dynamically account for changing antecedent conditions. This led to increased variability in model performance relative to the continuous model, which automatically accounts for the wetness of a catchment through dynamic simulation of water storages. When considering more intense future rainfall events, representation of antecedent conditions became less important than assumptions around (non)linearity in catchment response. The linear continuous model we applied may underestimate flood risk in a future climate with greater extreme rainfall intensity. In contrast with the recommendations of previous studies, this indicates that continuous simulation is not necessarily the key to robust flood modelling under climate change. By applying bottom-up climate change assessment, we were able to understand systematic changes in relative model performance under changing conditions and deduce likely sources of prediction error in the two models.

  5. Erosion Assessment Modeling Using the Sateec Gis Model on the Prislop Catchment

    Directory of Open Access Journals (Sweden)

    Damian Gheorghe

    2014-05-01

    Full Text Available The Sediment Assessment Tool for Effective Erosion Control (SATEEC acts as an extension for ArcView GIS 3, with easy to use commands. The erosion assessment is divided into two modules that consist of Universal Soil Loss Equation (USLE for sheet/rill erosion and the nLS/USPED modeling for gully head erosion. The SATEEC erosion modules can be successfully implemented for areas where sheet, rill and gully erosion occurs, such as the Prislop Catchment. The enhanced SATEEC system does not require experienced GIS users to operate the system therefore it is suitable for local authorities and/or students not so familiar with erosion modeling.

  6. Guide for developing conceptual models for ecological risk assessments

    International Nuclear Information System (INIS)

    Suter, G.W., II.

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs

  7. Consensus-based training and assessment model for general surgery.

    Science.gov (United States)

    Szasz, P; Louridas, M; de Montbrun, S; Harris, K A; Grantcharov, T P

    2016-05-01

    Surgical education is becoming competency-based with the implementation of in-training milestones. Training guidelines should reflect these changes and determine the specific procedures for such milestone assessments. This study aimed to develop a consensus view regarding operative procedures and tasks considered appropriate for junior and senior trainees, and the procedures that can be used as technical milestone assessments for trainee progression in general surgery. A Delphi process was followed where questionnaires were distributed to all 17 Canadian general surgery programme directors. Items were ranked on a 5-point Likert scale, with consensus defined as Cronbach's α of at least 0·70. Items rated 4 or above on the 5-point Likert scale by 80 per cent of the programme directors were included in the models. Two Delphi rounds were completed, with 14 programme directors taking part in round one and 11 in round two. The overall consensus was high (Cronbach's α = 0·98). The training model included 101 unique procedures and tasks, 24 specific to junior trainees, 68 specific to senior trainees, and nine appropriate to all. The assessment model included four procedures. A system of operative procedures and tasks for junior- and senior-level trainees has been developed along with an assessment model for trainee progression. These can be used as milestones in competency-based assessments. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  8. Performance and Cognitive Assessment in 3-D Modeling

    Science.gov (United States)

    Fahrer, Nolan E.; Ernst, Jeremy V.; Branoff, Theodore J.; Clark, Aaron C.

    2011-01-01

    The purpose of this study was to investigate identifiable differences between performance and cognitive assessment scores in a 3-D modeling unit of an engineering drafting course curriculum. The study aimed to provide further investigation of the need of skill-based assessments in engineering/technical graphics courses to potentially increase…

  9. Modelling requirements for future assessments based on FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  10. evaluation of models for assessing groundwater vulnerability

    African Journals Online (AJOL)

    DR. AMINU

    applied models for groundwater vulnerability assessment mapping. The appraoches .... The overall 'pollution potential' or DRASTIC index is established by applying the formula: DRASTIC Index: ... affected by the structure of the soil surface.

  11. Modeling risk assessment for nuclear processing plants with LAVA

    International Nuclear Information System (INIS)

    Smith, S.T.; Tisinger, R.M.

    1988-01-01

    Using the Los Alamos Vulnerability and Risk Assessment (LAVA) methodology, the authors developed a model for assessing risks associated with nuclear processing plants. LAVA is a three-part systematic approach to risk assessment. The first part is the mathematical methodology; the second is the general personal computer-based software engine; and the third is the application itself. The methodology provides a framework for creating applications for the software engine to operate upon; all application-specific information is data. Using LAVA, the authors build knowledge-based expert systems to assess risks in applications systems comprising a subject system and a safeguards system. The subject system model is sets of threats, assets, and undesirable outcomes. The safeguards system model is sets of safeguards functions for protecting the assets from the threats by preventing or ameliorating the undesirable outcomes, sets of safeguards subfunctions whose performance determine whether the function is adequate and complete, and sets of issues, appearing as interactive questionnaires, whose measures (in both monetary and linguistic terms) define both the weaknesses in the safeguards system and the potential costs of an undesirable outcome occurring

  12. AgMIP: Next Generation Models and Assessments

    Science.gov (United States)

    Rosenzweig, C.

    2014-12-01

    Next steps in developing next-generation crop models fall into several categories: significant improvements in simulation of important crop processes and responses to stress; extension from simplified crop models to complex cropping systems models; and scaling up from site-based models to landscape, national, continental, and global scales. Crop processes that require major leaps in understanding and simulation in order to narrow uncertainties around how crops will respond to changing atmospheric conditions include genetics; carbon, temperature, water, and nitrogen; ozone; and nutrition. The field of crop modeling has been built on a single crop-by-crop approach. It is now time to create a new paradigm, moving from 'crop' to 'cropping system.' A first step is to set up the simulation technology so that modelers can rapidly incorporate multiple crops within fields, and multiple crops over time. Then the response of these more complex cropping systems can be tested under different sustainable intensification management strategies utilizing the updated simulation environments. Model improvements for diseases, pests, and weeds include developing process-based models for important diseases, frameworks for coupling air-borne diseases to crop models, gathering significantly more data on crop impacts, and enabling the evaluation of pest management strategies. Most smallholder farming in the world involves integrated crop-livestock systems that cannot be represented by crop modeling alone. Thus, next-generation cropping system models need to include key linkages to livestock. Livestock linkages to be incorporated include growth and productivity models for grasslands and rangelands as well as the usual annual crops. There are several approaches for scaling up, including use of gridded models and development of simpler quasi-empirical models for landscape-scale analysis. On the assessment side, AgMIP is leading a community process for coordinated contributions to IPCC AR6

  13. Formal safety assessment based on relative risks model in ship navigation

    Energy Technology Data Exchange (ETDEWEB)

    Hu Shenping [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: sphu@mmc.shmtu.edu.cn; Fang Quangen [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: qgfang@mmc.shmtu.edu.cn; Xia Haibo [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: hbxia@mmc.shmtu.edu.cn; Xi Yongtao [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: xiyt@mmc.shmtu.edu.cn

    2007-03-15

    Formal safety assessment (FSA) is a structured and systematic methodology aiming at enhancing maritime safety. It has been gradually and broadly used in the shipping industry nowadays around the world. On the basis of analysis and conclusion of FSA approach, this paper discusses quantitative risk assessment and generic risk model in FSA, especially frequency and severity criteria in ship navigation. Then it puts forward a new model based on relative risk assessment (MRRA). The model presents a risk-assessment approach based on fuzzy functions and takes five factors into account, including detailed information about accident characteristics. It has already been used for the assessment of pilotage safety in Shanghai harbor, China. Consequently, it can be proved that MRRA is a useful method to solve the problems in the risk assessment of ship navigation safety in practice.

  14. Formal safety assessment based on relative risks model in ship navigation

    International Nuclear Information System (INIS)

    Hu Shenping; Fang Quangen; Xia Haibo; Xi Yongtao

    2007-01-01

    Formal safety assessment (FSA) is a structured and systematic methodology aiming at enhancing maritime safety. It has been gradually and broadly used in the shipping industry nowadays around the world. On the basis of analysis and conclusion of FSA approach, this paper discusses quantitative risk assessment and generic risk model in FSA, especially frequency and severity criteria in ship navigation. Then it puts forward a new model based on relative risk assessment (MRRA). The model presents a risk-assessment approach based on fuzzy functions and takes five factors into account, including detailed information about accident characteristics. It has already been used for the assessment of pilotage safety in Shanghai harbor, China. Consequently, it can be proved that MRRA is a useful method to solve the problems in the risk assessment of ship navigation safety in practice

  15. Testing of an accident consequence assessment model using field data

    International Nuclear Information System (INIS)

    Homma, Toshimitsu; Matsubara, Takeshi; Tomita, Kenichi

    2007-01-01

    This paper presents the results obtained from the application of an accident consequence assessment model, OSCAAR to the Iput dose reconstruction scenario of BIOMASS and also to the Chernobyl 131 I fallout scenario of EMRAS, both organized by International Atomic Energy Agency. The Iput Scenario deals with 137 Cs contamination of the catchment basin and agricultural area in the Bryansk Region of Russia, which was heavily contaminated after the Chernobyl accident. This exercise was used to test the chronic exposure pathway models in OSCAAR with actual measurements and to identify the most important sources of uncertainty with respect to each part of the assessment. The OSCAAR chronic exposure pathway models had some limitations but the refined model, COLINA almost successfully reconstructed the whole 10-year time course of 137 Cs activity concentrations in most requested types of agricultural products and natural foodstuffs. The Plavsk scenario provides a good opportunity to test not only the food chain transfer model of 131 I but also the method of assessing 131 I thyroid burden. OSCAAR showed in general good capabilities for assessing the important 131 I exposure pathways. (author)

  16. GEMA3D - landscape modelling for dose assessments

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard (Aleksandria Sciences (United Kingdom))

    2010-08-15

    Concerns have been raised about SKB's interpretation of landscape objects in their radiological assessment models, specifically in relation to the size of the objects represented - leading to excessive volumetric dilution - and to the interpretation of local hydrology - leading to non-conservative hydrologic dilution. Developed from the Generic Ecosystem Modelling Approach, GEMA3D is an attempt to address these issues in a simple radiological assessment landscape model. In GEMA3D landscape features are model led as landscape elements (lels) based on a three compartment structure which is able to represent both terrestrial and aquatic lels. The area of the lels can be chosen to coincide with the bedrock fracture from which radionuclides are assumed to be released and the dispersion of radionuclides through out the landscape can be traced. Result indicate that released contaminants remain localised close to the release location and follow the main flow axis of the surface drainage system. This is true even for relatively weakly sorbing species. An interpretation of the size of landscape elements suitable to represent dilution in the biosphere for radiological assessment purposes is suggested, though the concept remains flexible. For reference purposes an agricultural area of one hectare is the baseline. The Quaternary deposits (QD) at the Forsmark site are only a few metres thick above the crystalline bedrock in which the planned repository for spent fuel will be constructed. The biosphere model is assumed to be the upper one metre of the QD. A further model has been implemented for advective - dispersive transport in the deeper QD. The effects of chemical zonation have been briefly investigated. The results confirm the importance of retention close to the release point from the bedrock and clearly indicate that there is a need for a better description of the hydrology of the QD on the spatial scales relevant to the lels required for radiological assessments

  17. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  18. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    Science.gov (United States)

    Bonasia, Rosanna; Scaini, Chiara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2014-01-01

    Popocatépetl is one of Mexico's most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene-Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl's reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the "Ochre Pumice" Plinian eruption (4965 14C yr BP

  19. Risk assessment and remedial policy evaluation using predictive modeling

    International Nuclear Information System (INIS)

    Linkov, L.; Schell, W.R.

    1996-01-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forest compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment

  20. Modeling Of Construction Noise For Environmental Impact Assessment

    Directory of Open Access Journals (Sweden)

    Mohamed F. Hamoda

    2008-06-01

    Full Text Available This study measured the noise levels generated at different construction sites in reference to the stage of construction and the equipment used, and examined the methods to predict such noise in order to assess the environmental impact of noise. It included 33 construction sites in Kuwait and used artificial neural networks (ANNs for the prediction of noise. A back-propagation neural network (BPNN model was compared with a general regression neural network (GRNN model. The results obtained indicated that the mean equivalent noise level was 78.7 dBA which exceeds the threshold limit. The GRNN model was superior to the BPNN model in its accuracy of predicting construction noise due to its ability to train quickly on sparse data sets. Over 93% of the predictions were within 5% of the observed values. The mean absolute error between the predicted and observed data was only 2 dBA. The ANN modeling proved to be a useful technique for noise predictions required in the assessment of environmental impact of construction activities.

  1. Nuclear security assessment with Markov model approach

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Terao, Norichika

    2013-01-01

    Nuclear security risk assessment with the Markov model based on random event is performed to explore evaluation methodology for physical protection in nuclear facilities. Because the security incidences are initiated by malicious and intentional acts, expert judgment and Bayes updating are used to estimate scenario and initiation likelihood, and it is assumed that the Markov model derived from stochastic process can be applied to incidence sequence. Both an unauthorized intrusion as Design Based Threat (DBT) and a stand-off attack as beyond-DBT are assumed to hypothetical facilities, and performance of physical protection and mitigation and minimization of consequence are investigated to develop the assessment methodology in a semi-quantitative manner. It is shown that cooperation between facility operator and security authority is important to respond to the beyond-DBT incidence. (author)

  2. Model summary report for the safety assessment SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Vahlund, Fredrik

    2006-10-15

    This document is the model summary report for the safety assessment SR-Can. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SR-Can, a number of different computer codes are used. In order to better understand how these codes are related Assessment Model Flowcharts, AMFs, have been produced within the project. From these, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A large number of different computer codes are used in the assessment of which some are commercial while others are developed especially for the current assessment project. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: It must be demonstrated that the code is suitable for its purpose; It must be demonstrated that the code has been properly used; and, It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met.

  3. Model summary report for the safety assessment SR-Can

    International Nuclear Information System (INIS)

    Vahlund, Fredrik

    2006-10-01

    This document is the model summary report for the safety assessment SR-Can. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SR-Can, a number of different computer codes are used. In order to better understand how these codes are related Assessment Model Flowcharts, AMFs, have been produced within the project. From these, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A large number of different computer codes are used in the assessment of which some are commercial while others are developed especially for the current assessment project. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: It must be demonstrated that the code is suitable for its purpose; It must be demonstrated that the code has been properly used; and, It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met

  4. A Conceptual Model of Future Volcanism at Medicine Lake Volcano, California - With an Emphasis on Understanding Local Volcanic Hazards

    Science.gov (United States)

    Molisee, D. D.; Germa, A.; Charbonnier, S. J.; Connor, C.

    2017-12-01

    Medicine Lake Volcano (MLV) is most voluminous of all the Cascade Volcanoes ( 600 km3), and has the highest eruption frequency after Mount St. Helens. Detailed mapping by USGS colleagues has shown that during the last 500,000 years MLV erupted >200 lava flows ranging from basalt to rhyolite, produced at least one ash-flow tuff, one caldera forming event, and at least 17 scoria cones. Underlying these units are 23 additional volcanic units that are considered to be pre-MLV in age. Despite the very high likelihood of future eruptions, fewer than 60 of 250 mapped volcanic units (MLV and pre-MLV) have been dated reliably. A robust set of eruptive ages is key to understanding the history of the MLV system and to forecasting the future behavior of the volcano. The goals of this study are to 1) obtain additional radiometric ages from stratigraphically strategic units; 2) recalculate recurrence rate of eruptions based on an augmented set of radiometric dates; and 3) use lava flow, PDC, ash fall-out, and lahar computational simulation models to assess the potential effects of discrete volcanic hazards locally and regionally. We identify undated target units (units in key stratigraphic positions to provide maximum chronological insight) and obtain field samples for radiometric dating (40Ar/39Ar and K/Ar) and petrology. Stratigraphic and radiometric data are then used together in the Volcano Event Age Model (VEAM) to identify changes in the rate and type of volcanic eruptions through time, with statistical uncertainty. These newly obtained datasets will be added to published data to build a conceptual model of volcanic hazards at MLV. Alternative conceptual models, for example, may be that the rate of MLV lava flow eruptions are nonstationary in time and/or space and/or volume. We explore the consequences of these alternative models on forecasting future eruptions. As different styles of activity have different impacts, we estimate these potential effects using simulation

  5. The SAVI vulnerability assessment model

    International Nuclear Information System (INIS)

    Winblad, A.E.

    1987-01-01

    The assessment model ''Systematic Analysis of Vulnerability to Intrusion'' (SAVI) presented in this report is a PC-based path analysis model. It can provide estimates of protection system effectiveness (or vulnerability) against a spectrum of outsider threats including collusion with an insider adversary. It calculates one measure of system effectiveness, the probability of interruption P(I), for all potential adversary paths. SAVI can perform both theft and sabotage vulnerability analyses. For theft, the analysis is based on the assumption that adversaries should be interrupted either before they can accomplish removal of the target material from its normal location or removal from the site boundary. For sabotage, the analysis is based on the assumption that adversaries should be interrupted before completion of their sabotage task

  6. Adaptation in integrated assessment modeling: where do we stand?

    OpenAIRE

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We analyze how modelers have chosen to describe adaptation within an integrated framework, and suggest many ways they could improve the treatment of adaptation by considering more of its bottom-up cha...

  7. Tailored model abstraction in performance assessments

    International Nuclear Information System (INIS)

    Kessler, J.H.

    1995-01-01

    Total System Performance Assessments (TSPAs) are likely to be one of the most significant parts of making safety cases for the continued development and licensing of geologic repositories for the disposal of spent fuel and HLW. Thus, it is critical that the TSPA model capture the 'essence' of the physical processes relevant to demonstrating the appropriate regulation is met. But how much detail about the physical processes must be modeled and understood before there is enough confidence that the appropriate essence has been captured? In this summary the level of model abstraction that is required is discussed. Approaches for subsystem and total system performance analyses are outlined, and the role of best estimate models is examined. It is concluded that a conservative approach for repository performance, based on limited amount of field and laboratory data, can provide sufficient confidence for a regulatory decision

  8. A Multi-Actor Dynamic Integrated Assessment Model (MADIAM)

    OpenAIRE

    Weber, Michael

    2004-01-01

    The interactions between climate and the socio-economic system are investigated with a Multi-Actor Dynamic Integrated Assessment Model (MADIAM) obtained by coupling a nonlinear impulse response model of the climate sub-system (NICCS) to a multi-actor dynamic economic model (MADEM). The main goal is to initiate a model development that is able to treat the dynamics of the coupled climate socio-economic system, including endogenous technological change, in a non-equilibrium situation, thereby o...

  9. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    Science.gov (United States)

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  10. Operations Assessment of Launch Vehicle Architectures using Activity Based Cost Models

    Science.gov (United States)

    Ruiz-Torres, Alex J.; McCleskey, Carey

    2000-01-01

    The growing emphasis on affordability for space transportation systems requires the assessment of new space vehicles for all life cycle activities, from design and development, through manufacturing and operations. This paper addresses the operational assessment of launch vehicles, focusing on modeling the ground support requirements of a vehicle architecture, and estimating the resulting costs and flight rate. This paper proposes the use of Activity Based Costing (ABC) modeling for this assessment. The model uses expert knowledge to determine the activities, the activity times and the activity costs based on vehicle design characteristics. The approach provides several advantages to current approaches to vehicle architecture assessment including easier validation and allowing vehicle designers to understand the cost and cycle time drivers.

  11. Social vulnerability as a contributing factor to disasters in Central America: A case study at San Vicente volcano, El Salvador

    Science.gov (United States)

    Bowman, L. J.; Henquinet, K. B.; Gierke, J. S.; Rose, W. I.

    2012-12-01

    El Salvador's geographic location on the Pacific Ring of Fire at the juncture of the Caribbean and Cocos plates exposes its population to various natural hazards, including volcanic eruptions (e.g., Santa Ana in 2005), earthquakes (e.g., January 13 and February 13, 2001), and landslides and flooding due to tropical rainfall events (e.g., Hurricane Mitch in 1998, Hurricane Stan in 2005). Such hazards can be devastating anywhere, but the condition of social vulnerability in which many Salvadorans currently live exacerbates the impacts of these hazards. Aspects contributing to most rural Salvadorans being marginalized include a colonial history marked by ethnic discrimination and laws prohibiting land ownership, lack of access to desirable land in an agrarian society, a poor education system, global economic policies that foster inequality, political marginalization, a bloody civil conflict, and rampant criminality and violence. In November 2009, an extreme rainfall event triggered landslides and lahars killing over 200 people at San Vicente volcano. This disaster brought to light weaknesses in disaster preparedness and response plans. Despite the existence of recent hazard maps and lahar inundation models (2001), and the occurrence of a similar, deadly event in 1934, the population appeared to be unaware of the risk, and lacked the organization and decision-making protocols to adequately deal with the emergency. Therefore, in the aftermath of the 2009 lahars, much of the focus on disaster risk reduction (DRR) initiatives has been aimed at the communities affected by this most recent event. Our study examines root causes of social vulnerability and assesses the apparent impacts of these interventions on the population, including individual's perceptions regarding these risk-reducing interventions. Two years after the event, though aid abounds, many people remain vulnerable to hazards in this area. Semi-structured interviews were completed with survivors of the 2009

  12. Economic assessment model architecture for AGC/AVLIS selection

    International Nuclear Information System (INIS)

    Hoglund, R.L.

    1984-01-01

    The economic assessment model architecture described provides the flexibility and completeness in economic analysis that the selection between AGC and AVLIS demands. Process models which are technology-specific will provide the first-order responses of process performance and cost to variations in process parameters. The economics models can be used to test the impacts of alternative deployment scenarios for a technology. Enterprise models provide global figures of merit for evaluating the DOE perspective on the uranium enrichment enterprise, and business analysis models compute the financial parameters from the private investor's viewpoint

  13. Assessing alternative conceptual models of fracture flow

    International Nuclear Information System (INIS)

    Ho, C.K.

    1995-01-01

    The numerical code TOUGH2 was used to assess alternative conceptual models of fracture flow. The models that were considered included the equivalent continuum model (ECM) and the dual permeability (DK) model. A one-dimensional, layered, unsaturated domain was studied with a saturated bottom boundary and a constant infiltration at the top boundary. Two different infiltration rates were used in the studies. In addition, the connection areas between the fracture and matrix elements in the dual permeability model were varied. Results showed that the two conceptual models of fracture flow produced different saturation and velocity profiles-even under steady-state conditions. The magnitudes of the discrepancies were sensitive to two parameters that affected the flux between the fractures and matrix in the dual permeability model: (1) the fracture-matrix connection areas and (2) the capillary pressure gradients between the fracture and matrix elements

  14. A Hierarchal Risk Assessment Model Using the Evidential Reasoning Rule

    Directory of Open Access Journals (Sweden)

    Xiaoxiao Ji

    2017-02-01

    Full Text Available This paper aims to develop a hierarchical risk assessment model using the newly-developed evidential reasoning (ER rule, which constitutes a generic conjunctive probabilistic reasoning process. In this paper, we first provide a brief introduction to the basics of the ER rule and emphasize the strengths for representing and aggregating uncertain information from multiple experts and sources. Further, we discuss the key steps of developing the hierarchical risk assessment framework systematically, including (1 formulation of risk assessment hierarchy; (2 representation of both qualitative and quantitative information; (3 elicitation of attribute weights and information reliabilities; (4 aggregation of assessment information using the ER rule and (5 quantification and ranking of risks using utility-based transformation. The proposed hierarchical risk assessment framework can potentially be implemented to various complex and uncertain systems. A case study on the fire/explosion risk assessment of marine vessels demonstrates the applicability of the proposed risk assessment model.

  15. A simple, semi-prescriptive self-assessment model for TQM.

    Science.gov (United States)

    Warwood, Stephen; Antony, Jiju

    2003-01-01

    This article presents a simple, semi-prescriptive self-assessment model for use in industry as part of a continuous improvement program such as Total Quality Management (TQM). The process by which the model was constructed started with a review of the available literature in order to research TQM success factors. Next, postal surveys were conducted by sending questionnaires to the winning organisations of the Baldrige and European Quality Awards and to a preselected group of enterprising UK organisations. From the analysis of this data, the self-assessment model was constructed to help organisations in their quest for excellence. This work confirmed the findings from the literature, that there are key factors that contribute to the successful implementation of TQM and these have different levels of importance. These key factors, in order of importance, are: effective leadership, the impact of other quality-related programs, measurement systems, organisational culture, education and training, the use of teams, efficient communications, active empowerment of the workforce, and a systems infrastructure to support the business and customer-focused processes. This analysis, in turn, enabled the design of a self-assessment model that can be applied within any business setting. Further work should include the testing and review of this model to ascertain its suitability and effectiveness within industry today.

  16. A review of air exchange rate models for air pollution exposure assessments.

    Science.gov (United States)

    Breen, Michael S; Schultz, Bradley D; Sohn, Michael D; Long, Thomas; Langstaff, John; Williams, Ronald; Isaacs, Kristin; Meng, Qing Yu; Stallings, Casson; Smith, Luther

    2014-11-01

    A critical aspect of air pollution exposure assessments is estimation of the air exchange rate (AER) for various buildings where people spend their time. The AER, which is the rate of exchange of indoor air with outdoor air, is an important determinant for entry of outdoor air pollutants and for removal of indoor-emitted air pollutants. This paper presents an overview and critical analysis of the scientific literature on empirical and physically based AER models for residential and commercial buildings; the models highlighted here are feasible for exposure assessments as extensive inputs are not required. Models are included for the three types of airflows that can occur across building envelopes: leakage, natural ventilation, and mechanical ventilation. Guidance is provided to select the preferable AER model based on available data, desired temporal resolution, types of airflows, and types of buildings included in the exposure assessment. For exposure assessments with some limited building leakage or AER measurements, strategies are described to reduce AER model uncertainty. This review will facilitate the selection of AER models in support of air pollution exposure assessments.

  17. Risk assessment of storm surge disaster based on numerical models and remote sensing

    Science.gov (United States)

    Liu, Qingrong; Ruan, Chengqing; Zhong, Shan; Li, Jian; Yin, Zhonghui; Lian, Xihu

    2018-06-01

    Storm surge is one of the most serious ocean disasters in the world. Risk assessment of storm surge disaster for coastal areas has important implications for planning economic development and reducing disaster losses. Based on risk assessment theory, this paper uses coastal hydrological observations, a numerical storm surge model and multi-source remote sensing data, proposes methods for valuing hazard and vulnerability for storm surge and builds a storm surge risk assessment model. Storm surges in different recurrence periods are simulated in numerical models and the flooding areas and depth are calculated, which are used for assessing the hazard of storm surge; remote sensing data and GIS technology are used for extraction of coastal key objects and classification of coastal land use are identified, which is used for vulnerability assessment of storm surge disaster. The storm surge risk assessment model is applied for a typical coastal city, and the result shows the reliability and validity of the risk assessment model. The building and application of storm surge risk assessment model provides some basis reference for the city development plan and strengthens disaster prevention and mitigation.

  18. Peer Assessment with Online Tools to Improve Student Modeling

    Science.gov (United States)

    Atkins, Leslie J.

    2012-11-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.

  19. Risk assessment model for development of advanced age-related macular degeneration.

    Science.gov (United States)

    Klein, Michael L; Francis, Peter J; Ferris, Frederick L; Hamon, Sara C; Clemons, Traci E

    2011-12-01

    To design a risk assessment model for development of advanced age-related macular degeneration (AMD) incorporating phenotypic, demographic, environmental, and genetic risk factors. We evaluated longitudinal data from 2846 participants in the Age-Related Eye Disease Study. At baseline, these individuals had all levels of AMD, ranging from none to unilateral advanced AMD (neovascular or geographic atrophy). Follow-up averaged 9.3 years. We performed a Cox proportional hazards analysis with demographic, environmental, phenotypic, and genetic covariates and constructed a risk assessment model for development of advanced AMD. Performance of the model was evaluated using the C statistic and the Brier score and externally validated in participants in the Complications of Age-Related Macular Degeneration Prevention Trial. The final model included the following independent variables: age, smoking history, family history of AMD (first-degree member), phenotype based on a modified Age-Related Eye Disease Study simple scale score, and genetic variants CFH Y402H and ARMS2 A69S. The model did well on performance measures, with very good discrimination (C statistic = 0.872) and excellent calibration and overall performance (Brier score at 5 years = 0.08). Successful external validation was performed, and a risk assessment tool was designed for use with or without the genetic component. We constructed a risk assessment model for development of advanced AMD. The model performed well on measures of discrimination, calibration, and overall performance and was successfully externally validated. This risk assessment tool is available for online use.

  20. Assessment for Complex Learning Resources: Development and Validation of an Integrated Model

    Directory of Open Access Journals (Sweden)

    Gudrun Wesiak

    2013-01-01

    Full Text Available Today’s e-learning systems meet the challenge to provide interactive, personalized environments that support self-regulated learning as well as social collaboration and simulation. At the same time assessment procedures have to be adapted to the new learning environments by moving from isolated summative assessments to integrated assessment forms. Therefore, learning experiences enriched with complex didactic resources - such as virtualized collaborations and serious games - have emerged. In this extension of [1] an integrated model for e-assessment (IMA is outlined, which incorporates complex learning resources and assessment forms as main components for the development of an enriched learning experience. For a validation the IMA was presented to a group of experts from the fields of cognitive science, pedagogy, and e-learning. The findings from the validation lead to several refinements of the model, which mainly concern the component forms of assessment and the integration of social aspects. Both aspects are accounted for in the revised model, the former by providing a detailed sub-model for assessment forms.

  1. Radionuclide transport and dose assessment modelling in biosphere assessment 2009

    International Nuclear Information System (INIS)

    Hjerpe, T.; Broed, R.

    2010-11-01

    Following the guidelines set forth by the Ministry of Trade and Industry (now Ministry of Employment and Economy), Posiva is preparing to submit a construction license application for the final disposal spent nuclear fuel at the Olkiluoto site, Finland, by the end of the year 2012. Disposal will take place in a geological repository implemented according to the KBS-3 method. The long-term safety section supporting the license application will be based on a safety case that, according to the internationally adopted definition, will be a compilation of the evidence, analyses and arguments that quantify and substantiate the safety and the level of expert confidence in the safety of the planned repository. This report documents in detail the conceptual and mathematical models and key data used in the landscape model set-up, radionuclide transport modelling, and radiological consequences analysis applied in the 2009 biosphere assessment. Resulting environmental activity concentrations in landscape model due to constant unit geosphere release rates, and the corresponding annual doses, are also calculated and presented in this report. This provides the basis for understanding the behaviour of the applied landscape model and subsequent dose calculations. (orig.)

  2. Assessing uncertainty in SRTM elevations for global flood modelling

    Science.gov (United States)

    Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.

    2017-12-01

    The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.

  3. A model for assessing human cognitive reliability in PRA studies

    International Nuclear Information System (INIS)

    Hannaman, G.W.; Spurgin, A.J.; Lukic, Y.

    1985-01-01

    This paper summarizes the status of a research project sponsored by EPRI as part of the Probabilistic Risk Assessment (PRA) technology improvement program and conducted by NUS Corporation to develop a model of Human Cognitive Reliability (HCR). The model was synthesized from features identified in a review of existing models. The model development was based on the hypothesis that the key factors affecting crew response times are separable. The inputs to the model consist of key parameters the values of which can be determined by PRA analysts for each accident situation being assessed. The output is a set of curves which represent the probability of control room crew non-response as a function of time for different conditions affecting their performance. The non-response probability is then a contributor to the overall non-success of operating crews to achieve a functional objective identified in the PRA study. Simulator data and some small scale tests were utilized to illustrate the calibration of interim HCR model coefficients for different types of cognitive processing since the data were sparse. The model can potentially help PRA analysts make human reliability assessments more explicit. The model incorporates concepts from psychological models of human cognitive behavior, information from current collections of human reliability data sources and crew response time data from simulator training exercises

  4. Development on Dose Assessment Model of Northeast Asia Nuclear Accident Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ju Yub; Kim, Ju Youl; Kim, Suk Hoon; Lee, Seung Hee; Yoon, Tae Bin [FNC Techology, Yongin (Korea, Republic of)

    2016-05-15

    In order to support the emergency response system, the simulator for overseas nuclear accident is under development including source-term estimation, atmospheric dispersion modeling and dose assessment. The simulator is named NANAS (Northeast Asia Nuclear Accident Simulator). For the source-term estimation, design characteristics of each reactor type should be reflected into the model. Since there are a lot of reactor types in neighboring countries, the representative reactors of China, Japan and Taiwan have been selected and the source-term estimation models for each reactor have been developed, respectively. For the atmospheric dispersion modeling, Lagrangian particle model will be integrated into the simulator for the long range dispersion modeling in Northeast Asia region. In this study, the dose assessment model has been developed considering external and internal exposure. The dose assessment model has been developed as a part of the overseas nuclear accidents simulator which is named NANAS. It addresses external and internal pathways including cloudshine, groundshine and inhalation. Also, it uses the output of atmospheric dispersion model (i.e. the average concentrations of radionuclides in air and ground) and various coefficients (e.g. dose conversion factor and breathing rate) as an input. Effective dose and thyroid dose for each grid in the Korean Peninsula region are printed out as a format of map projection and chart. Verification and validation on the dose assessment model will be conducted in further study by benchmarking with the measured data of Fukushima Daiichi Nuclear Accident.

  5. Simulation Modeling of Resilience Assessment in Indonesian Fertiliser Industry Supply Networks

    Science.gov (United States)

    Utami, I. D.; Holt, R. J.; McKay, A.

    2018-01-01

    Supply network resilience is a significant aspect in the performance of the Indonesian fertiliser industry. Decision makers use risk assessment and port management reports to evaluate the availability of infrastructure. An opportunity was identified to incorporate both types of data into an approach for the measurement of resilience. A framework, based on a synthesis of literature and interviews with industry practitioners, covering both social and technical factors is introduced. A simulation model was then built to allow managers to explore implications for resilience and predict levels of risk in different scenarios. Result of interview with respondens from Indonesian fertiliser industry indicated that the simulation model could be valuable in the assessment. This paper provides details of the simulation model for decision makers to explore levels of risk in supply networks. For practitioners, the model could be used by government to assess the current condition of supply networks in Indonesian industries. On the other hand, for academia, the approach provides a new application of agent-based models in research on supply network resilience and presents a real example of how agent-based modeling could be used as to support the assessment approach.

  6. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  7. Are revised models better models? A skill score assessment of regional interannual variability

    Science.gov (United States)

    Sperber, Kenneth R.; Participating AMIP Modelling Groups

    1999-05-01

    Various skill scores are used to assess the performance of revised models relative to their original configurations. The interannual variability of all-India, Sahel and Nordeste rainfall and summer monsoon windshear is examined in integrations performed under the experimental design of the Atmospheric Model Intercomparison Project. For the indices considered, the revised models exhibit greater fidelity at simulating the observed interannual variability. Interannual variability of all-India rainfall is better simulated by models that have a more realistic rainfall climatology in the vicinity of India, indicating the beneficial effect of reducing systematic model error.

  8. A formalism to generate probability distributions for performance-assessment modeling

    International Nuclear Information System (INIS)

    Kaplan, P.G.

    1990-01-01

    A formalism is presented for generating probability distributions of parameters used in performance-assessment modeling. The formalism is used when data are either sparse or nonexistent. The appropriate distribution is a function of the known or estimated constraints and is chosen to maximize a quantity known as Shannon's informational entropy. The formalism is applied to a parameter used in performance-assessment modeling. The functional form of the model that defines the parameter, data from the actual field site, and natural analog data are analyzed to estimate the constraints. A beta probability distribution of the example parameter is generated after finding four constraints. As an example of how the formalism is applied to the site characterization studies of Yucca Mountain, the distribution is generated for an input parameter in a performance-assessment model currently used to estimate compliance with disposal of high-level radioactive waste in geologic repositories, 10 CFR 60.113(a)(2), commonly known as the ground water travel time criterion. 8 refs., 2 figs

  9. The MCRA model for probabilistic single-compound and cumulative risk assessment of pesticides.

    Science.gov (United States)

    van der Voet, Hilko; de Boer, Waldo J; Kruisselbrink, Johannes W; Goedhart, Paul W; van der Heijden, Gerie W A M; Kennedy, Marc C; Boon, Polly E; van Klaveren, Jacob D

    2015-05-01

    Pesticide risk assessment is hampered by worst-case assumptions leading to overly pessimistic assessments. On the other hand, cumulative health effects of similar pesticides are often not taken into account. This paper describes models and a web-based software system developed in the European research project ACROPOLIS. The models are appropriate for both acute and chronic exposure assessments of single compounds and of multiple compounds in cumulative assessment groups. The software system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in pesticide risk assessment at mcra.rivm.nl. We describe the MCRA implementation of the methods as advised in the 2012 EFSA Guidance on probabilistic modelling, as well as more refined methods developed in the ACROPOLIS project. The emphasis is on cumulative assessments. Two approaches, sample-based and compound-based, are contrasted. It is shown that additional data on agricultural use of pesticides may give more realistic risk assessments. Examples are given of model and software validation of acute and chronic assessments, using both simulated data and comparisons against the previous release of MCRA and against the standard software DEEM-FCID used by the Environmental Protection Agency in the USA. It is shown that the EFSA Guidance pessimistic model may not always give an appropriate modelling of exposure. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  10. Toward a consistent modeling framework to assess multi-sectoral climate impacts.

    Science.gov (United States)

    Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin

    2018-02-13

    Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.

  11. Adaptation in integrated assessment modeling: where do we stand?

    NARCIS (Netherlands)

    Patt, A.; van Vuuren, D.P.; Berkhout, F.G.H.; Aaheim, A.; Hof, A.F.; Isaac, M.; Mechler, R.

    2010-01-01

    Adaptation is an important element on the climate change policy agenda. Integrated assessment models, which are key tools to assess climate change policies, have begun to address adaptation, either by including it implicitly in damage cost estimates, or by making it an explicit control variable. We

  12. Assessment of the Stakeholders’ Importance Using AHP Method – Modeling and Application

    Directory of Open Access Journals (Sweden)

    Danka Knezević

    2015-05-01

    Full Text Available Attention to stakeholders, which means that companies bear responsibility for the implications of their actions, is emerging as a critical strategic issue. Hence, meeting legitimate stakeholders’ requests would enhance the reputation of a company and increase its competitiveness on product markets. That is why an accurate identification of stakeholders and assessment of their importance is so significant for the companies. Through an integration of the earlier models of excellence, models for identification and classification of stakeholders, models for assessing the quality of a company and the AHP method, widely applicable in various fields, a new model for assessment of stakeholders’ significance is proposed in this paper. The model also provides an assessment of a company based on the degree of the importance and satisfaction of stakeholders. The results of this model could be useful for companies and their management when it comes to defining a proper business strategy, monitoring the system changes over time, creating a basis for comparison with other similar systems or with itself. A practical example is given to demonstrate the effectiveness of the model.

  13. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    Science.gov (United States)

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  14. Review of early assessment models of innovative medical technologies

    DEFF Research Database (Denmark)

    Fasterholdt, Iben; Krahn, Murray D; Kidholm, Kristian

    2017-01-01

    INTRODUCTION: Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models...

  15. Exploring the Assessment of the DSM-5 Alternative Model for Personality Disorders With the Personality Assessment Inventory.

    Science.gov (United States)

    Busch, Alexander J; Morey, Leslie C; Hopwood, Christopher J

    2017-01-01

    Section III of the Diagnostic and Statistical Manual of Mental Disorders (5th ed. [DSM-5]; American Psychiatric Association, 2013) contains an alternative model for the diagnosis of personality disorder involving the assessment of 25 traits and a global level of overall personality functioning. There is hope that this model will be increasingly used in clinical and research settings, and the ability to apply established instruments to assess these concepts could facilitate this process. This study sought to develop scoring algorithms for these alternative model concepts using scales from the Personality Assessment Inventory (PAI). A multiple regression strategy used to predict scores in 2 undergraduate samples on DSM-5 alternative model instruments: the Personality Inventory for the DSM-5 (PID-5) and the General Personality Pathology scale (GPP; Morey et al., 2011 ). These regression functions resulted in scores that demonstrated promising convergent and discriminant validity across the alternative model concepts, as well as a factor structure in a cross-validation sample that was congruent with the putative structure of the alternative model traits. Results were linked to the PAI community normative data to provide normative information regarding these alternative model concepts that can be used to identify elevated traits and personality functioning level scores.

  16. Persistent hemifacial spasm after microvascular decompression: a risk assessment model.

    Science.gov (United States)

    Shah, Aalap; Horowitz, Michael

    2017-06-01

    Microvascular decompression (MVD) for hemifacial spasm (HFS) provides resolution of disabling symptoms such as eyelid twitching and muscle contractions of the entire hemiface. The primary aim of this study was to evaluate the predictive value of patient demographics and spasm characteristics on long-term outcomes, with or without intraoperative lateral spread response (LSR) as an additional variable in a risk assessment model. A retrospective study was undertaken to evaluate the associations of pre-operative patient characteristics, as well as intraoperative LSR and need for a staged procedure on the presence of persistent or recurrent HFS at the time of hospital discharge and at follow-up. A risk assessment model was constructed with the inclusion of six clinically or statistically significant variables from the univariate analyses. A receiving operator characteristic curve was generated, and area under the curve was calculated to determine the strength of the predictive model. A risk assessment model was first created consisting of significant pre-operative variables (Model 1) (age >50, female gender, history of botulinum toxin use, platysma muscle involvement). This model demonstrated borderline predictive value for persistent spasm at discharge (AUC .60; p=.045) and fair predictive value at follow-up (AUC .75; p=.001). Intraoperative variables (e.g. LSR persistence) demonstrated little additive value (Model 2) (AUC .67). Patients with a higher risk score (three or greater) demonstrated greater odds of persistent HFS at the time of discharge (OR 1.5 [95%CI 1.16-1.97]; p=.035), as well as greater odds of persistent or recurrent spasm at the time of follow-up (OR 3.0 [95%CI 1.52-5.95]; p=.002) Conclusions: A risk assessment model consisting of pre-operative clinical characteristics is useful in prognosticating HFS persistence at follow-up.

  17. A new multi-disciplinary model for the assessment and reduction of volcanic risk: the example of the island of Vulcano, Italy

    Science.gov (United States)

    Simicevic, Aleksandra; Bonadonna, Costanza; di Traglia, Federico; Rosi, Mauro

    2010-05-01

    Volcanic eruptions are accompanied by numerous hazards which pose short- and long-term threats to people and property. Recent experiences have shown that successful responses to hazard events correlate strongly with the degree to which proactive policies of risk reduction are already in place before an eruption occurs. Effective proactive risk-reduction strategies require contributions from numerous disciplines. A volcanic eruption is not a hazard, per se, but rather an event capable of producing a variety of hazards (e.g. earthquakes, pyroclastic density currents, lava flows, tephra fall, lahars, landslides, gas release, and tsunamis) that can affect the built environment in a variety of ways, over different time scales and with different degrees of intensity. Our proposed model for the assessment and mitigation of exposure-based volcanic risk is mainly based on the compilation of three types of maps: hazard maps, hazard-specific vulnerability maps and exposure-based risk maps. Hazard maps identify the spatial distribution of individual volcanic hazard and it includes both event analysis and impact analysis. Hazard-specific vulnerability maps represent the systematic evaluation of physical vulnerability of the built environment to a range of volcanic phenomena, i.e. spatial distribution of buildings vulnerable to a given hazard based on the analysis of selected building elements. Buildings are classified on the basis of their major components that are relevant for different volcanic hazards, their strength, their construction materials and are defined taking into account the potential damage that each group of building elements (e.g. walls, roof, load-bearing structure) will suffer under a volcanic hazard. All those factors are enumerated in a checklist and are used for the building survey. Hazard-specific vulnerability maps are then overlapped with hazard maps in order to compile exposure-based risk maps and so quantify the potential damage. Such quantification

  18. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  19. Motivation Monitoring and Assessment Extension for Input-Process-Outcome Game Model

    Science.gov (United States)

    Ghergulescu, Ioana; Muntean, Cristina Hava

    2014-01-01

    This article proposes a Motivation Assessment-oriented Input-Process-Outcome Game Model (MotIPO), which extends the Input-Process-Outcome game model with game-centred and player-centred motivation assessments performed right from the beginning of the game-play. A feasibility case-study involving 67 participants playing an educational game and…

  20. Assessing groundwater policy with coupled economic-groundwater hydrologic modeling

    Science.gov (United States)

    Mulligan, Kevin B.; Brown, Casey; Yang, Yi-Chen E.; Ahlfeld, David P.

    2014-03-01

    This study explores groundwater management policies and the effect of modeling assumptions on the projected performance of those policies. The study compares an optimal economic allocation for groundwater use subject to streamflow constraints, achieved by a central planner with perfect foresight, with a uniform tax on groundwater use and a uniform quota on groundwater use. The policies are compared with two modeling approaches, the Optimal Control Model (OCM) and the Multi-Agent System Simulation (MASS). The economic decision models are coupled with a physically based representation of the aquifer using a calibrated MODFLOW groundwater model. The results indicate that uniformly applied policies perform poorly when simulated with more realistic, heterogeneous, myopic, and self-interested agents. In particular, the effects of the physical heterogeneity of the basin and the agents undercut the perceived benefits of policy instruments assessed with simple, single-cell groundwater modeling. This study demonstrates the results of coupling realistic hydrogeology and human behavior models to assess groundwater management policies. The Republican River Basin, which overlies a portion of the Ogallala aquifer in the High Plains of the United States, is used as a case study for this analysis.

  1. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  2. PRACTICAL APPLICATION OF A MODEL FOR ASSESSING

    Directory of Open Access Journals (Sweden)

    Petr NOVOTNÝ

    2015-12-01

    Full Text Available Rail transport is an important sub-sector of transport infrastructure. Disruption of its operation due to emergencies can result in a reduction in functional parameters of provided services with consequent impacts on society. Identification of critical elements of this system enables its timely and effective protection. On that ground, the article presents a draft model for assessing the criticality of railway infrastructure elements. This model uses a systems approach and multicriteria semi-quantitative analysis with weighted criteria for calculating the criticality of individual elements of the railway infrastructure. In the conclusion, it presents a practical application of the proposed model including the discussion of results.

  3. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    Science.gov (United States)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  4. A GoldSim Based Biosphere Assessment Model for a HLW Repository

    International Nuclear Information System (INIS)

    Lee, Youn-Myoung; Hwang, Yong-Soo; Kang, Chul-Hyung

    2007-01-01

    To demonstrate the performance of a repository, the dose exposure to a human being due to nuclide releases from a repository should be evaluated and the results compared to the dose limit presented by the regulatory bodies. To evaluate a dose rate to an individual due to a long-term release of nuclides from a HLW repository, biosphere assessment models and their implemented codes such as ACBIO1 and ACBIO2 have been developed with the aid of AMBER during the last few years. BIOMASS methodology has been adopted for a HLW repository currently being considered in Korea, which has a similar concept to the Swedish KBS-3 HLW repository. Recently, not just only for verifying the purpose for biosphere assessment models but also for varying the possible alternatives to assess the consequences in a biosphere due to a HLW repository, another version of the assessment modesl has been newly developed in the frame of development programs for a total system performance assessment modeling tool by utilizing GoldSim. Through a current study, GoldSim approach for a biosphere modeling is introduced. Unlike AMBER by which a compartment scheme can be rather simply constructed with an appropriate transition rate between compartments, GoldSim was designed to facilitate the object-oriented modules by which specific models can be addressed in an additional manner, like solving jig saw puzzles

  5. Predicting the ungauged basin : Model validation and realism assessment

    NARCIS (Netherlands)

    Van Emmerik, T.H.M.; Mulder, G.; Eilander, D.; Piet, M.; Savenije, H.H.G.

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  6. Predicting the ungauged basin: model validation and realism assessment

    NARCIS (Netherlands)

    van Emmerik, Tim; Mulder, Gert; Eilander, Dirk; Piet, Marijn; Savenije, Hubert

    2015-01-01

    The hydrological decade on Predictions in Ungauged Basins (PUB) led to many new insights in model development, calibration strategies, data acquisition and uncertainty analysis. Due to a limited amount of published studies on genuinely ungauged basins, model validation and realism assessment of

  7. A mathematical model for environmental risk assessment in manufacturing industry

    Institute of Scientific and Technical Information of China (English)

    何莉萍; 徐盛明; 陈大川; 党创寅

    2002-01-01

    Environmental conscious manufacturing has become an important issue in industry because of market pressure and environmental regulations. An environmental risk assessment model was developed based on the network analytic method and fuzzy set theory. The "interval analysis method" was applied to deal with the on-site monitoring data as basic information for assessment. In addition, the fuzzy set theory was employed to allow uncertain, interactive and dynamic information to be effectively incorporated into the environmental risk assessment. This model is a simple, practical and effective tool for evaluating the environmental risk of manufacturing industry and for analyzing the relative impacts of emission wastes, which are hazardous to both human and ecosystem health. Furthermore, the model is considered useful for design engineers and decision-maker to design and select processes when the costs, environmental impacts and performances of a product are taken into consideration.

  8. Mesorad dose assessment model. Volume 1. Technical basis

    International Nuclear Information System (INIS)

    Scherpelz, R.I.; Bander, T.J.; Athey, G.F.; Ramsdell, J.V.

    1986-03-01

    MESORAD is a dose assessment model for emergency response applications. Using release data for as many as 50 radionuclides, the model calculates: (1) external doses resulting from exposure to radiation emitted by radionuclides contained in elevated or deposited material; (2) internal dose commitment resulting from inhalation; and (3) total whole-body doses. External doses from airborne material are calculated using semi-infinite and finite cloud approximations. At each stage in model execution, the appropriate approximation is selected after considering the cloud dimensions. Atmospheric processes are represented in MESORAD by a combination of Lagrangian puff and Gaussian plume dispersion models, a source depletion (deposition velocity) dry deposition model, and a wet deposition model using washout coefficients based on precipitation rates

  9. ASSESSING INDIVIDUAL PERFORMANCE ON INFORMATION TECHNOLOGY ADOPTION: A NEW MODEL

    OpenAIRE

    Diah Hari Suryaningrum

    2012-01-01

    This paper aims to propose a new model in assessing individual performance on information technology adoption. The new model to assess individual performance was derived from two different theories: decomposed theory of planned behavior and task-technology fit theory. Although many researchers have tried to expand these theories, some of their efforts might lack of theoretical assumptions. To overcome this problem and enhance the coherence of the integration, I used a theory from social scien...

  10. Tackling Biocomplexity with Meta-models for Species Risk Assessment

    Directory of Open Access Journals (Sweden)

    Philip J. Nyhus

    2007-06-01

    Full Text Available We describe results of a multi-year effort to strengthen consideration of the human dimension into endangered species risk assessments and to strengthen research capacity to understand biodiversity risk assessment in the context of coupled human-natural systems. A core group of social and biological scientists have worked with a network of more than 50 individuals from four countries to develop a conceptual framework illustrating how human-mediated processes influence biological systems and to develop tools to gather, translate, and incorporate these data into existing simulation models. A central theme of our research focused on (1 the difficulties often encountered in identifying and securing diverse bodies of expertise and information that is necessary to adequately address complex species conservation issues; and (2 the development of quantitative simulation modeling tools that could explicitly link these datasets as a way to gain deeper insight into these issues. To address these important challenges, we promote a "meta-modeling" approach where computational links are constructed between discipline-specific models already in existence. In this approach, each model can function as a powerful stand-alone program, but interaction between applications is achieved by passing data structures describing the state of the system between programs. As one example of this concept, an integrated meta-model of wildlife disease and population biology is described. A goal of this effort is to improve science-based capabilities for decision making by scientists, natural resource managers, and policy makers addressing environmental problems in general, and focusing on biodiversity risk assessment in particular.

  11. Assessing the Validity of the Simplified Potential Energy Clock Model for Modeling Glass-Ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Jamison, Ryan Dale [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grillet, Anne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stavig, Mark E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Strong, Kevin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dai, Steve Xunhu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Glass-ceramic seals may be the future of hermetic connectors at Sandia National Laboratories. They have been shown capable of surviving higher temperatures and pressures than amorphous glass seals. More advanced finite-element material models are required to enable model-based design and provide evidence that the hermetic connectors can meet design requirements. Glass-ceramics are composite materials with both crystalline and amorphous phases. The latter gives rise to (non-linearly) viscoelastic behavior. Given their complex microstructures, glass-ceramics may be thermorheologically complex, a behavior outside the scope of currently implemented constitutive models at Sandia. However, it was desired to assess if the Simplified Potential Energy Clock (SPEC) model is capable of capturing the material response. Available data for SL 16.8 glass-ceramic was used to calibrate the SPEC model. Model accuracy was assessed by comparing model predictions with shear moduli temperature dependence and high temperature 3-point bend creep data. It is shown that the model can predict the temperature dependence of the shear moduli and 3- point bend creep data. Analysis of the results is presented. Suggestions for future experiments and model development are presented. Though further calibration is likely necessary, SPEC has been shown capable of modeling glass-ceramic behavior in the glass transition region but requires further analysis below the transition region.

  12. Assessment and development of implementation models of health ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Assessment and development of implementation models of health-related ... The Contribution of Civil Society Organizations in Achieving Health for All ... Health Information for Maternal and Child Health Planning in Urban Bangladesh.

  13. High-resolution assessment of land use impacts on biodiversity in life cycle assessment using species habitat suitability models.

    Science.gov (United States)

    de Baan, Laura; Curran, Michael; Rondinini, Carlo; Visconti, Piero; Hellweg, Stefanie; Koellner, Thomas

    2015-02-17

    Agricultural land use is a main driver of global biodiversity loss. The assessment of land use impacts in decision-support tools such as life cycle assessment (LCA) requires spatially explicit models, but existing approaches are either not spatially differentiated or modeled at very coarse scales (e.g., biomes or ecoregions). In this paper, we develop a high-resolution (900 m) assessment method for land use impacts on biodiversity based on habitat suitability models (HSM) of mammal species. This method considers potential land use effects on individual species, and impacts are weighted by the species' conservation status and global rarity. We illustrate the method using a case study of crop production in East Africa, but the underlying HSMs developed by the Global Mammals Assessment are available globally. We calculate impacts of three major export crops and compare the results to two previously developed methods (focusing on local and regional impacts, respectively) to assess the relevance of the methodological innovations proposed in this paper. The results highlight hotspots of product-related biodiversity impacts that help characterize the links among agricultural production, consumption, and biodiversity loss.

  14. Introduction and Assessment of a Blended-Learning Model to Teach Patient Assessment in a Doctor of Pharmacy Program.

    Science.gov (United States)

    Prescott, William Allan; Woodruff, Ashley; Prescott, Gina M; Albanese, Nicole; Bernhardi, Christian; Doloresco, Fred

    2016-12-25

    Objective. To integrate a blended-learning model into a two-course patient assessment sequence in a doctor of pharmacy (PharmD) program and to assess the academic performance and perceptions of enrolled students. Design. A blended-learning model consisting of a flipped classroom format was integrated into a patient assessment (PA) course sequence. Course grades of students in the blended-learning (intervention) and traditional-classroom (control) groups were compared. A survey was administered to assess student perceptions. Assessment. The mean numeric grades of students in the intervention group were higher than those of students in the traditional group (PA1 course: 92.2±3.1 vs 90.0±4.3; and PA2 course: 90.3±4.9 vs 85.8±4.2). Eighty-six percent of the students in the intervention group agreed that the instructional methodologies used in this course facilitated understanding of the material. Conclusion. The blended-learning model was associated with improved academic performance and was well-received by students.

  15. Evolution in performance assessment modeling as a result of regulatory review

    Energy Technology Data Exchange (ETDEWEB)

    Rowat, J.H.; Dolinar, G.M.; Stephens, M.E. [AECL Chalk River Labs., Ontario (Canada)] [and others

    1995-12-31

    AECL is planning to build the IRUS (Intrusion Resistant Underground Structure) facility for near-surface disposal of LLRW. The PSAR (preliminary safety assessment report) was subject to an initial regulatory review during mid-1992. The regulatory authority provided comments on many aspects of the safety assessment documentation including a number of questions on specific PA (Performance Assessment) modelling assumptions. As a result of these comments as well as a separate detailed review of the IRUS disposal concept, changes were made to the conceptual and mathematical models. The original disposal concept included a non-sorbing vault backfill, with a strong reliance on the wasteform as a barrier. This concept was altered to decrease reliance on the wasteform by replacing the original backfill with a sand/clinoptilolite mix, which is a better sorber of metal cations. This change lead to changes in the PA models which in turn altered the safety case for the facility. This, and other changes that impacted performance assessment modelling are the subject of this paper.

  16. Geologic modeling in risk assessment methodology for radioactive waste management

    International Nuclear Information System (INIS)

    Logan, S.E.; Berbano, M.C.

    1977-01-01

    Under contract to the U.S. Environmental Protection Agency (EPA), the University of New Mexico is developing a computer based assessment methodology for evaluating public health and environmental impacts from the disposal of radioactive waste in geologic formations. Methodology incorporates a release or fault tree model, an environmental model, and an economic model. The release model and its application to a model repository in bedded salt is described. Fault trees are constructed to provide the relationships between various geologic and man-caused events which are potential mechanisms for release of radioactive material beyond the immediate environs of the repository. The environmental model includes: 1) the transport to and accumulations at various receptors in the biosphere, 2) pathways from these environmental concentrations, and 3) radiation dose to man. Finally, economic results are used to compare and assess various disposal configurations as a basis for formulatin

  17. Assessment and improvement of biotransfer models to cow's milk and beef used in exposure assessment tools for organic pollutants.

    Science.gov (United States)

    Takaki, Koki; Wade, Andrew J; Collins, Chris D

    2015-11-01

    The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow's milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish (EPI-HL and IFS-HL). This simulated metabolic rate was then incorporated into the mechanistic cattle biotransfer models (RAIDAR, ACC-HUMAN, OMEGA, and CKow). The goodness of fit tests showed that RAIDAR, ACC-HUMAN, OMEGA model performances were significantly improved using either of the QSARs when comparing the new model outputs to observed data. The CKow model is the only one that separates the processes in the gut and liver. This model showed the lowest residual error of all the models tested when the BioWIN model was used to represent the ruminant metabolic process in the gut and the two fish QSARs were used to represent the metabolic process in the liver. Our testing included EUSES and CalTOX which are KOW-regression models that are widely used in regulatory assessment. New regressions based on the simulated rate of the two metabolic processes are also proposed as an alternative to KOW-regression models for a screening risk assessment. The modified CKow model is more physiologically realistic, but has equivalent usability to existing KOW-regression models for estimating cattle biotransfer of organic pollutants. Copyright © 2015. Published by Elsevier Ltd.

  18. Confidence assessment. Site descriptive modelling SDM-Site Forsmark

    International Nuclear Information System (INIS)

    2008-09-01

    The objective of this report is to assess the confidence that can be placed in the Forsmark site descriptive model, based on the information available at the conclusion of the surface-based investigations (SDM-Site Forsmark). In this exploration, an overriding question is whether remaining uncertainties are significant for repository engineering design or long-term safety assessment and could successfully be further reduced by more surface based investigations or more usefully by explorations underground made during construction of the repository. The confidence in the Forsmark site descriptive model, based on the data available at the conclusion of the surface-based site investigations, have been assessed by exploring: Confidence in the site characterisation data base; Key remaining issues and their handling; Handling of alternative models; Consistency between disciplines; and, Main reasons for confidence and lack of confidence in the model. It is generally found that the key aspects of importance for safety assessment and repository engineering of the Forsmark site descriptive model are associated with a high degree of confidence. Because of the robust geological model that describes the site, the overall confidence in Forsmark site descriptive model is judged to be high. While some aspects have lower confidence this lack of confidence is handled by providing wider uncertainty ranges, bounding estimates and/or alternative models. Most, but not all, of the low confidence aspects have little impact on repository engineering design or for long-term safety. Poor precision in the measured data are judged to have limited impact on uncertainties on the site descriptive model, with the exceptions of inaccuracy in determining the position of some boreholes at depth in 3-D space, as well as the poor precision of the orientation of BIPS images in some boreholes, and the poor precision of stress data determined by overcoring at the locations where the pre

  19. The Methodical Instrumentarium for Assessing the Competitiveness of Business Model of Trade Enterprise

    Directory of Open Access Journals (Sweden)

    Grosul Victoria A.

    2017-10-01

    Full Text Available The article substantiates the need to assess the competitiveness of business model of enterprise. By analyzing, systematizing and generalizing the scientific work of foreign and domestic scientists, the basic methods of assessment were allocated and the feasibility of an integrated approach to assessing the competitiveness of business model of enterprise was argued. A scorecard system for evaluating the competitiveness of business model of trade enterprise has been developed and a structural and logical framework for the integrated assessment of the competitiveness of the indicated business model has been substantiated. Based on the Ishikawa diagram, potential competitive advantages have been defined and the «problem areas» have been identified that impact the competitiveness of business model of enterprise. The authors provide recommendations for transforming the business model of trade enterprise in the context of the enterprise’s orientation towards development.

  20. Assessing testamentary and decision-making capacity: Approaches and models.

    Science.gov (United States)

    Purser, Kelly; Rosenfeld, Tuly

    2015-09-01

    The need for better and more accurate assessments of testamentary and decision-making capacity grows as Australian society ages and incidences of mentally disabling conditions increase. Capacity is a legal determination, but one on which medical opinion is increasingly being sought. The difficulties inherent within capacity assessments are exacerbated by the ad hoc approaches adopted by legal and medical professionals based on individual knowledge and skill, as well as the numerous assessment paradigms that exist. This can negatively affect the quality of assessments, and results in confusion as to the best way to assess capacity. This article begins by assessing the nature of capacity. The most common general assessment models used in Australia are then discussed, as are the practical challenges associated with capacity assessment. The article concludes by suggesting a way forward to satisfactorily assess legal capacity given the significant ramifications of getting it wrong.

  1. A Process Model for Assessing Adolescent Risk for Suicide.

    Science.gov (United States)

    Stoelb, Matt; Chiriboga, Jennifer

    1998-01-01

    This comprehensive assessment process model includes primary, secondary, and situational risk factors and their combined implications and significance in determining an adolescent's level or risk for suicide. Empirical data and clinical intuition are integrated to form a working client model that guides the professional in continuously reassessing…

  2. Model testing for the remediation assessment of a radium contaminated site in Olen, Belgium

    International Nuclear Information System (INIS)

    Sweeck, Lieve; Kanyar, Bela; Krajewski, Pawel; Kryshev, Alexander; Lietava, Peter; Nenyei, Arpad; Sazykina, Tatiana; Yu, Charley; Zeevaert, Theo

    2005-01-01

    Environmental assessment models are used as decision-aiding tools in the selection of remediation options for radioactively contaminated sites. In most cases, the effectiveness of the remedial actions in terms of dose savings cannot be demonstrated directly, but can be established with the help of environmental assessment models, through the assessment of future radiological impacts. It should be emphasized that, given the complexity of the processes involved and our current understanding of how they operate, these models are simplified descriptions of the behaviour of radionuclides in the environment and therefore imperfect. One way of testing and improving the reliability of the models is to compare their predictions with real data and/or the predictions of other models. Within the framework of the Remediation Assessment Working Group (RAWG) of the BIOMASS (BIOsphere Modelling and ASSessment) programme coordinated by IAEA, two scenarios were constructed and applied to test the reliability of environmental assessment models when remedial actions are involved. As a test site, an area of approximately 100 ha contaminated by the discharges of an old radium extraction plant in Olen (Belgium) has been considered. In the first scenario, a real situation was evaluated and model predictions were compared with measured data. In the second scenario the model predictions for specific hypothetical but realistic situations were compared. Most of the biosphere models were not developed to assess the performance of remedial actions and had to be modified for this purpose. It was demonstrated clearly that the modeller's experience and familiarity with the mathematical model, the site and with the scenario play a very important role in the outcome of the model calculations. More model testing studies, preferably for real situations, are needed in order to improve the models and modelling methods and to expand the areas in which the models are applicable

  3. Models for assessing and managing credit risk

    Directory of Open Access Journals (Sweden)

    Neogradi Slađana

    2014-01-01

    Full Text Available This essay deals with the definition of a model for assessing and managing credit risk. Risk is an inseparable component of any average and normal credit transaction. Looking at the different aspects of the identification and classification of risk in the banking industry as well as representation of the key components of modern risk management. In the first part of the essay will analyze how the impact of credit risk on bank and empirical models for determining the financial difficulties in which the company can be found. Bank on the basis of these models can reduce number of approved risk assets. In the second part, we consider models for improving credit risk with emphasis on Basel I, II and III, and the third part, we conclude that the most appropriate model and gives the best effect for measuring credit risk in domestic banks.

  4. Assessing Ecosystem Model Performance in Semiarid Systems

    Science.gov (United States)

    Thomas, A.; Dietze, M.; Scott, R. L.; Biederman, J. A.

    2017-12-01

    In ecosystem process modelling, comparing outputs to benchmark datasets observed in the field is an important way to validate models, allowing the modelling community to track model performance over time and compare models at specific sites. Multi-model comparison projects as well as models themselves have largely been focused on temperate forests and similar biomes. Semiarid regions, on the other hand, are underrepresented in land surface and ecosystem modelling efforts, and yet will be disproportionately impacted by disturbances such as climate change due to their sensitivity to changes in the water balance. Benchmarking models at semiarid sites is an important step in assessing and improving models' suitability for predicting the impact of disturbance on semiarid ecosystems. In this study, several ecosystem models were compared at a semiarid grassland in southwestern Arizona using PEcAn, or the Predictive Ecosystem Analyzer, an open-source eco-informatics toolbox ideal for creating the repeatable model workflows necessary for benchmarking. Models included SIPNET, DALEC, JULES, ED2, GDAY, LPJ-GUESS, MAESPA, CLM, CABLE, and FATES. Comparison between model output and benchmarks such as net ecosystem exchange (NEE) tended to produce high root mean square error and low correlation coefficients, reflecting poor simulation of seasonality and the tendency for models to create much higher carbon sources than observed. These results indicate that ecosystem models do not currently adequately represent semiarid ecosystem processes.

  5. Accident consequence assessments with different atmospheric dispersion models

    International Nuclear Information System (INIS)

    Panitz, H.J.

    1989-11-01

    An essential aim of the improvements of the new program system UFOMOD for Accident Consequence Assessments (ACAs) was to substitute the straight-line Gaussian plume model conventionally used in ACA models by more realistic atmospheric dispersion models. To identify improved models which can be applied in ACA codes and to quantify the implications of different dispersion models on the results of an ACA, probabilistic comparative calculations with different atmospheric dispersion models have been performed. The study showed that there are trajectory models available which can be applied in ACAs and that they provide more realistic results of ACAs than straight-line Gaussian models. This led to a completely novel concept of atmospheric dispersion modelling in which two different distance ranges of validity are distinguished: the near range of some ten kilometres distance and the adjacent far range which are assigned to respective trajectory models. (orig.) [de

  6. Fully integrated modelling for sustainability assessment of resource recovery from waste.

    Science.gov (United States)

    Millward-Hopkins, Joel; Busch, Jonathan; Purnell, Phil; Zwirner, Oliver; Velis, Costas A; Brown, Andrew; Hahladakis, John; Iacovidou, Eleni

    2018-01-15

    This paper presents an integrated modelling approach for value assessments, focusing on resource recovery from waste. The method tracks and forecasts a range of values across environmental, social, economic and technical domains by attaching these to material-flows, thus building upon and integrating unidimensional models such as material flow analysis (MFA) and lifecycle assessment (LCA). We argue that the usual classification of metrics into these separate domains is useful for interpreting the outputs of multidimensional assessments, but unnecessary for modelling. We thus suggest that multidimensional assessments can be better performed by integrating the calculation methods of unidimensional models rather than their outputs. To achieve this, we propose a new metric typology that forms the foundation of a multidimensional model. This enables dynamic simulations to be performed with material-flows (or values in any domain) driven by changes in value in other domains. We then apply the model in an illustrative case highlighting links between the UK coal-based electricity-production and concrete/cement industries, investigating potential impacts that may follow the increased use of low-carbon fuels (biomass and solid recovered fuels; SRF) in the former. We explore synergies and trade-offs in value across domains and regions, e.g. how changes in carbon emissions in one part of the system may affect mortality elsewhere. This highlights the advantages of recognising complex system dynamics and making high-level inferences of their effects, even when rigorous analysis is not possible. We also indicate how changes in social, environmental and economic 'values' can be understood as being driven by changes in the technical value of resources. Our work thus emphasises the advantages of building fully integrated models to inform conventional sustainability assessments, rather than applying hybrid approaches that integrate outputs from parallel models. The approach we

  7. Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  8. Fire models for assessment of nuclear power plant fires

    International Nuclear Information System (INIS)

    Nicolette, V.F.; Nowlen, S.P.

    1989-01-01

    This paper reviews the state-of-the-art in available fire models for the assessment of nuclear power plants fires. The advantages and disadvantages of three basic types of fire models (zone, field, and control volume) and Sandia's experience with these models will be discussed. It is shown that the type of fire model selected to solve a particular problem should be based on the information that is required. Areas of concern which relate to all nuclear power plant fire models are identified. 17 refs., 6 figs

  9. Assessment of modelling needs for safety analysis of current HTGR concepts

    International Nuclear Information System (INIS)

    Kroeger, P.G.; Van Tuyle, G.J.

    1985-12-01

    In view of the recent shift in emphasis of the DOE/Industry HTGR development efforts to smaller modular designs it became necessary to review the modelling needs and the codes available to assess the safety performance of these new designs. This report provides a final assessment of the most urgent modelling needs, comparing these to the tools available, and outlining the most significant areas where further modelling is required. Plans to implement the required work are presented. 47 refs., 20 figs

  10. Probabilistic Modeling and Risk Assessment of Cable Icing

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee

    This dissertation addresses the issues related to icing of structures with special emphasis on bridge cables. Cable supported bridges in cold climate suffers for ice accreting on the cables, this poses three different undesirable situations. Firstly the changed shape of the cable due to ice...... preliminary framework is modified for assessing the probability of occurrence of in-cloud and precipitation icing and its duration. Different probabilistic models are utilized for the representation of the meteorological variables and their appropriateness is evaluated both through goodness-of-fit tests...... are influencing the two icing mechanisms and their duration. The model is found to be more sensitive to changes in the discretization levels of the input variables. Thirdly the developed operational probabilistic framework for the assessment of the expected number of occurrences of ice/snow accretion on bridge...

  11. Combining catchment and instream modelling to assess physical habitat quality

    DEFF Research Database (Denmark)

    Olsen, Martin

    Study objectives After the implementation of EU's Water Framework Directive (WFD) in Denmark ecological impacts from groundwater exploitation on surface waters has to receive additional consideration. Small streams in particular are susceptible to changes in run-off but have only recieved little...... attention in past studies of run-off impact on the quality of stream physical habitats. This study combined catchment and instream models with instream habitat observations to assess the ecological impacts from groundwater exploitation on a small stream. The main objectives of this study was; • to assess...... which factors are controlling the run-off conditions in stream Ledreborg and to what degree • to assess the run-off reference condition of stream Ledreborg where intensive groundwater abstraction has taken place in 67 years using a simple rainfall-run-off-model • to assess how stream run-off affect...

  12. A parsimonious dynamic model for river water quality assessment.

    Science.gov (United States)

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water quality model rather than detailed ones. The study presents a parsimonious river water quality model to evaluate the propagation of pollutants in natural rivers. The model is made up of two sub-models: a quantity one and a quality one. The model employs a river schematisation that considers different stretches according to the geometric characteristics and to the gradient of the river bed. Each stretch is represented with a conceptual model of a series of linear channels and reservoirs. The channels determine the delay in the pollution wave and the reservoirs cause its dispersion. To assess the river water quality, the model employs four state variables: DO, BOD, NH(4), and NO. The model was applied to the Savena River (Italy), which is the focus of a European-financed project in which quantity and quality data were gathered. A sensitivity analysis of the model output to the model input or parameters was done based on the Generalised Likelihood Uncertainty Estimation methodology. The results demonstrate the suitability of such a model as a tool for river water quality management.

  13. Assessment of TRACE Condensation Model Against Reflux Condensation Tests with Noncondensable Gases

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Won; Cheong, Ae Ju; Shin, Andong; Suh, Nam Duk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The TRACE is the latest in a series of advanced, best-estimated reactor systems code developed by U.S. Nuclear Regulatory Commission for analyzing transient and steady-state neutronic-thermal-hydraulic behavior in light water reactors. This special model is expected to replace the default model in a future code release after sufficient testing has been completed. This study assesses the special condensation model of TRACE 5.0-patch4 against the counter-current flow configuration. For this purpose, the predicted results of special model are compared to the experimental and to those of default model. The KAST reflux condensation test with NC gases are used in this assessment. We assessed the special model for film condensation of TRACE 5.0-patch4 against the data of the reflux condensation test in the presence of NC gases. The special condensation model of TRACE provides a reasonable estimate of HTC with good agreement at the low inlet steam flow rate.

  14. Assessment of TRACE Condensation Model Against Reflux Condensation Tests with Noncondensable Gases

    International Nuclear Information System (INIS)

    Lee, Kyung Won; Cheong, Ae Ju; Shin, Andong; Suh, Nam Duk

    2015-01-01

    The TRACE is the latest in a series of advanced, best-estimated reactor systems code developed by U.S. Nuclear Regulatory Commission for analyzing transient and steady-state neutronic-thermal-hydraulic behavior in light water reactors. This special model is expected to replace the default model in a future code release after sufficient testing has been completed. This study assesses the special condensation model of TRACE 5.0-patch4 against the counter-current flow configuration. For this purpose, the predicted results of special model are compared to the experimental and to those of default model. The KAST reflux condensation test with NC gases are used in this assessment. We assessed the special model for film condensation of TRACE 5.0-patch4 against the data of the reflux condensation test in the presence of NC gases. The special condensation model of TRACE provides a reasonable estimate of HTC with good agreement at the low inlet steam flow rate

  15. Training courses on integrated safety assessment modelling for waste repositories

    International Nuclear Information System (INIS)

    Mallants, D.

    2007-01-01

    Near-surface or deep repositories of radioactive waste are being developed and evaluated all over the world. Also, existing repositories for low- and intermediate-level waste often need to be re-evaluated to extend their license or to obtain permission for final closure. The evaluation encompasses both a technical feasibility as well as a safety analysis. The long term safety is usually demonstrated by means of performance or safety assessment. For this purpose computer models are used that calculate the migration of radionuclides from the conditioned radioactive waste, through engineered barriers to the environment (groundwater, surface water, and biosphere). Integrated safety assessment modelling addresses all relevant radionuclide pathways from source to receptor (man), using in combination various computer codes in which the most relevant physical, chemical, mechanical, or even microbiological processes are mathematically described. SCK-CEN organizes training courses in Integrated safety assessment modelling that are intended for individuals who have either a controlling or supervising role within the national radwaste agencies or regulating authorities, or for technical experts that carry out the actual post-closure safety assessment for an existing or new repository. Courses are organised by the Department of Waste and Disposal

  16. Petroleum system modeling capabilities for use in oil and gas resource assessments

    Science.gov (United States)

    Higley, Debra K.; Lewan, Michael; Roberts, Laura N.R.; Henry, Mitchell E.

    2006-01-01

    Summary: Petroleum resource assessments are among the most highly visible and frequently cited scientific products of the U.S. Geological Survey. The assessments integrate diverse and extensive information on the geologic, geochemical, and petroleum production histories of provinces and regions of the United States and the World. Petroleum systems modeling incorporates these geoscience data in ways that strengthen the assessment process and results are presented visually and numerically. The purpose of this report is to outline the requirements, advantages, and limitations of one-dimensional (1-D), two-dimensional (2-D), and three-dimensional (3-D) petroleum systems modeling that can be applied to the assessment of oil and gas resources. Primary focus is on the application of the Integrated Exploration Systems (IES) PetroMod? software because of familiarity with that program as well as the emphasis by the USGS Energy Program on standardizing to one modeling application. The Western Canada Sedimentary Basin (WCSB) is used to demonstrate the use of the PetroMod? software. Petroleum systems modeling quantitatively extends the 'total petroleum systems' (TPS) concept (Magoon and Dow, 1994; Magoon and Schmoker, 2000) that is employed in USGS resource assessments. Modeling allows integration of state-of-the-art analysis techniques, and provides the means to test and refine understanding of oil and gas generation, migration, and accumulation. Results of modeling are presented visually, numerically, and statistically, which enhances interpretation of the processes that affect TPSs through time. Modeling also provides a framework for the input and processing of many kinds of data essential in resource assessment, including (1) petroleum system elements such as reservoir, seal, and source rock intervals; (2) timing of depositional, hiatus, and erosional events and their influences on petroleum systems; (3) incorporation of vertical and lateral distribution and lithologies of

  17. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT: A GIS-BASED HYDROLOGIC MODELING TOOL

    Science.gov (United States)

    Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extens...

  18. Zebrafish as a correlative and predictive model for assessing biomaterial nanotoxicity.

    Science.gov (United States)

    Fako, Valerie E; Furgeson, Darin Y

    2009-06-21

    The lack of correlative and predictive models to assess acute and chronic toxicities limits the rapid pre-clinical development of new therapeutics. This barrier is due in part to the exponential growth of nanotechnology and nanotherapeutics, coupled with the lack of rigorous and robust screening assays and putative standards. It is a fairly simple and cost-effective process to initially screen the toxicity of a nanomaterial by using invitro cell cultures; unfortunately it is nearly impossible to imitate a complimentary invivo system. Small mammalian models are the most common method used to assess possible toxicities and biodistribution of nanomaterials in humans. Alternatively, Daniorerio, commonly known as zebrafish, are proving to be a quick, cheap, and facile model to conservatively assess toxicity of nanomaterials.

  19. Risk assessment and food allergy: the probabilistic model applied to allergens

    NARCIS (Netherlands)

    Spanjersberg, M.Q.I.; Kruizinga, A.G.; Rennen, M.A.J.; Houben, G.F.

    2007-01-01

    In order to assess the risk of unintended exposure to food allergens, traditional deterministic risk assessment is usually applied, leading to inconsequential conclusions as 'an allergic reaction cannot be excluded'. TNO therefore developed a quantitative risk assessment model for allergens based on

  20. Crop modelling for integrated assessment of risk to food production from climate change

    NARCIS (Netherlands)

    Ewert, F.; Rötter, R.P.; Bindi, M.; Webber, Heidi; Trnka, M.; Kersebaum, K.C.; Olesen, J.E.; Ittersum, van M.K.; Janssen, S.J.C.; Rivington, M.; Semenov, M.A.; Wallach, D.; Porter, J.R.; Stewart, D.; Verhagen, J.; Gaiser, T.; Palosuo, T.; Tao, F.; Nendel, C.; Roggero, P.P.; Bartosová, L.; Asseng, S.

    2015-01-01

    The complexity of risks posed by climate change and possible adaptations for crop production has called for integrated assessment and modelling (IAM) approaches linking biophysical and economic models. This paper attempts to provide an overview of the present state of crop modelling to assess

  1. A Review of Quantitative Situation Assessment Models for Nuclear Power Plant Operators

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Situation assessment is the process of developing situation awareness and situation awareness is defined as 'the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.' Situation awareness is an important element influencing human actions because human decision making is based on the result of situation assessment or situation awareness. There are many models for situation awareness and those models can be categorized into qualitative or quantitative. As the effects of some input factors on situation awareness can be investigated through the quantitative models, the quantitative models are more useful for the design of operator interfaces, automation strategies, training program, and so on, than the qualitative models. This study presents the review of two quantitative models of situation assessment (SA) for nuclear power plant operators

  2. Modelling Global Land Use and Social Implications in the Sustainability Assessment of Biofuels

    DEFF Research Database (Denmark)

    Kløverpris, Jesper; Wenzel, Henrik

    2007-01-01

    Cross-fertilising environmental, economic and geographical modelling to improve the environmental assessment of biofuel......Cross-fertilising environmental, economic and geographical modelling to improve the environmental assessment of biofuel...

  3. Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea

    International Nuclear Information System (INIS)

    Jang, Minho; Hong, Taehoon; Ji, Changyoon

    2015-01-01

    The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using the developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C 6 H 6 eq.) calculated by the previous model was much lower (1965 kg C 6 H 6 eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings

  4. A review of laboratory and numerical modelling in volcanology

    Directory of Open Access Journals (Sweden)

    J. L. Kavanagh

    2018-04-01

    Full Text Available Modelling has been used in the study of volcanic systems for more than 100 years, building upon the approach first applied by Sir James Hall in 1815. Informed by observations of volcanological phenomena in nature, including eye-witness accounts of eruptions, geophysical or geodetic monitoring of active volcanoes, and geological analysis of ancient deposits, laboratory and numerical models have been used to describe and quantify volcanic and magmatic processes that span orders of magnitudes of time and space. We review the use of laboratory and numerical modelling in volcanological research, focussing on sub-surface and eruptive processes including the accretion and evolution of magma chambers, the propagation of sheet intrusions, the development of volcanic flows (lava flows, pyroclastic density currents, and lahars, volcanic plume formation, and ash dispersal. When first introduced into volcanology, laboratory experiments and numerical simulations marked a transition in approach from broadly qualitative to increasingly quantitative research. These methods are now widely used in volcanology to describe the physical and chemical behaviours that govern volcanic and magmatic systems. Creating simplified models of highly dynamical systems enables volcanologists to simulate and potentially predict the nature and impact of future eruptions. These tools have provided significant insights into many aspects of the volcanic plumbing system and eruptive processes. The largest scientific advances in volcanology have come from a multidisciplinary approach, applying developments in diverse fields such as engineering and computer science to study magmatic and volcanic phenomena. A global effort in the integration of laboratory and numerical volcano modelling is now required to tackle key problems in volcanology and points towards the importance of benchmarking exercises and the need for protocols to be developed so that models are routinely tested against real

  5. A review of laboratory and numerical modelling in volcanology

    Science.gov (United States)

    Kavanagh, Janine L.; Engwell, Samantha L.; Martin, Simon A.

    2018-04-01

    Modelling has been used in the study of volcanic systems for more than 100 years, building upon the approach first applied by Sir James Hall in 1815. Informed by observations of volcanological phenomena in nature, including eye-witness accounts of eruptions, geophysical or geodetic monitoring of active volcanoes, and geological analysis of ancient deposits, laboratory and numerical models have been used to describe and quantify volcanic and magmatic processes that span orders of magnitudes of time and space. We review the use of laboratory and numerical modelling in volcanological research, focussing on sub-surface and eruptive processes including the accretion and evolution of magma chambers, the propagation of sheet intrusions, the development of volcanic flows (lava flows, pyroclastic density currents, and lahars), volcanic plume formation, and ash dispersal. When first introduced into volcanology, laboratory experiments and numerical simulations marked a transition in approach from broadly qualitative to increasingly quantitative research. These methods are now widely used in volcanology to describe the physical and chemical behaviours that govern volcanic and magmatic systems. Creating simplified models of highly dynamical systems enables volcanologists to simulate and potentially predict the nature and impact of future eruptions. These tools have provided significant insights into many aspects of the volcanic plumbing system and eruptive processes. The largest scientific advances in volcanology have come from a multidisciplinary approach, applying developments in diverse fields such as engineering and computer science to study magmatic and volcanic phenomena. A global effort in the integration of laboratory and numerical volcano modelling is now required to tackle key problems in volcanology and points towards the importance of benchmarking exercises and the need for protocols to be developed so that models are routinely tested against real world data.

  6. Cost Model for Risk Assessment of Company Operation in Audit

    Directory of Open Access Journals (Sweden)

    S. V.

    2017-12-01

    Full Text Available This article explores the approach to assessing the risk of company activities termination by building a cost model. This model gives auditors information on managers’ understanding of factors influencing change in the value of assets and liabilities, and the methods to identify it in more effective and reliable ways. Based on this information, the auditor can assess the adequacy of use of the assumption on continuity of company operation by management personnel when preparing financial statements. Financial uncertainty entails real manifestations of factors creating risks of the occurrence of costs, revenue losses due their manifestations, which in the long run can be a reason for termination of company operation, and, therefore, need to be foreseen in the auditor’s assessment of the adequacy of use of the continuity assumption when preparing financial statements by company management. The purpose of the study is to explore and develop a methodology for use of cost models to assess the risk of termination of company operation in audit. The issue of methodology for assessing the audit risk through analyzing methods for company valuation has not been dealt with. The review of methodologies for assessing the risks of termination of company operation in course of audit gives grounds for the conclusion that use of cost models can be an effective methodology for identification and assessment of such risks. The analysis of the above methods gives understanding of the existing system for company valuation, integrated into the management system, and the consequences of its use, i. e. comparison of the asset price data with the accounting data and the market value of the asset data. Overvalued or undervalued company assets may be a sign of future sale or liquidation of a company, which may signal on high probability of termination of company operation. A wrong choice or application of valuation methods can be indicative of the risk of non

  7. Predictive assessment of models for dynamic functional connectivity

    DEFF Research Database (Denmark)

    Nielsen, Søren Føns Vind; Schmidt, Mikkel Nørgaard; Madsen, Kristoffer Hougaard

    2018-01-01

    represent functional brain networks as a meta-stable process with a discrete number of states; however, there is a lack of consensus on how to perform model selection and learn the number of states, as well as a lack of understanding of how different modeling assumptions influence the estimated state......In neuroimaging, it has become evident that models of dynamic functional connectivity (dFC), which characterize how intrinsic brain organization changes over time, can provide a more detailed representation of brain function than traditional static analyses. Many dFC models in the literature...... dynamics. To address these issues, we consider a predictive likelihood approach to model assessment, where models are evaluated based on their predictive performance on held-out test data. Examining several prominent models of dFC (in their probabilistic formulations) we demonstrate our framework...

  8. Quality assessment of protein model-structures based on structural and functional similarities.

    Science.gov (United States)

    Konopka, Bogumil M; Nebel, Jean-Christophe; Kotulska, Malgorzata

    2012-09-21

    Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. GOBA--Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and

  9. A Risk Assessment Example for Soil Invertebrates Using Spatially Explicit Agent-Based Models

    DEFF Research Database (Denmark)

    Reed, Melissa; Alvarez, Tania; Chelinho, Sonia

    2016-01-01

    Current risk assessment methods for measuring the toxicity of plant protection products (PPPs) on soil invertebrates use standardized laboratory conditions to determine acute effects on mortality and sublethal effects on reproduction. If an unacceptable risk is identified at the lower tier...... population models for ubiquitous soil invertebrates (collembolans and earthworms) as refinement options in current risk assessment. Both are spatially explicit agent-based models (ABMs), incorporating individual and landscape variability. The models were used to provide refined risk assessments for different...... application scenarios of a hypothetical pesticide applied to potato crops (full-field spray onto the soil surface [termed “overall”], in-furrow, and soil-incorporated pesticide applications). In the refined risk assessment, the population models suggest that soil invertebrate populations would likely recover...

  10. Integrating Machine Learning into a Crowdsourced Model for Earthquake-Induced Damage Assessment

    Science.gov (United States)

    Rebbapragada, Umaa; Oommen, Thomas

    2011-01-01

    On January 12th, 2010, a catastrophic 7.0M earthquake devastated the country of Haiti. In the aftermath of an earthquake, it is important to rapidly assess damaged areas in order to mobilize the appropriate resources. The Haiti damage assessment effort introduced a promising model that uses crowdsourcing to map damaged areas in freely available remotely-sensed data. This paper proposes the application of machine learning methods to improve this model. Specifically, we apply work on learning from multiple, imperfect experts to the assessment of volunteer reliability, and propose the use of image segmentation to automate the detection of damaged areas. We wrap both tasks in an active learning framework in order to shift volunteer effort from mapping a full catalog of images to the generation of high-quality training data. We hypothesize that the integration of machine learning into this model improves its reliability, maintains the speed of damage assessment, and allows the model to scale to higher data volumes.

  11. Hydrological Modelling using HEC-HMS for Flood Risk Assessment of Segamat Town, Malaysia

    Science.gov (United States)

    Romali, N. S.; Yusop, Z.; Ismail, A. Z.

    2018-03-01

    This paper presents an assessment of the applicability of using Hydrologic Modelling System developed by the Hydrologic Engineering Center (HEC-HMS) for hydrological modelling of Segamat River. The objective of the model application is to assist in the assessment of flood risk by providing the peak flows of 2011 Segamat flood for the generation of flood mapping of Segamat town. The capability of the model was evaluated by comparing the historical observed data with the simulation results of the selected flood events. The model calibration and validation efficiency was verified using Nash-Sutcliffe model efficiency coefficient. The results demonstrate the interest to implement the hydrological model for assessing flood risk where the simulated peak flow result is in agreement with historical observed data. The model efficiency of the calibrated and validated exercises is 0.90 and 0.76 respectively, which is acceptable.

  12. An assessment Model for Customer Relationship Management Process in Iranian Private-Commercial Banks

    Directory of Open Access Journals (Sweden)

    Tahmoures Hasangholi Pour

    2012-03-01

    Full Text Available According to several reports, in spite of huge investment on customer relationship management (CRM, risk of implementing such projects is high. One of failure factors is having no method to assess CRM success comprehensively. Nowadays, classic financial methods are common ways for assessing marketing and CRM initiatives. But, the mentioned models are unsuitable to assess investments like CRM that we expect to have intangible, indirect and strategic benefits. So, we need a process-oriented approach to assess all tangible and intangible factors of CRM process and complete existing models. In this paper, using a qualitative approach and grounded theory, a comprehensive and process-oriented CRM assessment model will be provided that considers all factors of customer relationship management in private-commercial banks of Iran. Finally, based on analysis of developed model, some suggestions will be offered for banking managers and future researchers.

  13. Assessment of the Clinical Trainer as a Role Model: A Role Model Apperception Tool (RoMAT)

    NARCIS (Netherlands)

    Jochemsen-van der Leeuw, H. G. A. Ria; van Dijk, Nynke; Wieringa-de Waard, Margreet

    2014-01-01

    Purpose Positive role modeling by clinical trainers is important for helping trainees learn professional and competent behavior. The authors developed and validated an instrument to assess clinical trainers as role models: the Role Model Apperception Tool (RoMAT). Method On the basis of a 2011

  14. An Introduction to the Partial Credit Model for Developing Nursing Assessments.

    Science.gov (United States)

    Fox, Christine

    1999-01-01

    Demonstrates how the partial credit model, a variation of the Rasch Measurement Model, can be used to develop performance-based assessments for nursing education. Applies the model using the Practical Knowledge Inventory for Nurses. (SK)

  15. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate

  16. Modeling LCD Displays with Local Backlight Dimming for Image Quality Assessment

    DEFF Research Database (Denmark)

    Korhonen, Jari; Burini, Nino; Forchhammer, Søren

    2011-01-01

    for evaluating the signal quality distortion related directly to digital signal processing, such as compression. However, the physical characteristics of the display device also pose a significant impact on the overall perception. In order to facilitate image quality assessment on modern liquid crystaldisplays...... (LCD) using light emitting diode (LED) backlight with local dimming, we present the essential considerations and guidelines for modeling the characteristics of displays with high dynamic range (HDR) and locally adjustable backlight segments. The representation of the image generated by the model can...... be assessed using the traditional objective metrics, and therefore the proposed approach is useful for assessing the performance of different backlight dimming algorithms in terms of resulting quality and power consumption in a simulated environment. We have implemented the proposed model in C++ and compared...

  17. Hybrid LCA model for assessing the embodied environmental impacts of buildings in South Korea

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Minho, E-mail: minmin40@hanmail.net [Asset Management Division, Mate Plus Co., Ltd., 9th Fl., Financial News Bldg. 24-5 Yeouido-dong, Yeongdeungpo-gu, Seoul, 150-877 (Korea, Republic of); Hong, Taehoon, E-mail: hong7@yonsei.ac.kr [Department of Architectural Engineering, Yonsei University, Seoul, 120-749 (Korea, Republic of); Ji, Changyoon, E-mail: chnagyoon@yonsei.ac.kr [Department of Architectural Engineering, Yonsei University, Seoul, 120-749 (Korea, Republic of)

    2015-01-15

    The assessment of the embodied environmental impacts of buildings can help decision-makers plan environment-friendly buildings and reduce environmental impacts. For a more comprehensive assessment of the embodied environmental impacts of buildings, a hybrid life cycle assessment model was developed in this study. The developed model can assess the embodied environmental impacts (global warming, ozone layer depletion, acidification, eutrophication, photochemical ozone creation, abiotic depletion, and human toxicity) generated directly and indirectly in the material manufacturing, transportation, and construction phases. To demonstrate the application and validity of the developed model, the environmental impacts of an elementary school building were assessed using the developed model and compared with the results of a previous model used in a case study. The embodied environmental impacts from the previous model were lower than those from the developed model by 4.6–25.2%. Particularly, human toxicity potential (13 kg C{sub 6}H{sub 6} eq.) calculated by the previous model was much lower (1965 kg C{sub 6}H{sub 6} eq.) than what was calculated by the developed model. The results indicated that the developed model can quantify the embodied environmental impacts of buildings more comprehensively, and can be used by decision-makers as a tool for selecting environment-friendly buildings. - Highlights: • The model was developed to assess the embodied environmental impacts of buildings. • The model evaluates GWP, ODP, AP, EP, POCP, ADP, and HTP as environmental impacts. • The model presents more comprehensive results than the previous model by 4.6–100%. • The model can present the HTP of buildings, which the previous models cannot do. • Decision-makers can use the model for selecting environment-friendly buildings.

  18. Biosphere modeling in waste disposal safety assessments -- An example using the terrestrial-aquatic model of the environment

    International Nuclear Information System (INIS)

    Klos, R.A.

    1998-01-01

    Geological disposal of radioactive wastes is intended to provide long-term isolation of potentially harmful radionuclides from the human environment and the biosphere. The long timescales involved pose unique problems for biosphere modeling because there are considerable uncertainties regarding the state of the biosphere into which releases might ultimately occur. The key to representing the biosphere in long-timescale assessments is the flexibility with which those aspects of the biosphere that are of relevance to dose calculations are represented, and this comes from the way in which key biosphere features, events, and processes are represented in model codes. How this is done in contemporary assessments is illustrated by the Terrestrial-Aquatic Model of the Environment (TAME), an advanced biosphere model for waste disposal assessments recently developed in Switzerland. A numerical example of the release of radionuclides from a subterranean source to an inland valley biosphere is used to illustrate how biosphere modeling is carried out and the practical ways in which meaningful quantitative results can be achieved. The results emphasize the potential for accumulation of radionuclides in the biosphere over long timescales and also illustrate the role of parameter values in such modeling

  19. Momasi Model in Need Assessment of Faculty Members of Alborz University

    Directory of Open Access Journals (Sweden)

    S. Esmaelzadeh

    2013-02-01

    Full Text Available Background: The first step in developing human resources to improve the performance of universities is to indentify accurate educational needs. Models may draw on a number of theories to help understand a particular problem in a certain setting or context. Momasi model is an integrated of the existing models in educational needs assessment field which has sufficient comprehensiveness of data collection. the aim of this study was application of Momasi model in need assessment of faculty members in seven areas duties. Methods: This study is a cross- sectional study which was formed based on Momasi model between34 faculty members of Alborz university. Results: Different areas of educational needs were respectively prioritized as: personal development, research, administrative and executive activities, education, health services and health promotion, and specialized activities outside the university. The most mean and standard deviation belong to area of research, The first priority in the area of research was the publications in English, in personal development area: familiarity with SPSS software ,and the area of education it was creativity nurture. Conclusion: Based on assessment results, research area in this needs assessment study has the most important priority and frequency. Therefore it is recommended that data gathered in research area section put in first priority for empowering for faculty members Of Alborz University.

  20. Determinants of Dermal Exposure Relevant for Exposure Modelling in Regulatory Risk Assessment

    NARCIS (Netherlands)

    Marquart, J.; Brouwer, D.H.; Gijsbers, J.H.J.; Links, I.H.M.; Warren, N.; Hemmen, J.J. van

    2003-01-01

    Risk assessment of chemicals requires assessment of the exposure levels of workers. In the absence of adequate specific measured data, models are often used to estimate exposure levels. For dermal exposure only a few models exist, which are not validated externally. In the scope of a large European

  1. Towards a Model and Methodology for Assessing Student Learning Outcomes and Satisfaction

    Science.gov (United States)

    Duque, Lola C.; Weeks, John R.

    2010-01-01

    Purpose: The purpose of this paper is threefold: first, to introduce a conceptual model for assessing undergraduate student learning outcomes and satisfaction that involves concepts drawn from the services marketing and assessment literatures; second, to illustrate the utility of the model as implemented in an academic department (geography)…

  2. Goodness-of-Fit Assessment of Item Response Theory Models

    Science.gov (United States)

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  3. Measurement Rounding Errors in an Assessment Model of Project Led Engineering Education

    Directory of Open Access Journals (Sweden)

    Francisco Moreira

    2009-11-01

    Full Text Available This paper analyzes the rounding errors that occur in the assessment of an interdisciplinary Project-Led Education (PLE process implemented in the Integrated Master degree on Industrial Management and Engineering (IME at University of Minho. PLE is an innovative educational methodology which makes use of active learning, promoting higher levels of motivation and students’ autonomy. The assessment model is based on multiple evaluation components with different weights. Each component can be evaluated by several teachers involved in different Project Supporting Courses (PSC. This model can be affected by different types of errors, namely: (1 rounding errors, and (2 non-uniform criteria of rounding the grades. A rigorous analysis of the assessment model was made and the rounding errors involved on each project component were characterized and measured. This resulted in a global maximum error of 0.308 on the individual student project grade, in a 0 to 100 scale. This analysis intended to improve not only the reliability of the assessment results, but also teachers’ awareness of this problem. Recommendations are also made in order to improve the assessment model and reduce the rounding errors as much as possible.

  4. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  5. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  6. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    Science.gov (United States)

    Bonasia, Rosanna; Scaini, Chirara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2013-01-01

    Popocatépetl is one of Mexico’s most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene–Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl’s reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the “Ochre Pumice” Plinian eruption (4965 14C

  7. Efficiency assessment models of higher education institution staff activity

    Directory of Open Access Journals (Sweden)

    K. A. Dyusekeyev

    2016-01-01

    Full Text Available The paper substantiates the necessity of improvement of university staff incentive system under the conditions of competition in the field of higher education, the necessity to develop a separate model for the evaluation of the effectiveness of the department heads. The authors analysed the methods for assessing production function of units. The advantage of the application of the methods to assess the effectiveness of border economic structures in the field of higher education is shown. The choice of the data envelopment analysis method to solve the problem has proved. The model for evaluating of university departments activity on the basis of the DEAmethodology has developed. On the basis of operating in Russia, Kazakhstan and other countries universities staff pay systems the structure of the criteria system for university staff activity evaluation has been designed. For clarification and specification of the departments activity efficiency criteria a strategic map has been developed that allowed us to determine the input and output parameters of the model. DEA-methodology using takes into account a large number of input and output parameters, increases the assessment objectivity by excluding experts, receives interim data to identify the strengths and weaknesses of the evaluated object.

  8. Modeling Composite Assessment Data Using Item Response Theory

    Science.gov (United States)

    Ueckert, Sebastian

    2018-01-01

    Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119

  9. Introduction to the special section on mixture modeling in personality assessment.

    Science.gov (United States)

    Wright, Aidan G C; Hallquist, Michael N

    2014-01-01

    Latent variable models offer a conceptual and statistical framework for evaluating the underlying structure of psychological constructs, including personality and psychopathology. Complex structures that combine or compare categorical and dimensional latent variables can be accommodated using mixture modeling approaches, which provide a powerful framework for testing nuanced theories about psychological structure. This special series includes introductory primers on cross-sectional and longitudinal mixture modeling, in addition to empirical examples applying these techniques to real-world data collected in clinical settings. This group of articles is designed to introduce personality assessment scientists and practitioners to a general latent variable framework that we hope will stimulate new research and application of mixture models to the assessment of personality and its pathology.

  10. A Third-Order Item Response Theory Model for Modeling the Effects of Domains and Subdomains in Large-Scale Educational Assessment Surveys

    Science.gov (United States)

    Rijmen, Frank; Jeon, Minjeong; von Davier, Matthias; Rabe-Hesketh, Sophia

    2014-01-01

    Second-order item response theory models have been used for assessments consisting of several domains, such as content areas. We extend the second-order model to a third-order model for assessments that include subdomains nested in domains. Using a graphical model framework, it is shown how the model does not suffer from the curse of…

  11. Multi-model approach to assess the impact of climate change on runoff

    Science.gov (United States)

    Dams, J.; Nossent, J.; Senbeta, T. B.; Willems, P.; Batelaan, O.

    2015-10-01

    The assessment of climate change impacts on hydrology is subject to uncertainties related to the climate change scenarios, stochastic uncertainties of the hydrological model and structural uncertainties of the hydrological model. This paper focuses on the contribution of structural uncertainty of hydrological models to the overall uncertainty of the climate change impact assessment. To quantify the structural uncertainty of hydrological models, four physically based hydrological models (SWAT, PRMS and a semi- and fully distributed version of the WetSpa model) are set up for a catchment in Belgium. Each model is calibrated using four different objective functions. Three climate change scenarios with a high, mean and low hydrological impact are statistically perturbed from a large ensemble of climate change scenarios and are used to force the hydrological models. This methodology allows assessing and comparing the uncertainty introduced by the climate change scenarios with the uncertainty introduced by the hydrological model structure. Results show that the hydrological model structure introduces a large uncertainty on both the average monthly discharge and the extreme peak and low flow predictions under the climate change scenarios. For the low impact climate change scenario, the uncertainty range of the mean monthly runoff is comparable to the range of these runoff values in the reference period. However, for the mean and high impact scenarios, this range is significantly larger. The uncertainty introduced by the climate change scenarios is larger than the uncertainty due to the hydrological model structure for the low and mean hydrological impact scenarios, but the reverse is true for the high impact climate change scenario. The mean and high impact scenarios project increasing peak discharges, while the low impact scenario projects increasing peak discharges only for peak events with return periods larger than 1.6 years. All models suggest for all scenarios a

  12. Addressing dependability by applying an approach for model-based risk assessment

    International Nuclear Information System (INIS)

    Gran, Bjorn Axel; Fredriksen, Rune; Thunem, Atoosa P.-J.

    2007-01-01

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development

  13. Addressing dependability by applying an approach for model-based risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

    2007-11-15

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

  14. Industrial process system assessment: bridging process engineering and life cycle assessment through multiscale modeling.

    Science.gov (United States)

    The Industrial Process System Assessment (IPSA) methodology is a multiple step allocation approach for connecting information from the production line level up to the facility level and vice versa using a multiscale model of process systems. The allocation procedure assigns inpu...

  15. Model assessment using a multi-metric ranking technique

    Science.gov (United States)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  16. Utility of Social Modeling for Proliferation Assessment - Preliminary Findings

    International Nuclear Information System (INIS)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-01-01

    Often the methodologies for assessing proliferation risk are focused around the inherent vulnerability of nuclear energy systems and associated safeguards. For example an accepted approach involves ways to measure the intrinsic and extrinsic barriers to potential proliferation. This paper describes preliminary investigation into non-traditional use of social and cultural information to improve proliferation assessment and advance the approach to assessing nuclear material diversion. Proliferation resistance assessment, safeguard assessments and related studies typically create technical information about the vulnerability of a nuclear energy system to diversion of nuclear material. The purpose of this research project is to find ways to integrate social information with technical information by explicitly considering the role of culture, groups and/or individuals to factors that impact the possibility of proliferation. When final, this work is expected to describe and demonstrate the utility of social science modeling in proliferation and proliferation risk assessments.

  17. Integrated assessment models of climate change. An incomplete overview

    International Nuclear Information System (INIS)

    Dowlatabadi, H.

    1995-01-01

    Integrated assessment is a trendy phrase that has recently entered the vocabulary of folks in Washington, DC and elsewhere. The novelty of the term in policy analysis and policy making circles belies the longevity of this approach in the sciences and past attempts at their application to policy issues. This paper is an attempt at providing an overview of integrated assessment with a special focus on policy motivated integrated assessments of climate change. The first section provides an introduction to integrated assessments in general, followed by a discussion of the bounds to the climate change issue. The next section is devoted to a taxonomy of the policy motivated models. Then the integrated assessment effort at Carnegie Mellon is described briefly. A perspective on the challenges ahead in successful representation of natural and social dynamics in integrated assessments of global climate change is presented in the final section. (Author)

  18. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    Science.gov (United States)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  19. Development of model for assessment of radiation discharge to the environment

    International Nuclear Information System (INIS)

    Shang Zhaorong; Wu Hao; Liu Hua

    2003-01-01

    International Atomic Energy Agency (IAEA) establish basic and detailed requirements for protection against the risks associated with exposure to radiation and for the safety of radiation sources that may deliver such exposure, in which, particularly emphasize to 'make an assessment of the nature, magnitude and likelihood of the exposure attributed to the source'. It is clear that the assessment of the consequential radiation exposure arising from any releases of radioactive materials to the environment will have to rely on some form of model. This paper summary recent progress in radiation protection policy and radioecology research and primary concludes the basis requirements in assessment model development

  20. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  1. Modelling Tradescantia fluminensis to assess long term survival

    Directory of Open Access Journals (Sweden)

    Alex James

    2015-06-01

    Full Text Available We present a simple Poisson process model for the growth of Tradescantia fluminensis, an invasive plant species that inhibits the regeneration of native forest remnants in New Zealand. The model was parameterised with data derived from field experiments in New Zealand and then verified with independent data. The model gave good predictions which showed that its underlying assumptions are sound. However, this simple model had less predictive power for outputs based on variance suggesting that some assumptions were lacking. Therefore, we extended the model to include higher variability between plants thereby improving its predictions. This high variance model suggests that control measures that promote node death at the base of the plant or restrict the main stem growth rate will be more effective than those that reduce the number of branching events. The extended model forms a good basis for assessing the efficacy of various forms of control of this weed, including the recently-released leaf-feeding tradescantia leaf beetle (Neolema ogloblini.

  2. Avian collision risk models for wind energy impact assessments

    International Nuclear Information System (INIS)

    Masden, E.A.; Cook, A.S.C.P.

    2016-01-01

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  3. Avian collision risk models for wind energy impact assessments

    Energy Technology Data Exchange (ETDEWEB)

    Masden, E.A., E-mail: elizabeth.masden@uhi.ac.uk [Environmental Research Institute, North Highland College-UHI, University of the Highlands and Islands, Ormlie Road, Thurso, Caithness KW14 7EE (United Kingdom); Cook, A.S.C.P. [British Trust for Ornithology, The Nunnery, Thetford IP24 2PU (United Kingdom)

    2016-01-15

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  4. The development and implementation of a decision-making capacity assessment model.

    Science.gov (United States)

    Parmar, Jasneet; Brémault-Phillips, Suzette; Charles, Lesley

    2015-03-01

    Decision-making capacity assessment (DMCA) is an issue of increasing importance for older adults. Current challenges need to be explored, and potential processes and strategies considered in order to address issues of DMCA in a more coordinated manner. An iterative process was used to address issues related to DMCA. This began with recognition of challenges associated with capacity assessments (CAs) by staff at Covenant Health (CH). Review of the literature, as well as discussions with and a survey of staff at three CH sites, resulted in determination of issues related to DMCA. Development of a DMCA Model and demonstration of its feasibility followed. A process was proposed with front-end screening/problem- solving, a well-defined standard assessment, and definition of team member roles. A Capacity Assessment Care Map was formulated based on the process. Documentation was developed consisting of a Capacity Assessment Process Worksheet, Capacity Interview Worksheet, and a brochure. Interactive workshops were delivered to familiarize staff with the DMCA Model. A successful demonstration project led to implementation across all sites in the Capital Health region, and eventual provincial endorsement. Concerns identified in the survey and in the literature regarding CA were addressed through the holistic interdisciplinary approach offered by the DMCA Model.

  5. Development of good modelling practice for phsiologically based pharmacokinetic models for use in risk assessment: The first steps

    Science.gov (United States)

    The increasing use of tissue dosimetry estimated using pharmacokinetic models in chemical risk assessments in multiple countries necessitates the need to develop internationally recognized good modelling practices. These practices would facilitate sharing of models and model eva...

  6. Spatial variability and parametric uncertainty in performance assessment models

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo

    2011-01-01

    The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)

  7. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... data are collected from published scientific research. The bias and the root-mean-square error, as well as the scatter index, are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example, this paper presents how the quantified...... uncertainties can be implemented in probabilistic reliability assessments....

  8. Determination of Wave Model Uncertainties used for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...... be implemented in probabilistic reliability assessments....

  9. Bayesian Comparison of Alternative Graded Response Models for Performance Assessment Applications

    Science.gov (United States)

    Zhu, Xiaowen; Stone, Clement A.

    2012-01-01

    This study examined the relative effectiveness of Bayesian model comparison methods in selecting an appropriate graded response (GR) model for performance assessment applications. Three popular methods were considered: deviance information criterion (DIC), conditional predictive ordinate (CPO), and posterior predictive model checking (PPMC). Using…

  10. Development of a Transgenic Model to Assess Bioavailable Genotoxicity in Sediments

    National Research Council Canada - National Science Library

    1999-01-01

    This technical note describes the rationale for using transgenic animal models to assess the potential genotoxicity of sediments, the benefits that can be obtained using such models versus currently...

  11. Uncertainty analysis in agent-based modelling and consequential life cycle assessment coupled models : a critical review

    NARCIS (Netherlands)

    Baustert, P.M.; Benetto, E.

    2017-01-01

    The evolution of life cycle assessment (LCA) from a merely comparative tool for the assessment of products to a policy analysis tool proceeds by incorporating increasingly complex modelling approaches. In more recent studies of complex systems, such as the agriculture sector or mobility, agent-based

  12. An analytical model for the assessment of airline expansion strategies

    Directory of Open Access Journals (Sweden)

    Mauricio Emboaba Moreira

    2014-01-01

    Full Text Available Purpose: The purpose of this article is to develop an analytical model to assess airline expansion strategies by combining generic business strategy models with airline business models. Methodology and approach: A number of airline business models are examined, as are Porter’s (1983 industry five forces that drive competition, complemented by Nalebuff/ Brandenburger’s  (1996 sixth force, and the basic elements of the general environment in which the expansion process takes place.  A system of points and weights is developed to create a score among the 904,736 possible combinations considered. The model’s outputs are generic expansion strategies with quantitative assessments for each specific combination of elements inputted. Originality and value: The analytical model developed is original because it combines for the first time and explicitly elements of the general environment, industry environment, airline business models and the generic expansion strategy types. Besides it creates a system of scores that may be used to drive the decision process toward the choice of a specific strategic expansion path. Research implications: The analytical model may be adapted to other industries apart from the airline industry by substituting the element “airline business model” by other industries corresponding elements related to the different specific business models.

  13. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  14. Literature Review and Assessment of Plant and Animal Transfer Factors Used in Performance Assessment Modeling

    International Nuclear Information System (INIS)

    Robertson, David E.; Cataldo, Dominic A.; Napier, Bruce A.; Krupka, Kenneth M.; Sasser, Lyle B.

    2003-01-01

    A literature review and assessment was conducted by Pacific Northwest National Laboratory (PNNL) to update information on plant and animal radionuclide transfer factors used in performance-assessment modeling. A group of 15 radionuclides was included in this review and assessment. The review is composed of four main sections, not including the Introduction. Section 2.0 provides a review of the critically important issue of physicochemical speciation and geochemistry of the radionuclides in natural soil-water systems as it relates to the bioavailability of the radionuclides. Section 3.0 provides an updated review of the parameters of importance in the uptake of radionuclides by plants, including root uptake via the soil-groundwater system and foliar uptake due to overhead irrigation. Section 3.0 also provides a compilation of concentration ratios (CRs) for soil-to-plant uptake for the 15 selected radionuclides. Section 4.0 provides an updated review on radionuclide uptake data for animal products related to absorption, homeostatic control, approach to equilibration, chemical and physical form, diet, and age. Compiled transfer coefficients are provided for cow's milk, sheep's milk, goat's milk, beef, goat meat, pork, poultry, and eggs. Section 5.0 discusses the use of transfer coefficients in soil, plant, and animal modeling using regulatory models for evaluating radioactive waste disposal or decommissioned sites. Each section makes specific suggestions for future research in its area.

  15. Literature Review and Assessment of Plant and Animal Transfer Factors Used in Performance Assessment Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, David E.; Cataldo, Dominic A.; Napier, Bruce A.; Krupka, Kenneth M.; Sasser, Lyle B.

    2003-07-20

    A literature review and assessment was conducted by Pacific Northwest National Laboratory (PNNL) to update information on plant and animal radionuclide transfer factors used in performance-assessment modeling. A group of 15 radionuclides was included in this review and assessment. The review is composed of four main sections, not including the Introduction. Section 2.0 provides a review of the critically important issue of physicochemical speciation and geochemistry of the radionuclides in natural soil-water systems as it relates to the bioavailability of the radionuclides. Section 3.0 provides an updated review of the parameters of importance in the uptake of radionuclides by plants, including root uptake via the soil-groundwater system and foliar uptake due to overhead irrigation. Section 3.0 also provides a compilation of concentration ratios (CRs) for soil-to-plant uptake for the 15 selected radionuclides. Section 4.0 provides an updated review on radionuclide uptake data for animal products related to absorption, homeostatic control, approach to equilibration, chemical and physical form, diet, and age. Compiled transfer coefficients are provided for cow’s milk, sheep’s milk, goat’s milk, beef, goat meat, pork, poultry, and eggs. Section 5.0 discusses the use of transfer coefficients in soil, plant, and animal modeling using regulatory models for evaluating radioactive waste disposal or decommissioned sites. Each section makes specific suggestions for future research in its area.

  16. Development and Exemplification of a Model for Teacher Assessment in Primary Science

    Science.gov (United States)

    Davies, D. J.; Earle, S.; McMahon, K.; Howe, A.; Collier, C.

    2017-01-01

    The Teacher Assessment in Primary Science project is funded by the Primary Science Teaching Trust and based at Bath Spa University. The study aims to develop a whole-school model of valid, reliable and manageable teacher assessment to inform practice and make a positive impact on primary-aged children's learning in science. The model is based on a…

  17. EASETECH – A LCA model for assessment of environmental technologies

    DEFF Research Database (Denmark)

    Damgaard, Anders; Baumeister, Hubert; Astrup, Thomas Fruergaard

    2014-01-01

    EASETECH is a new model for the environmental assessment of environmental technologies developed in collaboration between DTU Environment and DTU Compute. EASETECH is based on experience gained in the field of waste management modelling over the last decade and applies the same concepts to systems...

  18. Development of Web-Based Formative Assessment Model to Enhance Physics Concepts of Students

    Directory of Open Access Journals (Sweden)

    Ediyanto Ediyanto

    2015-03-01

    Full Text Available Pengembangan Model Penilaian Formatif Berbasis Web untuk Meningkatkan Pemahaman Konsep Fisika Siswa   Abstract: There are two approaches of learning assessment, called formative and summative. The formative assessment is applicable because it involves students directly during the process, may im-prove these students perceptive. The limited time in class makes this process difficult, then the de-velopment of both online and offline formative assessment, provide responsive feedback for teachers and students, is definitely needed. This research goal is to produce a model of web-based formative assessment for physics. This study used research design and development of the formative assess-ment-model. Questionnaire is used for product validation, consist of validation of textbook,  instrument of pre and post-learning quizzes and web product.The result of quantitative analysis shows that the developed product is valid without any revision. Based on qualitative data, the product revision follows comments and suggestions from expert’s validation, teachers and students. The product testing shows that the formative assessment-model may improve students’ conceptual comprehension. Key Words: formatice assessment-model, students’ conceptual comprehension of physics, web-based   Abstrak: Penilaian terbagi menjadi dua macam yaitu penilaian formatif dan penilaian sumatif. Penilaian formatif tepat digunakan karena prosesnya melibatkan siswa secara langsung di dalam proses pembelajaran dan mampu meningkatkan pemahaman konsep siswa. Keterbatasan waktu di kelas menyebabkan proses ini sulit dilakukan, maka perlu dikembangkan model penilaian formatif secara online dan off-line yang dapat memberikan umpan balik yang cepat bagi siswa dan guru. Tujuan dari penelitian adalah menghasilkan model web-based penilaian formatif untuk pembelajaran fisika. Penelitian menggunakan rancangan penelitian dan pengembangan model penilaian formatif. Instrumen yang digunakan

  19. Indoor Air Nuclear, Biological, and Chemical Health Modeling and Assessment System

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, Robert D.; Hadley, Donald L.; Armstrong, Peter R.; Buck, John W.; Hoopes, Bonnie L.; Janus, Michael C.

    2001-03-01

    Indoor air quality effects on human health are of increasing concern to public health agencies and building owners. The prevention and treatment of 'sick building' syndrome and the spread of air-borne diseases in hospitals, for example, are well known priorities. However, increasing attention is being directed to the vulnerability of our public buildings/places, public security and national defense facilities to terrorist attack or the accidental release of air-borne biological pathogens, harmful chemicals, or radioactive contaminants. The Indoor Air Nuclear, Biological, and Chemical Health Modeling and Assessment System (IA-NBC-HMAS) was developed to serve as a health impact analysis tool for use in addressing these concerns. The overall goal was to develop a user-friendly fully functional prototype Health Modeling and Assessment system, which will operate under the PNNL FRAMES system for ease of use and to maximize its integration with other modeling and assessment capabilities accessible within the FRAMES system (e.g., ambient air fate and transport models, water borne fate and transport models, Physiologically Based Pharmacokinetic models, etc.). The prototype IA-NBC-HMAS is designed to serve as a functional Health Modeling and Assessment system that can be easily tailored to meet specific building analysis needs of a customer. The prototype system was developed and tested using an actual building (i.e., the Churchville Building located at the Aberdeen Proving Ground) and release scenario (i.e., the release and measurement of tracer materials within the building) to ensure realism and practicality in the design and development of the prototype system. A user-friendly "demo" accompanies this report to allow the reader the opportunity for a "hands on" review of the prototype system's capability.

  20. Model summary report for the safety assessment SFR 1 SAR-08

    Energy Technology Data Exchange (ETDEWEB)

    2008-03-15

    This document is the model summary report for the safety assessment SFR 1 SAR-08. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SFR1 SAR-08, a number of different computer codes are used. In order to better understand how these codes are related an Assessment Model Flowchart, AMF, has been produced within the project. From the AMF, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A number of different computer codes are used in the assessment of which some are commercial while others are developed for assessment projects. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met

  1. Model summary report for the safety assessment SFR 1 SAR-08

    International Nuclear Information System (INIS)

    2008-03-01

    This document is the model summary report for the safety assessment SFR 1 SAR-08. In the report, the quality assurance measures conducted for the assessment codes are presented together with the chosen methodology. In the safety assessment SFR1 SAR-08, a number of different computer codes are used. In order to better understand how these codes are related an Assessment Model Flowchart, AMF, has been produced within the project. From the AMF, it is possible to identify the different modelling tasks and consequently also the different computer codes used. A number of different computer codes are used in the assessment of which some are commercial while others are developed for assessment projects. QA requirements must on the one hand take this diversity into account and on the other hand be well defined. In the methodology section of the report the following requirements are defined: - It must be demonstrated that the code is suitable for its purpose. - It must be demonstrated that the code has been properly used. - It must be demonstrated that the code development process has followed appropriate procedures and that the code produces accurate results. Although the requirements are identical for all codes, the measures used to show that the requirements are fulfilled will be different for different codes (for instance due to the fact that for some software the source-code is not available for review). Subsequent to the methodology section, each assessment code is presented and it is shown how the requirements are met

  2. Indoor Air Quality Building Education and Assessment Model Forms

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  3. An integrated model for the Environmental Impact Assessment of Highways in China

    DEFF Research Database (Denmark)

    Cai, Hao

    2011-01-01

    In China, environmental issues caused by construction and operation of highway catch more and more attention, and thus environmental impact assessment of highway has become an important part of feasibility study. According to the Specifications for Environmental Impact Assessment of Highways...... in the People's Republic of China, this paper proposes an integrated model for Environmental Impact Assessment (EIA) of highway. The model has two main characteristics. Firstly, the whole highway is divided into several sections, and then weight of each section is distributed by its importance. Secondly...

  4. Ecological models for regulatory risk assessments of pesticides: Developing a strategy for the future.

    NARCIS (Netherlands)

    Thorbek, P.; Forbes, V.; Heimbach, F.; Hommen, U.; Thulke, H.H.; Brink, van den P.J.

    2010-01-01

    Ecological Models for Regulatory Risk Assessments of Pesticides: Developing a Strategy for the Future provides a coherent, science-based view on ecological modeling for regulatory risk assessments. It discusses the benefits of modeling in the context of registrations, identifies the obstacles that

  5. Multi-scale, multi-model assessment of projected land allocation

    Science.gov (United States)

    Vernon, C. R.; Huang, M.; Chen, M.; Calvin, K. V.; Le Page, Y.; Kraucunas, I.

    2017-12-01

    Effects of land use and land cover change (LULCC) on climate are generally classified into two scale-dependent processes: biophysical and biogeochemical. An extensive amount of research has been conducted related to the impact of each process under alternative climate change futures. However, these studies are generally focused on the impacts of a single process and fail to bridge the gap between sector-driven scale dependencies and any associated dynamics. Studies have been conducted to better understand the relationship of these processes but their respective scale has not adequately captured overall interdependencies between land surface changes and changes in other human-earth systems (e.g., energy, water, economic, etc.). There has also been considerable uncertainty surrounding land use land cover downscaling approaches due to scale dependencies. Demeter, a land use land cover downscaling and change detection model, was created to address this science gap. Demeter is an open-source model written in Python that downscales zonal land allocation projections to the gridded resolution of a user-selected spatial base layer (e.g., MODIS, NLCD, EIA CCI, etc.). Demeter was designed to be fully extensible to allow for module inheritance and replacement for custom research needs, such as flexible IO design to facilitate the coupling of Earth system models (e.g., the Accelerated Climate Modeling for Energy (ACME) and the Community Earth System Model (CESM)) to integrated assessment models (e.g., the Global Change Assessment Model (GCAM)). In this study, we first assessed the sensitivity of downscaled LULCC scenarios at multiple resolutions from Demeter to its parameters by comparing them to historical LULC change data. "Optimal" values of key parameters for each region were identified and used to downscale GCAM-based future scenarios consistent with those in the Land Use Model Intercomparison Project (LUMIP). Demeter-downscaled land use scenarios were then compared to the

  6. Risk assessment of flood disaster and forewarning model at different spatial-temporal scales

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Xu, Jinchao; Guo, Qizhong; Hang, Qingfeng; Chen, Yaqian

    2018-05-01

    Aiming at reducing losses from flood disaster, risk assessment of flood disaster and forewarning model is studied. The model is built upon risk indices in flood disaster system, proceeding from the whole structure and its parts at different spatial-temporal scales. In this study, on the one hand, it mainly establishes the long-term forewarning model for the surface area with three levels of prediction, evaluation, and forewarning. The method of structure-adaptive back-propagation neural network on peak identification is used to simulate indices in prediction sub-model. Set pair analysis is employed to calculate the connection degrees of a single index, comprehensive index, and systematic risk through the multivariate connection number, and the comprehensive assessment is made by assessment matrixes in evaluation sub-model. The comparison judging method is adopted to divide warning degree of flood disaster on risk assessment comprehensive index with forewarning standards in forewarning sub-model and then the long-term local conditions for proposing planning schemes. On the other hand, it mainly sets up the real-time forewarning model for the spot, which introduces the real-time correction technique of Kalman filter based on hydrological model with forewarning index, and then the real-time local conditions for presenting an emergency plan. This study takes Tunxi area, Huangshan City of China, as an example. After risk assessment and forewarning model establishment and application for flood disaster at different spatial-temporal scales between the actual and simulated data from 1989 to 2008, forewarning results show that the development trend for flood disaster risk remains a decline on the whole from 2009 to 2013, despite the rise in 2011. At the macroscopic level, project and non-project measures are advanced, while at the microcosmic level, the time, place, and method are listed. It suggests that the proposed model is feasible with theory and application, thus

  7. An integrated model for the assessment of global water resources – Part 2: Applications and assessments

    Directory of Open Access Journals (Sweden)

    N. Hanasaki

    2008-07-01

    Full Text Available To assess global water resources from the perspective of subannual variation in water availability and water use, an integrated water resources model was developed. In a companion report, we presented the global meteorological forcing input used to drive the model and six modules, namely, the land surface hydrology module, the river routing module, the crop growth module, the reservoir operation module, the environmental flow requirement module, and the anthropogenic withdrawal module. Here, we present the results of the model application and global water resources assessments. First, the timing and volume of simulated agriculture water use were examined because agricultural use composes approximately 85% of total consumptive water withdrawal in the world. The estimated crop calendar showed good agreement with earlier reports for wheat, maize, and rice in major countries of production. In major countries, the error in the planting date was ±1 mo, but there were some exceptional cases. The estimated irrigation water withdrawal also showed fair agreement with country statistics, but tended to be underestimated in countries in the Asian monsoon region. The results indicate the validity of the model and the input meteorological forcing because site-specific parameter tuning was not used in the series of simulations. Finally, global water resources were assessed on a subannual basis using a newly devised index. This index located water-stressed regions that were undetected in earlier studies. These regions, which are indicated by a gap in the subannual distribution of water availability and water use, include the Sahel, the Asian monsoon region, and southern Africa. The simulation results show that the reservoir operations of major reservoirs (>1 km3 and the allocation of environmental flow requirements can alter the population under high water stress by approximately −11% to +5% globally. The integrated model is applicable to

  8. Sustainable BECCS pathways evaluated by an integrated assessment model

    Science.gov (United States)

    Kato, E.

    2017-12-01

    Negative emissions technologies, particularly Bioenergy with Carbon Capture and Storage (BECCS), are key components of mitigation strategies in ambitious future socioeconomic scenarios analysed by integrated assessment models. Generally, scenarios aiming to keep mean global temperature rise below 2°C above pre-industrial would require net negative carbon emissions in the end of the 21st century. Also, in the context of Paris agreement which acknowledges "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of this century", RD&D for the negative emissions technologies in this decade has a crucial role for the possibility of early deployment of the technology. Because of the requirement of potentially extensive use of land and water for producing the bioenergy feedstock to get the anticipated level of gross negative emissions, researches on how to develop sustainable scenarios of BECCS is needed. Here, we present BECCS deployment scenarios that consider economically viable flow of bioenergy system including power generation and conversion process to liquid and gaseous fuels for transportation and heat with consideration of sustainable global biomass use. In the modelling process, detailed bioenergy representations, i.e. various feedstock and conversion technologies with and without CCS, are implemented in an integrated assessment (IA) model GRAPE (Global Relationship Assessment to Protect the Environment). Also, to overcome a general discrepancy about assumed future agricultural yield between 'top-down' IA models and 'bottom-up' estimates, which would crucially affect the land-use pattern, we applied yields change of food and energy crops consistent with process-based biophysical crop models in consideration of changing climate conditions. Using the framework, economically viable strategy for implementing sustainable bioenergy and BECCS flow are evaluated in the scenarios targeting to keep global average

  9. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  10. Assessment of Large Transport Infrastructure Projects: the CBA-DK model

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Banister, David

    2008-01-01

    The scope of this paper is to present a newly developed decision support model to assess transport infrastructure projects: CBA-DK. The model makes use of conventional cost-benefit analysis resulting in aggregated single point estimates and quantitative risk analysis using Monte Carlo simulation...... resulting in interval results. The embedded uncertainties within traditional CBA such as ex-ante based investment costs and travel time savings are of particular concern. The methodological approach has been to apply suitable probability distribution functions on the uncertain parameters, thus resulting...... in feasibility risk assessment moving from point to interval results. Decision support as illustrated in this paper aims to provide assistance in the development and ultimately the choice of action while accounting for the uncertainties surrounding transport appraisal schemes. The modelling framework...

  11. Radionuclide release rates from spent fuel for performance assessment modeling

    International Nuclear Information System (INIS)

    Curtis, D.B.

    1994-01-01

    In a scenario of aqueous transport from a high-level radioactive waste repository, the concentration of radionuclides in water in contact with the waste constitutes the source term for transport models, and as such represents a fundamental component of all performance assessment models. Many laboratory experiments have been done to characterize release rates and understand processes influencing radionuclide release rates from irradiated nuclear fuel. Natural analogues of these waste forms have been studied to obtain information regarding the long-term stability of potential waste forms in complex natural systems. This information from diverse sources must be brought together to develop and defend methods used to define source terms for performance assessment models. In this manuscript examples of measures of radionuclide release rates from spent nuclear fuel or analogues of nuclear fuel are presented. Each example represents a very different approach to obtaining a numerical measure and each has its limitations. There is no way to obtain an unambiguous measure of this or any parameter used in performance assessment codes for evaluating the effects of processes operative over many millennia. The examples are intended to suggest by example that in the absence of the ability to evaluate accuracy and precision, consistency of a broadly based set of data can be used as circumstantial evidence to defend the choice of parameters used in performance assessments

  12. Fixing the cracks in the crystal ball: A maturity model for quantitative risk assessment

    International Nuclear Information System (INIS)

    Rae, Andrew; Alexander, Rob; McDermid, John

    2014-01-01

    Quantitative risk assessment (QRA) is widely practiced in system safety, but there is insufficient evidence that QRA in general is fit for purpose. Defenders of QRA draw a distinction between poor or misused QRA and correct, appropriately used QRA, but this distinction is only useful if we have robust ways to identify the flaws in an individual QRA. In this paper we present a comprehensive maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature and in a collection of risk assessment peer reviews. We provide initial validation of the completeness and realism of the model. Our risk assessment maturity model provides a way to prioritise both process development within an organisation and empirical research within the QRA community. - Highlights: • Quantitative risk assessment (QRA) is widely practiced, but there is insufficient evidence that it is fit for purpose. • A given QRA may be good, or it may not – we need systematic ways to distinguish this. • We have created a maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature. • We have provided initial validation of the completeness and realism of the model. • The maturity model can also be used to prioritise QRA research discipline-wide

  13. Can Bayesian Belief Networks help tackling conceptual model uncertainties in contaminated site risk assessment?

    DEFF Research Database (Denmark)

    Troldborg, Mads; Thomsen, Nanna Isbak; McKnight, Ursula S.

    different conceptual models may describe the same contaminated site equally well. In many cases, conceptual model uncertainty has been shown to be one of the dominant sources for uncertainty and is therefore essential to account for when quantifying uncertainties in risk assessments. We present here......A key component in risk assessment of contaminated sites is the formulation of a conceptual site model. The conceptual model is a simplified representation of reality and forms the basis for the mathematical modelling of contaminant fate and transport at the site. A conceptual model should...... a Bayesian Belief Network (BBN) approach for evaluating the uncertainty in risk assessment of groundwater contamination from contaminated sites. The approach accounts for conceptual model uncertainty by considering multiple conceptual models, each of which represents an alternative interpretation of the site...

  14. Assessing the effect of adding interactive modeling to the geoscience curriculum

    Science.gov (United States)

    Castillo, A.; Marshall, J.; Cardenas, M.

    2013-12-01

    Technology and computer models enhance the learning experience when appropriately utilized. Moreover, learning is significantly improved when effective visualization is combined with models of processes allowing for inquiry-based problem solving. Still, hands-on experiences in real scenarios result in better contextualization of related problems compared to virtual laboratories. Therefore, the role of scientific visualization, technology, and computer modeling is to enhance, not displace, the learning experience by supplementing real-world problem solving and experiences, although in some circumstances, they can adequately serve to take the place of reality. The key to improving scientific education is to embrace an inquiry-based approach that favorably uses technology. This study will attempt to evaluate the effect of adding interactive modeling to the geological sciences curriculum. An assessment tool, designed to assess student understanding of physical hydrology, was used to evaluate a curriculum intervention based on student learning with a data- and modeling-driven approach using COMSOL Multiphysics software. This intervention was implemented in an upper division and graduate physical hydrology course in fall 2012. Students enrolled in the course in fall 2011 served as the control group. Interactive modeling was added to the curriculum in fall 2012 to replace the analogous mathematical modeling done by hand in fall 2011. Pre- and post-test results were used to assess and report its effectiveness. Student interviews were also used to probe student reactions to both the experimental and control curricula. The pre- and post-tests asked students to describe the significant processes in the hydrological cycle and describe the laws governing these processes. Their ability to apply their knowledge in a real-world problem was also assessed. Since the pre- and post-test data failed to meet the assumption of normality, a non-parametric Kruskal-Wallis test was run to

  15. Mechanistic effect modeling for ecological risk assessment: where to go from here?

    Science.gov (United States)

    Grimm, Volker; Martin, Benjamin T

    2013-07-01

    Mechanistic effect models (MEMs) consider the mechanisms of how chemicals affect individuals and ecological systems such as populations and communities. There is an increasing awareness that MEMs have high potential to make risk assessment of chemicals more ecologically relevant than current standard practice. Here we discuss what kinds of MEMs are needed to improve scientific and regulatory aspects of risk assessment. To make valid predictions for a wide range of environmental conditions, MEMs need to include a sufficient amount of emergence, for example, population dynamics emerging from what individual organisms do. We present 1 example where the life cycle of individuals is described using Dynamic Energy Budget theory. The resulting individual-based population model is thus parameterized at the individual level but correctly predicts multiple patterns at the population level. This is the case for both control and treated populations. We conclude that the state-of-the-art in mechanistic effect modeling has reached a level where MEMs are robust and predictive enough to be used in regulatory risk assessment. Mechanistic effect models will thus be used to advance the scientific basis of current standard practice and will, if their development follows Good Modeling Practice, be included in a standardized way in future regulatory risk assessments. Copyright © 2013 SETAC.

  16. Assessing model-based reasoning using evidence-centered design a suite of research-based design patterns

    CERN Document Server

    Mislevy, Robert J; Riconscente, Michelle; Wise Rutstein, Daisy; Ziker, Cindy

    2017-01-01

    This Springer Brief provides theory, practical guidance, and support tools to help designers create complex, valid assessment tasks for hard-to-measure, yet crucial, science education standards. Understanding, exploring, and interacting with the world through models characterizes science in all its branches and at all levels of education. Model-based reasoning is central to science education and thus science assessment. Current interest in developing and using models has increased with the release of the Next Generation Science Standards, which identified this as one of the eight practices of science and engineering. However, the interactive, complex, and often technology-based tasks that are needed to assess model-based reasoning in its fullest forms are difficult to develop. Building on research in assessment, science education, and learning science, this Brief describes a suite of design patterns that can help assessment designers, researchers, and teachers create tasks for assessing aspects of model-based...

  17. Review and assessment of pool scrubbing models

    International Nuclear Information System (INIS)

    Herranz, L.E.; Escudero, M.J.; Peyres, V.; Polo, J.; Lopez-Jimenez, J.

    1996-01-01

    Decontamination of fission products bearing bubbles as they through aqueous pools becomes a crucial phenomenon for source term evaluation of hypothetical risk dominant sequences of Light Water Reactors. In the present report a peer review and assessment of models encapsulated in SPARC andBUSCA codes is presented. Several aspects of pool scrubbing have been addressed: particle removal, fission product vapour retention and bubble hydrodynamics. Particular emphasis has been given to the close link between retention and hydrodynamics, from both modelling and experimental point of view. In addition, RHR and SGTR sequences were simulated with SPARC90 and BUSCA-AUG92 codes, and their results were compared with those obtained with MAAP 3.0B.As a result of this work, model capabilities and shortcomings have beenassessed and some areas susceptible of further research have been identified.(Author) 73 refs

  18. DNB Mechanistic model assessment based on experimental data in narrow rectangular channel

    International Nuclear Information System (INIS)

    Zhou Lei; Yan Xiao; Huang Yanping; Xiao Zejun; Huang Shanfang

    2011-01-01

    The departure from nuclear boiling (DNB) is important concerning about the safety of a PWR. Lacking assessment by experimental data points, it's doubtful whether the existing models can be used in narrow rectangular channels or not. Based on experimental data points in narrow rectangular channels, two kinds of classical DNB models, which include liquid sublayer dryout model (LSDM) and bubble crowding model (BCM), were assessed. The results show that the BCM has much wider application range than the LSDM. Several thermal parameters show systematical influences on the calculated results by the models. The performances of all the models deteriorate as the void fraction increases. The reason may be attributed to the geometrical differences between a circular tube and narrow rectangular channel. (authors)

  19. Evaluation of HVS models in the application of medical image quality assessment

    Science.gov (United States)

    Zhang, L.; Cavaro-Menard, C.; Le Callet, P.

    2012-03-01

    In this study, four of the most widely used Human Visual System (HVS) models are applied on Magnetic Resonance (MR) images for signal detection task. Their performances are evaluated against gold standard derived from radiologists' majority decision. The task-based image quality assessment requires taking into account the human perception specificities, for which various HVS models have been proposed. However to our knowledge, no work was conducted to evaluate and compare the suitability of these models with respect to the assessment of medical image qualities. This pioneering study investigates the performances of different HVS models on medical images in terms of approximation to radiologist performance. We propose to score the performance of each HVS model using the AUC (Area Under the receiver operating characteristic Curve) and its variance estimate as the figure of merit. The radiologists' majority decision is used as gold standard so that the estimated AUC measures the distance between the HVS model and the radiologist perception. To calculate the variance estimate of AUC, we adopted the one-shot method that is independent of the HVS model's output range. The results of this study will help to provide arguments to the application of some HVS model on our future medical image quality assessment metric.

  20. A Model for Predicting Student Performance on High-Stakes Assessment

    Science.gov (United States)

    Dammann, Matthew Walter

    2010-01-01

    This research study examined the use of student achievement on reading and math state assessments to predict success on the science state assessment. Multiple regression analysis was utilized to test the prediction for all students in grades 5 and 8 in a mid-Atlantic state. The prediction model developed from the analysis explored the combined…

  1. Assessment and improvement of condensation model in RELAP5/MOD3

    Energy Technology Data Exchange (ETDEWEB)

    Rho, Hui Cheon; Choi, Kee Yong; Park, Hyeon Sik; Kim, Sang Jae [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of); Lee, Sang Il [Korea Power Engineering Co., Inc., Seoul (Korea, Republic of)

    1997-07-15

    The objective of this research is to remove the uncertainty of the condensation model through the assessment and improvement of the various heat transfer correlations used in the RELAP5/MOD3 code. The condensation model of the standard RELAP5/MOD3 code is systematically arranged and analyzed. A condensation heat transfer database is constructed from the previous experimental data on various condensation phenomena. Based on the constructed database, the condensation models in the code are assessed and improved. An experiment on the reflux condensation in a tube of steam generator in the presence of noncondensable gases is planned to acquire the experimental data.

  2. Impact of Model Detail of Synchronous Machines on Real-time Transient Stability Assessment

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel; Jóhannsson, Hjörtur; Østergaard, Jacob

    2013-01-01

    In this paper, it is investigated how detailed the model of a synchronous machine needs to be in order to assess transient stability using a Single Machine Equivalent (SIME). The results will show how the stability mechanism and the stability assessment are affected by the model detail. In order...... of the machine models is varied. Analyses of the results suggest that a 4th-order model may be sufficient to represent synchronous machines in transient stability studies....

  3. Assessment of PWR Steam Generator modelling in RELAP5/MOD2

    International Nuclear Information System (INIS)

    Putney, J.M.; Preece, R.J.

    1993-06-01

    An assessment of Steam Generator (SG) modelling in the PWR thermal-hydraulic code RELAP5/MOD2 is presented. The assessment is based on a review of code assessment calculations performed in the UK and elsewhere, detailed calculations against a series of commissioning tests carried out on the Wolf Creek PWR and analytical investigations of the phenomena involved in normal and abnormal SG operation. A number of modelling deficiencies are identified and their implications for PWR safety analysis are discussed -- including methods for compensating for the deficiencies through changes to the input deck. Consideration is also given as to whether the deficiencies will still be present in the successor code RELAP5/MOD3

  4. Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments

    Science.gov (United States)

    Manning, Ted A.; Lawrence, Scott L.

    2014-01-01

    As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.

  5. A new assessment model and tool for pediatric nurse practitioners.

    Science.gov (United States)

    Burns, C

    1992-01-01

    This article presents a comprehensive assessment model for pediatric nurse practitioner (PNP) practice that integrates familiar elements of the classical medical history, Gordon's Functional Health Patterns, and developmental fields into one system. This model drives the diagnostic reasoning process toward consideration of a broad range of disease, daily living (nursing diagnosis), and developmental diagnoses, which represents PNP practice better than the medical model does.

  6. Application of the cognitive therapy model to initial crisis assessment.

    Science.gov (United States)

    Calvert, Patricia; Palmer, Christine

    2003-03-01

    This article provides a background to the development of cognitive therapy and cognitive therapeutic skills with a specific focus on the treatment of a depressive episode. It discusses the utility of cognitive therapeutic strategies to the model of crisis theory and initial crisis assessment currently used by the Community Assessment & Treatment Team of Waitemata District Health Board on the North Shore of Auckland, New Zealand. A brief background to cognitive therapy is provided, followed by a comprehensive example of the use of the Socratic questioning method in guiding collaborative assessment and treatment of suicidality by nurses during the initial crisis assessment.

  7. Assessment of health surveys: fitting a multidimensional graded response model.

    Science.gov (United States)

    Depaoli, Sarah; Tiemensma, Jitske; Felt, John M

    The multidimensional graded response model, an item response theory (IRT) model, can be used to improve the assessment of surveys, even when sample sizes are restricted. Typically, health-based survey development utilizes classical statistical techniques (e.g. reliability and factor analysis). In a review of four prominent journals within the field of Health Psychology, we found that IRT-based models were used in less than 10% of the studies examining scale development or assessment. However, implementing IRT-based methods can provide more details about individual survey items, which is useful when determining the final item content of surveys. An example using a quality of life survey for Cushing's syndrome (CushingQoL) highlights the main components for implementing the multidimensional graded response model. Patients with Cushing's syndrome (n = 397) completed the CushingQoL. Results from the multidimensional graded response model supported a 2-subscale scoring process for the survey. All items were deemed as worthy contributors to the survey. The graded response model can accommodate unidimensional or multidimensional scales, be used with relatively lower sample sizes, and is implemented in free software (example code provided in online Appendix). Use of this model can help to improve the quality of health-based scales being developed within the Health Sciences.

  8. The MARINA model (Model to Assess River Inputs of Nutrients to seAs)

    NARCIS (Netherlands)

    Strokal, Maryna; Kroeze, Carolien; Wang, Mengru; Bai, Zhaohai; Ma, Lin

    2016-01-01

    Chinese agriculture has been developing fast towards industrial food production systems that discharge nutrient-rich wastewater into rivers. As a result, nutrient export by rivers has been increasing, resulting in coastal water pollution. We developed a Model to Assess River Inputs of Nutrients

  9. Modelling future impacts of air pollution using the multi-scale UK Integrated Assessment Model (UKIAM).

    Science.gov (United States)

    Oxley, Tim; Dore, Anthony J; ApSimon, Helen; Hall, Jane; Kryza, Maciej

    2013-11-01

    Integrated assessment modelling has evolved to support policy development in relation to air pollutants and greenhouse gases by providing integrated simulation tools able to produce quick and realistic representations of emission scenarios and their environmental impacts without the need to re-run complex atmospheric dispersion models. The UK Integrated Assessment Model (UKIAM) has been developed to investigate strategies for reducing UK emissions by bringing together information on projected UK emissions of SO2, NOx, NH3, PM10 and PM2.5, atmospheric dispersion, criteria for protection of ecosystems, urban air quality and human health, and data on potential abatement measures to reduce emissions, which may subsequently be linked to associated analyses of costs and benefits. We describe the multi-scale model structure ranging from continental to roadside, UK emission sources, atmospheric dispersion of emissions, implementation of abatement measures, integration with European-scale modelling, and environmental impacts. The model generates outputs from a national perspective which are used to evaluate alternative strategies in relation to emissions, deposition patterns, air quality metrics and ecosystem critical load exceedance. We present a selection of scenarios in relation to the 2020 Business-As-Usual projections and identify potential further reductions beyond those currently being planned. © 2013.

  10. Modeling issues associated with production reactor safety assessment

    International Nuclear Information System (INIS)

    Stack, D.W.; Thomas, W.R.

    1990-01-01

    This paper describes several Probabilistic Safety Assessment (PSA) modeling issues that are related to the unique design and operation of the production reactors. The identification of initiating events and determination of a set of success criteria for the production reactors is of concern because of their unique design. The modeling of accident recovery must take into account the unique operation of these reactors. Finally, a more thorough search and evaluation of common-cause events is required to account for combinations of unique design features and operation that might otherwise not be included in the PSA. It is expected that most of these modeling issues also would be encountered when modeling some of the other more unique reactor and nonreactor facilities that are part of the DOE nuclear materials production complex. 9 refs., 2 figs

  11. A computational model for evaluating the effects of attention, memory, and mental models on situation assessment of nuclear power plant operators

    International Nuclear Information System (INIS)

    Lee, Hyun-Chul; Seong, Poong-Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, as failures of situation assessment may cause wrong decisions for process control and finally errors of commission in nuclear power plants. A few computational models that can be used to predict and quantify the situation awareness of operators have been suggested. However, these models do not sufficiently consider human characteristics for nuclear power plant operators. In this paper, we propose a computational model for situation assessment of nuclear power plant operators using a Bayesian network. This model incorporates human factors significantly affecting operators' situation assessment, such as attention, working memory decay, and mental model. As this proposed model provides quantitative results of situation assessment and diagnostic performance, we expect that this model can be used in the design and evaluation of human system interfaces as well as the prediction of situation awareness errors in the human reliability analysis.

  12. A computational model for evaluating the effects of attention, memory, and mental models on situation assessment of nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyun-Chul [Instrumentation and Control/Human Factors Division, Korea Atomic Energy Research Institute, 1045 Daedeok-daero, Yuseong-gu, Daejeon 305-353 (Korea, Republic of)], E-mail: leehc@kaeri.re.kr; Seong, Poong-Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)

    2009-11-15

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, as failures of situation assessment may cause wrong decisions for process control and finally errors of commission in nuclear power plants. A few computational models that can be used to predict and quantify the situation awareness of operators have been suggested. However, these models do not sufficiently consider human characteristics for nuclear power plant operators. In this paper, we propose a computational model for situation assessment of nuclear power plant operators using a Bayesian network. This model incorporates human factors significantly affecting operators' situation assessment, such as attention, working memory decay, and mental model. As this proposed model provides quantitative results of situation assessment and diagnostic performance, we expect that this model can be used in the design and evaluation of human system interfaces as well as the prediction of situation awareness errors in the human reliability analysis.

  13. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  14. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  15. Multilayered Word Structure Model for Assessing Spelling of Finnish Children in Shallow Orthography

    Science.gov (United States)

    Kulju, Pirjo; Mäkinen, Marita

    2017-01-01

    This study explores Finnish children's word-level spelling by applying a linguistically based multilayered word structure model for assessing spelling performance. The model contributes to the analytical qualitative assessment approach in order to identify children's spelling performance for enhancing writing skills. The children (N = 105)…

  16. Arc-related porphyry molybdenum deposit model: Chapter D in Mineral deposit models for resource assessment

    Science.gov (United States)

    Taylor, Ryan D.; Hammarstrom, Jane M.; Piatak, Nadine M.; Seal, Robert R.

    2012-01-01

    This report provides a descriptive model for arc-related porphyry molybdenum deposits. Presented within are geological, geochemical, and mineralogical characteristics that differentiate this deposit type from porphyry copper and alkali-feldspar rhyolite-granite porphyry molybdenum deposits. The U.S. Geological Survey's effort to update existing mineral deposit models spurred this research, which is intended to supplement previously published models for this deposit type that help guide mineral-resource and mineral-environmental assessments.

  17. Computable general equilibrium models for sustainability impact assessment: Status quo and prospects

    International Nuclear Information System (INIS)

    Boehringer, Christoph; Loeschel, Andreas

    2006-01-01

    Sustainability Impact Assessment (SIA) of economic, environmental, and social effects triggered by governmental policies has become a central requirement for policy design. The three dimensions of SIA are inherently intertwined and subject to trade-offs. Quantification of trade-offs for policy decision support requires numerical models in order to assess systematically the interference of complex interacting forces that affect economic performance, environmental quality, and social conditions. This paper investigates the use of computable general equilibrium (CGE) models for measuring the impacts of policy interference on policy-relevant economic, environmental, and social (institutional) indicators. We find that operational CGE models used for energy-economy-environment (E3) analyses have a good coverage of central economic indicators. Environmental indicators such as energy-related emissions with direct links to economic activities are widely covered, whereas indicators with complex natural science background such as water stress or biodiversity loss are hardly represented. Social indicators stand out for very weak coverage, mainly because they are vaguely defined or incommensurable. Our analysis identifies prospects for future modeling in the field of integrated assessment that link standard E3-CGE-models to themespecific complementary models with environmental and social focus. (author)

  18. Natural resource damage assessment models for Great Lakes, coastal, and marine environments

    International Nuclear Information System (INIS)

    French, D.P.; Reed, M.

    1993-01-01

    A computer model of the physical fates, biological effects, and economic damages resulting from releases of oil and other hazardous materials has been developed by Applied Science Associates to be used in Type A natural resource damage assessments under the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA). Natural resource damage assessment models for great lakes environments and for coastal and marine environments will become available. A coupled geographical information system allows gridded representation of complex coastal boundaries, variable bathymetry, shoreline types, and multiple biological habitats. The physical and biological models are three dimensional. Direct mortality from toxic concentrations and oiling, impacts of habitat loss, and food web losses are included in the model. Estimation of natural resource damages is based both on the lost value of injured resources and on the costs of restoring or replacing those resources. The models are implemented on a personal computer, with a VGA graphical user interface. Following public review, the models will become a formal part of the US regulatory framework. The models are programmed in a modular and generic fashion, to facilitate transportability and application to new areas. The model has several major components. Physical fates and biological effects submodels estimate impacts or injury resulting from a spill. The hydrodynamic submodel calculates currents that transport contaminant(s) or organisms. The compensable value submodel values injuries to help assess damages. The restoration submodel determines what restoration actions will most cost-effectively reduce injuries as measured by compensable values. Injury and restoration costs are assessed for each of a series of habitats (environments) affected by the spill. Environmental, chemical, and biological databases supply required information to the model for computing fates and effects (injury)

  19. Modelling human interactions in the assessment of man-made hazards

    International Nuclear Information System (INIS)

    Nitoi, M.; Farcasiu, M.; Apostol, M.

    2016-01-01

    The human reliability assessment tools are not currently capable to model adequately the human ability to adapt, to innovate and to manage under extreme situations. The paper presents the results obtained by ICN PSA team in the frame of FP7 Advanced Safety Assessment Methodologies: extended PSA (ASAMPSA_E) project regarding the investigation of conducting HRA in human-made hazards. The paper proposes to use a 4-steps methodology for the assessment of human interactions in the external events (Definition and modelling of human interactions; Quantification of human failure events; Recovery analysis; Review). The most relevant factors with respect to HRA for man-made hazards (response execution complexity; existence of procedures with respect to the scenario in question; time available for action; timing of cues; accessibility of equipment; harsh environmental conditions) are presented and discussed thoroughly. The challenges identified in relation to man-made hazards HRA are highlighted. (authors)

  20. Common and Critical Components Among Community Health Assessment and Community Health Improvement Planning Models.

    Science.gov (United States)

    Pennel, Cara L; Burdine, James N; Prochaska, John D; McLeroy, Kenneth R

    Community health assessment and community health improvement planning are continuous, systematic processes for assessing and addressing health needs in a community. Since there are different models to guide assessment and planning, as well as a variety of organizations and agencies that carry out these activities, there may be confusion in choosing among approaches. By examining the various components of the different assessment and planning models, we are able to identify areas for coordination, ways to maximize collaboration, and strategies to further improve community health. We identified 11 common assessment and planning components across 18 models and requirements, with a particular focus on health department, health system, and hospital models and requirements. These common components included preplanning; developing partnerships; developing vision and scope; collecting, analyzing, and interpreting data; identifying community assets; identifying priorities; developing and implementing an intervention plan; developing and implementing an evaluation plan; communicating and receiving feedback on the assessment findings and/or the plan; planning for sustainability; and celebrating success. Within several of these components, we discuss characteristics that are critical to improving community health. Practice implications include better understanding of different models and requirements by health departments, hospitals, and others involved in assessment and planning to improve cross-sector collaboration, collective impact, and community health. In addition, federal and state policy and accreditation requirements may be revised or implemented to better facilitate assessment and planning collaboration between health departments, hospitals, and others for the purpose of improving community health.

  1. Semantic modeling of portfolio assessment in e-learning environment

    Directory of Open Access Journals (Sweden)

    Lucila Romero

    2017-01-01

    Full Text Available In learning environment, portfolio is used as a tool to keep track of learner’s progress. Particularly, when it comes to e-learning, continuous assessment allows greater customization and efficiency in learning process and prevents students lost interest in their study. Also, each student has his own characteristics and learning skills that must be taken into account in order to keep learner`s interest. So, personalized monitoring is the key to guarantee the success of technology-based education. In this context, portfolio assessment emerge as the solution because is an easy way to allow teacher organize and personalize assessment according to students characteristic and need. A portfolio assessment can contain various types of assessment like formative assessment, summative assessment, hetero or self-assessment and use different instruments like multiple choice questions, conceptual maps, and essay among others. So, a portfolio assessment represents a compilation of all assessments must be solved by a student in a course, it documents progress and set targets. In previous work, it has been proposed a conceptual framework that consist of an ontology network named AOnet which is a semantic tool conceptualizing different types of assessments. Continuing that work, this paper presents a proposal to implement portfolios assessment in e-learning environments. The proposal consists of a semantic model that describes key components and relations of this domain to set the bases to develop a tool to generate, manage and perform portfolios assessment.

  2. Assessment of groundwater vulnerability by applying the modified DRASTIC model in Beihai City, China.

    Science.gov (United States)

    Wu, Xiaoyu; Li, Bin; Ma, Chuanming

    2018-05-01

    This study assesses vulnerability of groundwater to pollution in Beihai City, China, as a support of groundwater resource protection. The assessment result not only objectively reflects potential possibility of groundwater to contamination but also provides scientific basis for the planning and utilization of groundwater resources. This study optimizes the parameters consisting of natural factors and human factors upon the DRASTIC model and modifies the ratings of these parameters, based on the local environmental conditions for the study area. And a weight of each parameter is assigned by the analytic hierarchy process (AHP) to reduce the subjectivity of humans to vulnerability assessment. The resulting scientific ratings and weights of modified DRASTIC model (AHP-DRASTLE model) contribute to obtain the more realistic assessment of vulnerability of groundwater to contaminant. The comparison analysis validates the accuracy and rationality of the AHP-DRASTLE model and shows it suits the particularity of the study area. The new assessment method (AHP-DRASTLE model) can provide a guide for other scholars to assess the vulnerability of groundwater to contamination. The final vulnerability map for the AHP-DRASTLE model shows four classes: highest (2%), high (29%), low (55%), and lowest (14%). The vulnerability map serves as a guide for decision makers on groundwater resource protection and land use planning at the regional scale and that it is adapted to a specific area.

  3. Physicologically Based Toxicokinetic Models of Tebuconazole and Application in Human Risk Assessment

    DEFF Research Database (Denmark)

    Jonsdottir, Svava Osk; Reffstrup, Trine Klein; Petersen, Annette

    2016-01-01

    (ADME) of tebuconazole. The developed models were validated on in vivo half-life data for rabbit with good results, and on plasma and tissue concentration-time course data of tebuconazole after i.v. administration in rabbit. In most cases, the predicted concentration levels were seen to be within......A series of physiologically based toxicokinetic (PBTK) models for tebuconazole were developed in four species, rat, rabbit, rhesus monkey, and human. The developed models were analyzed with respect to the application of the models in higher tier human risk assessment, and the prospect of using...... such models in risk assessment of cumulative and aggregate exposure is discussed. Relatively simple and biologically sound models were developed using available experimental data as parameters for describing the physiology of the species, as well as the absorption, distribution, metabolism, and elimination...

  4. Analysis of third-party certification approaches using an occupational health and safety conformity-assessment model.

    Science.gov (United States)

    Redinger, C F; Levine, S P

    1998-11-01

    The occupational health and safety conformity-assessment model presented in this article was developed (1) to analyze 22 public and private programs to determine the extent to which these programs use third parties in conformity-assessment determinations, and (2) to establish a framework to guide future policy developments related to the use of third parties in occupational health and safety conformity-assessment activities. The units of analysis for this study included select Occupational Safety and Health Administration programs and standards, International Organization for Standardization-based standards and guidelines, and standards and guidelines developed by nongovernmental bodies. The model is based on a 15-cell matrix that categorizes first-, second-, and third-party activities in terms of assessment, accreditation, and accreditation-recognition activities. The third-party component of the model has three categories: industrial hygiene/safety testing and sampling; product, equipment, and laboratory certification; and, occupational health and safety management system registration/certification. Using the model, 16 of the 22 programs were found to have a third-party component in their conformity-assessment structure. The analysis revealed that (1) the model provides a useful means to describe and analyze various third-party approaches, (2) the model needs modification to capture aspects of traditional governmental conformity-assessment/enforcement activities, and (3) several existing third-party conformity-assessment systems offer robust models that can guide future third-party policy formulation and implementation activities.

  5. Assessment and prediction of air quality using fuzzy logic and autoregressive models

    Science.gov (United States)

    Carbajal-Hernández, José Juan; Sánchez-Fernández, Luis P.; Carrasco-Ochoa, Jesús A.; Martínez-Trinidad, José Fco.

    2012-12-01

    In recent years, artificial intelligence methods have been used for the treatment of environmental problems. This work, presents two models for assessment and prediction of air quality. First, we develop a new computational model for air quality assessment in order to evaluate toxic compounds that can harm sensitive people in urban areas, affecting their normal activities. In this model we propose to use a Sigma operator to statistically asses air quality parameters using their historical data information and determining their negative impact in air quality based on toxicity limits, frequency average and deviations of toxicological tests. We also introduce a fuzzy inference system to perform parameter classification using a reasoning process and integrating them in an air quality index describing the pollution levels in five stages: excellent, good, regular, bad and danger, respectively. The second model proposed in this work predicts air quality concentrations using an autoregressive model, providing a predicted air quality index based on the fuzzy inference system previously developed. Using data from Mexico City Atmospheric Monitoring System, we perform a comparison among air quality indices developed for environmental agencies and similar models. Our results show that our models are an appropriate tool for assessing site pollution and for providing guidance to improve contingency actions in urban areas.

  6. Situating Power Potentials and Dynamics of Learners and Tutors within Self-Assessment Models

    Science.gov (United States)

    Taras, Maddalena

    2016-01-01

    Many twenty-first century educational discourses focus on including and empowering independent learners. Within the context of five self-assessment models, this article evaluates how these practices relate to the realities of student involvement, empowerment and voice. A proposed new classification of these self-assessment models is presented and…

  7. Individual-based model for radiation risk assessment

    Science.gov (United States)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  8. Tropospheric Ozone Assessment Report: Assessment of global-scale model performance for global and regional ozone distributions, variability, and trends

    Directory of Open Access Journals (Sweden)

    P. J. Young

    2018-01-01

    Full Text Available The goal of the Tropospheric Ozone Assessment Report (TOAR is to provide the research community with an up-to-date scientific assessment of tropospheric ozone, from the surface to the tropopause. While a suite of observations provides significant information on the spatial and temporal distribution of tropospheric ozone, observational gaps make it necessary to use global atmospheric chemistry models to synthesize our understanding of the processes and variables that control tropospheric ozone abundance and its variability. Models facilitate the interpretation of the observations and allow us to make projections of future tropospheric ozone and trace gas distributions for different anthropogenic or natural perturbations. This paper assesses the skill of current-generation global atmospheric chemistry models in simulating the observed present-day tropospheric ozone distribution, variability, and trends. Drawing upon the results of recent international multi-model intercomparisons and using a range of model evaluation techniques, we demonstrate that global chemistry models are broadly skillful in capturing the spatio-temporal variations of tropospheric ozone over the seasonal cycle, for extreme pollution episodes, and changes over interannual to decadal periods. However, models are consistently biased high in the northern hemisphere and biased low in the southern hemisphere, throughout the depth of the troposphere, and are unable to replicate particular metrics that define the longer term trends in tropospheric ozone as derived from some background sites. When the models compare unfavorably against observations, we discuss the potential causes of model biases and propose directions for future developments, including improved evaluations that may be able to better diagnose the root cause of the model-observation disparity. Overall, model results should be approached critically, including determining whether the model performance is acceptable for

  9. Conceptual design of an integrated technology model for carbon policy assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Dimotakes, Paul E. (NASA Jet Propulsion Laboratory, Pasadena, CA)

    2011-01-01

    This report describes the conceptual design of a technology choice model for understanding strategies to reduce carbon intensity in the electricity sector. The report considers the major modeling issues affecting technology policy assessment and defines an implementable model construct. Further, the report delineates the basis causal structure of such a model and attempts to establish the technical/algorithmic viability of pursuing model development along with the associated analyses.

  10. Development and exemplification of a model for Teacher Assessment in Primary Science

    Science.gov (United States)

    Davies, D. J.; Earle, S.; McMahon, K.; Howe, A.; Collier, C.

    2017-09-01

    The Teacher Assessment in Primary Science project is funded by the Primary Science Teaching Trust and based at Bath Spa University. The study aims to develop a whole-school model of valid, reliable and manageable teacher assessment to inform practice and make a positive impact on primary-aged children's learning in science. The model is based on a data-flow 'pyramid' (analogous to the flow of energy through an ecosystem), whereby the rich formative assessment evidence gathered in the classroom is summarised for monitoring, reporting and evaluation purposes [Nuffield Foundation. (2012). Developing policy, principles and practice in primary school science assessment. London: Nuffield Foundation]. Using a design-based research (DBR) methodology, the authors worked in collaboration with teachers from project schools and other expert groups to refine, elaborate, validate and operationalise the data-flow 'pyramid' model, resulting in the development of a whole-school self-evaluation tool. In this paper, we argue that a DBR approach to theory-building and school improvement drawing upon teacher expertise has led to the identification, adaptation and successful scaling up of a promising approach to school self-evaluation in relation to assessment in science.

  11. Supplier segmentation model and multicriteria assessment for micro and small enterprise

    Directory of Open Access Journals (Sweden)

    Luiz Felipe de Oliveira Moura Santos

    2016-06-01

    Full Text Available The literature has presented many supplier segmentation models and multicriteria assessments; however, these models do not address the characteristics of micro and small enterprises (MSE, which have scarce resources and seek management tools with a large marginal contribution. The objective of this study is to propose a supplier segmentation model and assessment to address the requirements of MSE’s and to illustrate its practicability through a case study in a small furniture manufacturer. The results showed that the model’s direct benefits do not represent large marginal contributions, but the indirect benefits and the structuration process developed contribute to important consensual decisions and actions for the growth of these organizations.

  12. Radioactive waste disposal assessment - overview of biosphere processes and models

    International Nuclear Information System (INIS)

    Coughtrey, P.J.

    1992-09-01

    This report provides an overview of biosphere processes and models in the general context of the radiological assessment of radioactive waste disposal as a basis for HMIP's response to biosphere aspects of Nirex's submissions for disposal of radioactive wastes in a purpose-built repository at Sellafield, Cumbria. The overview takes into account published information from the UK as available from Nirex's safety and assessment research programme and HMIP's disposal assessment programme, as well as that available from studies in the UK and elsewhere. (Author)

  13. The MESORAD dose assessment model: Computer code

    International Nuclear Information System (INIS)

    Ramsdell, J.V.; Athey, G.F.; Bander, T.J.; Scherpelz, R.I.

    1988-10-01

    MESORAD is a dose equivalent model for emergency response applications that is designed to be run on minicomputers. It has been developed by the Pacific Northwest Laboratory for use as part of the Intermediate Dose Assessment System in the US Nuclear Regulatory Commission Operations Center in Washington, DC, and the Emergency Management System in the US Department of Energy Unified Dose Assessment Center in Richland, Washington. This volume describes the MESORAD computer code and contains a listing of the code. The technical basis for MESORAD is described in the first volume of this report (Scherpelz et al. 1986). A third volume of the documentation planned. That volume will contain utility programs and input and output files that can be used to check the implementation of MESORAD. 18 figs., 4 tabs

  14. SUMO, System performance assessment for a high-level nuclear waste repository: Mathematical models

    International Nuclear Information System (INIS)

    Eslinger, P.W.; Miley, T.B.; Engel, D.W.; Chamberlain, P.J. II.

    1992-09-01

    Following completion of the preliminary risk assessment of the potential Yucca Mountain Site by Pacific Northwest Laboratory (PNL) in 1988, the Office of Civilian Radioactive Waste Management (OCRWM) of the US Department of Energy (DOE) requested the Performance Assessment Scientific Support (PASS) Program at PNL to develop an integrated system model and computer code that provides performance and risk assessment analysis capabilities for a potential high-level nuclear waste repository. The system model that has been developed addresses the cumulative radionuclide release criteria established by the US Environmental Protection Agency (EPA) and estimates population risks in terms of dose to humans. The system model embodied in the SUMO (System Unsaturated Model) code will also allow benchmarking of other models being developed for the Yucca Mountain Project. The system model has three natural divisions: (1) source term, (2) far-field transport, and (3) dose to humans. This document gives a detailed description of the mathematics of each of these three divisions. Each of the governing equations employed is based on modeling assumptions that are widely accepted within the scientific community

  15. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches.

    Science.gov (United States)

    Walke, Russell C; Kirchner, Gerald; Xu, Shulan; Dverstorp, Björn

    2015-10-01

    Geological disposal facilities are the preferred option for high-level radioactive waste, due to their potential to provide isolation from the surface environment (biosphere) on very long timescales. Assessments need to strike a balance between stylised models and more complex approaches that draw more extensively on site-specific information. This paper explores the relative merits of complex versus more stylised biosphere models in the context of a site-specific assessment. The more complex biosphere modelling approach was developed by the Swedish Nuclear Fuel and Waste Management Co (SKB) for the Formark candidate site for a spent nuclear fuel repository in Sweden. SKB's approach is built on a landscape development model, whereby radionuclide releases to distinct hydrological basins/sub-catchments (termed 'objects') are represented as they evolve through land rise and climate change. Each of seventeen of these objects is represented with more than 80 site specific parameters, with about 22 that are time-dependent and result in over 5000 input values per object. The more stylised biosphere models developed for this study represent releases to individual ecosystems without environmental change and include the most plausible transport processes. In the context of regulatory review of the landscape modelling approach adopted in the SR-Site assessment in Sweden, the more stylised representation has helped to build understanding in the more complex modelling approaches by providing bounding results, checking the reasonableness of the more complex modelling, highlighting uncertainties introduced through conceptual assumptions and helping to quantify the conservatisms involved. The more stylised biosphere models are also shown capable of reproducing the results of more complex approaches. A major recommendation is that biosphere assessments need to justify the degree of complexity in modelling approaches as well as simplifying and conservative assumptions. In light of

  16. A comparative study of the use of different risk-assessment models in Danish municipalities

    DEFF Research Database (Denmark)

    Sørensen, Kresta Munkholt

    2018-01-01

    Risk-assessment models are widely used in casework involving vulnerable children and families. Internationally, there are a number of different kinds of models with great variation in regard to the characteristics of factors that harm children. Lists of factors have been made but most of them give...... very little advice on how the factors should be weighted. This paper will address the use of risk-assessment models in six different Danish municipalities. The paper presents a comparative analysis and discussion of differences and similarities between three models: the Integrated Children’s System...... (ICS), the Signs of Safety (SoS) model and models developed by the municipalities themselves (MM). The analysis will answer the following two key questions: (i) to which risk and protective factors do the caseworkers give most weight in the risk assessment? and (ii) does each of the different models...

  17. Improving the use of crop models for risk assessment and climate change adaptation.

    Science.gov (United States)

    Challinor, Andrew J; Müller, Christoph; Asseng, Senthold; Deva, Chetan; Nicklin, Kathryn Jane; Wallach, Daniel; Vanuytrecht, Eline; Whitfield, Stephen; Ramirez-Villegas, Julian; Koehler, Ann-Kristin

    2018-01-01

    Crop models are used for an increasingly broad range of applications, with a commensurate proliferation of methods. Careful framing of research questions and development of targeted and appropriate methods are therefore increasingly important. In conjunction with the other authors in this special issue, we have developed a set of criteria for use of crop models in assessments of impacts, adaptation and risk. Our analysis drew on the other papers in this special issue, and on our experience in the UK Climate Change Risk Assessment 2017 and the MACSUR, AgMIP and ISIMIP projects. The criteria were used to assess how improvements could be made to the framing of climate change risks, and to outline the good practice and new developments that are needed to improve risk assessment. Key areas of good practice include: i. the development, running and documentation of crop models, with attention given to issues of spatial scale and complexity; ii. the methods used to form crop-climate ensembles, which can be based on model skill and/or spread; iii. the methods used to assess adaptation, which need broadening to account for technological development and to reflect the full range options available. The analysis highlights the limitations of focussing only on projections of future impacts and adaptation options using pre-determined time slices. Whilst this long-standing approach may remain an essential component of risk assessments, we identify three further key components: 1.Working with stakeholders to identify the timing of risks. What are the key vulnerabilities of food systems and what does crop-climate modelling tell us about when those systems are at risk?2.Use of multiple methods that critically assess the use of climate model output and avoid any presumption that analyses should begin and end with gridded output.3.Increasing transparency and inter-comparability in risk assessments. Whilst studies frequently produce ranges that quantify uncertainty, the assumptions

  18. Behavior model for performance assessment

    International Nuclear Information System (INIS)

    Brown-VanHoozer, S. A.

    1999-01-01

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result

  19. Behavior model for performance assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Borwn-VanHoozer, S. A.

    1999-07-23

    Every individual channels information differently based on their preference of the sensory modality or representational system (visual auditory or kinesthetic) we tend to favor most (our primary representational system (PRS)). Therefore, some of us access and store our information primarily visually first, some auditorily, and others kinesthetically (through feel and touch); which in turn establishes our information processing patterns and strategies and external to internal (and subsequently vice versa) experiential language representation. Because of the different ways we channel our information, each of us will respond differently to a task--the way we gather and process the external information (input), our response time (process), and the outcome (behavior). Traditional human models of decision making and response time focus on perception, cognitive and motor systems stimulated and influenced by the three sensory modalities, visual, auditory and kinesthetic. For us, these are the building blocks to knowing how someone is thinking. Being aware of what is taking place and how to ask questions is essential in assessing performance toward reducing human errors. Existing models give predications based on time values or response times for a particular event, and may be summed and averaged for a generalization of behavior(s). However, by our not establishing a basic understanding of the foundation of how the behavior was predicated through a decision making strategy process, predicative models are overall inefficient in their analysis of the means by which behavior was generated. What is seen is the end result.

  20. Multi-model ensembles for assessment of flood losses and associated uncertainty

    Science.gov (United States)

    Figueiredo, Rui; Schröter, Kai; Weiss-Motz, Alexander; Martina, Mario L. V.; Kreibich, Heidi

    2018-05-01

    Flood loss modelling is a crucial part of risk assessments. However, it is subject to large uncertainty that is often neglected. Most models available in the literature are deterministic, providing only single point estimates of flood loss, and large disparities tend to exist among them. Adopting any one such model in a risk assessment context is likely to lead to inaccurate loss estimates and sub-optimal decision-making. In this paper, we propose the use of multi-model ensembles to address these issues. This approach, which has been applied successfully in other scientific fields, is based on the combination of different model outputs with the aim of improving the skill and usefulness of predictions. We first propose a model rating framework to support ensemble construction, based on a probability tree of model properties, which establishes relative degrees of belief between candidate models. Using 20 flood loss models in two test cases, we then construct numerous multi-model ensembles, based both on the rating framework and on a stochastic method, differing in terms of participating members, ensemble size and model weights. We evaluate the performance of ensemble means, as well as their probabilistic skill and reliability. Our results demonstrate that well-designed multi-model ensembles represent a pragmatic approach to consistently obtain more accurate flood loss estimates and reliable probability distributions of model uncertainty.

  1. Comparative evaluation of life cycle assessment models for solid waste management

    International Nuclear Information System (INIS)

    Winkler, Joerg; Bilitewski, Bernd

    2007-01-01

    This publication compares a selection of six different models developed in Europe and America by research organisations, industry associations and governmental institutions. The comparison of the models reveals the variations in the results and the differences in the conclusions of an LCA study done with these models. The models are compared by modelling a specific case - the waste management system of Dresden, Germany - with each model and an in-detail comparison of the life cycle inventory results. Moreover, a life cycle impact assessment shows if the LCA results of each model allows for comparable and consecutive conclusions, which do not contradict the conclusions derived from the other models' results. Furthermore, the influence of different level of detail in the life cycle inventory of the life cycle assessment is demonstrated. The model comparison revealed that the variations in the LCA results calculated by the models for the case show high variations and are not negligible. In some cases the high variations in results lead to contradictory conclusions concerning the environmental performance of the waste management processes. The static, linear modelling approach chosen by all models analysed is inappropriate for reflecting actual conditions. Moreover, it was found that although the models' approach to LCA is comparable on a general level, the level of detail implemented in the software tools is very different

  2. WORKSHOP ON APPLICATION OF STATISTICAL METHODS TO BIOLOGICALLY-BASED PHARMACOKINETIC MODELING FOR RISK ASSESSMENT

    Science.gov (United States)

    Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...

  3. Improving treatment outcome assessment in a mouse tuberculosis model.

    Science.gov (United States)

    Mourik, Bas C; Svensson, Robin J; de Knegt, Gerjo J; Bax, Hannelore I; Verbon, Annelies; Simonsson, Ulrika S H; de Steenwinkel, Jurriaan E M

    2018-04-09

    Preclinical treatment outcome evaluation of tuberculosis (TB) occurs primarily in mice. Current designs compare relapse rates of different regimens at selected time points, but lack information about the correlation between treatment length and treatment outcome, which is required to efficiently estimate a regimens' treatment-shortening potential. Therefore we developed a new approach. BALB/c mice were infected with a Mycobacterium tuberculosis Beijing genotype strain and were treated with rifapentine-pyrazinamide-isoniazid-ethambutol (R p ZHE), rifampicin-pyrazinamide-moxifloxacin-ethambutol (RZME) or rifampicin-pyrazinamide-moxifloxacin-isoniazid (RZMH). Treatment outcome was assessed in n = 3 mice after 9 different treatment lengths between 2-6 months. Next, we created a mathematical model that best fitted the observational data and used this for inter-regimen comparison. The observed data were best described by a sigmoidal E max model in favor over linear or conventional E max models. Estimating regimen-specific parameters showed significantly higher curative potentials for RZME and R p ZHE compared to RZMH. In conclusion, we provide a new design for treatment outcome evaluation in a mouse TB model, which (i) provides accurate tools for assessment of the relationship between treatment length and predicted cure, (ii) allows for efficient comparison between regimens and (iii) adheres to the reduction and refinement principles of laboratory animal use.

  4. Environmental impact assessments and geological repositories: A model process

    International Nuclear Information System (INIS)

    Webster, S.

    2000-01-01

    In a recent study carried out for the European Commission, the scope and application of environmental impact assessment (EIA) legislation and current EIA practice in European Union Member States and applicant countries of Central and Eastern Europe was investigated, specifically in relation to the geological disposal of radioactive waste. This paper reports the study's investigations into a model approach to EIA in the context of geological repositories, including the role of the assessment in the overall decision processes and public involvement. (author)

  5. Multi-model global assessment of subseasonal prediction skill of atmospheric rivers

    Science.gov (United States)

    Deflorio, M. J.

    2017-12-01

    Atmospheric rivers (ARs) are global phenomena that are characterized by long, narrow plumes of water vapor transport. They are most often observed in the midlatitudes near climatologically active storm track regions. Because of their frequent association with floods, landslides, and other hydrological impacts on society, there is significant incentive at the intersection of academic research, water management, and policymaking to understand the skill with which state-of-the-art operational weather models can predict ARs weeks-to-months in advance. We use the newly assembled Subseasonal-to-Seasonal (S2S) database, which includes extensive hindcast records of eleven operational weather models, to assess global prediction skill of atmospheric rivers on S2S timescales. We develop a metric to assess AR skill that is suitable for S2S timescales by counting the total number of AR days which occur over each model and observational grid cell during a 2-week time window. This "2-week AR occurrence" metric is suitable for S2S prediction skill assessment because it does not consider discrete hourly or daily AR objects, but rather a smoothed representation of AR occurrence over a longer period of time. Our results indicate that several of the S2S models, especially the ECMWF model, show useful prediction skill in the 2-week forecast window, with significant interannual variation in some regions. We also present results from an experimental forecast of S2S AR prediction skill using the ECMWF and NCEP models.

  6. Compartmental models for assessing the fishery production in the Indian Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Dalal, S.G.; Parulekar, A.H.

    Compartmental models for assessing the fishery production in the Indian Ocean is discussed. The article examines the theoretical basis on which modern fishery sciences is built. The model shows that, large changes in energy flux from one pathway...

  7. Comparison of models used for ecological risk assessment and human health risk assessment

    International Nuclear Information System (INIS)

    Ryti, R.T.; Gallegos, A.F.

    1994-01-01

    Models are used to derive action levels for site screening, or to estimate potential ecological or human health risks posed by potentially hazardous sites. At the Los Alamos National Laboratory (LANL), which is RCRA-regulated, the human-health screening action levels are based on hazardous constituents described in RCRA Subpart S and RESRAD-derived soil guidelines (based on 10 mRem/year) for radiological constituents. Also, an ecological risk screening model was developed for a former firing site, where the primary constituents include depleted uranium, beryllium and lead. Sites that fail the screening models are evaluated with site-specific human risk assessment (using RESRAD and other approaches) and a detailed ecological effect model (ECOTRAN). ECOTRAN is based on pharmacokinetics transport modeling within a multitrophic-level biological-growth dynamics model. ECOTRAN provides detailed temporal records of contaminant concentrations in biota, and annual averages of these body burdens are compared to equivalent site-specific runs of the RESRAD model. The results show that thoughtful interpretation of the results of these models must be applied before they can be used for evaluation of current risk posed by sites and the benefits of various remedial options. This presentation compares the concentrations of biological media in the RESRAD screening runs to the concentrations in ecological endpoints predicted by the ecological screening model. The assumptions and limitations of these screening models and the decision process where these are screening models are applied are discussed

  8. Condensing a detailed groundwater flow and contaminant transport model into a geosphere model for environmental and safety assessment

    International Nuclear Information System (INIS)

    Chan Tin; Melnyk, Ted

    2004-01-01

    AECL (Atomic Energy of Canada Limited) is preparing an Environmental Impact Statement (EIS) to present its case to a federal environmental assessment panel for a concept for disposal of Canada's nuclear fuel waste. The concept is that of a sealed vault constructed at a depth of 500 to 1,000 m in plutonic rock of the Canadian Shield. An analysis of disposal system performance using a probabilistic system variability analysis code (SYVAC3-CC3) has been an important component of the assessment of the long-term safety and environmental impacts of the disposal system. In the assessment, the disposal system is divided into vault, geosphere and biosphere, each of which is represented by a computationally simplified model. This paper summarizes the procedure for condensing a detailed 3-D finite-element hydrogeological model into the SYVAC3-CC3 geosphere model, GEONET. (author)

  9. Using the New Two-Phase-Titan to Evaluate Potential Lahar Hazard at Villa la Angostura, Argentina

    Science.gov (United States)

    Sheridan, M. F.; Cordoba, G. A.; Viramonte, J. G.; Folch, A.; Villarosa, G.; Delgado, H.

    2013-05-01

    The 2011 eruption of Puyehue Volcano, located in the Cordon del Caulle volcanic complex, Chile, produced an ash plume that mainly affected downwind areas in Argentina. This plume forced air transport in the region to be closed for several weeks. Tephra fall deposits from this eruption affected many locations and pumice deposits on lakes killed most of the fish. As the ash emission occurred during the southern hemisphere winter (June), ash horizons were inter layered with layers of snow. This situation posed a potential threat for human settlements located downslope of the mountains. This was the case at Villa la Angostura, Neuquen province, Argentina, which sits on a series of fluvial deposits that originate in three major basins: Piedritas, Colorado, and Florencia. The Institute of Geological Survey of Argentina (SEGEMAR) estimated that the total accumulated deposit in each basin contains a ratio of approximately 30% ash and 70% snow. The CyTED-Ceniza Iberoamerican network worked together with Argentinean, Colombian and USA institutions in this hazard assessment. We used the program Two-Phase-Titan to model two scenarios in each of the basins. This computer code was developed at SUNY University at Buffalo supported by NSF Grant EAR 711497. Two-Phase-Titan is a new depth-averaged model for two phase flows that uses balance equations for multiphase mixtures. We evaluate the stresses using a Coulomb law for the solid phase and the typical hydraulic shallow water approach for the fluid phase. The linkage for compositions in the range between the pure end-member phases is accommodated by the inclusion of a phenomenological-based drag coefficient. The model is capable of simulating the whole range of particle volumetric fractions, from pure fluid flows to pure solid avalanches. The initial conditions, volume and solid concentration, required by Two-Phase-Titan were imposed using the SEGEMAR estimation of total deposited volume, assuming that the maximum volume that can

  10. Overview of MELCOR 1.8.4: Modeling advances and assessment

    International Nuclear Information System (INIS)

    Gauntt, R.O.; Cole, R.K.; Rodriguez, S.B.; Young, M.F.; Gasser, R.D.

    1998-01-01

    The newly released MELCOR 1.8.4 reactor accident analysis code contains many new modeling features as well as improvements to existing models. New model additions to the MELCOR code include a model for predicting enhanced depletion rates for hygroscopic aerosols and a model for predicting the chemisorption of Cesium to the surfaces of piping. Improvements to existing models include: upgrading the core module (COR) to handle flow redistribution resulting from the formation of core blockages, improving the thermal hydraulics (CVH) coupling with COR to handle flow reversal situations, and upgrading the fission product scrubbing model to incorporate the SPARC90 code. Significant upgrading of the COR package core degradation modeling was also included in the new code release version. New and improved models are described in the following paper. In addition, a number of assessment analyses were recently performed, focusing on demonstrating the new and improved capabilities in the code. Results of assessment calculations demonstrating code performance for aerosol (pool) scrubbing, hygroscopic aerosol behavior, and core degradation and hydrogen production are presented. Finally, ongoing code developments activities beyond MELCOR 1.8.4 are described. These include models for treating iodine behavior in containment sumps, pools, and atmosphere, and plans for implementing reflood models and the attendant effects on accident progression. Further improvements and additions to the core degradation modeling in MELCOR are described, including the implementation of enhanced clad failure models to treat clad ballooning and eutectic interaction with grid spacers, and expansion of the COR package to allow for improved representation of UO 2 -Zr eutectic behavior, improved melt relocation treatment, greater detail in describing aspects of BWR core degradation (fuel channel, bypass, and lower plenum), and more flexibility in modeling other structures in the core such as core plate

  11. The status of world biosphere modelling for waste disposal assessments following BIOMOVS II

    International Nuclear Information System (INIS)

    Klos, R.; Reid, J.A.K.; Santucci, P.; Bergstrom, U.

    1996-01-01

    Biosphere modelling for radioactive waste disposal assessments faces unique problems. Models for such applications tend to be quite distinct from other similar environmental assessment tools. Over the past few years, two of the Working Groups in the second international biosphere model validation study (BIOMOVS II) have been considering the special requirements for such models. The BIOMOVS II Reference Biospheres Working Group has concentrated on the elaboration of the methodology for the definition of models for such assessments. lie Complementary Studies Working Group has dealt with how the Features, Events and Processes (FEPS) included in the participating models are represented, in the context of the representation of a temperate inland biosphere. The aim of Complementary Studies was to move forward from the first phase of BIOMOVS, with the analysis going further and deeper into principles on which the participating models are based. Ten of the leading models from around the world have participated in the Complementary Studies model intercomparison exercise. This paper presents some key findings using the international biosphere FEP-list produced by the Reference Biospheres Working Group as a framework for discussing the current state-of-the-art. Common features of the models as well as reasons for the model differences are discussed. Areas where the international community could benefit from a harmonisation of approaches are also identified, setting out possible future requirements and developments. In the Complementary Studies intercomparison, the hypothetical release of radionuclides to an inland valley biosphere was considered. The radionuclides considered in the study were selected because of their relevance for underground repositories for long-lived radioactive wastes and because their individual properties made them suitable probes for many of the important Features, Events and Processes (FEPS) in long timescale biosphere modelling. The data

  12. The Development of the Assessment for Learning Model of Mathematics for Rajamangala University of Technology Rattanakosin

    Directory of Open Access Journals (Sweden)

    Wannaree Pansiri

    2016-12-01

    Full Text Available The objectives of this research were 1 to develop the assessment for learning model of Mathematics for Rajamangala University 2 to study the effectivness of assessment for learning model of Mathematics for Rajamagala University of Technology Rattanakosin. The research target group consisted of 72 students from 3 classes and 3 General Mathematics teachers. The data was gathered from observation, worksheets, achievement test and skill of assessment for learning, questionnaire of the assessment for learning model of Mathematics. The statistics that used in this research were Frequency, Percentage, Mean, Standard Deviation, and Growth Score. The results of this research were 1. The assessment of learning model of Mathematics for Rajamangala University of Technology Rattanakosin consisted of 3 components ; 1. Pre-assessment which consisted of 4 activities ; a Preparation b Teacher development c Design and creation the assessment plan and instrument for assessment and d Creation of the learning experience plan 2. The component for assessment process consisted of 4 steps which were a Identifying the learning objectives and criteria b Identifying the learning experience plan and assessment follow the plan c Learning reflection and giving feedback and d Learner development based on information and improve instruction and 3. Giving feedback component. 2. The effective of assessment for learning model found that most students had good score in concentration, honest, responsibilities, group work, task presentation, worksheets, and doing exercises. The development knowledge of learning and knowledge and skill of assessment for learning of lecturers were fairly good. The opinion to the assessment for learning of learners and assessment for learning model of Mathematics of teachers found that was in a good level.

  13. Assessing flood risk at the global scale: model setup, results, and sensitivity

    International Nuclear Information System (INIS)

    Ward, Philip J; Jongman, Brenden; Weiland, Frederiek Sperna; Winsemius, Hessel C; Bouwman, Arno; Ligtvoet, Willem; Van Beek, Rens; Bierkens, Marc F P

    2013-01-01

    Globally, economic losses from flooding exceeded $19 billion in 2012, and are rising rapidly. Hence, there is an increasing need for global-scale flood risk assessments, also within the context of integrated global assessments. We have developed and validated a model cascade for producing global flood risk maps, based on numerous flood return-periods. Validation results indicate that the model simulates interannual fluctuations in flood impacts well. The cascade involves: hydrological and hydraulic modelling; extreme value statistics; inundation modelling; flood impact modelling; and estimating annual expected impacts. The initial results estimate global impacts for several indicators, for example annual expected exposed population (169 million); and annual expected exposed GDP ($1383 billion). These results are relatively insensitive to the extreme value distribution employed to estimate low frequency flood volumes. However, they are extremely sensitive to the assumed flood protection standard; developing a database of such standards should be a research priority. Also, results are sensitive to the use of two different climate forcing datasets. The impact model can easily accommodate new, user-defined, impact indicators. We envisage several applications, for example: identifying risk hotspots; calculating macro-scale risk for the insurance industry and large companies; and assessing potential benefits (and costs) of adaptation measures. (letter)

  14. Advancing Integrated Systems Modelling Framework for Life Cycle Sustainability Assessment

    Directory of Open Access Journals (Sweden)

    Anthony Halog

    2011-02-01

    Full Text Available The need for integrated methodological framework for sustainability assessment has been widely discussed and is urgent due to increasingly complex environmental system problems. These problems have impacts on ecosystems and human well-being which represent a threat to economic performance of countries and corporations. Integrated assessment crosses issues; spans spatial and temporal scales; looks forward and backward; and incorporates multi-stakeholder inputs. This study aims to develop an integrated methodology by capitalizing the complementary strengths of different methods used by industrial ecologists and biophysical economists. The computational methodology proposed here is systems perspective, integrative, and holistic approach for sustainability assessment which attempts to link basic science and technology to policy formulation. The framework adopts life cycle thinking methods—LCA, LCC, and SLCA; stakeholders analysis supported by multi-criteria decision analysis (MCDA; and dynamic system modelling. Following Pareto principle, the critical sustainability criteria, indicators and metrics (i.e., hotspots can be identified and further modelled using system dynamics or agent based modelling and improved by data envelopment analysis (DEA and sustainability network theory (SNT. The framework is being applied to development of biofuel supply chain networks. The framework can provide new ways of integrating knowledge across the divides between social and natural sciences as well as between critical and problem-solving research.

  15. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    International Nuclear Information System (INIS)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-01-01

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator

  16. PORFLOW Modeling Supporting The H-Tank Farm Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, J. M.; Flach, G. P.; Westbrook, M. L.

    2012-08-31

    Numerical simulations of groundwater flow and contaminant transport in the vadose and saturated zones have been conducted using the PORFLOW code in support of an overall Performance Assessment (PA) of the H-Tank Farm. This report provides technical detail on selected aspects of PORFLOW model development and describes the structure of the associated electronic files. The PORFLOW models for the H-Tank Farm PA, Rev. 1 were updated with grout, solubility, and inventory changes. The aquifer model was refined. In addition, a set of flow sensitivity runs were performed to allow flow to be varied in the related probabilistic GoldSim models. The final PORFLOW concentration values are used as input into a GoldSim dose calculator.

  17. SR 97 - Alternative models project. Discrete fracture network modelling for performance assessment of Aberg

    International Nuclear Information System (INIS)

    Dershowitz, B.; Eiben, T.; Follin, S.; Andersson, Johan

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modeling approaches for geosphere performance assessment for a single hypothetical site. The hypothetical site, arbitrarily named Aberg is based on parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The Aberg model domain, boundary conditions and canister locations are defined as a common reference case to facilitate comparisons between approaches. This report presents the results of a discrete fracture pathways analysis of the Aberg site, within the context of the SR 97 performance assessment exercise. The Aberg discrete fracture network (DFN) site model is based on consensus Aberg parameters related to the Aespoe HRL site. Discrete fracture pathways are identified from canister locations in a prototype repository design to the surface of the island or to the sea bottom. The discrete fracture pathways analysis presented in this report is used to provide the following parameters for SKB's performance assessment transport codes FARF31 and COMP23: * F-factor: Flow wetted surface normalized with regards to flow rate (yields an appreciation of the contact area available for diffusion and sorption processes) [TL -1 ]. * Travel Time: Advective transport time from a canister location to the environmental discharge [T]. * Canister Flux: Darcy flux (flow rate per unit area) past a representative canister location [LT -1 ]. In addition to the above, the discrete fracture pathways analysis in this report also provides information about: additional pathway parameters such as pathway length, pathway width, transport aperture, reactive surface area and transmissivity, percentage of canister locations with pathways to the surface discharge, spatial pattern of pathways and pathway discharges, visualization of pathways, and statistical

  18. SR 97 - Alternative models project. Discrete fracture network modelling for performance assessment of Aberg

    Energy Technology Data Exchange (ETDEWEB)

    Dershowitz, B.; Eiben, T. [Golder Associates Inc., Seattle (United States); Follin, S.; Andersson, Johan [Golder Grundteknik KB, Stockholm (Sweden)

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modeling approaches for geosphere performance assessment for a single hypothetical site. The hypothetical site, arbitrarily named Aberg is based on parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The Aberg model domain, boundary conditions and canister locations are defined as a common reference case to facilitate comparisons between approaches. This report presents the results of a discrete fracture pathways analysis of the Aberg site, within the context of the SR 97 performance assessment exercise. The Aberg discrete fracture network (DFN) site model is based on consensus Aberg parameters related to the Aespoe HRL site. Discrete fracture pathways are identified from canister locations in a prototype repository design to the surface of the island or to the sea bottom. The discrete fracture pathways analysis presented in this report is used to provide the following parameters for SKB's performance assessment transport codes FARF31 and COMP23: * F-factor: Flow wetted surface normalized with regards to flow rate (yields an appreciation of the contact area available for diffusion and sorption processes) [TL{sup -1}]. * Travel Time: Advective transport time from a canister location to the environmental discharge [T]. * Canister Flux: Darcy flux (flow rate per unit area) past a representative canister location [LT{sup -1}]. In addition to the above, the discrete fracture pathways analysis in this report also provides information about: additional pathway parameters such as pathway length, pathway width, transport aperture, reactive surface area and transmissivity, percentage of canister locations with pathways to the surface discharge, spatial pattern of pathways and pathway discharges, visualization of pathways, and

  19. Model assessment of protective barrier designs: Part 2

    International Nuclear Information System (INIS)

    Fayer, M.J.

    1987-11-01

    Protective barriers are being considered for use at the Hanford Site to enhance the isolation of radioactive wastes from water, plant, and animal intrusion. This study assesses the effectiveness of protective barriers for isolation of wastes from water. In this report, barrier designs are reviewed and several barrier modeling assumptions are tested. 20 refs., 16 figs., 6 tabs

  20. A model for assessing information technology effectiveness in the business environment

    Directory of Open Access Journals (Sweden)

    Sandra Cristina Riascos Erazo

    2008-05-01

    Full Text Available The impact of technology on administrative processes has improved business strategies (especially regarding the e-ffect of information technology - IT, often leading to organisational success. Its effectiveness in this environment was thus modelled due to such importance; this paper describes studying a series of models aimed at assessing IT, its ad-vantages and disadvantages. A model is proposed involving different aspects for an integral assessment of IT effecti-veness and considering administrative activities’ particular characteristics. This analytical study provides guidelines for identifying IT effectiveness in a business environment and current key strategies in technological innovation. This stu-dy was based on ISO 9126, ISO 9001, ISO 15939 and ISO 25000 standards as well as COBIT and CMM stan-dards.

  1. Model-based pH monitor for sensor assessment.

    Science.gov (United States)

    van Schagen, Kim; Rietveld, Luuk; Veersma, Alex; Babuska, Robert

    2009-01-01

    Owing to the nature of the treatment processes, monitoring the processes based on individual online measurements is difficult or even impossible. However, the measurements (online and laboratory) can be combined with a priori process knowledge, using mathematical models, to objectively monitor the treatment processes and measurement devices. The pH measurement is a commonly used measurement at different stages in the drinking water treatment plant, although it is a unreliable instrument, requiring significant maintenance. It is shown that, using a grey-box model, it is possible to assess the measurement devices effectively, even if detailed information of the specific processes is unknown.

  2. Report on the model developments in the sectoral assessments

    DEFF Research Database (Denmark)

    Iglesias, Ana; Termansen, Mette; Bouwer, Laurens

    2014-01-01

    into the economic assessments. At the same time, the models will link to the case studies in two ways. First, they use the data in the case studies for model validation and then they provide information to inform stakeholders on adaptation strategies. Therefore, Deliverable 3.2 aims to address three main questions......The Objective of this Deliverable D3.2 is to describe the models developed in BASE that is, the experimental setup for the sectoral modelling. The model development described in this deliverable will then be implemented in the adaptation and economic analysis in WP6 in order to integrate adaptation......: How to address climate adaptation options with the sectoral bottom-up models? - This includes a quantification of the costs of adaptation with the sectoral models, in monetary terms or in other measures of costs. The benefits in this framework will be the avoided damages, therefore a measure...

  3. Equivalent magnetic vector potential model for low-frequency magnetic exposure assessment

    Science.gov (United States)

    Diao, Y. L.; Sun, W. N.; He, Y. Q.; Leung, S. W.; Siu, Y. M.

    2017-10-01

    In this paper, a novel source model based on a magnetic vector potential for the assessment of induced electric field strength in a human body exposed to the low-frequency (LF) magnetic field of an electrical appliance is presented. The construction of the vector potential model requires only a single-component magnetic field to be measured close to the appliance under test, hence relieving considerable practical measurement effort—the radial basis functions (RBFs) are adopted for the interpolation of discrete measurements; the magnetic vector potential model can then be directly constructed by summing a set of simple algebraic functions of RBF parameters. The vector potentials are then incorporated into numerical calculations as the equivalent source for evaluations of the induced electric field in the human body model. The accuracy and effectiveness of the proposed model are demonstrated by comparing the induced electric field in a human model to that of the full-wave simulation. This study presents a simple and effective approach for modelling the LF magnetic source. The result of this study could simplify the compliance test procedure for assessing an electrical appliance regarding LF magnetic exposure.

  4. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    The prevailing document addresses validation of the stochastic continuum model HYDRASTAR designed for Monte Carlo simulations of groundwater flow in fractured rocks. Here, validation is defined as a process to demonstrate that a model concept is fit for its purpose. Preferably, the validation is carried out by comparison of model predictions with independent field observations and experimental measurements. In addition, other sources can also be used to confirm that the model concept gives acceptable results. One method is to compare results with the ones achieved using other model concepts for the same set of input data. Another method is to compare model results with analytical solutions. The model concept HYDRASTAR has been used in several studies including performance assessments of hypothetical repositories for spent nuclear fuel. In the performance assessments, the main tasks for HYDRASTAR have been to calculate groundwater travel time distributions, repository flux distributions, path lines and their exit locations. The results have then been used by other model concepts to calculate the near field release and far field transport. The aim and framework for the validation process includes describing the applicability of the model concept for its purpose in order to build confidence in the concept. Preferably, this is made by comparisons of simulation results with the corresponding field experiments or field measurements. Here, two comparisons with experimental results are reported. In both cases the agreement was reasonably fair. In the broader and more general context of the validation process, HYDRASTAR results have been compared with other models and analytical solutions. Commonly, the approximation calculations agree well with the medians of model ensemble results. Additional indications that HYDRASTAR is suitable for its purpose were obtained from the comparisons with results from other model concepts. Several verification studies have been made for

  5. Biosphere models for safety assessment of radioactive waste disposal

    International Nuclear Information System (INIS)

    Proehl, G.; Olyslaegers, G.; Zeevaert, T.; Kanyar, B.; Bergstroem, U.; Hallberg, B.; Mobbs, S.; Chen, Q.; Kowe, R.

    2004-01-01

    The aim of the BioMoSA project has been to contribute in the confidence building of biosphere models, for application in performance assessments of radioactive waste disposal. The detailed objectives of this project are: development and test of practical biosphere models for application in long-term safety studies of radioactive waste disposal to different European locations, identification of features, events and processes that need to be modelled on a site-specific rather than on a generic base, comparison of the results and quantification of the variability of site-specific models developed according to the reference biosphere methodology, development of a generic biosphere tool for application in long term safety studies, comparison of results from site-specific models to those from generic one, Identification of possibilities and limitations for the application of the generic biosphere model. (orig.)

  6. Regional drought assessment using a distributed hydrological model coupled with Standardized Runoff Index

    Directory of Open Access Journals (Sweden)

    H. Shen

    2015-05-01

    Full Text Available Drought assessment is essential for coping with frequent droughts nowadays. Owing to the large spatio-temporal variations in hydrometeorology in most regions in China, it is very necessary to use a physically-based hydrological model to produce rational spatial and temporal distributions of hydro-meteorological variables for drought assessment. In this study, the large-scale distributed hydrological model Variable Infiltration Capacity (VIC was coupled with a modified standardized runoff index (SRI for drought assessment in the Weihe River basin, northwest China. The result indicates that the coupled model is capable of reasonably reproducing the spatial distribution of drought occurrence. It reflected the spatial heterogeneity of regional drought and improved the physical mechanism of SRI. This model also has potential for drought forecasting, early warning and mitigation, given that accurate meteorological forcing data are available.

  7. IMAGE: An Integrated Model for the Assessment of the Greenhouse Effect

    NARCIS (Netherlands)

    Rotmans J; Boois H de; Swart RJ

    1989-01-01

    In dit rapport wordt beschreven hoe het RIVM-simulatiemodel IMAGE (an Integrated Model for the Assessment of the Greenhouse Effect) is opgebouwd. Het model beoogt een geintegreerd overzicht te geven van de broeikasproblematiek alsmede inzicht te verschaffen in de wezenlijke drijfveren van het

  8. Practical examples of modeling choices and their consequences for risk assessment

    Science.gov (United States)

    Although benchmark dose (BMD) modeling has become the preferred approach to identifying a point of departure (POD) over the No Observed Adverse Effect Level, there remain challenges to its application in human health risk assessment. BMD modeling, as currently implemented by the...

  9. Automatic creation of Markov models for reliability assessment of safety instrumented systems

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2008-01-01

    After the release of new international functional safety standards like IEC 61508, people care more for the safety and availability of safety instrumented systems. Markov analysis is a powerful and flexible technique to assess the reliability measurements of safety instrumented systems, but it is fallible and time-consuming to create Markov models manually. This paper presents a new technique to automatically create Markov models for reliability assessment of safety instrumented systems. Many safety related factors, such as failure modes, self-diagnostic, restorations, common cause and voting, are included in Markov models. A framework is generated first based on voting, failure modes and self-diagnostic. Then, repairs and common-cause failures are incorporated into the framework to build a complete Markov model. Eventual simplification of Markov models can be done by state merging. Examples given in this paper show how explosively the size of Markov model increases as the system becomes a little more complicated as well as the advancement of automatic creation of Markov models

  10. Optimizing Prediction Using Bayesian Model Averaging: Examples Using Large-Scale Educational Assessments.

    Science.gov (United States)

    Kaplan, David; Lee, Chansoon

    2018-01-01

    This article provides a review of Bayesian model averaging as a means of optimizing the predictive performance of common statistical models applied to large-scale educational assessments. The Bayesian framework recognizes that in addition to parameter uncertainty, there is uncertainty in the choice of models themselves. A Bayesian approach to addressing the problem of model uncertainty is the method of Bayesian model averaging. Bayesian model averaging searches the space of possible models for a set of submodels that satisfy certain scientific principles and then averages the coefficients across these submodels weighted by each model's posterior model probability (PMP). Using the weighted coefficients for prediction has been shown to yield optimal predictive performance according to certain scoring rules. We demonstrate the utility of Bayesian model averaging for prediction in education research with three examples: Bayesian regression analysis, Bayesian logistic regression, and a recently developed approach for Bayesian structural equation modeling. In each case, the model-averaged estimates are shown to yield better prediction of the outcome of interest than any submodel based on predictive coverage and the log-score rule. Implications for the design of large-scale assessments when the goal is optimal prediction in a policy context are discussed.

  11. Dynamic model for the assessment of radiological exposure to marine biota

    Energy Technology Data Exchange (ETDEWEB)

    Vives i Batlle, J. [Westlakes Scientific Consulting Ltd, The Princess Royal Building, Westlakes Science and Technology Park, Moor Row, Cumbria CA24 3LN (United Kingdom)], E-mail: jordi.vives@westlakes.ac.uk; Wilson, R.C.; Watts, S.J.; Jones, S.R.; McDonald, P.; Vives-Lynch, S. [Westlakes Scientific Consulting Ltd, The Princess Royal Building, Westlakes Science and Technology Park, Moor Row, Cumbria CA24 3LN (United Kingdom)

    2008-11-15

    A generic approach has been developed to simulate dynamically the uptake and turnover of radionuclides by marine biota. The approach incorporates a three-compartment biokinetic model based on first order linear kinetics, with interchange rates between the organism and its surrounding environment. Model rate constants are deduced as a function of known parameters: biological half-lives of elimination, concentration factors and a sample point of the retention curve, allowing for the representation of multi-component release. The new methodology has been tested and validated in respect of non-dynamic assessment models developed for regulatory purposes. The approach has also been successfully tested against research dynamic models developed to represent the uptake of technetium and radioiodine by lobsters and winkles. Assessments conducted on two realistic test scenarios demonstrated the importance of simulating time-dependency for ecosystems in which environmental levels of radionuclides are not in equilibrium.

  12. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    Science.gov (United States)

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  13. Using the Dynamic Model to Identify Stages of Teacher Skills in Assessment

    Science.gov (United States)

    Christoforidou, Margarita; Xirafidou, Elisavet

    2014-01-01

    The article presents the results of two cross-sectional studies that investigate teachers' skills in using various techniques of assessment in mathematics by taking into account the four phases of assessment. The five dimensions of the dynamic model are also taken into account in proposing a framework for measuring teacher skills in assessment.…

  14. A Model for Making Decisions about Ethical Dilemmas in Student Assessment

    Science.gov (United States)

    Johnson, Robert L.; Liu, Jin; Burgess, Yin

    2017-01-01

    In this mixed-methods study we investigated the development of a generalized ethics decision-making model that can be applied in considering ethical dilemmas related to student assessment. For the study, we developed five scenarios that describe ethical dilemmas associated with student assessment. Survey participants (i.e., educators) completed an…

  15. A model for assessing social impacts of nuclear technology

    International Nuclear Information System (INIS)

    Suzuki, Atsuyuki; Kiyose, Ryohei

    1981-01-01

    A theoretical framework is given for assessing the social or environmental impacts of nuclear technology. A two-act problem concerning the incentive-penalty system is supposed to formulate the principle of ALAP. An observation plan to make decision on the problem is optimized with the Bayseian decision theory. The optimized solution resting on the amount of incentive or penalty is compared with an actual or practical plan. Then, by finding the indifference between the two plans, an impact is assessed in monetary terms. As regards the third step, the model does not provide the details since it is beyond the scope of the description. If there exists an actual plan, it can be easily compared with the results from this theory. If there does not or in the process of making it, its feasibility must be studied by another model or by different approaches. (J.P.N.)

  16. Assessment of tropospheric delay mapping function models in Egypt: Using PTD database model

    Science.gov (United States)

    Abdelfatah, M. A.; Mousa, Ashraf E.; El-Fiky, Gamal S.

    2018-06-01

    For space geodetic measurements, estimates of tropospheric delays are highly correlated with site coordinates and receiver clock biases. Thus, it is important to use the most accurate models for the tropospheric delay to reduce errors in the estimates of the other parameters. Both the zenith delay value and mapping function should be assigned correctly to reduce such errors. Several mapping function models can treat the troposphere slant delay. The recent models were not evaluated for the Egyptian local climate conditions. An assessment of these models is needed to choose the most suitable one. The goal of this paper is to test the quality of global mapping function which provides high consistency with precise troposphere delay (PTD) mapping functions. The PTD model is derived from radiosonde data using ray tracing, which consider in this paper as true value. The PTD mapping functions were compared, with three recent total mapping functions model and another three separate dry and wet mapping function model. The results of the research indicate that models are very close up to zenith angle 80°. Saastamoinen and 1/cos z model are behind accuracy. Niell model is better than VMF model. The model of Black and Eisner is a good model. The results also indicate that the geometric range error has insignificant effect on slant delay and the fluctuation of azimuth anti-symmetric is about 1%.

  17. Multimedia radionuclide exposure assessment modeling. Annual report, October 1980-September 1981

    International Nuclear Information System (INIS)

    Whelan, G.; Onishi, Y.; Simmons, C.S.; Horst, T.W.; Gupta, S.K.; Orgill, M.M.; Newbill, C.A.

    1982-12-01

    Pacific Northwest Laboratory (PNL) and Los Alamos National Laboratory (LANL) are jointly developing a methodology for assessing exposures of the air, water, and plants to radionuclides as part of an overall development effort of a radionuclide disposal site evaluation methodology. Work in FY-1981 continued the development of the Multimedia Contaminant Environmental Exposure Assessment (MCEA) methodology and initiated an assessment of radionuclide migration in Los Alamos and Pueblo Canyons, New Mexico, using the methodology. The AIRTRAN model was completed, briefly tested, and documented. In addition, a literature search for existing validation data for AIRTRAN was performed. The feasibility and advisability of including the UNSAT moisture flow model as a submodel of the terrestrial code BIOTRAN was assessed. A preliminary application of the proposed MCEA methodology, as it related to the Mortandad-South Mortandad Canyon site in New Mexico is discussed. This preliminary application represented a scaled-down version of the methodology in which only the terrestrial, overland, and surface water components were used. An update describing the progress in the assessment of radionuclide migration in Los Alamos and Pueblo Canyons is presented. 38 references, 47 figures, 11 tables

  18. Annual report, October 1980-September 1981 Multimedia radionuclide exposure assessment modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, G.; Onishi, Y.; Simmons, C.S.; Horst, T.W.; Gupta, S.K.; Orgill, M.M.; Newbill, C.A.

    1982-12-01

    Pacific Northwest Laboratory (PNL) and Los Alamos National Laboratory (LANL) are jointly developing a methodology for assessing exposures of the air, water, and plants to radionuclides as part of an overall development effort of a radionuclide disposal site evaluation methodology. Work in FY-1981 continued the development of the Multimedia Contaminant Environmental Exposure Assessment (MCEA) methodology and initiated an assessment of radionuclide migration in Los Alamos and Pueblo Canyons, New Mexico, using the methodology. The AIRTRAN model was completed, briefly tested, and documented. In addition, a literature search for existing validation data for AIRTRAN was performed. The feasibility and advisability of including the UNSAT moisture flow model as a submodel of the terrestrial code BIOTRAN was assessed. A preliminary application of the proposed MCEA methodology, as it related to the Mortandad-South Mortandad Canyon site in New Mexico is discussed. This preliminary application represented a scaled-down version of the methodology in which only the terrestrial, overland, and surface water components were used. An update describing the progress in the assessment of radionuclide migration in Los Alamos and Pueblo Canyons is presented. 38 references, 47 figures, 11 tables.

  19. Mathematical models of cancer and their use in risk assessment. Technical report No. 27

    International Nuclear Information System (INIS)

    Whittemore, A.S.

    1979-08-01

    The sensitivity of risk predictions to certain assumptions in the underlying mathematical model is illustrated. To avoid the misleading and erroneous predictions that can result from the use of models incorporating assumptions whose validity is questionable, the following steps should be taken. First, state the assumptions used in a proposed model in terms that are clear to all who will use the model to assess risk. Second, assess the sensitivity of predictions to changes in model assumptions. Third, scrutinize pivotal assumptions in light of the best available human and animal data. Fourth, stress inconsistencies between model assumptions and experimental or epidemiological observations. The model fitting procedure will yield the most information when the data discriminates between theories because of their inconsistency with one or more assumptions. In this sense, mathematical theories are most successful when they fail. Finally, exclude value judgments from the quantitative procedures used to assess risk; instead include them explicitly in that part of the decision process concerned with cost-benefit analysis

  20. A manufacturing quality assessment model based-on two stages interval type-2 fuzzy logic

    Science.gov (United States)

    Purnomo, Muhammad Ridwan Andi; Helmi Shintya Dewi, Intan

    2016-01-01

    This paper presents the development of an assessment models for manufacturing quality using Interval Type-2 Fuzzy Logic (IT2-FL). The proposed model is developed based on one of building block in sustainable supply chain management (SSCM), which is benefit of SCM, and focuses more on quality. The proposed model can be used to predict the quality level of production chain in a company. The quality of production will affect to the quality of product. Practically, quality of production is unique for every type of production system. Hence, experts opinion will play major role in developing the assessment model. The model will become more complicated when the data contains ambiguity and uncertainty. In this study, IT2-FL is used to model the ambiguity and uncertainty. A case study taken from a company in Yogyakarta shows that the proposed manufacturing quality assessment model can work well in determining the quality level of production.

  1. A Zebrafish Heart Failure Model for Assessing Therapeutic Agents.

    Science.gov (United States)

    Zhu, Xiao-Yu; Wu, Si-Qi; Guo, Sheng-Ya; Yang, Hua; Xia, Bo; Li, Ping; Li, Chun-Qi

    2018-03-20

    Heart failure is a leading cause of death and the development of effective and safe therapeutic agents for heart failure has been proven challenging. In this study, taking advantage of larval zebrafish, we developed a zebrafish heart failure model for drug screening and efficacy assessment. Zebrafish at 2 dpf (days postfertilization) were treated with verapamil at a concentration of 200 μM for 30 min, which were determined as optimum conditions for model development. Tested drugs were administered into zebrafish either by direct soaking or circulation microinjection. After treatment, zebrafish were randomly selected and subjected to either visual observation and image acquisition or record videos under a Zebralab Blood Flow System. The therapeutic effects of drugs on zebrafish heart failure were quantified by calculating the efficiency of heart dilatation, venous congestion, cardiac output, and blood flow dynamics. All 8 human heart failure therapeutic drugs (LCZ696, digoxin, irbesartan, metoprolol, qiliqiangxin capsule, enalapril, shenmai injection, and hydrochlorothiazide) showed significant preventive and therapeutic effects on zebrafish heart failure (p failure model developed and validated in this study could be used for in vivo heart failure studies and for rapid screening and efficacy assessment of preventive and therapeutic drugs.

  2. Statistical multi-model approach for performance assessment of cooling tower

    International Nuclear Information System (INIS)

    Pan, Tian-Hong; Shieh, Shyan-Shu; Jang, Shi-Shang; Tseng, Wen-Hung; Wu, Chan-Wei; Ou, Jenq-Jang

    2011-01-01

    This paper presents a data-driven model-based assessment strategy to investigate the performance of a cooling tower. In order to achieve this objective, the operations of a cooling tower are first characterized using a data-driven method, multiple models, which presents a set of local models in the format of linear equations. Satisfactory fuzzy c-mean clustering algorithm is used to classify operating data into several groups to build local models. The developed models are then applied to predict the performance of the system based on design input parameters provided by the manufacturer. The tower characteristics are also investigated using the proposed models via the effects of the water/air flow ratio. The predicted results tend to agree well with the calculated tower characteristics using actual measured operating data from an industrial plant. By comparison with the design characteristic curve provided by the manufacturer, the effectiveness of cooling tower can be obtained in the end. A case study conducted in a commercial plant demonstrates the validity of proposed approach. It should be noted that this is the first attempt to assess the cooling efficiency which is deviated from the original design value using operating data for an industrial scale process. Moreover, the evaluated process need not interrupt the normal operation of the cooling tower. This should be of particular interest in industrial applications.

  3. Student Generated Rubrics: An Assessment Model To Help All Students Succeed. Assessment Bookshelf Series.

    Science.gov (United States)

    Ainsworth, Larry; Christinson, Jan

    The assessment model described in this guide was initially developed by a team of fifth-grade teachers who wrote objectives of integrating social studies and language arts. It helps the teacher guide students to create a task-specific rubric that they use to evaluate their own and peers' work. Teachers review the student evaluations, determine the…

  4. The development of a surface hydrology model for use in radiological safety assessments

    International Nuclear Information System (INIS)

    Little, R.H.; Ashton, J.

    1991-01-01

    A detailed understanding and quantification of geosphere and biosphere water movements is vital when assessing the impact of a radioactive waste repository. Not only is water important in the transport of radionuclides from the repository into the geosphere and hence into the biosphere, but it is also important in the transport of radionuclides within the biosphere and their transport to humans. Although geosphere water fluxes have traditionally been rigorously quantified, the quantification of biosphere water fluxes has been far less rigorous. In order to redress the balance, Associated Nuclear Services Ltd (ANS) have proposed to develop a surface hydrology model for use within radiological assessments undertaken by Her Majesty's Inspectorate of Pollution (HMIP) of the United Kingdom Department of the Environment (UKDoE). It is proposed that the deterministic, lumped, quasi-physical/semi-empirical approach of conceptual models should be adopted for the model. The model will be sufficiently flexible to be applicable to a wide range of catchments, as well as a variety of temporal and spatial scales. It is envisaged that the model will have a variety of uses within the HMIP assessment methodology including the identification of significant surface hydrological processes, the provision of input data for assessment codes and the study of the biosphere-geosphere interface. (17 refs., 4 figs.)

  5. The 1993 timber assessment market model: structure, projections, and policy simulations.

    Science.gov (United States)

    Darius M. Adams; Richard W. Haynes

    1996-01-01

    The 1993 timber assessment market model (TAMM) is a spatial model of the solidwood and timber inventory elements of the U.S. forest products sector. The TAMM model provides annual projections of volumes and prices in the solidwood products and sawtimber stumpage markets and estimates of total timber harvest and inventory by geographic region for periods of up to 50...

  6. Integrated Model to Assess Cloud Deployment Effectiveness When Developing an IT-strategy

    Science.gov (United States)

    Razumnikov, S.; Prankevich, D.

    2016-04-01

    Developing an IT-strategy of cloud deployment is a complex issue since even the stage of its formation necessitates revealing what applications will be the best possible to meet the requirements of a company business-strategy, evaluate reliability and safety of cloud providers and analyze staff satisfaction. A system of criteria, as well an integrated model to assess cloud deployment effectiveness is offered. The model makes it possible to identify what applications being at the disposal of a company, as well as new tools to be deployed are reliable and safe enough for implementation in the cloud environment. The data on practical use of the procedure to assess cloud deployment effectiveness by a provider of telecommunication services is presented. The model was used to calculate values of integral indexes of services to be assessed, then, ones, meeting the criteria and answering the business-strategy of a company, were selected.

  7. A meta model-based methodology for an energy savings uncertainty assessment of building retrofitting

    Directory of Open Access Journals (Sweden)

    Caucheteux Antoine

    2016-01-01

    Full Text Available To reduce greenhouse gas emissions, energy retrofitting of building stock presents significant potential for energy savings. In the design stage, energy savings are usually assessed through Building Energy Simulation (BES. The main difficulty is to first assess the energy efficiency of the existing buildings, in other words, to calibrate the model. As calibration is an under determined problem, there is many solutions for building representation in simulation tools. In this paper, a method is proposed to assess not only energy savings but also their uncertainty. Meta models, using experimental designs, are used to identify many acceptable calibrations: sets of parameters that provide the most accurate representation of the building are retained to calculate energy savings. The method was applied on an existing office building modeled with the TRNsys BES. The meta model, using 13 parameters, is built with no more than 105 simulations. The evaluation of the meta model on thousands of new simulations gives a normalized mean bias error between the meta model and BES of <4%. Energy savings are assessed based on six energy savings concepts, which indicate savings of 2–45% with a standard deviation ranging between 1.3% and 2.5%.

  8. A multi-model assessment of the co-benefits of climate mitigation for global air quality

    NARCIS (Netherlands)

    Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana; Riahi, Keywan; van Dingenen, Rita; Aleluia Reis, Lara; Calvin, Katherine; Dentener, Frank; Drouet, Laurent; Fujimori, Shinichiro; Harmsen, Mathijs; Luderer, Gunnar; Heyes, Chris; Strefler, Jessica; Tavoni, Massimo; van Vuuren, Detlef P.

    2016-01-01

    We present a model comparison study that combines multiple integrated assessment models with a reduced-form global air quality model to assess the potential co-benefits of global climate mitigation policies in relation to the World Health Organization (WHO) goals on air quality and health. We

  9. Assessing and improving the quality of modeling : a series of empirical studies about the UML

    NARCIS (Netherlands)

    Lange, C.F.J.

    2007-01-01

    Assessing and Improving the Quality of Modeling A Series of Empirical Studies about the UML This thesis addresses the assessment and improvement of the quality of modeling in software engineering. In particular, we focus on the Unified Modeling Language (UML), which is the de facto standard in

  10. Review and assessment of models for predicting the migration of radionuclides through rivers

    International Nuclear Information System (INIS)

    Monte, Luigi; Boyer, Patrick; Brittain, John E.; Haakanson, Lars; Lepicard, Samuel; Smith, Jim T.

    2005-01-01

    The present paper summarises the results of the review and assessment of state-of-the-art models developed for predicting the migration of radionuclides through rivers. The different approaches of the models to predict the behaviour of radionuclides in lotic ecosystems are presented and compared. The models were classified and evaluated according to their main methodological approaches. The results of an exercise of model application to specific contamination scenarios aimed at assessing and comparing the model performances were described. A critical evaluation and analysis of the uncertainty of the models was carried out. The main factors influencing the inherent uncertainty of the models, such as the incompleteness of the actual knowledge and the intrinsic environmental and biological variability of the processes controlling the behaviour of radionuclides in rivers, are analysed

  11. Curonian Lagoon drainage basin modelling and assessment of climate change impact

    Directory of Open Access Journals (Sweden)

    Natalja Čerkasova

    2016-04-01

    Full Text Available The Curonian Lagoon, which is the largest European coastal lagoon with a surface area of 1578 km2 and a drainage area of 100,458 km2, is facing a severe eutrophication problem. With its increasing water management difficulties, the need for a sophisticated hydrological model of the Curonian Lagoon's drainage area arose, in order to assess possible changes resulting from local and global processes. In this study, we developed and calibrated a sophisticated hydrological model with the required accuracy, as an initial step for the future development of a modelling framework that aims to correctly predict the movement of pesticides, sediments or nutrients, and to evaluate water-management practices. The Soil and Water Assessment Tool was used to implement a model of the study area and to assess the impact of climate-change scenarios on the run-off of the Nemunas River and the Minija River, which are located in the Curonian Lagoons drainage basin. The models calibration and validation were performed using monthly streamflow data, and evaluated using the coefficient of determination (R2 and the Nash-Sutcliffe model efficiency coefficient (NSE. The calculated values of the R2 and NSE for the Nemunas and Minija Rivers stations were 0.81 and 0.79 for the calibration, and 0.679 and 0.602 for the validation period. Two potential climate-change scenarios were developed within the general patterns of near-term climate projections, as defined by the Intergovernmental Panel on Climate Change Fifth Assessment Report: both pessimistic (substantial changes in precipitation and temperature and optimistic (insubstantial changes in precipitation and temperature. Both simulations produce similar general patterns in river-discharge change: a strong increase (up to 22% in the winter months, especially in February, a decrease during the spring (up to 10% and summer (up to 18%, and a slight increase during the autumn (up to 10%.

  12. Task-based dermal exposure models for regulatory risk assessment

    NARCIS (Netherlands)

    Warren, N.D.; Marquart, H.; Christopher, Y.; Laitinen, J.; Hemmen, J.J. van

    2006-01-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of

  13. The Active Lava Flows of Kilauea Volcano, Hawaii

    Indian Academy of Sciences (India)

    'lahar' is from Indonesia, a country with some of the most active and destructive volcanoes .... tourist-dependent businesses such as airlines, rental car compa- nies, and hotels. ... excellent viewing conditions and photo opportunities. The heat.

  14. Assessing Impacts of Climate Change on Forests: The State of Biological Modeling

    Science.gov (United States)

    Dale, V. H.; Rauscher, H. M.

    1993-04-06

    Models that address the impacts to forests of climate change are reviewed by four levels of biological organization: global, regional or landscape, community, and tree. The models are compared as to their ability to assess changes in greenhouse gas flux, land use, maps of forest type or species composition, forest resource productivity, forest health, biodiversity, and wildlife habitat. No one model can address all of these impacts, but landscape transition models and regional vegetation and land-use models consider the largest number of impacts. Developing landscape vegetation dynamics models of functional groups is suggested as a means to integrate the theory of both landscape ecology and individual tree responses to climate change. Risk assessment methodologies can be adapted to deal with the impacts of climate change at various spatial and temporal scales. Four areas of research development are identified: (1) linking socioeconomic and ecologic models, (2) interfacing forest models at different scales, (3) obtaining data on susceptibility of trees and forest to changes in climate and disturbance regimes, and (4) relating information from different scales.

  15. A sensitivity analysis of a radiological assessment model for Arctic waters

    DEFF Research Database (Denmark)

    Nielsen, S.P.

    1998-01-01

    A model based on compartment analysis has been developed to simulate the dispersion of radionuclides in Arctic waters for an assessment of doses to man. The model predicts concentrations of radionuclides in the marine environment and doses to man from a range of exposure pathways. A parameter sen...... scavenging, water-sediment interaction, biological uptake, ice transport and fish migration. Two independent evaluations of the release of radioactivity from dumped nuclear waste in the Kara Sea have been used as source terms for the dose calculations.......A model based on compartment analysis has been developed to simulate the dispersion of radionuclides in Arctic waters for an assessment of doses to man. The model predicts concentrations of radionuclides in the marine environment and doses to man from a range of exposure pathways. A parameter...... sensitivity analysis has identified components of the model that are potentially important contributors to the predictive accuracy of doses to individuals of critical groups as well as to the world population. The components investigated include features associated with water transport and mixing, particle...

  16. Regional soil erosion assessment in Slovakia using modelling and farmer's participation

    DEFF Research Database (Denmark)

    Kenderessy, Pavol; Veihe, Anita

    with cereals, sunflowers and corn and is characterised by poor cultivation practices and use of fertilizers leading to land degradation. As a first step, the initial raster-based modelling of soil loss and deposition has provided acceptable and realistic values. The predicted spatial patterns of erosion...... for erosion risk assessments at the landscape scale in Slovakia using a combination of quantitative and qualitative methods for assessing spatial prediction patterns. The model was set up for the Paríž catchment (239.93 km2) in south-western Slovakia. The area has been intensively cultivated primarily...... are now being identified using farmer participation to ensure that the ‘correct’ hot spot areas are being identified. In the end, scenarios will be set up to assess the effect of farming practices and/or conservation measures on soil erosion rates in the area....

  17. Developing a Sustainability Assessment Model to Analyze China’s Municipal Solid Waste Management Enhancement Strategy

    Directory of Open Access Journals (Sweden)

    Hua Li

    2015-01-01

    Full Text Available This study develops a sustainability assessment model for analysis and decision-making of the impact of China’s municipal solid waste management enhancement strategy options based on three waste treatment scenarios: landfill disposal, waste-to-energy incineration, and a combination of a material recovery facility and composting. The model employs life cycle assessment, health risk assessment, and full cost accounting to evaluate the treatment scenarios regarding safeguarding public health, protecting the environment and conserving resources, and economic feasibility. The model then uses an analytic hierarchy process for an overall appraisal of sustainability. Results suggest that a combination of material recovery and composting is the most efficient option. The study results clarify sustainable attributes, suitable predications, evaluation modeling, and stakeholder involvement issues in solid waste management. The demonstration of the use of sustainability assessment model (SAM provides flexibility by allowing assessment for a municipal solid waste management (MSWM strategy on a case-by-case basis, taking into account site-specific factors, therefore it has the potential for flexible applications in different communities/regions.

  18. Assessment and improvement of biotransfer models to cow’s milk and beef used in exposure assessment tools for organic pollutants

    OpenAIRE

    Takaki, Koki; Wade, Andrew J.; Collins, Christopher D.

    2015-01-01

    The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow’s milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish ...

  19. Modeling and Stability Assessment of Single-Phase Grid Synchronization Techniques

    DEFF Research Database (Denmark)

    Golestan, Saeed; Guerrero, Josep M.; Vasquez, Juan

    2018-01-01

    (GSTs) is of vital importance. This task is most often based on obtaining a linear time-invariant (LTI) model for the GST and applying standard stability tests to it. Another option is modeling and dynamics/stability assessment of GSTs in the linear time-periodic (LTP) framework, which has received...... a very little attention. In this letter, the procedure of deriving the LTP model for single-phase GSTs is first demonstrated. The accuracy of the LTP model in predicting the GST dynamic behavior and stability is then evaluated and compared with that of the LTI one. Two well-known single-phase GSTs, i...

  20. Specifying and Refining a Measurement Model for a Simulation-Based Assessment. CSE Report 619.

    Science.gov (United States)

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in simulation-based assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance in a complex assessment. This paper describes a Bayesian approach to modeling and estimating…

  1. Triangular model integrating clinical teaching and assessment.

    Science.gov (United States)

    Abdelaziz, Adel; Koshak, Emad

    2014-01-01

    Structuring clinical teaching is a challenge facing medical education curriculum designers. A variety of instructional methods on different domains of learning are indicated to accommodate different learning styles. Conventional methods of clinical teaching, like training in ambulatory care settings, are prone to the factor of coincidence in having varieties of patient presentations. Accordingly, alternative methods of instruction are indicated to compensate for the deficiencies of these conventional methods. This paper presents an initiative that can be used to design a checklist as a blueprint to guide appropriate selection and implementation of teaching/learning and assessment methods in each of the educational courses and modules based on educational objectives. Three categories of instructional methods were identified, and within each a variety of methods were included. These categories are classroom-type settings, health services-based settings, and community service-based settings. Such categories have framed our triangular model of clinical teaching and assessment.

  2. Practical utilization of modeling and simulation in laboratory process waste assessments

    International Nuclear Information System (INIS)

    Lyttle, T.W.; Smith, D.M.; Weinrach, J.B.; Burns, M.L.

    1993-01-01

    At Los Alamos National Laboratory (LANL), facility waste streams tend to be small but highly diverse. Initial characterization of such waste streams is difficult in part due to a lack of tools to assist the waste generators in completing such assessments. A methodology has been developed at LANL to allow process knowledgeable field personnel to develop baseline waste generation assessments and to evaluate potential waste minimization technology. This process waste assessment (PWA) system is an application constructed within the process modeling system. The Process Modeling System (PMS) is an object-oriented, mass balance-based, discrete-event simulation using the common LISP object system (CLOS). Analytical capabilities supported within the PWA system include: complete mass balance specifications, historical characterization of selected waste streams and generation of facility profiles for materials consumption, resource utilization and worker exposure. Anticipated development activities include provisions for a best available technologies (BAT) database and integration with the LANL facilities management Geographic Information System (GIS). The environments used to develop these assessment tools will be discussed in addition to a review of initial implementation results

  3. Psychometric model for safety culture assessment in nuclear research facilities

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, C.S. do, E-mail: claudio.souza@ctmsp.mar.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP), Av. Professor Lineu Prestes 2468, 05508-000 São Paulo, SP (Brazil); Andrade, D.A., E-mail: delvonei@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN – SP), Av. Professor Lineu Prestes 2242, 05508-000 São Paulo, SP (Brazil); Mesquita, R.N. de, E-mail: rnavarro@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN – SP), Av. Professor Lineu Prestes 2242, 05508-000 São Paulo, SP (Brazil)

    2017-04-01

    Highlights: • A psychometric model to evaluate ‘safety climate’ at nuclear research facilities. • The model presented evidences of good psychometric qualities. • The model was applied to nuclear research facilities in Brazil. • Some ‘safety culture’ weaknesses were detected in the assessed organization. • A potential tool to develop safety management programs in nuclear facilities. - Abstract: A safe and reliable operation of nuclear power plants depends not only on technical performance, but also on the people and on the organization. Organizational factors have been recognized as the main causal mechanisms of accidents by research organizations through USA, Europe and Japan. Deficiencies related with these factors reveal weaknesses in the organization’s safety culture. A significant number of instruments to assess the safety culture based on psychometric models that evaluate safety climate through questionnaires, and which are based on reliability and validity evidences, have been published in health and ‘safety at work’ areas. However, there are few safety culture assessment instruments with these characteristics (reliability and validity) available on nuclear literature. Therefore, this work proposes an instrument to evaluate, with valid and reliable measures, the safety climate of nuclear research facilities. The instrument was developed based on methodological principles applied to research modeling and its psychometric properties were evaluated by a reliability analysis and validation of content, face and construct. The instrument was applied to an important nuclear research organization in Brazil. This organization comprises 4 research reactors and many nuclear laboratories. The survey results made possible a demographic characterization and the identification of some possible safety culture weaknesses and pointing out potential areas to be improved in the assessed organization. Good evidence of reliability with Cronbach's alpha

  4. Psychometric model for safety culture assessment in nuclear research facilities

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Andrade, D.A.; Mesquita, R.N. de

    2017-01-01

    Highlights: • A psychometric model to evaluate ‘safety climate’ at nuclear research facilities. • The model presented evidences of good psychometric qualities. • The model was applied to nuclear research facilities in Brazil. • Some ‘safety culture’ weaknesses were detected in the assessed organization. • A potential tool to develop safety management programs in nuclear facilities. - Abstract: A safe and reliable operation of nuclear power plants depends not only on technical performance, but also on the people and on the organization. Organizational factors have been recognized as the main causal mechanisms of accidents by research organizations through USA, Europe and Japan. Deficiencies related with these factors reveal weaknesses in the organization’s safety culture. A significant number of instruments to assess the safety culture based on psychometric models that evaluate safety climate through questionnaires, and which are based on reliability and validity evidences, have been published in health and ‘safety at work’ areas. However, there are few safety culture assessment instruments with these characteristics (reliability and validity) available on nuclear literature. Therefore, this work proposes an instrument to evaluate, with valid and reliable measures, the safety climate of nuclear research facilities. The instrument was developed based on methodological principles applied to research modeling and its psychometric properties were evaluated by a reliability analysis and validation of content, face and construct. The instrument was applied to an important nuclear research organization in Brazil. This organization comprises 4 research reactors and many nuclear laboratories. The survey results made possible a demographic characterization and the identification of some possible safety culture weaknesses and pointing out potential areas to be improved in the assessed organization. Good evidence of reliability with Cronbach's alpha

  5. CAirTOX, An inter-media transfer model for assessing indirect exposures to hazardous air contaminants

    International Nuclear Information System (INIS)

    McKone, T.E.

    1994-01-01

    Risk assessment is a quantitative evaluation of information on potential health hazards of environmental contaminants and the extent of human exposure to these contaminants. As applied to toxic chemical emissions to air, risk assessment involves four interrelated steps. These are (1) determination of source concentrations or emission characteristics, (2) exposure assessment, (3) toxicity assessment, and (4) risk characterization. These steps can be carried out with assistance from analytical models in order to estimate the potential risk associated with existing and future releases. CAirTOX has been developed as a spreadsheet model to assist in making these types of calculations. CAirTOX follows an approach that has been incorporated into the CalTOX model, which was developed for the California Department of Toxic Substances Control, With CAirTOX, we can address how contaminants released to an air basin can lead to contamination of soil, food, surface water, and sediments. The modeling effort includes a multimedia transport and transformation model, exposure scenario models, and efforts to quantify uncertainty in multimedia, multiple-pathway exposure assessments. The capacity to explicitly address uncertainty has been incorporated into the model in two ways. First, the spreadsheet form of the model makes it compatible with Monte-Carlo add-on programs that are available for uncertainty analysis. Second, all model inputs are specified in terms of an arithmetic mean and coefficient of variation so that uncertainty analyses can be carried out

  6. Application of mixed models for the assessment genotype and ...

    African Journals Online (AJOL)

    Application of mixed models for the assessment genotype and environment interactions in cotton ( Gossypium hirsutum ) cultivars in Mozambique. ... The cultivars ISA 205, STAM 42 and REMU 40 showed superior productivity when they were selected by the Harmonic Mean of Genotypic Values (HMGV) criterion in relation ...

  7. Invited review: A position on the Global Livestock Environmental Assessment Model (GLEAM).

    Science.gov (United States)

    MacLeod, M J; Vellinga, T; Opio, C; Falcucci, A; Tempio, G; Henderson, B; Makkar, H; Mottet, A; Robinson, T; Steinfeld, H; Gerber, P J

    2018-02-01

    The livestock sector is one of the fastest growing subsectors of the agricultural economy and, while it makes a major contribution to global food supply and economic development, it also consumes significant amounts of natural resources and alters the environment. In order to improve our understanding of the global environmental impact of livestock supply chains, the Food and Agriculture Organization of the United Nations has developed the Global Livestock Environmental Assessment Model (GLEAM). The purpose of this paper is to provide a review of GLEAM. Specifically, it explains the model architecture, methods and functionality, that is the types of analysis that the model can perform. The model focuses primarily on the quantification of greenhouse gases emissions arising from the production of the 11 main livestock commodities. The model inputs and outputs are managed and produced as raster data sets, with spatial resolution of 0.05 decimal degrees. The Global Livestock Environmental Assessment Model v1.0 consists of five distinct modules: (a) the Herd Module; (b) the Manure Module; (c) the Feed Module; (d) the System Module; (e) the Allocation Module. In terms of the modelling approach, GLEAM has several advantages. For example spatial information on livestock distributions and crops yields enables rations to be derived that reflect the local availability of feed resources in developing countries. The Global Livestock Environmental Assessment Model also contains a herd model that enables livestock statistics to be disaggregated and variation in livestock performance and management to be captured. Priorities for future development of GLEAM include: improving data quality and the methods used to perform emissions calculations; extending the scope of the model to include selected additional environmental impacts and to enable predictive modelling; and improving the utility of GLEAM output.

  8. Predicting the natural flow regime: Models for assessing hydrological alteration in streams

    Science.gov (United States)

    Carlisle, D.M.; Falcone, J.; Wolock, D.M.; Meador, M.R.; Norris, R.H.

    2009-01-01

    Understanding the extent to which natural streamflow characteristics have been altered is an important consideration for ecological assessments of streams. Assessing hydrologic condition requires that we quantify the attributes of the flow regime that would be expected in the absence of anthropogenic modifications. The objective of this study was to evaluate whether selected streamflow characteristics could be predicted at regional and national scales using geospatial data. Long-term, gaged river basins distributed throughout the contiguous US that had streamflow characteristics representing least disturbed or near pristine conditions were identified. Thirteen metrics of the magnitude, frequency, duration, timing and rate of change of streamflow were calculated using a 20-50 year period of record for each site. We used random forests (RF), a robust statistical modelling approach, to develop models that predicted the value for each streamflow metric using natural watershed characteristics. We compared the performance (i.e. bias and precision) of national- and regional-scale predictive models to that of models based on landscape classifications, including major river basins, ecoregions and hydrologic landscape regions (HLR). For all hydrologic metrics, landscape stratification models produced estimates that were less biased and more precise than a null model that accounted for no natural variability. Predictive models at the national and regional scale performed equally well, and substantially improved predictions of all hydrologic metrics relative to landscape stratification models. Prediction error rates ranged from 15 to 40%, but were 25% for most metrics. We selected three gaged, non-reference sites to illustrate how predictive models could be used to assess hydrologic condition. These examples show how the models accurately estimate predisturbance conditions and are sensitive to changes in streamflow variability associated with long-term land-use change. We also

  9. Conceptual adsorption models and open issues pertaining to performance assessment

    International Nuclear Information System (INIS)

    Serne, R.J.

    1992-01-01

    Recently several articles have been published that question the appropriateness of the distribution coefficient, Rd, concept to quantify radionuclide migration. Several distinct issues surrounding the modeling of nuclide retardation. The first section defines adsorption terminology and discusses various adsorption processes. The next section describes five commonly used adsorption conceptual models, specifically emphasizing what attributes that affect adsorption are explicitly accommodated in each model. I also review efforts to incorporate each adsorption model into performance assessment transport computer codes. The five adsorption conceptual models are (1) the constant Rd model, (2) the parametric Rd model, (3) isotherm adsorption models, (4) mass action adsorption models, and (5) surface-complexation with electrostatics models. The final section discusses the adequacy of the distribution ratio concept, the adequacy of transport calculations that rely on constant retardation factors and the status of incorporating sophisticated adsorption models into transport codes. 86 refs., 1 fig., 1 tab

  10. Ensemble atmospheric dispersion modeling for emergency response consequence assessments

    International Nuclear Information System (INIS)

    Addis, R.P.; Buckley, R.L.

    2003-01-01

    Full text: Prognostic atmospheric dispersion models are used to generate consequence assessments, which assist decision-makers in the event of a release from a nuclear facility. Differences in the forecast wind fields generated by various meteorological agencies, differences in the transport and diffusion models themselves, as well as differences in the way these models treat the release source term, all may result in differences in the simulated plumes. This talk will address the U.S. participation in the European ENSEMBLE project, and present a perspective an how ensemble techniques may be used to enable atmospheric modelers to provide decision-makers with a more realistic understanding of how both the atmosphere and the models behave. Meteorological forecasts generated by numerical models from national and multinational meteorological agencies provide individual realizations of three-dimensional, time dependent atmospheric wind fields. These wind fields may be used to drive atmospheric dispersion (transport and diffusion) models, or they may be used to initiate other, finer resolution meteorological models, which in turn drive dispersion models. Many modeling agencies now utilize ensemble-modeling techniques to determine how sensitive the prognostic fields are to minor perturbations in the model parameters. However, the European Union programs RTMOD and ENSEMBLE are the first projects to utilize a WEB based ensemble approach to interpret the output from atmospheric dispersion models. The ensembles produced are different from those generated by meteorological forecasting centers in that they are ensembles of dispersion model outputs from many different atmospheric transport and diffusion models utilizing prognostic atmospheric fields from several different forecast centers. As such, they enable a decision-maker to consider the uncertainty in the plume transport and growth as a result of the differences in the forecast wind fields as well as the differences in the

  11. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    Energy Technology Data Exchange (ETDEWEB)

    G. Saulnier and W. Statham

    2006-04-16

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. The Pena Blanca Natural Analogue Performance Assessment Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash-flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following analogous characteristics as compared to the Yucca Mountain repository site: (1) Analogous source--UO{sub 2} uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geology--(i.e. fractured, welded, and altered rhyolitic ash-flow tuffs); (3) Analogous climate--Semiarid to arid; (4) Analogous setting--Volcanic tuffs overlie carbonate rocks; and (5) Analogous geochemistry--Oxidizing conditions Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table.

  12. THE PENA BLANCA NATURAL ANALOGUE PERFORMANCE ASSESSMENT MODEL

    International Nuclear Information System (INIS)

    G. Saulnier; W. Statham

    2006-01-01

    The Nopal I uranium mine in the Sierra Pena Blanca, Chihuahua, Mexico serves as a natural analogue to the Yucca Mountain repository. The Pena Blanca Natural Analogue Performance Assessment Model simulates the mobilization and transport of radionuclides that are released from the mine and transported to the saturated zone. The Pena Blanca Natural Analogue Performance Assessment Model uses probabilistic simulations of hydrogeologic processes that are analogous to the processes that occur at the Yucca Mountain site. The Nopal I uranium deposit lies in fractured, welded, and altered rhyolitic ash-flow tuffs that overlie carbonate rocks, a setting analogous to the geologic formations at the Yucca Mountain site. The Nopal I mine site has the following analogous characteristics as compared to the Yucca Mountain repository site: (1) Analogous source--UO 2 uranium ore deposit = spent nuclear fuel in the repository; (2) Analogous geology--(i.e. fractured, welded, and altered rhyolitic ash-flow tuffs); (3) Analogous climate--Semiarid to arid; (4) Analogous setting--Volcanic tuffs overlie carbonate rocks; and (5) Analogous geochemistry--Oxidizing conditions Analogous hydrogeology: The ore deposit lies in the unsaturated zone above the water table

  13. Review of Project SAFE: Comments on biosphere conceptual model description and risk assessment methodology

    International Nuclear Information System (INIS)

    Klos, Richard; Wilmot, Roger

    2002-09-01

    The Swedish Nuclear Fuel and Waste Management Company's (SKB's) most recent assessment of the safety of the Forsmark repository for low-level and intermediate-level waste (Project SAFE) is currently undergoing review by the Swedish regulators. As part of its review, the Swedish Radiation Protection Institute (SSI) identified that two components of SAFE require more detailed review: (i) the conceptual model description of the biosphere system, and (ii) SKB's risk assessment methodology. We have reviewed the biosphere system interaction matrix and how this has been used in the identification, justification and description of biosphere models for radiological assessment purposes. The risk assessment methodology has been reviewed considering in particular issues associated with scenario selection, assessment timescale, and the probability and risk associated with the well scenario. There is an extensive range of supporting information on which biosphere modelling in Project SAFE is based. However, the link between this material and the biosphere models themselves is not clearly set out. This leads to some contradictions and mis-matches between description and implementation. One example concerns the representation of the geosphere-biosphere interface. The supporting description of lakes indicates that interaction between groundwaters entering the biosphere through lake bed sediments could lead to accumulations of radionuclides in sediments. These sediments may become agricultural areas at some time in the future. In the numerical modelling of the biosphere carried out in Project SAFE, the direct accumulation of contaminants in bed sediments is not represented. Application of a more rigorous procedure to ensure numerical models are fit for purpose is recommended, paying more attention to issues associated with the geosphere-biosphere interface. A more structured approach to risk assessment would be beneficial, with a better explanation of the difference between

  14. Description of codes and models to be used in risk assessment

    International Nuclear Information System (INIS)

    1991-09-01

    Human health and environmental risk assessments will be performed as part of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA) remedial investigation/feasibility study (RI/FS) activities at the Hanford Site. Analytical and computer encoded numerical models are commonly used during both the remedial investigation (RI) and feasibility study (FS) to predict or estimate the concentration of contaminants at the point of exposure to humans and/or the environment. This document has been prepared to identify the computer codes that will be used in support of RI/FS human health and environmental risk assessments at the Hanford Site. In addition to the CERCLA RI/FS process, it is recommended that these computer codes be used when fate and transport analyses is required for other activities. Additional computer codes may be used for other purposes (e.g., design of tracer tests, location of observation wells, etc.). This document provides guidance for unit managers in charge of RI/FS activities. Use of the same computer codes for all analytical activities at the Hanford Site will promote consistency, reduce the effort required to develop, validate, and implement models to simulate Hanford Site conditions, and expedite regulatory review. The discussion provides a description of how models will likely be developed and utilized at the Hanford Site. It is intended to summarize previous environmental-related modeling at the Hanford Site and provide background for future model development. The modeling capabilities that are desirable for the Hanford Site and the codes that were evaluated. The recommendations include the codes proposed to support future risk assessment modeling at the Hanford Site, and provides the rational for the codes selected. 27 refs., 3 figs., 1 tab

  15. Teachers’ design and use of rubrics and modeling activities for formative assessment of lower secondary school students’ modeling competence in science

    DEFF Research Database (Denmark)

    Nielsen, Sanne Schnell

    Modeling competence plays a central role in the recently revised science curriculum in Denmark. Teachers are requested to assess students learning progress targeting the modeling competence in their daily teaching. Accordingly, the teachers must understand this competence and have suitable...... assessment criteria and methods at hand. However, the curriculum descriptions of the modeling competence concept is only phrased in general terms and not based on a systematic framework....

  16. Towards a realistic approach to validation of reactive transport models for performance assessment

    International Nuclear Information System (INIS)

    Siegel, M.D.

    1993-01-01

    Performance assessment calculations are based on geochemical models that assume that interactions among radionuclides, rocks and groundwaters under natural conditions, can be estimated or bound by data obtained from laboratory-scale studies. The data include radionuclide distribution coefficients, measured in saturated batch systems of powdered rocks, and retardation factors measured in short-term column experiments. Traditional approaches to model validation cannot be applied in a straightforward manner to the simple reactive transport models that use these data. An approach to model validation in support of performance assessment is described in this paper. It is based on a recognition of different levels of model validity and is compatible with the requirements of current regulations for high-level waste disposal. Activities that are being carried out in support of this approach include (1) laboratory and numerical experiments to test the validity of important assumptions inherent in current performance assessment methodologies,(2) integrated transport experiments, and (3) development of a robust coupled reaction/transport code for sensitivity analyses using massively parallel computers

  17. Quantifying the impact of model inaccuracy in climate change impact assessment studies using an agro-hydrological model

    NARCIS (Netherlands)

    Droogers, P.; Loon, van A.F.; Immerzeel, W.W.

    2008-01-01

    Numerical simulation models are frequently applied to assess the impact of climate change on hydrology and agriculture. A common hypothesis is that unavoidable model errors are reflected in the reference situation as well as in the climate change situation so that by comparing reference to scenario

  18. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Validation study of safety assessment model for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Munakata, Masahiro; Takeda, Seiji; Kimura, Hideo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2001-12-01

    The JAERI-AECL collaboration research program has been conducted to validate a groundwater flow and radionuclide transport models for safety assessment. JAERI have developed a geostatistical model for radionuclide transport through a heterogeneous geological media and verify using experimental results of field tracer tests. The simulated tracer plumes explain favorably the experimental tracer plumes. A regional groundwater flow and transport model using site-scale parameter obtained from tracer tests have been verified by comparing simulation results with observation ones of natural environmental tracer. (author)

  20. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  1. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  2. A Remote Sensing-Derived Corn Yield Assessment Model

    Science.gov (United States)

    Shrestha, Ranjay Man

    be further associated with the actual yield. Utilizing satellite remote sensing products, such as daily NDVI derived from Moderate Resolution Imaging Spectroradiometer (MODIS) at 250 m pixel size, the crop yield estimation can be performed at a very fine spatial resolution. Therefore, this study examined the potential of these daily NDVI products within agricultural studies and crop yield assessments. In this study, a regression-based approach was proposed to estimate the annual corn yield through changes in MODIS daily NDVI time series. The relationship between daily NDVI and corn yield was well defined and established, and as changes in corn phenology and yield were directly reflected by the changes in NDVI within the growing season, these two entities were combined to develop a relational model. The model was trained using 15 years (2000-2014) of historical NDVI and county-level corn yield data for four major corn producing states: Kansas, Nebraska, Iowa, and Indiana, representing four climatic regions as South, West North Central, East North Central, and Central, respectively, within the U.S. Corn Belt area. The model's goodness of fit was well defined with a high coefficient of determination (R2>0.81). Similarly, using 2015 yield data for validation, 92% of average accuracy signified the performance of the model in estimating corn yield at county level. Besides providing the county-level corn yield estimations, the derived model was also accurate enough to estimate the yield at finer spatial resolution (field level). The model's assessment accuracy was evaluated using the randomly selected field level corn yield within the study area for 2014, 2015, and 2016. A total of over 120 plot level corn yield were used for validation, and the overall average accuracy was 87%, which statistically justified the model's capability to estimate plot-level corn yield. Additionally, the proposed model was applied to the impact estimation by examining the changes in corn yield

  3. OMNIITOX - operational life-cycle impact assessment models and information tools for practitioners

    DEFF Research Database (Denmark)

    Molander, S; Lidholm, Peter; Schowanek, Diederik

    2004-01-01

    of the characterisation model(s) and limited input data on chemical properties, which often has resulted in the omission of toxicants from the LCIA, or at best focus on well characterised chemicals. The project addresses both problems and integrates models, as well as data, in an information system – the OMNIITOX IS....... There is also a need for clarification of the relations between the (environmental) risk assessments of toxicants and LCIA, in addition to investigating the feasibility of introducing LCA into European chemicals legislation, tasks that also were addressed in the project.......This article is the preamble to a set of articles describing initial results from an on-going European Commission funded, 5th Framework project called OMNIITOX, Operational Models aNd Information tools for Industrial applications of eco/TOXicological impact assessments. The different parts...

  4. Assessing accuracy of point fire intervals across landscapes with simulation modelling

    Science.gov (United States)

    Russell A. Parsons; Emily K. Heyerdahl; Robert E. Keane; Brigitte Dorner; Joseph Fall

    2007-01-01

    We assessed accuracy in point fire intervals using a simulation model that sampled four spatially explicit simulated fire histories. These histories varied in fire frequency and size and were simulated on a flat landscape with two forest types (dry versus mesic). We used three sampling designs (random, systematic grids, and stratified). We assessed the sensitivity of...

  5. Utility of Social Modeling in Assessment of a State's Propensity for Nuclear Proliferation

    International Nuclear Information System (INIS)

    Coles, Garill A.; Brothers, Alan J.; Whitney, Paul D.; Dalton, Angela C.; Olson, Jarrod; White, Amanda M.; Cooley, Scott K.; Youchak, Paul M.; Stafford, Samuel V.

    2011-01-01

    This report is the third and final report out of a set of three reports documenting research for the U.S. Department of Energy (DOE) National Security Administration (NASA) Office of Nonproliferation Research and Development NA-22 Simulations, Algorithms, and Modeling program that investigates how social modeling can be used to improve proliferation assessment for informing nuclear security, policy, safeguards, design of nuclear systems and research decisions. Social modeling has not to have been used to any significant extent in a proliferation studies. This report focuses on the utility of social modeling as applied to the assessment of a State's propensity to develop a nuclear weapons program.

  6. Water quality assessment and meta model development in Melen watershed - Turkey.

    Science.gov (United States)

    Erturk, Ali; Gurel, Melike; Ekdal, Alpaslan; Tavsan, Cigdem; Ugurluoglu, Aysegul; Seker, Dursun Zafer; Tanik, Aysegul; Ozturk, Izzet

    2010-07-01

    Istanbul, being one of the highly populated metropolitan areas of the world, has been facing water scarcity since the past decade. Water transfer from Melen Watershed was considered as the most feasible option to supply water to Istanbul due to its high water potential and relatively less degraded water quality. This study consists of two parts. In the first part, water quality data covering 26 parameters from 5 monitoring stations were analyzed and assessed due to the requirements of the "Quality Required of Surface Water Intended for the Abstraction of Drinking Water" regulation. In the second part, a one-dimensional stream water quality model with simple water quality kinetics was developed. It formed a basic design for more advanced water quality models for the watershed. The reason for assessing the water quality data and developing a model was to provide information for decision making on preliminary actions to prevent any further deterioration of existing water quality. According to the water quality assessment at the water abstraction point, Melen River has relatively poor water quality with regard to NH(4)(+), BOD(5), faecal streptococcus, manganese and phenol parameters, and is unsuitable for drinking water abstraction in terms of COD, PO(4)(3-), total coliform, total suspended solids, mercury and total chromium parameters. The results derived from the model were found to be consistent with the water quality assessment. It also showed that relatively high inorganic nitrogen and phosphorus concentrations along the streams are related to diffuse nutrient loads that should be managed together with municipal and industrial wastewaters. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. Model Evaluation and Uncertainty in Agricultural Impacts Assessments: Results and Strategies from the Agricultural Model Intercomparison and Improvement Project (AgMIP)

    Science.gov (United States)

    Rosenzweig, C.; Hatfield, J.; Jones, J. W.; Ruane, A. C.

    2012-12-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) is an international effort to assess the state of global agricultural modeling and to understand climate impacts on the agricultural sector. AgMIP connects the climate science, crop modeling, and agricultural economic modeling communities to generate probabilistic projections of current and future climate impacts. The goals of AgMIP are to improve substantially the characterization of risk of hunger and world food security due to climate change and to enhance adaptation capacity in both developing and developed countries. This presentation will describe the general approach of AgMIP, highlight AgMIP efforts to evaluate climate, crop, and economic models, and discuss AgMIP uncertainty assessments. Model evaluation efforts will be outlined using examples from various facets of AgMIP, including climate scenario generation, the wheat crop model intercomparison, and the global agricultural economics model intercomparison being led in collaboration with the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP). Strategies developed to quantify uncertainty in each component of AgMIP, as well as the propagation of uncertainty through the climate-crop-economic modeling framework, will be detailed and preliminary uncertainty assessments that highlight crucial areas requiring improved models and data collection will be introduced.

  8. Empirical assessment of a threshold model for sylvatic plague

    DEFF Research Database (Denmark)

    Davis, Stephen; Leirs, Herwig; Viljugrein, H.

    2007-01-01

    Plague surveillance programmes established in Kazakhstan, Central Asia, during the previous century, have generated large plague archives that have been used to parameterize an abundance threshold model for sylvatic plague in great gerbil (Rhombomys opimus) populations. Here, we assess the model...... examine six hypotheses that could explain the resulting false positive predictions, namely (i) including end-of-outbreak data erroneously lowers the estimated threshold, (ii) too few gerbils were tested, (iii) plague becomes locally extinct, (iv) the abundance of fleas was too low, (v) the climate...

  9. The implementation of assessment model based on character building to improve students’ discipline and achievement

    Science.gov (United States)

    Rusijono; Khotimah, K.

    2018-01-01

    The purpose of this research was to investigate the effect of implementing the assessment model based on character building to improve discipline and student’s achievement. Assessment model based on character building includes three components, which are the behaviour of students, the efforts, and student’s achievement. This assessment model based on the character building is implemented in science philosophy and educational assessment courses, in Graduate Program of Educational Technology Department, Educational Faculty, Universitas Negeri Surabaya. This research used control group pre-test and post-test design. Data collection method used in this research were observation and test. The observation was used to collect the data about the disciplines of the student in the instructional process, while the test was used to collect the data about student’s achievement. Moreover, the study applied t-test to the analysis of data. The result of this research showed that assessment model based on character building improved discipline and student’s achievement.

  10. State-of-the-art and research needs for oil spill impact assessment modelling

    Energy Technology Data Exchange (ETDEWEB)

    French-McCay, D. [Applied Science Associates Inc., South Kingstown, RI (United States)

    2009-07-01

    Many oil spill models focus on trajectory and fate in aquatic environments. Models designed to address subsurface oil concentrations typically overlay fates model concentration results on maps or grids of biological distributions to assess impacts. This paper discussed a state-of-the-art biological effects model designed to evaluate the impacts and dose of oil spill hydrocarbons on aquatic biota including birds, mammals, reptiles, fish, invertebrates and plants. The biological effects model was coupled to an oil trajectory and fates spill impact model application package (SIMAP) in order to obtain accurate spatial and temporal quantifications of oil distributions and hydrocarbon component concentrations. Processes simulated in the model included slick spreading, evaporation of volatiles from surface oil, transport on the water surface, and various types of oil dispersion and emulsification. The design of the model was discussed, as well as strategies used for applying the model for hindcasts and risk assessments. 204 refs., 3 tabs., 5 figs.

  11. State-of-the-art and research needs for oil spill impact assessment modelling

    International Nuclear Information System (INIS)

    French-McCay, D.

    2009-01-01

    Many oil spill models focus on trajectory and fate in aquatic environments. Models designed to address subsurface oil concentrations typically overlay fates model concentration results on maps or grids of biological distributions to assess impacts. This paper discussed a state-of-the-art biological effects model designed to evaluate the impacts and dose of oil spill hydrocarbons on aquatic biota including birds, mammals, reptiles, fish, invertebrates and plants. The biological effects model was coupled to an oil trajectory and fates spill impact model application package (SIMAP) in order to obtain accurate spatial and temporal quantifications of oil distributions and hydrocarbon component concentrations. Processes simulated in the model included slick spreading, evaporation of volatiles from surface oil, transport on the water surface, and various types of oil dispersion and emulsification. The design of the model was discussed, as well as strategies used for applying the model for hindcasts and risk assessments. 204 refs., 3 tabs., 5 figs

  12. A model for assessing the systemic vulnerability in landslide prone areas

    Directory of Open Access Journals (Sweden)

    S. Pascale

    2010-07-01

    Full Text Available The objectives of spatial planning should include the definition and assessment of possible mitigation strategies regarding the effects of natural hazards on the surrounding territory. Unfortunately, however, there is often a lack of adequate tools to provide necessary support to the local bodies responsible for land management. This paper deals with the conception, the development and the validation of an integrated numerical model for assessing systemic vulnerability in complex and urbanized landslide-prone areas. The proposed model considers this vulnerability not as a characteristic of a particular element at risk, but as a peculiarity of a complex territorial system, in which the elements are reciprocally linked in a functional way. It is an index of the tendency of a given territorial element to suffer damage (usually of a functional kind due to its interconnections with other elements of the same territorial system. The innovative nature of this work also lies in the formalization of a procedure based on a network of influences for an adequate assessment of such "systemic" vulnerability.

    This approach can be used to obtain information which is useful, in any given situation of a territory hit by a landslide event, for the identification of the element which has suffered the most functional damage, ie the most "critical" element and the element which has the greatest repercussions on other elements of the system and thus a "decisive" role in the management of the emergency.

    This model was developed within a GIS system through the following phases:

    1. the topological characterization of the territorial system studied and the assessment of the scenarios in terms of spatial landslide hazard. A statistical method, based on neural networks was proposed for the assessment of landslide hazard;

    2. the analysis of the direct consequences of a scenario event on the system;

    3. the definition of the

  13. Application of an Integrated Assessment Model to the Kevin Dome site, Montana

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Minh [Univ. of Wyoming, Laramie, WY (United States); Zhang, Ye [Univ. of Wyoming, Laramie, WY (United States); Carey, James William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stauffer, Philip H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-30

    The objectives of the Integrated Assessment Model is to enable the Fault Swarm algorithm in the National Risk Assessment Partnership, ensure faults are working in the NRAP-IAM tool, calculate hypothetical fault leakage in NRAP-IAM, and compare leakage rates to Eclipse simulations.

  14. Permafrost Degradation Risk Zone Assessment using Simulation Models

    DEFF Research Database (Denmark)

    Daanen, R.P.; Ingeman-Nielsen, Thomas; Marchenko, S.

    2011-01-01

    In this proof-of-concept study we focus on linking large scale climate and permafrost simulations to small scale engineering projects by bridging the gap between climate and permafrost sciences on the one hand and on the other technical recommendation for adaptation of planned infrastructures...... to climate change in a region generally underlain by permafrost. We present the current and future state of permafrost in Greenland as modelled numerically with the GIPL model driven by HIRHAM climate projections up to 2080. We develop a concept called Permafrost Thaw Potential (PTP), defined...... as the potential active layer increase due to climate warming and surface alterations. PTP is then used in a simple risk assessment procedure useful for engineering applications. The modelling shows that climate warming will result in continuing wide-spread permafrost warming and degradation in Greenland...

  15. Assessment of the Eu migration experiments and their modelling

    International Nuclear Information System (INIS)

    Klotz, D.

    2001-01-01

    The humic acid transport of heavy metals in underground water was investigated in laboratory experiments using the lanthanide Eu in the form of 152 Eu 3+ , which is both a model heavy metal and an indicator for assessing the potential hazards of ultimate storage sites for radioactive waste [de

  16. An Empirical Study of a Solo Performance Assessment Model

    Science.gov (United States)

    Russell, Brian E.

    2015-01-01

    The purpose of this study was to test a hypothesized model of solo music performance assessment. Specifically, this study investigates the influence of technique and musical expression on perceptions of overall performance quality. The Aural Musical Performance Quality (AMPQ) measure was created to measure overall performance quality, technique,…

  17. A model for assessing the radiological impacts of deep sea disposal of radioactive wastes: development of the model and preliminary results

    International Nuclear Information System (INIS)

    Poulin, M.; Chartier, M.; Durrieu de Madron, X.

    1987-10-01

    A new numerical model has been developed in France to assess the radiological consequences of low level radioactive waste disposal on the sea bottom of the Atlantic Ocean. It is a box model covering the world ocean with a finer resolution in the North Atlantic and near the Nuclear Energy Agency dumpsite. The main processes involved in the nuclides transfer from the drums to man are modelled or parameterized: time variations of the nuclides release, advection and diffusion by the ocean fluid, adsorption-desorption on particles, sedimentation and burial of sediments, transfers through organisms living in the sea (fishes, crustaceans, molluscs, seaweeds, plankton,...). The dose equivalent to critical group members and the collective dose equivalent are assessed. A sensitivity analysis of the model has been performed to assess the reliability of the dose calculations. An intercomparison exercise has been achieved with two other independent models on a benchmark problem

  18. A time dependent zonally averaged energy balance model to be incorporated into IMAGE (Integrated Model to Assess the Greenhouse Effect). Collaborative Paper

    International Nuclear Information System (INIS)

    Jonas, M.; Olendrzynski, K.; Elzen, M. den

    1991-10-01

    The Intergovernmental Panel on Climate Change (IPCC) is placing increasing emphasis on the use of time-dependent impact models that are linked with energy-emission accounting frameworks and models that predict in a time-dependent fashion important variables such as atmospheric concentrations of greenhouse gases, surface temperature and precipitation. Integrating these tools (greenhouse gas emission strategies, atmospheric processes, ecological impacts) into what is called an integrated assessment model will assist policymakers in the IPCC and elsewhere to assess the impacts of a wide variety of emission strategies. The Integrated Model to Assess the Greenhouse Effect (IMAGE; developed at RIVM) represents such an integrated assessment model which already calculates historical and future effects of greenhouse gas emissions on global surface temperature, sea level rise and other ecological and socioeconomic impacts. However, to be linked to environmental impact models such as the Global Vegetation Model and the Timber Assessment Model, both of which are under development at RIVM and IIASA, IMAGE needs to be regionalized in terms of temperature and precipitation output. These key parameters will then enable the above environmental impact models to be run in a time-dependent mode. In this paper we lay the scientific and numerical basis for a two-dimensional Energy Balance Model (EBM) to be integrated into the climate module of IMAGE which will ultimately provide scenarios of surface temperature and precipitation, resolved with respect to latitude and height. This paper will deal specifically with temperature; following papers will deal with precipitation. So far, the relatively simple EBM set up in this paper resolves mean annual surface temperatures on a regional scale defined by 10 deg latitude bands. In addition, we can concentrate on the implementation of the EBM into IMAGE, i.e., on the steering mechanism itself. Both reasons justify the time and effort put into

  19. Assessment of the Suitability of High Resolution Numerical Weather Model Outputs for Hydrological Modelling in Mountainous Cold Regions

    Science.gov (United States)

    Rasouli, K.; Pomeroy, J. W.; Hayashi, M.; Fang, X.; Gutmann, E. D.; Li, Y.

    2017-12-01

    The hydrology of mountainous cold regions has a large spatial variability that is driven both by climate variability and near-surface process variability associated with complex terrain and patterns of vegetation, soils, and hydrogeology. There is a need to downscale large-scale atmospheric circulations towards the fine scales that cold regions hydrological processes operate at to assess their spatial variability in complex terrain and quantify uncertainties by comparison to field observations. In this research, three high resolution numerical weather prediction models, namely, the Intermediate Complexity Atmosphere Research (ICAR), Weather Research and Forecasting (WRF), and Global Environmental Multiscale (GEM) models are used to represent spatial and temporal patterns of atmospheric conditions appropriate for hydrological modelling. An area covering high mountains and foothills of the Canadian Rockies was selected to assess and compare high resolution ICAR (1 km × 1 km), WRF (4 km × 4 km), and GEM (2.5 km × 2.5 km) model outputs with station-based meteorological measurements. ICAR with very low computational cost was run with different initial and boundary conditions and with finer spatial resolution, which allowed an assessment of modelling uncertainty and scaling that was difficult with WRF. Results show that ICAR, when compared with WRF and GEM, performs very well in precipitation and air temperature modelling in the Canadian Rockies, while all three models show a fair performance in simulating wind and humidity fields. Representation of local-scale atmospheric dynamics leading to realistic fields of temperature and precipitation by ICAR, WRF, and GEM makes these models suitable for high resolution cold regions hydrological predictions in complex terrain, which is a key factor in estimating water security in western Canada.

  20. Source-term development for a contaminant plume for use by multimedia risk assessment models

    International Nuclear Information System (INIS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    1999-01-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equal importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool